The Wayback Machine - https://web.archive.org/web/20210602011013/https://github.com/topics/hyperopt
Skip to content
#

hyperopt

Here are 64 public repositories matching this topic...

Hyperparameter optimization or tuning is the problem of choosing a set of optimal hyperparameters for a learning algorithm. A hyperparameter is a parameter whose value is used to control the learning process. By contrast, the values of other parameters (typically node weights) are learned. Hyperparameters are crucial as they control the overall behavior of a machine learning model. The ultimate goal is to find an optimal combination of hyperparameters that minimizes a predefined loss function to give better results.
  • Updated Dec 29, 2020
  • Jupyter Notebook
Home-Credit-Default-Risk-Recognition

The project provides a complete end-to-end workflow for building a binary classifier in Python to recognize the risk of housing loan default. It includes methods like automated feature engineering for connecting relational databases, comparison of different classifiers on imbalanced data, and hyperparameter tuning using Bayesian optimization.
  • Updated Jul 1, 2020
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the hyperopt topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the hyperopt topic, visit your repo's landing page and select "manage topics."

Learn more