Glossary

Hyperparameters

What are Hyperparameters?

A hyperparameter is a parameter whose value is set before the machine learning process begins. In contrast, the values of other parameters are derived via training. Algorithm hyperparameters affect the speed and quality of the learning process.

 

Why are Hyperparameters Important?

Hyperparameters are important because they can have a direct impact on the behavior of the training algorithm and have a significant impact on the performance of the model being trained. Choosing appropriate hyperparameters plays a critical role in the success of neural network architecture and has a huge impact on the learned model. For example, if the learning rate is too low, the model will miss important patterns in the data. If it is high, it may have collisions.

 

C3 AI and Hyperparameters

Using hyperparameter optimization on C3 AI® Platform, parallelized hyperparameters are optimized on MLPipelines. C3 AI offers the execution of hyperparameters with extensive computing memory via worker nodes in the C3 AI environment. The C3 AI Platform provides for manual early stop options — the user can stop the iterative process if initial results are not promising — and auto early stop options during which the user-defined “good enough” performance target halts further iterations. C3 AI Platform Hyperparameter Optimization also offers model persistence options such as “keep all trained” or “keep best trained,” with custom validation options for hold outs and non-time-series k-folds. Finally, users can view results during and after a search that are organized by hyperparameter combinations. In the absence of the C3 AI Platform, hyperparameter optimizations usually are run in a Jupyter notebook. Jupyter notebook’s available workstation resources may constrain the training dataset’s size and limit concurrent training/testing in long-running experiments. Model and “optimal” parameters still need to be deployed to production, monitored, and managed.