Hyperparameter tuning is the process of optimizing hyperparameter values to maximize the predictive accuracy of a model.
Katib is a Kubernetes-native project for automated machine learning (AutoML). Katib supports hyperparameter tuning, early stopping and neural architecture search (NAS).
If you don’t use Katib or a similar system for hyperparameter tuning, you need to run many training jobs yourself, manually adjusting the hyperparameters to find optimal values.
Kale provides both an SDK for using Katib and a user interface (UI) for Katib as part of the Kale JupyterLab extension.
Katib runs several training jobs (known as trials) within each hyperparameter tuning job (experiment). Each trial tests a different set of hyperparameter configurations. At the end of the experiment, Katib outputs the optimized values for the hyperparameters.