Preferred Networks, Inc. has released Optuna v1.0, the first major version of the open-source hyperparameter optimization framework for machine learning. Projects using the existing beta version can be updated to Optuna v1.0 with minimal changes to the code.
In machine learning and deep learning, it is critical that complex hyperparameters (*1), which control the behavior of an algorithm during the training process, are optimized to deliver a trained model with better accuracy.
Optuna automates the trial-and-error process of optimizing hyperparameters. It finds hyperparameter values that enable the algorithm to give good performance. Since its beta version release as open-source software (OSS) in December 2018, Optuna has received development support from numerous contributors and added a number of new features based on feedbacks from the OSS community as well as in the company.
Main features of Optuna v1.0 include:
- Efficient hyperparameter tuning with state-of-the-art optimization algorithms
- Support for various machine learning libraries including PyTorch, TensorFlow, Keras, FastAI, scikit-learn, LightGBM, and XGBoost
- Support for parallel execution across multiple computing machines to significantly reduce the optimization time
- Search space can be described by Python control statements
- Various visualization techniques that allow users to conduct diverse analyses of the optimization results
Optuna has received many contributions from external developers. PFN will continue to quickly incorporate the results of the latest machine learning research into the development of Optuna and work with the OSS community to promote the use of Optuna.