We have the answers to your questions! - Don't miss our next open house about the data universe!

Optuna Unveiled: Harnessing the Power of Machine Learning Optimization

- Reading Time: 3 minutes

Optuna is an automated search tool for optimising the hyperparameters of your Machine Learning models. Using different search methods and their combination, this library helps you identify the optimal hyperparameters.

As a reminder, hyperparameters correspond to data that must be entered manually by the developer. This is in contrast to learning via training data. They have a significant impact on how the model works and on its performance.

For example, if you wanted to train a neural network with linear layers only, potential hyperparameters would include:

  • Number of layers ;
  • Units per layer ;
  • Learning Rate ;
  • Regularisation force ;
  • Activation function.


Even if you only have two potential values per variable, you can quickly end up with an impressive (and time-consuming) number of experiments.

You can do this manually using techniques such as cross-validation or trial-and-error optimisation, or choose Optuna to automate the process.

How does Optuna work?

The principle behind Optuna is that the user provides a space of hyperparameters to be tested. The tool will then determine the combination of hyperparameters that optimise the Machine Learning model.

To do this, you need to provide Optuna with a performance metric. The tool will do its utmost to optimise it.

The search tool has separated algorithms into two different categories: the sampling strategy and the pruning strategy. The sampling strategy actually conceals three distinct methods:

  1. Random search: this selects the hyperparameters randomly in the defined search space. This is useful when you don’t know what to try.
  2. Grid search: this tries all possible combinations in the defined search space. It is useful when you want to try out a specific idea. On the other hand, it is very time-consuming if your search space is too large.
  3. Bayesian optimisation: this uses random search first, then refines the hyperparameters according to the results. Bayesian optimisation is often more effective than random search, because it orientates its trials on the basis of previous results.

In addition to the sampling strategy, the “pruning strategy” reduces the number of useless trials by early stopping those that seem least promising. This saves you computing time.

Bear in mind that if you don’t specify a strategy, Optuna will use the sampling strategy with random search by default. So you won’t have any way of reducing the number of trials.

What are the advantages of Optuna?

Whenever you need to optimise the hyperparameters of your Machine Learning models, Optuna is the best solution for automating the process. This solution has a number of advantages. The first is its ease of use. The search tool integrates easily into your existing machine learning pipelines. All you have to do is define your hyperparameter search space and the loss function.

Optuna then offers various search methods (detailed above). These different methods allow you to adjust your search according to the use case. What’s more, the tool can be used on any Machine Learning model. Clustering, classification or regression, it takes care of everything.

With easy-to-view results, Optuna gives you a better understanding of how hyperparameters influence the performance of your model. So you have everything you need to determine the optimal hyperparameter configuration for your model. All thanks to a free, open-source project.

How can I optimise my model's hyperparameters with Optuna?

Now that you’re aware of the importance of Optuna and the time you’ll save by using it, find out how to set it up.

  • Start by importing the Optuna modules and the other modules you need to define your model and loss function: ‘!pip install optuna’.
  • Define the hyperparameter search space. To do this, use the ‘create_study’ function to create a “study” object. Specify the hyperparameter search space using the ‘suggest_uniform’ and ‘suggest_loguniform’ functions.
  • Define the cost function: it should take a hyperparameter configuration as input and return a performance score.
  • Start the search for hyperparameters by calling the ‘optimize’ function of the ‘study’ object with your cost function as an argument. Optuna will then start searching for the best hyperparameters using the sampling strategy you have chosen (by default, random search).
  • Access the results using the ‘get_best_params’ and ‘get_best_value’ functions.
  • You can also use the Optuna visualisation functions ‘plot_intermediate_values’ and ‘plot_optimization_history’.

Below is an example of how to optimise the hyperparameters of a linear regression model using Optuna :

import optuna

# Create the “study” object and define the hyperparameter search space

study = optuna.create_study()

study.suggest_uniform(“alpha”, 0.0, 1.0)

study.suggest_uniform(“l1_ratio”, 0.0, 1.0)

# Define the cost function

def objective(trial):

alpha = trial.suggest_uniform(“alpha”, 0.0, 1.0)

l1_ratio = trial.suggest_uniform(“l1_ratio”, 0.0, 1.0)

model = Lasso(alpha=alpha, l1_ratio=l1_ratio)

score = -1.0 * cross_val


Optuna is an effective automated search tool for optimising the hyperparameters of its Machine Learning models. Its ease of use, flexibility in the choice of optimisation algorithms and integration with existing pipelines make it a must-have.

Now you know more about Optuna. If you want to master this tool to automate your processes by deploying Machine Learning models, find out more about DataScientest’s training courses.

You are not available?

Leave us your e-mail, so that we can send you your new articles when they are published!
icon newsletter


Get monthly insider insights from experts directly in your mailbox