Hyperparameter Optimization With Hyperopt — Intro & Implementation | by Farzad Mahmoodinobar | Jun, 2023

Hyperparameter Optimization With Hyperopt — Intro & Implementation | by Farzad Mahmoodinobar | Jun, 2023

[ad_1]

2.1. Support Vector Machines and Iris Data Set

In a previous post I used Grid Search, Random Search and Bayesian Optimization for hyperparameter optimization using the Iris data set provided by scikit-learn. Iris data set includes 3 different irises petal and sepal lengths and is a commonly-used data set for classification exercises. In this post, we will use the same data set but we will use a Support Vector Machine (SVM) as a model with two parameters that we can optimize as follows:

  • C: Regularization parameter, which trades off misclassification of training examples against simplicity of the decision surface.
  • gamma: Kernel coefficient, which defines how much influence a single training example has. The larger gamma is, the closer other examples must be to be affected.

Since the goal of this exercise is to go through the hyperparameter optimization, I will not go deeper into what SVMs do but if you are interested, I find this scikit-learn post helpful.

We will generally follow the same steps that we used in the simple example earlier but will also visualize the process at the end:

1. Import necessary libraries and packages
2. Define the objective function and the search space
3. Run the optimization process
4. Visualize the optimization

2.1.1. Step 1 — Import Libraries and Packages

Let’s import the libraries and packages and then load the data set.

# Import libraries and packages
from sklearn import datasets
from sklearn.svm import SVC
from sklearn.model_selection import cross_val_score

# Load Iris dataset
iris = datasets.load_iris()
X = iris.data
y = iris.target

2.1.2. Step 2 — Define Objective Function and Search Space

Let’s first start with defining the objective function, which will train an SVM and returns the negative of the cross-validation score — that is what we want to minimize. Note that we are minimizing the negative of cross-validation score to be consistent with the general goal of “minimizing” the objective function (instead of “maximizing” the cross-validation score).

def objective_function(parameters):
clf = SVC(**parameters)
score = cross_val_score(clf, X, y, cv=5).mean()
return -score

Next we will define the search space, which consists of the values that our parameters of C and gamma can take. Note that we will use Hyperopt’s hp.uniform(label, low, high), which returns a value uniformly between “low” and “high” (source).

# Search Space
search_space = {
'C': hp.uniform('C', 0.1, 10),
'gamma': hp.uniform('gamma', 0.01, 1)
}

2.1.3. Run Optimization

Same as the simple example earlier, we will use a TPE algorithm and store the results in a Trials object.

# Trials object to store the results
trials = Trials()

# Run optimization
best = fmin(fn=objective_function, space=search_space, algo=tpe.suggest, trials=trials, max_evals=100)

Results:

[ad_2]
Source link

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *