Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Scaling input parameters #114

Open
dom-sta opened this issue Apr 8, 2024 · 0 comments
Open

Scaling input parameters #114

dom-sta opened this issue Apr 8, 2024 · 0 comments

Comments

@dom-sta
Copy link

dom-sta commented Apr 8, 2024

Hey there. Today I tried to plot the optimization progress of a mango run using a contour plot but then I ran into an issue. In my data, there's two continuous input parameters with very different scales (see code example below). After fitting the GaussianProcessRegressor the results were pretty much unusable, as can be seen by the plot (provided below). Only after scaling both features to the [0,1] interval, the results were what I expected them to be.

Correct me if I'm wrong, but after looking through the codebase, I couldn't find any scaling applied to continuous variables. I guess my question is, are we supposed to scale our parameters before passing them to Tuner as param_dict? If so, this should probably be mentioned in the docs, as this behavior is not very intuitive. I would also like to make a feature request, to make the scaling part of mango. The limits used for scaling are already known from the distributions defined in param_dict.

Here's some code to test the behavior:

import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
from sklearn.gaussian_process import GaussianProcessRegressor
from sklearn.gaussian_process.kernels import Matern

# Generate data
data = [[0.519712, 1438.106604, 0.9890267207690076],
 [0.588307, 1220.578838, 0.6265765135227674],
 [0.55075, 906.374136, 0.9308240637759998],
 [0.355543, 1403.904598, 1.001475488132783],
 [0.430634, 840.591423, 1.0267490857885169],
 [0.477513, 1062.385949, 0.4883283241359153],
 [0.571532, 392.193584, 0.8084230446619398],
 [0.310248, 1012.246997, 0.9610352617817592],
 [0.464245, 248.483425, 0.701506256396162],
 [0.526382, 591.378226, 0.8894154056980089],
 [0.434667, 1302.143909, 0.9839202744250197],
 [0.391176, 488.536072, 0.9349843519807586],
 [0.474439, 1071.135636, 0.5342269053336064],
 [0.548409, 231.415683, 0.9947894009673353],
 [0.47567, 718.013638, 0.5847223173442994],
 [0.435327, 1207.326743, 1.0041979428572427]]

df = pd.DataFrame(data,columns=("varA","varB","result"))

# Specify parameter limits
xmin = np.array([0.3,200])
xmax = np.array([0.6,1500])

# Define gp
kernel = Matern(nu=2.5)
gp = GaussianProcessRegressor(
    kernel=kernel,
    n_restarts_optimizer=100,
    random_state=1,
    normalize_y=True,
)

def minmax_scale(X, xmin, xmax):
    return (X - xmin) / (xmax - xmin)

X_train = df.iloc[:,:-1].values
X_train = minmax_scale(X=X_train,xmin = xmin, xmax = xmax) # comment out for comparison
y_train = df.iloc[:,-1].values

gp.fit(X=X_train, y=y_train)


def generate_contour_plot(X_train, gp_model, resolution=100):
    x_min, x_max = X_train[:, 0].min(), X_train[:, 0].max()
    y_min, y_max = X_train[:, 1].min(), X_train[:, 1].max()
    x_range = np.linspace(x_min, x_max, resolution)
    y_range = np.linspace(y_min, y_max, resolution)
    X, Y = np.meshgrid(x_range, y_range)
    xy = np.column_stack([X.ravel(), Y.ravel()])

    Z_mean = gp_model.predict(xy)
    Z_mean = Z_mean.reshape(X.shape)

    fig, ax = plt.subplots(figsize=(10, 6))
    contour = ax.contourf(X, Y, Z_mean, cmap='viridis', alpha=0.8, levels=50)
    fig.colorbar(contour, ax=ax, label='result')
    ax.scatter(X_train[:, 0], X_train[:, 1], c='red', marker='x', label='Training Data')
    ax.set_xlabel("varA")
    ax.set_ylabel("varB")
    ax.set_title('Gaussian Process Contour Plot')
    ax.legend()
    plt.show()

generate_contour_plot(X_train=X_train, gp_model=gp)

The gp response with scaled data:
image

The gp response with raw data:
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant