-
Notifications
You must be signed in to change notification settings - Fork 52
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add BO notebook #335
Add BO notebook #335
Conversation
Added introduction to Bayesian optimisation notebook. Also fixed some links which had broken after renaming the old kernels notebook in a previous PR.
Added `sudo apt-get update` command before `sudo apt-get install` as recommended in https://docs.github.com/en/actions/using-github-hosted-runners/customizing-github-hosted-runners in order to mitigate package installation failures.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is fantastic!
Just a couple small comments
|
||
|
||
# %% | ||
def generate_optimised_posterior( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think I would prefer return_fitted_model or something else? We don't really generate posterior (also are all posteriors not optimised?).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think I'll rename to return_optimised_posterior
. The reason I'm calling the posterior optimised is because of the kernel hyperparameter optimisation being performed. This is also in line with the other notebooks (e.g. https://docs.jaxgaussianprocesses.com/examples/barycentres/#learning-a-posterior-distribution)
mean = gpx.mean_functions.Zero() | ||
kernel = gpx.kernels.Matern52() | ||
prior = gpx.Prior(mean_function=mean, kernel=kernel) | ||
opt_posterior = generate_optimised_posterior(D, prior, subkey) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe could just have a whole fit_model method that just takes in D?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I had originally planned to do this, but in the second example end up having to use a slightly different prior (with two active dimensions and a slightly higher variance) so felt like being able to pass a prior was a bit easier in the end.
approx_sample = opt_posterior.sample_approx( | ||
num_samples=1, train_data=D, key=subkey, num_features=500 | ||
) | ||
x_star = optimise_sample( | ||
approx_sample, | ||
subkey, | ||
lower_bound, | ||
upper_bound, | ||
num_initial_sample_points=1000, | ||
) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
could even have a Thompson_Sample function that takes in a model and just return a chosen point?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Potentially - I think I'll leave as it is though as we'd also have to pass various other arguments (such as the bounds of the search space, num_initial_sample_points
etc.) which seems a bit excessive considering this functionality is only used twice in the notebook. I also think it's nice separating drawing the sample and optimising it, as this will be the case once we extend the code. Happy to refactor though if you think it would be much better to have a thompson_sample
function.
Incorporated feedback into BO notebook and added link to BO notebook from project README.
Added introduction to Bayesian optimisation notebook.
Also fixed some links which had broken after renaming the old kernels notebook in a previous PR.
Type of changes
Checklist
poetry run pre-commit run --all-files --show-diff-on-failure
before committing.Description
Have added a new notebook which serves as an introduction to Bayesian optimisation. Also fixed a few links in the documentation which were broken when the original kernels notebook was renamed.
Issue Number: N/A