Skip to content

Commit

Permalink
Add BO notebook
Browse files Browse the repository at this point in the history
Added introduction to Bayesian optimisation notebook.

Also fixed some links which had broken after renaming the old kernels
notebook in a previous PR.
  • Loading branch information
Thomas-Christie committed Jul 14, 2023
1 parent ce40904 commit 5375ae4
Show file tree
Hide file tree
Showing 9 changed files with 749 additions and 11 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ process modelling.
> - [**Stochastic Variational Inference**](https://docs.jaxgaussianprocesses.com/examples/uncollapsed_vi/)
> - [**BlackJax Integration**](https://docs.jaxgaussianprocesses.com/examples/classification/#mcmc-inference)
> - [**Laplace Approximation**](https://docs.jaxgaussianprocesses.com/examples/classification/#laplace-approximation)
> - [**Inference on Non-Euclidean Spaces**](https://docs.jaxgaussianprocesses.com/examples/kernels/#custom-kernel)
> - [**Inference on Non-Euclidean Spaces**](https://docs.jaxgaussianprocesses.com/examples/constructing_new_kernels/#custom-kernel)
> - [**Inference on Graphs**](https://docs.jaxgaussianprocesses.com/examples/graph_kernels/)
> - [**Pathwise Sampling**](https://docs.jaxgaussianprocesses.com/examples/spatial/)
> - [**Learning Gaussian Process Barycentres**](https://docs.jaxgaussianprocesses.com/examples/barycentres/)
Expand All @@ -63,7 +63,7 @@ process modelling.
## Guides for customisation
>
> - [**Custom kernels**](https://docs.jaxgaussianprocesses.com/examples/kernels/#custom-kernel)
> - [**Custom kernels**](https://docs.jaxgaussianprocesses.com/examples/constructing_new_kernels/#custom-kernel)
> - [**UCI regression**](https://docs.jaxgaussianprocesses.com/examples/yacht/)
## Conversion between `.ipynb` and `.py`
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/barycentres.py
Original file line number Diff line number Diff line change
Expand Up @@ -124,7 +124,7 @@
# optimised. For advice on achieving this, see the
# [Regression notebook](https://docs.jaxgaussianprocesses.com/examples/regression/)
# for advice on optimisation and the
# [Kernels notebook](https://docs.jaxgaussianprocesses.com/examples/kernels/) for
# [Kernels notebook](https://docs.jaxgaussianprocesses.com/examples/constructing_new_kernels/) for
# advice on selecting an appropriate kernel.


Expand Down
737 changes: 737 additions & 0 deletions docs/examples/bayesian_optimisation.py

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion docs/examples/deep_kernels.py
Original file line number Diff line number Diff line change
Expand Up @@ -130,7 +130,7 @@ def __call__(
# activation functions between the layers. The first hidden layer contains 64 units,
# while the second layer contains 32 units. Finally, we'll make the output of our
# network a three units wide. The corresponding kernel that we define will then be of
# [ARD form](https://docs.jaxgaussianprocesses.com/examples/kernels/#active-dimensions)
# [ARD form](https://docs.jaxgaussianprocesses.com/examples/constructing_new_kernels/#active-dimensions)
# to allow for different lengthscales in each dimension of the feature space.
# Users may wish to design more intricate network structures for more complex tasks,
# which functionality is supported well in Haiku.
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/graph_kernels.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
# of a graph using a Gaussian process with a Matérn kernel presented in
# <strong data-cite="borovitskiy2021matern"></strong>. For a general discussion of the
# kernels supported within GPJax, see the
# [kernels notebook](https://docs.jaxgaussianprocesses.com/examples/kernels).
# [kernels notebook](https://docs.jaxgaussianprocesses.com/examples/constructing_new_kernels).

# %%
# Enable Float64 for more stable matrix inversions.
Expand Down
4 changes: 2 additions & 2 deletions docs/examples/intro_to_gps.py
Original file line number Diff line number Diff line change
Expand Up @@ -447,8 +447,8 @@
# that are admissible under the GP prior. A kernel is a positive-definite
# function with parameters $\boldsymbol{\theta}$ that maps pairs of inputs
# $\mathbf{X}, \mathbf{X}' \in \mathcal{X}$ onto the real line. We dedicate the
# entirety of the [Kernel Guide
# notebook](https://docs.jaxgaussianprocesses.com/examples/kernels) to
# entirety of the [Introduction to Kernels
# notebook](https://docs.jaxgaussianprocesses.com/examples/intro_to_kernels) to
# exploring the different GPs each kernel can yield.
#
# ## Gaussian process regression
Expand Down
6 changes: 3 additions & 3 deletions docs/examples/intro_to_kernels.py
Original file line number Diff line number Diff line change
Expand Up @@ -212,11 +212,11 @@ def forrester(x: Float[Array, "N"]) -> Float[Array, "N"]:
test_y = forrester(test_x)

# %% [markdown]
# First we define our model, using the Matérn32 kernel, and construct our posterior *without* optimising the kernel hyperparameters:
# First we define our model, using the Matérn52 kernel, and construct our posterior *without* optimising the kernel hyperparameters:

# %%
mean = gpx.mean_functions.Zero()
kernel = gpx.kernels.Matern32(
kernel = gpx.kernels.Matern52(
lengthscale=jnp.array(2.0)
) # Initialise our kernel lengthscale to 2.0

Expand Down Expand Up @@ -672,7 +672,7 @@ def forrester(x: Float[Array, "N"]) -> Float[Array, "N"]:
#
# - [Gaussian Processes for Machine Learning](http://www.gaussianprocess.org/gpml/chapters/RW.pdf) - Chapter 4 provides a comprehensive overview of kernels, diving deep into some of the technical details and also providing some kernels defined on non-Euclidean spaces such as strings.
# - David Duvenaud's [Kernel Cookbook](https://www.cs.toronto.edu/~duvenaud/cookbook/) is a great resource for learning about kernels, and also provides some information about some of the pitfalls people commonly encounter when using the Matérn family of kernels. His PhD thesis, [Automatic Model Construction with Gaussian Processes](https://www.cs.toronto.edu/~duvenaud/thesis.pdf), also provides some in-depth recipes for how one may incorporate their prior knowledge when constructing kernels.
# - Finally, please check out our [more advanced kernel guide](https://docs.jaxgaussianprocesses.com/examples/kernels/), which details some more kernels available in GPJax as well as how one may combine kernels together to form more complex kernels.
# - Finally, please check out our [more advanced kernel guide](https://docs.jaxgaussianprocesses.com/examples/constructing_new_kernels/), which details some more kernels available in GPJax as well as how one may combine kernels together to form more complex kernels.
#
# ## System Configuration

Expand Down
2 changes: 1 addition & 1 deletion docs/examples/spatial.py
Original file line number Diff line number Diff line change
Expand Up @@ -133,7 +133,7 @@
# alone isn't enough to to a decent job at interpolating this data. Therefore, we can also use elevation and optimize
# the parameters of our kernel such that more relevance should be given to elevation. This is possible by using a
# kernel that has one length-scale parameter per input dimension: an automatic relevance determination (ARD) kernel.
# See our [kernel notebook](https://docs.jaxgaussianprocesses.com/examples/kernels/) for more an introduction to
# See our [kernel notebook](https://docs.jaxgaussianprocesses.com/examples/constructing_new_kernels/) for more an introduction to
# kernels in GPJax.

# %%
Expand Down
1 change: 1 addition & 0 deletions mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,7 @@ nav:
- Sparse GPs: examples/uncollapsed_vi.py
- Stochastic sparse GPs: examples/collapsed_vi.py
- Pathwise Sampling for Spatial Modelling: examples/spatial.py
- Bayesian Optimisation: examples/bayesian_optimisation.py
- 📖 Guides for customisation:
- Kernels: examples/constructing_new_kernels.py
- Likelihoods: examples/likelihoods_guide.py
Expand Down

0 comments on commit 5375ae4

Please sign in to comment.