Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add BO notebook #335

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion .github/workflows/build_docs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,9 @@ jobs:
installer-parallel: true

- name: Install LaTex
run: sudo apt-get install texlive-fonts-recommended texlive-fonts-extra texlive-latex-extra dvipng cm-super
run: |
sudo apt-get update
sudo apt-get install texlive-fonts-recommended texlive-fonts-extra texlive-latex-extra dvipng cm-super

- name: Build the documentation with MKDocs
run: |
Expand Down
4 changes: 3 additions & 1 deletion .github/workflows/test_docs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,9 @@ jobs:
npm install katex

- name: Install LaTex
run: sudo apt-get install texlive-fonts-recommended texlive-fonts-extra texlive-latex-extra dvipng cm-super
run: |
sudo apt-get update
sudo apt-get install texlive-fonts-recommended texlive-fonts-extra texlive-latex-extra dvipng cm-super

# Install Poetry and build the documentation
- name: Install and configure Poetry
Expand Down
5 changes: 3 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,16 +54,17 @@ process modelling.
> - [**Stochastic Variational Inference**](https://docs.jaxgaussianprocesses.com/examples/uncollapsed_vi/)
> - [**BlackJax Integration**](https://docs.jaxgaussianprocesses.com/examples/classification/#mcmc-inference)
> - [**Laplace Approximation**](https://docs.jaxgaussianprocesses.com/examples/classification/#laplace-approximation)
> - [**Inference on Non-Euclidean Spaces**](https://docs.jaxgaussianprocesses.com/examples/kernels/#custom-kernel)
> - [**Inference on Non-Euclidean Spaces**](https://docs.jaxgaussianprocesses.com/examples/constructing_new_kernels/#custom-kernel)
> - [**Inference on Graphs**](https://docs.jaxgaussianprocesses.com/examples/graph_kernels/)
> - [**Pathwise Sampling**](https://docs.jaxgaussianprocesses.com/examples/spatial/)
> - [**Learning Gaussian Process Barycentres**](https://docs.jaxgaussianprocesses.com/examples/barycentres/)
> - [**Deep Kernel Regression**](https://docs.jaxgaussianprocesses.com/examples/deep_kernels/)
> - [**Poisson Regression**](https://docs.jaxgaussianprocesses.com/examples/poisson/)
> - [**Bayesian Optimisation**](https://docs.jaxgaussianprocesses.com/examples/bayesian_optimisation/)

## Guides for customisation
>
> - [**Custom kernels**](https://docs.jaxgaussianprocesses.com/examples/kernels/#custom-kernel)
> - [**Custom kernels**](https://docs.jaxgaussianprocesses.com/examples/constructing_new_kernels/#custom-kernel)
> - [**UCI regression**](https://docs.jaxgaussianprocesses.com/examples/yacht/)

## Conversion between `.ipynb` and `.py`
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/barycentres.py
Original file line number Diff line number Diff line change
Expand Up @@ -124,7 +124,7 @@
# optimised. For advice on achieving this, see the
# [Regression notebook](https://docs.jaxgaussianprocesses.com/examples/regression/)
# for advice on optimisation and the
# [Kernels notebook](https://docs.jaxgaussianprocesses.com/examples/kernels/) for
# [Kernels notebook](https://docs.jaxgaussianprocesses.com/examples/constructing_new_kernels/) for
# advice on selecting an appropriate kernel.


Expand Down
744 changes: 744 additions & 0 deletions docs/examples/bayesian_optimisation.py

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion docs/examples/deep_kernels.py
Original file line number Diff line number Diff line change
Expand Up @@ -130,7 +130,7 @@ def __call__(
# activation functions between the layers. The first hidden layer contains 64 units,
# while the second layer contains 32 units. Finally, we'll make the output of our
# network a three units wide. The corresponding kernel that we define will then be of
# [ARD form](https://docs.jaxgaussianprocesses.com/examples/kernels/#active-dimensions)
# [ARD form](https://docs.jaxgaussianprocesses.com/examples/constructing_new_kernels/#active-dimensions)
# to allow for different lengthscales in each dimension of the feature space.
# Users may wish to design more intricate network structures for more complex tasks,
# which functionality is supported well in Haiku.
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/graph_kernels.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
# of a graph using a Gaussian process with a Matérn kernel presented in
# <strong data-cite="borovitskiy2021matern"></strong>. For a general discussion of the
# kernels supported within GPJax, see the
# [kernels notebook](https://docs.jaxgaussianprocesses.com/examples/kernels).
# [kernels notebook](https://docs.jaxgaussianprocesses.com/examples/constructing_new_kernels).

# %%
# Enable Float64 for more stable matrix inversions.
Expand Down
4 changes: 2 additions & 2 deletions docs/examples/intro_to_gps.py
Original file line number Diff line number Diff line change
Expand Up @@ -447,8 +447,8 @@
# that are admissible under the GP prior. A kernel is a positive-definite
# function with parameters $\boldsymbol{\theta}$ that maps pairs of inputs
# $\mathbf{X}, \mathbf{X}' \in \mathcal{X}$ onto the real line. We dedicate the
# entirety of the [Kernel Guide
# notebook](https://docs.jaxgaussianprocesses.com/examples/kernels) to
# entirety of the [Introduction to Kernels
# notebook](https://docs.jaxgaussianprocesses.com/examples/intro_to_kernels) to
# exploring the different GPs each kernel can yield.
#
# ## Gaussian process regression
Expand Down
6 changes: 3 additions & 3 deletions docs/examples/intro_to_kernels.py
Original file line number Diff line number Diff line change
Expand Up @@ -212,11 +212,11 @@ def forrester(x: Float[Array, "N"]) -> Float[Array, "N"]:
test_y = forrester(test_x)

# %% [markdown]
# First we define our model, using the Matérn32 kernel, and construct our posterior *without* optimising the kernel hyperparameters:
# First we define our model, using the Matérn52 kernel, and construct our posterior *without* optimising the kernel hyperparameters:

# %%
mean = gpx.mean_functions.Zero()
kernel = gpx.kernels.Matern32(
kernel = gpx.kernels.Matern52(
lengthscale=jnp.array(2.0)
) # Initialise our kernel lengthscale to 2.0

Expand Down Expand Up @@ -672,7 +672,7 @@ def forrester(x: Float[Array, "N"]) -> Float[Array, "N"]:
#
# - [Gaussian Processes for Machine Learning](http://www.gaussianprocess.org/gpml/chapters/RW.pdf) - Chapter 4 provides a comprehensive overview of kernels, diving deep into some of the technical details and also providing some kernels defined on non-Euclidean spaces such as strings.
# - David Duvenaud's [Kernel Cookbook](https://www.cs.toronto.edu/~duvenaud/cookbook/) is a great resource for learning about kernels, and also provides some information about some of the pitfalls people commonly encounter when using the Matérn family of kernels. His PhD thesis, [Automatic Model Construction with Gaussian Processes](https://www.cs.toronto.edu/~duvenaud/thesis.pdf), also provides some in-depth recipes for how one may incorporate their prior knowledge when constructing kernels.
# - Finally, please check out our [more advanced kernel guide](https://docs.jaxgaussianprocesses.com/examples/kernels/), which details some more kernels available in GPJax as well as how one may combine kernels together to form more complex kernels.
# - Finally, please check out our [more advanced kernel guide](https://docs.jaxgaussianprocesses.com/examples/constructing_new_kernels/), which details some more kernels available in GPJax as well as how one may combine kernels together to form more complex kernels.
#
# ## System Configuration

Expand Down
2 changes: 1 addition & 1 deletion docs/examples/spatial.py
Original file line number Diff line number Diff line change
Expand Up @@ -133,7 +133,7 @@
# alone isn't enough to to a decent job at interpolating this data. Therefore, we can also use elevation and optimize
# the parameters of our kernel such that more relevance should be given to elevation. This is possible by using a
# kernel that has one length-scale parameter per input dimension: an automatic relevance determination (ARD) kernel.
# See our [kernel notebook](https://docs.jaxgaussianprocesses.com/examples/kernels/) for more an introduction to
# See our [kernel notebook](https://docs.jaxgaussianprocesses.com/examples/constructing_new_kernels/) for more an introduction to
# kernels in GPJax.

# %%
Expand Down
1 change: 1 addition & 0 deletions mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,7 @@ nav:
- Sparse GPs: examples/uncollapsed_vi.py
- Stochastic sparse GPs: examples/collapsed_vi.py
- Pathwise Sampling for Spatial Modelling: examples/spatial.py
- Bayesian Optimisation: examples/bayesian_optimisation.py
- 📖 Guides for customisation:
- Kernels: examples/constructing_new_kernels.py
- Likelihoods: examples/likelihoods_guide.py
Expand Down