Skip to content

Commit

Permalink
More simplifications (and duplicate section name fix)
Browse files Browse the repository at this point in the history
  • Loading branch information
Uri Granta committed Jul 18, 2023
1 parent 465c432 commit f18e637
Show file tree
Hide file tree
Showing 6 changed files with 6 additions and 6 deletions.
2 changes: 1 addition & 1 deletion docs/notebooks/asynchronous_greedy_multiprocessing.pct.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# %% [markdown]
# # Asynchronous Bayesian optimization with Trieste
# # Asynchronous Bayesian Optimization
#
# In this notebook we demonstrate Trieste's ability to perform asynchronous Bayesian optimisation, as is suitable for scenarios where the objective function can be run for several points in parallel but where observations might return back at different times. To avoid wasting resources waiting for the evaluation of the whole batch, we immediately request the next point asynchronously, taking into account points that are still being evaluated. Besides saving resources, asynchronous approach also can potentially [improve sample efficiency](https://arxiv.org/abs/1901.10452) in comparison with synchronous batch strategies, although this is highly dependent on the use case.
#
Expand Down
2 changes: 1 addition & 1 deletion docs/notebooks/asynchronous_nongreedy_batch_ray.pct.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# %% [markdown]
# # Asynchronous batch Bayesian optimization
# # Asynchronous batch Bayesian Optimization
#
# As shown in [Asynchronous Bayesian Optimization](asynchronous_greedy_multiprocessing.ipynb) tutorial, Trieste provides support for running observations asynchronously. In that tutorial we used a greedy batch acquisition function called Local Penalization, and requested one new point whenever an observation was received. We also used the Python multiprocessing module to run distributed observations in parallel.
#
Expand Down
2 changes: 1 addition & 1 deletion docs/notebooks/data_transformation.pct.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
# %% [markdown]
# # Data transformation with the help of Ask-Tell interface.
# # Data transformation

# %%
import os
Expand Down
2 changes: 1 addition & 1 deletion docs/notebooks/deep_ensembles.pct.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@


# %% [markdown]
# ## Deep ensembles
# ## What are deep ensembles?
#
# Deep neural networks typically output only mean predictions, not posterior distributions as probabilistic models such as Gaussian processes do. Posterior distributions encode mean predictions, but also *epistemic* uncertainty - type of uncertainty that stems from model misspecification, and which can be eliminated with further data. Aleatoric uncertainty that stems from stochasticity of the data generating process is not contained in the posterior, but can be learned from the data. Bayesian optimization requires probabilistic models because epistemic uncertainty plays a key role in balancing between exploration and exploitation.
#
Expand Down
2 changes: 1 addition & 1 deletion docs/notebooks/recovering_from_errors.pct.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# %% [markdown]
# # Recovering from errors during optimization
# # Recovering from errors

# %%
import numpy as np
Expand Down
2 changes: 1 addition & 1 deletion docs/notebooks/visualizing_with_tensorboard.pct.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# %% [markdown]
# # Tracking and visualizing optimizations using Tensorboard
# # Visualizing with Tensorboard

# %%
import numpy as np
Expand Down

0 comments on commit f18e637

Please sign in to comment.