Skip to content

Using the jupyterlab services API

Rich Chiodo edited this page Aug 4, 2022 · 17 revisions

This page describes how the Jupyter extension uses the @jupyterlab/services npm module to connect and control Jupyter kernels.

What is it and why do we need it?

The Jupyter org has provided a @jupyterlab/services npm module to help clients connect to Jupyter servers and through the server, talk to Jupyter kernels.

The @jupyterlab/services npm module:

The npm module essentially handles all communication to and from a kernel so that javascript can treat the kernel like a local object. Meaning the Jupyter extension really only needs to know about IKernelConnection

image

Raw ZMQ

We started with the @jupyterlab/services because the first version of the Jupyter extension always talked to a notebook server, either by us starting one or the user providing one.

There were frequent issues with that and we switched to a more direct approach.

The direct (or raw as it's called elsewhere) doesn't talk to a notebook server though. Instead it talks directly to the kernels over zmq:

This means we don't have a websocket to connect to. However, since we were using the IKernelConnection as the standard way of communicating with kernels, we tricked jupyterlab services into thinking it was still talking to a server.

Essentially this part here:

image

Raw changes the sequence diagram to look like so:

image

In the diagram, the Notebook Server has now been replaced by a RawSocket. The implementation for this can be found here.

Note: Special services changes

Since the RawSocket is talking directly to the kernel, it needs to serialize/deserialize on its own. To facilitate this, we create a copy of the @jupyterlab/services kernel with the serialization functions stubbed out. That happens in an npm post install hook.

The raw version uses this modified kernel while the jupyter version uses the original.

Listening to messages

In this part of the sequence diagram:

image

the Jupyter extension calling requestExecute will generate something @jupyterlab/services calls a future

This kernel future is one of the ways @jupyterlab/services signals messages as they come in for a request.

The other way is something called a signal (which is just like a VS code event). The kernel has a signal for messages on different channels. Futures are a promise and a set of signals.

In the sequence diagram, the signal for an IOPubMessage (message on the iopub channel) is handled by listening to the signal named onIOPubMessage

Why is this important?

Futures and signals are how messages are handled from the kernel. That in itself is just an implementation detail, but it's important because of ipywidgets.

In this sequence diagram here, we needed to make sure the futures used by widgets are setup correctly on the UI side:

image

That means understanding what drives a future and what messages we needed to keep in sync so that the future would behave correctly in the widgets. Most widgets also use the @jupyterlab/services npm module, and so we had to make sure the futures on the UI side behaved like the ones on the extension side.

Clone this wiki locally