# JupyterLab

## Long-running notebooks

If running notebooks via Jupyter, we recommend submitting the notebooks for computation using [papermill](https://papermill.readthedocs.io/en/latest/) and specifying an explicit logfile when executing from the Jupyter terminal. This way you can disconnect from the Jupyter application and the notebook execution can continue whilst be able to monitor run progress.

```
papermill --stdout-file /files/my_job.out --stderr-file /files/my_job.err NOTEBOOK_PATH [OUTPUT_PATH]// Some code
```

{% hint style="info" %}
Note that your Jupyter notebook wil only receive cell output updates as long as the notebook is kept open in the browser. If you reopen a notebook that is still calculating in the background, you won't receive cell output updates. This is standard Jupyter behavior, unrelated to Nuvolos. This is one of the reasons why using tools like Papermill makes sense for long-running notebooks.
{% endhint %}

## Adding a new launcher <a href="#adding-a-new-launcher" id="adding-a-new-launcher"></a>

In some cases it might be useful to have multiple conda environments inside a single JupyterLab application and be able to launch notebooks from the JupyterLab launcher with kernels that run in these environments. We recommend that the kernel specification associated with the new conda environments created is always installed into the base conda environment (and not user / system prefix) to make sure that the kernel/launcher will function well after distributing an application. Our examples below follow this convention. If you don't want to share the application, then you can also follow instructions from other sources where typically the kernel specification is installed into the user home directory. The following can be done from a JupyterLab terminal and shortly afterwards a new Launcher should appear.

#### Python <a href="#python" id="python"></a>

In this case we recommend to create a new conda environment and install a launcher **into the environment** as following:

`conda create env --name my_new_env`

`conda activate my_new_env`

`conda install ipykernel`

`ipython kernel install --prefix=/opt/conda --name "My New Env"`

#### R <a href="#r" id="r"></a>

In this case we recommend to create a new conda environment and install a launcher **into the environment** as following:

`conda create env --name my_new_env`

`conda activate my_new_env`

`conda install r-recommended r-irkernel`

`R -e 'IRkernel::installspec(prefix="/opt/conda")'`**​**

**Julia**

Once you have a working Julia installation, execute the following command in Julia REPL:

```
using Pkg
Pkg.add("IJulia")           # Install IJulia package if not already installed
using IJulia
installkernel()             # Installs the default Julia kernel spec for Jupyter
```

## Accessing a local webserver in the browser

Some Python tools run as local web servers and need to be opened in a browser. In Nuvolos, these services must be exposed through [Jupyter Server Proxy](https://jupyter-server-proxy.readthedocs.io/en/latest/index.html) rather than by connecting directly to a port from your local browser. The example below uses [Tensorboard](https://www.tensorflow.org/tensorboard).

1. Use a JupyterLab application with version later than 3.0.0.
2. Install TensorBoard:

   ```
   pip install tensorboard
   ```
3. Install Jupyter Server Proxy if it is not already available:

   ```
   pip install jupyter-server-proxy
   ```
4. Create `/opt/conda/etc/jupyter/jupyter_server_config.py` with:

   ```
   c.ServerProxy.servers = {
     'tensorboard': {
       'command': ['tensorboard', '--logdir', '/files/tensorboard_logdir', '--port', '{port}'],
       'timeout': 120
     }
   }
   ```
5. Restart the Nuvolos application. After restart, a new TensorBoard launcher appears in the Notebook section of the JupyterLab launcher.
6. Run your TensorFlow job and note the output directory, for example `/tmp/my_fit_1`.
7. Create a symbolic link so TensorBoard reads from that run directory:

   ```
   ln -s /tmp/my_fit_1 /files/tensorboard_logdir
   ```

   \
   If `/files/tensorboard_logdir` already exists, remove it first with `rm /files/tensorboard_logdir`.
8. Open the TensorBoard launcher in JupyterLab. TensorBoard opens using the linked log directory. To inspect a different run, update the symlink and launch it again.

{% hint style="warning" %}
For best performance, write TensorBoard event files to `/tmp`, which is the fastest local storage available to the application. Files in `/tmp` are not preserved across restarts, so move any data you need to persistent storage afterward. If the server starts slowly, increase the configured `timeout` value or refresh the page until the service becomes available.
{% endhint %}

{% hint style="info" %}
If the server application takes a lot of time to start, you might need to increase the timeout value in the example, otherwise you'll need to refresh the page periodically until the server starts.
{% endhint %}

## Creating a ploty dash application from a notebook

Make sure you have the following packages installed (we suggest to do this via `conda` from the `conda-forge` channel).

* `plotly`
* `dash`

Once these are installed, install the JupyterDash extension:

```
pip install jupyter-dash
```

After this you need to make sure that your dash application has the following logic in it:

```python
from jupyter_dash import JupyterDash
# ... your imports

# the next line is key for the application to be routed to JupyterLab properly
JupyterDash.infer_jupyter_proxy_config()

# ... your code


# now it's time to create the app object
# note that normally, you would create via app = Dash(__name__)
# here we instantiate a JupyterDash instance which already has the correct reverse proxy configuration
# you can use all normal arguments you would pass Dash to pass to JupyterDash
app = JupyterDash(__name__)

# ... your code

# run the app - the jupyterlab mode opens the app in a new tab
app.run_server(mode="jupyterlab")
```

{% hint style="info" %}
Note that this procedure relies on the dash application being run in the context of a notebook.
{% endhint %}

## Real-time kernel resource usage monitoring

JupyterLab supports real-time resource monitoring through the `jupyter-resource-usage` [extension](https://github.com/jupyter-server/jupyter-resource-usage). Install it with:

```
pip install jupyter-resource-usage
```

Restart the Nuvolos application after installation. A new metering icon appears in the right sidebar. When a notebook tab is active, the panel shows CPU and RAM usage for the attached kernel, together with host-level CPU and RAM utilization.

{% hint style="info" %}
This extension requires IPyKernel 6.10.0 or later, so it may not work in older JupyterLab versions.
{% endhint %}

## Matplotlib plots with LaTeX

[To render](https://matplotlib.org/stable/users/explain/text/usetex.html) `matplotlib` labels and other text with LaTeX, install a LaTeX environment and the required packages:

1. [Install TinyTex](https://docs.nuvolos.com/features/applications/install-a-software-package#install-tinytex).
2. Install the TeX packages required by matplotlib:\
   ​`tlmgr install type1cm cm-super underscore dvipng`
3. [Configure your notebook's environment](https://docs.nuvolos.com/features/applications/install-a-software-package#tinytex-in-notebooks) as described in the linked setup instructions.
4. Run the notebook cell again with `usetex=True.`
