Supercharging AI/ML Development with JupyterLab and Docker
December 8, 2025 · 1444 words · 7 min
is an open source application built around the concept of a computational notebook document. It ena
is an open source application built around the concept of a computational notebook document. It enables sharing and executing code, data processing, visualization, and offers a range of interactive features for creating graphs. The latest version, , was released in early June. Compared to its predecessors, this version features a faster Web UI, improved editor performance, a new r, and real-time collaboration. If you have already installed the standalone 3.x version, evaluating the new features will require rewriting your current environment, which can be labor-intensive and risky. However, in environments where Docker operates, such as , you can start an isolated JupyterLab 4.0 in a container without affecting your installed JupyterLab environment. Of course, you can run these without impacting the existing environment and access them on a different port. In this article, we show how to quickly evaluate the new features of JupyterLab 4.0 using on Docker Desktop, without affecting the host PC side. Users have downloaded the base image of JupyterLab Notebook stack more than 10 million times from Docker Hub. What’s driving this significant download rate? There’s an ever-increasing demand for Docker containers to streamline development workflows, while allowing JupyterLab developers to innovate with their choice of project-tailored tools, application stacks, and deployment environments. Our JupyterLab notebook stack official image also supports both AMD64 and Arm64/v8 platforms. Containerizing the JupyterLab environment offers numerous benefits, including the following: To use JupyterLab on your computer, one option is to use the JupyterLab Desktop application. It’s based on Electron, so it operates with a GUI on Windows, macOS, and Linux. Indeed, using JupyterLab Desktop makes the installation process fairly simple. In a Windows environment, however, you’ll also need to set up the Python language separately, and, to extend the capabilities, you’ll need to use pip to set up packages. Although such a desktop solution may be simpler than building from scratch, we think the combination of Docker Desktop and Docker Stacks is still the more straightforward option. With JupyterLab Desktop, you cannot mix multiple versions or easily delete them after evaluation. Above all, it does not provide a consistent user experience across Windows, macOS, and Linux. On a Windows command prompt, execute the following command to launch a basic notebook: This command utilizes the Docker image, maps the host’s port to the container’s port , and enables command input and a pseudo-terminal. Additionally, an option is added to delete the container once the process is completed. After waiting for the Docker image to download, access and token information will be displayed on the command prompt as follows. Here, rewrite the URL to and then append the token to the end of this URL. In this example, the output will look like this: Note that this token is specific to my environment, so copying it will not work for you. You should replace it with the one actually displayed on your command prompt. Then, after waiting for a short while, JupyterLab will launch (Figure 1). From here, you can start a Notebook, access Python’s console environment, or utilize other work environments. The port 10000 on the host side is mapped to port 8888 inside the container, as shown in Figure 2. In the input form on the screen, enter the token displayed in the command line or in the container logs (the string following ), and select , as shown in Figure 3. By the way, in this environment, the data will be erased when the container is stopped. If you want to reuse your data even after stopping the container, create a volume by adding the option when launching the Docker container. To stop this container environment, click on the command prompt, then respond to the Jupyter server’s prompt with and press enter. If you are using Docker Desktop, stop the target container from the Containers. Once the display changes as follows, the container is terminated and the data is deleted. When the container is running, data is saved in the directory inside the container. You can either bind mount this as a volume or allocate it as a volume when starting the container. By doing so, even if you stop the container, you can use the same data again when you restart the container: The symbol signifies that the command line continues on the command prompt. You may also write the command in a single line without using the symbol. However, in the case of Windows command prompt, you need to use the symbol instead. With this setup, when launched, the JupyterLab container mounts the directory to the folder where the command was executed. Because the data persists even when the container is stopped, you can continue using your Notebook data as it is when you start the container again. In the following example, we’ll use the , which consists of 150 records in total, with 50 samples from each of three types of Iris flowers ( , , ). Each record consists of four numerical attributes (sepal length, sepal width, petal length, petal width) and one categorical attribute (type of iris). This data is included in the Python library , and we will use to plot this data. When trying to input the sample code from the scikit-learn page (the code is at the bottom of the page, and you can copy and paste it) into iPython, the following error occurs (Figure 4). This is an error message on iPython stating that the “matplotlib” module does not exist. Additionally, the “scikit-learn” module is needed. To avoid these errors and enable plotting, run the following command. Here, signifies running the command within the iPython environment: By pasting and executing the earlier sample code in the next cell on iPython, you can plot and display the Iris dataset as shown in Figure 5. Note that it can be cumbersome to use the command to add modules every time. Fortunately, you can add also add modules in the following ways: If you’re familiar with Dockerfile and building images, this five-step method is easy. Also, this approach can help keep the Docker image size in check. To build a Docker image, the first step is to create and navigate to the directory where you’ll place your Dockerfile and context: Create a file and list the Python modules you want to add with the command: This Dockerfile specifies a base image , copies the file from the local directory to the inside the container, and then runs a command to install the Python packages listed in the file. Here’s what each part of this command does: In this example, is the name of the Docker image you want to run. Make sure you have the appropriate image available on your system. The operation after startup is the same as before. You don’t need to add libraries with the command because the necessary libraries are included from the start. To execute the JupyterLab environment, we will utilize a Docker image called from the . Please note that the running Notebook will be terminated. After entering on the command prompt, enter and specify the running container. Then, enter the following to run a new container: This command will run a container using the image, which provides a Jupyter Notebook environment with additional scientific libraries. Here’s a breakdown of the command: The previous image was a minimal Notebook environment. The image we are using this time includes many packages used in the scientific field, such as numpy and pandas, so it may take some time to download the Docker image. This one is close to 4GB in image size. Once the container is running, you should be able to run the Iris dataset sample immediately without having to execute pip like before. Give it a try. Some images include TensorFlow’s deep learning library, ones for the R language, Julia programming language, and Apache Spark. See the . In a Windows environment, you can easily . Doing so will not affect or conflict with the existing Python language environment. Furthermore, this setup provides a consistent user experience across other platforms, such as macOS and Linux, making it the ideal solution for those who want to try it. By containerizing JupyterLab with Docker, AI/ML developers gain numerous advantages, including consistency, easy sharing and collaboration, and scalability. It enables efficient management of AI/ML development workflows, making it easier to experiment, collaborate, and reproduce results across different environments. With JupyterLab 4.0 and Docker, the possibilities for supercharging your AI/ML development are limitless. So why wait? Embrace containerization and experience the true power of JupyterLab in your AI/ML projects.