Effortlessly Build Machine Learning Apps with Hugging Face’s Docker Spaces

December 8, 2025 · 1185 words · 6 min

is a platform that enables collaborative open source machine learning (ML). The hub works as a cent

is a platform that enables collaborative open source machine learning (ML). The hub works as a central place where users can explore, experiment, collaborate, and build technology with machine learning. On the hub, you can find more than 140,000 models, 50,000 ML apps (called Spaces), and 20,000 datasets shared by the community. Using Spaces makes it easy to create and deploy ML-powered applications and demos in minutes. Recently, the Hugging Face team added support for , enabling users to create any custom app they want by simply writing a Dockerfile. Another great thing about Spaces is that once you have your app running, you can easily share it with anyone around the world. 🌍 This guide will step through the basics of creating a Docker Space, configuring it, and deploying code to it. We’ll show how to build a basic app for text generation that will be used to demo the model, which can generate text given input text. Models like this are used to power text completion in all sorts of apps. (You can check out a completed version of the app .) To follow along with the steps presented in this article, you’ll need to be signed in to the Hugging Face Hub — you can for free if you don’t have an account already. To get started, as shown in Figure 1. Next, you can choose any name you prefer for your project, select a license, and use Docker as the software development kit (SDK) as shown in Figure 2.  Spaces provides pre-built Docker templates like Argilla and Livebook that let you quickly start your ML projects using open source tools. If you choose the “Blank” option, that means you want to create your Dockerfile manually. Don’t worry, though; we’ll provide a Dockerfile to copy and paste later. 😅 When you finish filling out the form and click on the button, a new repository will be created in your Spaces account. This repository will be associated with the new space that you have created. If you’re new to the Hugging Face Hub 🤗, check out for a nice primer on repositories on the hub. Ok, now that you have an empty space repository, it’s time to write some code. 😎 The sample app will consist of the following three files: To follow along, create each file shown below . To do that, navigate to your Space’s tab, then choose → (Figure 3). Note that, if you prefer, you can also utilize Git. Make sure that you name each file exactly as we have done here. Then, copy the contents of each file from here and paste them into the corresponding file in the editor. After you have created and populated all the necessary files, commit each new file to your repository by clicking on the button. It’s time to list all the Python packages and their specific versions that are required for the project to function properly. The contents of the file typically include the name of the package and its version number, which can be specified in a variety of formats such as exact version numbers, version ranges, or compatible versions. The file lists , , and for the API along with , , and for the text-generation model. The following code defines a FastAPI web application that uses the transformers library to generate text based on user input. The app itself is a simple single-endpoint API. The endpoint takes in text and uses a transformers to generate a completion, which it then returns as a response. To give folks something to see, we reroute FastAPI’s from the default endpoint to the root of the app. This way, when someone visits your Space, they can play with it without having to write any code. In this section, we will write a Dockerfile that sets up a Python 3.9 environment, installs the packages listed in , and starts a FastAPI app on port 7860. Let’s go through this process step by step: The preceding line specifies that we’re going to use the official Python 3.9 Docker image as the base image for our container. This image is provided by Docker Hub, and it contains all the necessary files to run Python 3.9. This line sets the working directory inside the container to . This is where we’ll copy our application code and dependencies later on. The preceding line copies the file from our local directory to the directory inside the container. This file lists the Python packages that our application depends on This line uses to install the packages listed in . The flag tells pip to not use any cached packages, the flag tells to upgrade any already-installed packages if newer versions are available, and the flag specifies the requirements file to use. These lines create a new named user with a user ID of 1000, switch to that user, and then set the home directory to . The command sets the and environment variables. is modified to include the directory in the user’s home directory so that any binaries installed by pip will be available on the command line. to learn more about the user permission. This line sets the working directory inside the container to , which is . The preceding line copies the contents of our local directory into the directory inside the container, setting the owner of the files to the user that we created earlier. This line specifies the command to run when the container starts. It starts the FastAPI app using and listens on port 7860. The flag specifies that the app should listen on all available network interfaces, and the app:app argument tells to look for the app object in the app module in our code. Here’s the complete Dockerfile: Once you commit this file, your space will switch to , and you should see the container’s build logs pop up so you can monitor its status. 👀 If you want to double-check the files, you can find all the files at . For a more basic introduction on using Docker with FastAPI, you can refer to the from the FastAPI docs. If all goes well, your space should switch to once it’s done building, and the Swagger docs generated by FastAPI should appear in the tab. Because these docs are interactive, you can try out the endpoint by expanding the details of the endpoint and clicking (Figure 4). This article covered the basics of creating a Docker Space, building and configuring a basic FastAPI app for text generation that uses the google/flan-t5-small model. You can use this guide as a starting point to build more complex and exciting applications that leverage the power of machine learning. If you’re interested in learning more about Docker templates and seeing curated examples, check out the . There you’ll find a variety of templates to use as a starting point for your own projects, as well as tips and tricks for getting the most out of Docker templates. Happy coding!