With your development environment configured and the necessary libraries installed, it's time to create and run your first, albeit minimal, FastAPI application. This exercise will verify your setup and introduce you to the basic structure of a FastAPI program.
Create a new file named main.py
in your project directory and add the following Python code:
# main.py
from fastapi import FastAPI
# Create an instance of the FastAPI class
app = FastAPI()
# Define a path operation decorator for the root path ("/")
# This tells FastAPI that the function below handles GET requests to "/"
@app.get("/")
async def read_root():
"""
This is the root endpoint of the API.
It returns a simple greeting message.
"""
return {"message": "Hello from the FastAPI ML Service!"}
# Define another simple endpoint
@app.get("/status")
async def get_status():
"""
A simple status endpoint.
"""
return {"status": "API is running"}
Let's break down this code:
from fastapi import FastAPI
: We import the FastAPI
class, which provides all the core functionality for your API.app = FastAPI()
: We create an instance of the FastAPI
class. This app
variable will be the main point of interaction for creating API routes.@app.get("/")
: This is a Python decorator. Decorators modify or enhance functions. Here, @app.get
tells FastAPI that the function directly below it (read_root
) is responsible for handling requests that use the GET
HTTP method and target the path /
(the root path). This combination of a path (/
) and an HTTP method (GET
) is often called an "operation", and the function handling it is the "path operation function".async def read_root():
: This defines an asynchronous function named read_root
. FastAPI is built around Python's asyncio
library, allowing you to define async
functions for your endpoints. This enables handling multiple requests concurrently, which is particularly beneficial for I/O-bound tasks often encountered when interacting with models or external services. Even if your function doesn't perform explicit await
operations, defining it as async def
allows FastAPI to run it correctly within its asynchronous event loop.return {"message": "Hello from the FastAPI ML Service!"}
: Inside the function, we return a Python dictionary. FastAPI automatically converts this dictionary into a JSON response that will be sent back to the client. This automatic data serialization (and deserialization for incoming data, as we'll see later) is a major convenience.@app.get("/status")
and async def get_status():
: This defines a second endpoint at the path /status
, also responding to GET
requests with a simple JSON status message.To run this application, navigate to your project directory in your terminal (the directory containing main.py
) and execute the following command:
uvicorn main:app --reload
Let's analyze this command:
uvicorn
: This is the command to run the Uvicorn ASGI server, which we installed earlier. Uvicorn is responsible for actually serving your FastAPI application over HTTP.main:app
: This tells Uvicorn where to find your FastAPI application instance.
main
: Refers to the Python file main.py
.app
: Refers to the object app = FastAPI()
created inside main.py
.--reload
: This flag tells Uvicorn to automatically restart the server whenever it detects changes in your code files. This is incredibly useful during development, as you don't need to manually stop and start the server after every code modification.If everything is set up correctly, you should see output similar to this in your terminal:
INFO: Will watch for changes in directory '{your_project_directory}'.
INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
INFO: Started reloader process [{process_id}] using StatReload
INFO: Started server process [{process_id}]
INFO: Waiting for application startup.
INFO: Application startup complete.
This indicates that your FastAPI application is now running and accessible at http://127.0.0.1:8000
.
Open your web browser and navigate to http://127.0.0.1:8000
. You should see the JSON response from your read_root
function:
{"message":"Hello from the FastAPI ML Service!"}
Now, try navigating to http://127.0.0.1:8000/status
. You should see:
{"status":"API is running"}
One of the standout features of FastAPI is its built-in, automatic interactive documentation. It leverages standards like OpenAPI (formerly Swagger) and JSON Schema to generate documentation directly from your code, including your path operations, parameters, and data models (which we'll cover in the next chapter).
While your application is running, navigate to these two URLs:
http://127.0.0.1:8000/docs
: This provides the Swagger UI interface. It's an interactive environment where you can see all your API endpoints, their expected parameters, responses, and even try them out directly from the browser.http://127.0.0.1:8000/redoc
: This provides an alternative documentation interface using ReDoc. It offers a clean, hierarchical view of your API specification.Relationship between your code, FastAPI, the server, and the automatically generated documentation interfaces.
Explore these interfaces. You'll see your /
and /status
endpoints listed. This automatic documentation is a significant productivity booster, making it easier for you and others to understand and interact with your API, especially as it grows in complexity.
You've now successfully created, run, and interacted with your first FastAPI application. This simple example forms the foundation upon which we will build more sophisticated services capable of handling data validation and serving machine learning model predictions.
© 2025 ApX Machine Learning