Getting Began
In half 1 we walked by the method of organising a board recreation suggestion system leveraging FastAPI and PostgreSQL. In Half 2 we proceed this mission and present methods to deploy this mission to a cloud service, on this case Render, to make it accessible to customers.
To make this a actuality, we’ll be engaged on organising our PostgreSQL database on Render, populating it with our information, Dockerizing our FastAPI software, and at last deploying it to a Render Internet Software.
Desk of Contents
- Deploying a PostgreSQL database on Render
- Deploying a FastAPI app as a Render Internet Software
– Dockerizing our software
– Pushing Docker Picture to DockerHub
– Pulling from DockerHub to Render
Tooling Used
- Render
- Docker Desktop
- Docker Hub
Deploying on Render
Now we now have a PostgreSQL database and a FastAPI software that work domestically, and it’s time to deploy on a cloud service that may be accessed by a front-end software or finish consumer (through Swagger). For this mission, we’ll use Render; Render is a cloud platform that, for small tasks, gives a extra simple setup expertise than bigger cloud suppliers like AWS and Azure.
To get began, navigate to Render and create a brand new account, then you may create a brand new mission by deciding on the ‘New Venture’ button proven beneath. Word, as of the time of this writing, Render has a trial interval that ought to assist you to comply with alongside at zero price for the primary month. We’re calling this mission fastapi-test, we then navigate into that mission after it’s created.

Every mission incorporates every part required for that mission to work in a self-contained setting. On this case, we’d like two parts: a database and an internet server for our FastAPI software. Let’s begin with creating our Database.

That is quite simple, we choose ‘Create New Service’ as proven in Determine 3 after which choose ‘Postgres’. We’re then navigated to the sector proven in Determine 4 to arrange the database. We title our database “fastapi-database” and choose the free tier to get began. Render solely lets you use the free tier database for a restricted time, however it is going to be high quality for this instance, and when you wanted to keep up a database long run, the pricing may be very cheap.

After inputting our database data and deciding on ‘Create’ it should take a minute to arrange the database, and also you’ll then be offered with the display screen proven in Determine 5. We’ll save the Inside Database URL + Exterior Database URL variables in our .env file, as we’ll want these to attach from our FastAPI software. We will then check our connection to the database utilizing the Exterior Database URL variable(connecting from our native machine is outdoors the Render Atmosphere) and create the tables from our native machine earlier than shifting on to organising our FastAPI software.

We then run our check database connection script, which makes an attempt to connect with our database through the use of the External_Database_Url variable because the connection string and create a check desk. Word that our External_Database_Url is our full connection string for the database, so we are able to move this as our single enter. A profitable run ought to end in a printout as proven in Determine 6.

from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker, Session
from sqlalchemy.ext.declarative import declarative_base
import os
from dotenv import load_dotenv
from utils.db_handler import DatabaseHandler
import pandas as pd
import uuid
import sys
from sqlalchemy.exc import OperationalError
import psycopg2
# Load setting variables from .env file (override=True reloads modified values)
load_dotenv(override=True)
# loaidng exterior database URL
database_url = os.environ.get("External_Database_Url")
if not database_url:
print("❌ External_Database_Url not present in setting variables")
print("Please test your .env file incorporates: External_Database_Url=your_render_postgres_url")
sys.exit(1)
print(f"Database URL loaded: {database_url[:50]}...")
# Parse the database URL to extract parts for testing
from urllib.parse import urlparse
import socket
def parse_database_url(url):
"""Parse database URL to extract connection parts"""
parsed = urlparse(url)
return {
'host': parsed.hostname,
'port': parsed.port or 5432,
'database': parsed.path.lstrip('/'),
'username': parsed.username,
'password': parsed.password
}
db_params = parse_database_url(database_url)
def test_network_connectivity():
"""Check community connectivity to Render PostgreSQL endpoint"""
print("n=== Community Connectivity Assessments ===")
# 1. Check DNS decision
strive:
ip_address = socket.gethostbyname(db_params['host'])
print(f"✅ DNS Decision profitable")
besides socket.gaierror as e:
print(f"❌ DNS Decision failed: {e}")
return False
# 2. Check port connectivity
strive:
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.settimeout(10) # 10 second timeout
end result = sock.connect_ex((db_params['host'], int(db_params['port'])))
sock.shut()
if end result == 0:
print(f"✅ Port {db_params['port']} is accessible")
return True
else:
print(f"❌ Port {db_params['port']} is NOT accessible")
print(" This would possibly point out a community connectivity problem")
return False
besides Exception as e:
print(f"❌ Port connectivity check failed: {e}")
return False
# Run connectivity assessments
network_ok = test_network_connectivity()
if not network_ok:
print("n🔍 TROUBLESHOOTING STEPS:")
print("1. Examine your web connection")
print("2. Confirm the Render PostgreSQL URL is appropriate")
print("3. Guarantee your Render PostgreSQL occasion is lively")
print("4. Examine if there are any Render service outages")
sys.exit(1)
print("n=== Making an attempt Database Connection ===")
# connect with the database utilizing psycopg2
strive:
conn = psycopg2.join(
host=db_params['host'],
database=db_params['database'],
consumer=db_params['username'],
password=db_params['password'],
port=db_params['port'],
connect_timeout=30 # 30 second timeout
)
# If the connection is profitable, you may carry out database operations
cursor = conn.cursor()
# Instance: Execute a easy question
cursor.execute("SELECT model();")
db_version = cursor.fetchone()
print(f"✅ PostgreSQL Database Model: {db_version[0]}")
# Check making a easy desk to confirm permissions
cursor.execute("CREATE TABLE IF NOT EXISTS connection_test (id SERIAL PRIMARY KEY, test_time TIMESTAMP DEFAULT NOW());")
conn.commit()
print("✅ Database permissions verified - can create tables")
cursor.shut()
conn.shut()
print("✅ psycopg2 connection profitable!")
besides psycopg2.OperationalError as e:
print(f"❌ Database connection failed: {e}")
if "timeout" in str(e).decrease():
print("n🔍 TIMEOUT TROUBLESHOOTING:")
print("- Examine your web connection")
print("- Confirm the Render PostgreSQL URL is appropriate")
print("- Examine if Render service is experiencing points")
elif "authentication" in str(e).decrease():
print("n🔍 AUTHENTICATION TROUBLESHOOTING:")
print("- Confirm the database URL incorporates appropriate credentials")
print("- Examine in case your Render PostgreSQL service is lively")
print("- Make sure the database URL hasn't expired or modified")
sys.exit(1)
besides Exception as e:
print(f"❌ Sudden error: {e}")
sys.exit(1)
# If we get right here, connection was profitable, so exit the check
print(f"n✅ All assessments handed! Render PostgreSQL connection is working.")
print(f"✅ Linked to database: {db_params['database']}")
print("✅ Prepared to be used in your software!")
Loading Database
Now that we’ve verified that we are able to connect with our database from our native machine, it’s time to arrange our database tables and populate them. To load our database, we’ll use our src/load_database.py file, which we beforehand walked by the person items of this script initially of this text, so we gained’t go into additional element on it right here. The one notable factors are that we’re once more utilizing our External_Database_Url as our connection string, after which on the finish, we’re utilizing the test_table operate that we’ve outlined as a part of our DatabaseHandler class. This operate makes an attempt to connect with the desk title handed to it and returns the variety of rows in that desk.
Operating this script ought to end in an output as proven in Determine 11, the place every of the tables was created, after which on the finish we recheck that we are able to return information from them and present that the output rows match the enter rows.

from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker, Session
from sqlalchemy.ext.declarative import declarative_base
import os
from dotenv import load_dotenv
from utils.db_handler import DatabaseHandler
import pandas as pd
import uuid
import sys
from sqlalchemy.exc import OperationalError
import psycopg2
# Load setting variables from .env file
load_dotenv(override=True)
# Assemble PostgreSQL connection URL for Render
URL_database = os.environ.get("External_Database_Url")
# Initialize DatabaseHandler with the constructed URL
engine = DatabaseHandler(URL_database)
# loading preliminary consumer information
users_df = pd.read_csv("Knowledge/steam_users.csv")
games_df = pd.read_csv("Knowledge/steam_games.csv")
user_games_df = pd.read_csv("Knowledge/steam_user_games.csv")
user_recommendations_df = pd.read_csv("Knowledge/user_recommendations.csv")
game_tags_df = pd.read_csv("Knowledge/steam_game_tags.csv")
# Defining queries to create tables
user_table_creation_query = """CREATE TABLE IF NOT EXISTS customers (
id UUID PRIMARY KEY,
username VARCHAR(255) UNIQUE NOT NULL,
password VARCHAR(255) NOT NULL,
e mail VARCHAR(255) NOT NULL,
position VARCHAR(50) NOT NULL
)
"""
game_table_creation_query = """CREATE TABLE IF NOT EXISTS video games (
id UUID PRIMARY KEY,
appid VARCHAR(255) UNIQUE NOT NULL,
title VARCHAR(255) NOT NULL,
sort VARCHAR(255),
is_free BOOLEAN DEFAULT FALSE,
short_description TEXT,
detailed_description TEXT,
builders VARCHAR(255),
publishers VARCHAR(255),
worth VARCHAR(255),
genres VARCHAR(255),
classes VARCHAR(255),
release_date VARCHAR(255),
platforms TEXT,
metacritic_score FLOAT,
suggestions INTEGER
)
"""
user_games_query = """CREATE TABLE IF NOT EXISTS user_games (
id UUID PRIMARY KEY,
username VARCHAR(255) NOT NULL,
appid VARCHAR(255) NOT NULL,
shelf VARCHAR(50) DEFAULT 'Wish_List',
ranking FLOAT DEFAULT 0.0,
evaluation TEXT
)
"""
recommendation_table_creation_query = """CREATE TABLE IF NOT EXISTS user_recommendations (
id UUID PRIMARY KEY,
username VARCHAR(255),
appid VARCHAR(255),
similarity FLOAT
)
"""
game_tags_creation_query = """CREATE TABLE IF NOT EXISTS game_tags (
id UUID PRIMARY KEY,
appid VARCHAR(255) NOT NULL,
class VARCHAR(255) NOT NULL
)
"""
# Operating queries to create tables
engine.delete_table('user_recommendations')
engine.delete_table('user_games')
engine.delete_table('game_tags')
engine.delete_table('video games')
engine.delete_table('customers')
# Create tables
engine.create_table(user_table_creation_query)
engine.create_table(game_table_creation_query)
engine.create_table(user_games_query)
engine.create_table(recommendation_table_creation_query)
engine.create_table(game_tags_creation_query)
# Making certain every row of every dataframe has a singular ID
if 'id' not in users_df.columns:
users_df['id'] = [str(uuid.uuid4()) for _ in range(len(users_df))]
if 'id' not in games_df.columns:
games_df['id'] = [str(uuid.uuid4()) for _ in range(len(games_df))]
if 'id' not in user_games_df.columns:
user_games_df['id'] = [str(uuid.uuid4()) for _ in range(len(user_games_df))]
if 'id' not in user_recommendations_df.columns:
user_recommendations_df['id'] = [str(uuid.uuid4()) for _ in range(len(user_recommendations_df))]
if 'id' not in game_tags_df.columns:
game_tags_df['id'] = [str(uuid.uuid4()) for _ in range(len(game_tags_df))]
# Populates the 4 tables with information from the dataframes
engine.populate_table_dynamic(users_df, 'customers')
engine.populate_table_dynamic(games_df, 'video games')
engine.populate_table_dynamic(user_games_df, 'user_games')
engine.populate_table_dynamic(user_recommendations_df, 'user_recommendations')
engine.populate_table_dynamic(game_tags_df, 'game_tags')
# Testing if the tables have been created and populated accurately
print(engine.test_table('customers'))
print(engine.test_table('video games'))
print(engine.test_table('user_games'))
print(engine.test_table('user_recommendations'))
print(engine.test_table('game_tags'))
Deploying a FastAPI Software on Render
We now have the primary half of our mission deployed on render, and it’s time to arrange our FastAPI software. To do that, we’re going to make use of Render’s Internet Software internet hosting service, which is able to permit us to deploy our FastAPI App as an internet software that may be accessed by exterior providers. If we wished to construct a full-stack software, we might then permit our entrance finish to ship requests to the FastAPI software on Render and return information to the consumer. Nonetheless, as a result of we’re not all for constructing a front-end element presently, we’ll as an alternative work together with our App by the Swagger docs.
Containerizing our Software with Docker
We’ve arrange our FastAPI mission in an area setting, however now we have to switch it, with all of the code, dependencies, and environmental variables, to a container on Render. This could possibly be a frightening problem. Thankfully, Docker handles all of the sophisticated items and permits us to do exactly that with a easy configuration file and a few instructions. For individuals who haven’t used Docker, there’s a nice tutorial right here. The transient overview is that Docker is a software that simplifies the method of deploying and managing functions by permitting us to bundle our software with all its dependencies as a picture after which deploy that picture to a service like Render. On this mission, we use DockerHub as our picture repository, which serves as a central version-controlled storage space for our picture, which we are able to then pull into Render.
Our total stream for this mission may be considered like this FastAPI app operating domestically → A ‘Snapshot’ is taken with Docker and saved as a Docker Picture → That Picture is pushed to DockerHub → Render pulls this picture and makes use of it to spin up a Container that runs the appliance on a Render Server. Getting began with this course of, which we’ll stroll by subsequent, requires having Docker Desktop put in. Docker has a simple set up course of which you will get began on right here: https://www.docker.com/merchandise/docker-desktop/
Moreover, when you don’t have one already, you’ll want a Docker Hub account as this can function the repository to avoid wasting Docker Pictures to after which Pull them into Render. You possibly can create a Docker Hub right here: https://hub.docker.com/.
Constructing a Docker Picture
To create a Docker Picture for our mission, first be certain that Docker Desktop is operating; if it isn’t, you’ll seemingly get an error when attempting to create a Docker picture. To make sure it’s operating, open the Docker Desktop software out of your search bar or desktop, click on on the three dots within the backside left as proven beneath, and make sure you see the Inexperienced dot adopted by ‘Docker Desktop is operating’.

Subsequent, we have to inform Docker methods to construct our picture, which is finished by defining a Dockerfile. Our Dockerfile may be seen in Determine 9. We put it aside in our top-level listing, and it offers the directions that inform Docker methods to bundle our software into a picture that may be deployed on a special piece of {hardware}. Let’s stroll by this file to grasp what it’s doing.
- FROM: Selecting Base Picture: The primary line in our Dockerfile specifies what base picture we need to use to then prolong for our software. On this case, we’re utilizing the python:3.13-slim-bullseye picture, which is a light-weight Debian-based picture that may function the bottom for our software.
- WORKDIR: Altering Work Listing: Right here we’re setting the default listing inside our container to /app
- RUN: Checking for updates to system dependencies
- COPY: Coping necessities.txt file, it’s vital that necessities.txt is updated and incorporates all libraries required for the mission, or the Picture gained’t run accurately once we attempt to spin it up
- RUN: Putting in our necessities.txt file
- COPY: Copy our complete mission from our native listing to /app, which we created in step 2
- RUN: Making a logs listing at /app/logs
- EXPOSE: Doc that the port we’ll be exposing is port 8000
- ENV: Units our Python path to /app
- CMD: Runs our FastAPI app utilizing Uvicorn, units our app to the one outlined in src.primary:app, runs our app on port 8000

With our Dockerfile outlined, we now have a set of directions that we can provide to Docker to containerize our software into a picture that we are able to then push to Docker Hub. We will now do that with a few instructions from our VS Code terminal, proven beneath. Every of those traces must be run individually within the VS Code terminal from the highest listing of your mission.
- First, we construct our Docker picture, which is able to seemingly take a minute or two. On this case, we’re naming our picture ‘recommendersystem’
- Subsequent, we tag our picture, the syntax right here is image_name user_name/docker_hub_folder:image_name_on_dockerhub
- Lastly, we push our picture to Dockerhub once more specifying the user_name/docker_hub_folder:image_name_on_dockerhub
docker construct -t recommendersystem .
docker tag recommendersystem seelucas/fastapi_tutorial:fastapi_on_render
docker push seelucas/fastapi_tutorial:fastapi_on_render
After that is accomplished, we should always be capable of log in to DockerHub, navigate to our mission, and see that we now have a picture whose title matches what we gave it within the earlier 3 instructions, on this case, fastapi_on_render.

Pulling Docker Picture to Render
Now we now have our Docker Picture on DockerHub, and it’s time to deploy that Picture on Render. This may be accomplished by navigating to the identical mission that we created our database in, “fastapi-test”, deciding on “New”, within the prime proper, after which deciding on “Internet Service” as our FastAPI app might be deployed as a Internet Software.
As a result of we’re deploying our picture from Dockerhub, we specify that our Supply Code is an Current Picture, and as proven in Determine 11, we paste the Dockerhub Listing path to the Picture we need to deploy into ‘Picture URL’ in Render. We then get a notification that this can be a personal picture, which suggests we’ll have to create a Dockerhub Entry token that we are able to then use to securely pull the picture from DockerHub into Render.

Thankfully, making a DockerHub Entry token is easy; we navigate to our DockerHub account -> Settings → Private Entry token. The display screen ought to appear like Determine 12. we offer an entry token title, expiration date, and permissions. Since we’re pulling the picture into Render, we solely want learn entry somewhat than write or delete, so we choose that.

Lastly, deciding on ‘Generate’ will generate our token, which we then want to repeat over to render and enter as proven in Determine 13.

As soon as we’ve chosen ‘Add Credential’ as proven above, it should then load for a minute because the credentials are saved. We’ll then be taken again to the earlier display screen, the place we are able to choose our credentials to make use of to connect with DockerHub. On this case, we’ll use the tutorial credentials we simply created and choose Join. We are going to then have established a connection that we are able to use to drag our Docker Picture from DockerHub to Render for Deployment.

On the following web page, we proceed with organising our Render Internet applicaiton by deciding on the free choice after which importantly, on Environmental Variables, we copy and paste our .env file. Whereas we don’t use all of the variables on this file, we do use the ‘Internal_Database_Url’, which is the URL that FastAPI will search for in our primary.py file. With out this, we gained’t be capable of connect with our database, so it’s vital that we offer this. Word: for testing, we beforehand used the ‘External_Database_Url’ as a result of we have been operating the script from our native machine, which is exterior to our Render setting; nonetheless, right here each the Database and Internet Server are in the identical Render setting, so we use the Internal_Database_Url in primary.py.
After coming into our environmental variables, we then select ‘Deploy Internet Service’.

The service will take a few minutes to deploy, however then it is best to get a notification like beneath that the service has deployed with a render hyperlink on prime that we are able to entry at.

Navigating to this hyperlink will take us to the Hiya World methodology, if we add/docs to the top of it, we’ll be taken to the swagger docs in Determine 17. Right here we are able to check and guarantee our FastAPI Internet Software is related to our database through the use of the Fetch All Customers methodology. We will see beneath that this does certainly return information.

Lastly, we need to test if our consumer suggestions system is dynamically updating. In your earlier API name, we are able to see that there’s a consumer ‘user_username’ in our database. Utilizing the Fetch Beneficial Sport methodology with this username, we are able to see the highest match is appid = B08BHHRSPK.

We replace our customers’ favored video games by selecting a random one from our video games appid = B0BHTKGN7F, which seems to be ‘The Elder Scrolls: Skyrim Boardgame’, and leveraging our user_games POST methodology.

Including a recreation to our consumer video games desk is meant to routinely set off the recommender pipeline to rerun for that consumer and generate new suggestions. If we navigate to our console, we are able to see that it seems to have occurred as we get the brand new consumer suggestions generated message proven beneath.

If we navigate again to our Swagger docs, we are able to strive the fetch suggestion methodology once more, and we see in Determine 21 that we certainly do have a special listing of suggestions than the one earlier than. Our Recommender Pipeline is now routinely updating as customers add extra information and is accessible past our native setting.

Wrapping Up:
On this mission, we’ve proven methods to arrange and deploy a suggestion system leveraging a FastAPI interplay layer with a PostgreSQL database to generate clever board recreation suggestions for our customers. There are additional steps we might take to make this technique extra strong, like implementing a hybrid suggestion system as we achieve extra consumer information or enabling consumer tagging to seize extra options. Moreover, though we didn’t cowl it, we did make the most of a GitHub workflow to rebuild and push our Docker picture every time there’s a brand new replace to our primary department, and this code is obtainable in .github/workflows. This helped to significantly speedup improvement as we didn’t need to manually rebuild our Docker picture every time we made a small change.
I hope you loved studying and that this helps you construct and deploy your tasks with FastAPI.
LinkedIn: https://www.linkedin.com/in/lucas-see-6b439188/
Electronic mail: [email protected]
Figures: All photos, except in any other case famous, are by the creator.
Hyperlinks:
- Github Repository for Venture: https://github.com/pinstripezebra/recommender_system
- FastAPI Docs: https://fastapi.tiangolo.com/tutorial/
- Docker Tutorial: https://www.youtube.com/watch?v=b0HMimUb4f0
- Docker Desktop Obtain: https://www.youtube.com/watch?v=b0HMimUb4f0
- Docker Hub: https://hub.docker.com/