For a blog application, a full-text search feature is important. A full-text search feature can perform complex searches, retrieve results by similarity, or by weighting terms based on how often they appear in the text or how important different fields are.
For the SQLite database, this feature is limited and Django does not support it out of the box. The PostgreSQL database has a full-text search feature built in. For this feature, it was time to integrate a PostgreSQL database into our project.
Installing PostgreSQL
To use a PostgreSQL database, I have pulled the docker hub image of the PostgreSQL database with: docker pull postgres:16.2
. After pulling the image I run a PostgreSQL container with:
docker run --name=blog_db -e POSTGRES_DB=blog -e POSTGRES_USER=blog -e POSTGRES_PASSWORD=super_secret -p 5432:5432 -d postgres:16.2
Legend
docker run
: This Docker command will build the container and run it immediately.--name
: This flag is used to name the Docker container.-e
: This flag allows you to set environment variables, such aPOSTGRES_DB
: This is an optional environment variable and can be used to set a name for the default database that is created when the image is first started. If not set, the POSTGRES_USER
value is used as the database name.POSTGRES_USER
: If POSTGRES_USER
is not set, the default POSTGRES_USER
is "postgres". This environment variable is also optional.POSTGRES_PASSWORD
: This is a required environment variable and sets the password for using the PostgreSQL database.-p
: This option binds the port to the container.-d
: This is a detach
mode, the container will run in the background and will not block a terminal window.postgres:16.2
: This is the name of the docker image with its tag. Where postgres
is the name of the Docker image and 16.2
is the tag.To check if the container is running, you can use docker ps
. This command lists all running Docker containers with their current status and exposed ports. If you want to see all Docker containers, even the ones that have already stopped, you can use docker ps -a
.
The next step was to install the psycopg
PostgreSQL adapter for Python: python -m pip install psycopg==3.1.18
.
Dumping the existing data
All my previous data was stored in the SQLite database in my blog project. There is a way to dump all the data from the SQLite database, export that data, change the database in our blog project and import the data into the PostgreSQL database.
python manage.py dumpdata --indent=2 --output=blog_project.json --exclude=auth --exclude=contenttypes
Legend:
dumpdata
: This command exports the data in the database to a file.--indent
: This flag specifies the number of indentation spaces to use in the output.--output
: This flag specifies a file to write the serialised data to.--exclude
: This flag excludes certain applications or models from being dumped.Changing the database
The SQLite database configuration within the settings.py
file:
# blog_project/settings.py
DATABASES = {
"default": {
"ENGINE": "django.db.backends.sqlite3",
"NAME": BASE_DIR / "blog-project.sqlite3",
}
}
This configuration has been changed to a PostgreSQL database configuration:
# blog_project/settings.py
from decouple import config
DATABASES = {
"default": {
"ENGINE": "django.db.backends.postgresql",
"NAME": config("DB_NAME"),
"USER": config("DB_USER"),
"PASSWORD": config("DB_PASSWORD"),
"HOST": config("DB_HOST", 'localhost'),
"PORT": "5432",
}
}
The database configuration variables are stored in a local environment file called .env
.
# .env
# PostgreSQL
DB_NAME=blog
DB_USER=blog
DB_PASSWORD=super_secret
DB_HOST=localhost
To save the new database configuration, you need to migrate with python manage.py migrate
.
Loading the data into the new PostgreSQL database
We are going to load the data fixtures we created earlier into our new PostgreSQL database using:
python manage.py loaddata blog_project.json
Legend:
loaddata
: This command searches for the specified file and loads the contents of the file (fixture) into the configured database.