Wednesday, November 30, 2022
HomeOnline BusinessDjango and SQL: Your Dynamic Duo for Scaling Databases

Django and SQL: Your Dynamic Duo for Scaling Databases

Scaling and optimizing databases to satisfy the wants of your purposes generally is a important problem. When you haven’t learn my current weblog about how Django can do the heavy lifting for Python and SQL database purposes, I extremely advocate you test it out. However the TL;DR model is that SQL is optimized for SQL databases, Python isn’t, and Django is a superb middleman that will help you construct more practical purposes, with much less friction, complexity, and code when utilizing these two languages collectively.

So whereas Django does the heavy lifting of making the database app, you might be nonetheless accountable for the day-to-day administration and monitoring of your databases. A few of these administration duties might be deferred to your cloud supplier, utilizing providers like Linode Managed Databases, however you may uncover new roadblocks as you scale, similar to:

  • Database Migrations. Changing an present database to a brand new, desired state with managed modifications to the database scheme.
  • Multi-Database Deployments. To optimize efficiency, builders can design their purposes to make use of separate databases for segmented capabilities. For instance, a major learn/write database and a learn reproduction database for widespread queries.

If considered one of your databases makes use of SQL, you possibly can use Django to cut back friction and make your life quite a bit simpler whereas dealing with a major quantity of information.

This introduction to 2 key database administration ideas pairs with the step-by-step directions to constructing a production-ready Django software discovered within the Understanding Databases book and my new instructional video sequence. Both studying path will make it easier to get Django to do the SQL heavy lifting for you.

Database Migrations
While you’re beginning out, getting the information sorts proper for any given column generally is a bit tough, particularly since your information wants will inevitably change over time. What should you wished your title discipline to be simply 80 characters lengthy? What if it’s worthwhile to add a timestamp discipline so you possibly can observe precisely when objects have been added to the database?

Altering a desk after it has been created can get fairly messy for just a few causes:

  • What do you do with pre-existing values?
  • What if pre-existing rows are lacking information for brand new columns/fields?
  • What should you take away a column/discipline? What occurs to the information?
  • What should you add a relation that didn’t exist earlier than (ie overseas keys)?

Fortunately for Django builders, we’ve got one thing referred to as makemigrations and migrate.

Let’s check out the way it works in motion.

Right here’s our instance Django information mannequin:

class BlogArticle(fashions.Mannequin):
    person = fashions.ForeignKey(Person, default=1, on_delete=fashions.SET_DEFAULT)
    title = fashions.CharField(max_length=120)
    slug = fashions.SlugField(clean=True, null=True)
    content material = fashions.TextField(clean=True, null=True)
    publish_timestamp = fashions.DateTimeField(

Let’s add the sphere:

updated_by = fashions.ForeignKey(
        Person, related_name="editor", null=True, clean=True, on_delete=fashions.SET_NULL

This discipline will enable us to trace the final person to make a change to our mannequin. 

Let’s replace our mannequin:

class BlogArticle(fashions.Mannequin):
    person = fashions.ForeignKey(Person, default=1, on_delete=fashions.SET_DEFAULT)
    title = fashions.CharField(max_length=120)
    slug = fashions.SlugField(clean=True, null=True)
    content material = fashions.TextField(clean=True, null=True)
    publish_timestamp = fashions.DateTimeField(
    # our new discipline
    updated_by = fashions.ForeignKey(
        Person, related_name="editor", null=True, clean=True, on_delete=fashions.SET_NULL

Now, after we save this file this BlogArticle class is asserted in (, how can we let our database know this transformation occurred?

There’s two methods:

  1. python makemigrations
  2. python migrate

Let’s talk about what these two instructions do:

python makemigrations

python makemigrations appears for modifications in all information throughout your Django undertaking and appears for modifications. If modifications are discovered, a brand new python file can be created with the proposed modifications that our SQL database wants to make. The proposed modifications look one thing like:

from django.conf import settings
from django.db import migrations, fashions
import django.db.fashions.deletion

class Migration(migrations.Migration):

    dependencies = [
        ('articles', '0001_initial'),

    operations = [
            field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="editor", to=settings.AUTH_USER_MODEL),

This, in fact, is simply one other Python file. This file is letting us (the builders) know what ought to occur in our database. It’s written in Python and never SQL to keep up cohesion and to leverage the Django ORM’s built-in options.

However why is that this a file for what ought to occur? Properly, there’s just a few causes for this:

  • If we have to overview what ought to occur earlier than it does, we are able to catch it right here.
  • This makemigrations command doesn’t examine with the database to see if this transformation can even occur.
  • The database might have already been modified to suit these necessities (relying on a lot of elements associated to who/what’s managing the database).
  • If we have to run exams previous to altering a manufacturing database, proper now could be an incredible time to take action.

Assuming that this transformation is legitimate (so far as we are able to inform), we are able to commit the modifications:

python migrate

python migrate will try to vary our database for us — all fields, columns, tables, overseas keys, you title it — Django will do the work for us to assist make sure the database is up to date in the way in which we meant.

It’s vital to notice that Django may fail to make these modifications for a lot of causes. For brand new Django builders, that is nearly all the time as a consequence of including and eradicating fields and columns and failing to run migrations accurately.

When achieved accurately, python migrate ensures a secure system that matches our Python code with our SQL tables, thus permitting us all of the awesomeness that each Django and SQL databases present.

How does this give us extra flexibility?

Python has broad purposes the place SQL doesn’t. Structured Question Language has its limitations written within the title. Who’s creating Pixar animations with simply SQL?

OK, all of that is to say that the simplicity of Python really helps builders undertake the ability of SQL and presumably with out even realizing it.

Why Managed Databases and Django Make Sense

In the case of creating an internet purposes, together with Django, you’ll must determine just a few issues:

  • Which information storage answer(s) do we would like? MySQL, Postgres, MongoDB, Redis, Object Storage, and many others
  • How will we run/combine with the information storage answer?
  • How will we recuperate from interruption or downtime?
  • How will we preserve the storage answer?
  • How will we safe our storage answer?
  • How will we backup our storage answer?

The solutions to those questions might change as your undertaking grows in complexity however all of them begin in the identical place: deciding between self-managed versus third-party managed.


  • Professionals: Management and price.
  • (Important) Con: You’re accountable for every thing.

Managed providers usually value extra money from the beginning, whereas self-managing means you should utilize your most well-liked Linux distro that’s one way or the other (or considerably) optimized for what you want. This could embody working a forked model of MySQL that your staff has modified. You may save {dollars} on working your service, however this can all the time take extra time to keep up.

Third-party managed databases: 

Sure, it may be barely costlier in {dollars} and cents however it’ll take considerably much less time to keep up. This feature and managed information storage options are my de facto alternative for my net purposes. On this instance, we’re already using Django to handle database transactions. SQLAlchemy additionally shares this power as it’s used with frameworks similar to FastAPI, Flask, and lots of others. When you’re already outsourcing your SQL writing to a Python package deal, why not outsource working your SQL servers?

Now, given the effectiveness of Python ORMs (like Django ORM and SQLAlchemy), I like to recommend that you simply use managed database and/or managed information storage providers at any time when attainable, right here’s what you stand to realize should you do:

  • Diminished improvement time
  • Diminished administration time
  • Diminished restoration time
  • Diminished service interruptions
  • Diminished deployment and improvement complexity
  • Diminished complexity in Database migrations (from different providers)
  • Diminished repetitive/ineffective/inefficient actions for SQL builders
  • Diminished DevOps/Ops complexity
  • Elevated effectiveness of non-SQL builders
  • Elevated deployment and improvement velocity
  • Elevated reliability (usually backed by a Service Stage Settlement)
  • Elevated safety
  • Elevated maintainability
  • Elevated in backups and redundancy
  • Marginal improve in value

I made the checklist above with the mindset of utilizing a Managed MySQL Database Cluster on Linode as properly Linode Object Storage (for storing information like CSS, JavaScript, photos, movies, and many others). Virtually talking, utilizing these providers helps us preserve concentrate on constructing a wonderful net software with Django, FastAPI, Flask, Node.js, or no matter. To place it one other method, we shift the concentrate on constructing the instruments and software program your customers really need. You understand, the place the actual worth is to them.

MySQL, PostgreSQL, Redis, and Django

For a very long time, Django’s main database was PostgreSQL. I’d argue that is, largely, as a consequence of the truth that you might use a JSONField inside Postgres solely. With Django 3.2+ and MySQL 5.7.8+, the JSONField is now accessible for MySQL as properly.

Why is that this vital?

Storing unstructured information, like JSON, is usually required when dealing with user-generated content material or storing information from different API providers. Let’s see how:

from django.db import fashions

class Pet(fashions.Mannequin):
    title = fashions.CharField(max_length=200)
    information = fashions.JSONField(null=True)

    def __str__(self):
        return self.title

Right here’s the information I need to retailer in relation to this Pet:

pet1 = {
    "title": "Bruno",
    "kind": "Rat",
    "nickname": "We do not speak about it",
    "age": 2,
    "age_interval": "months"

pet2 = {
    "title": "Tom",
    "kind": "Cat",
    "breed": "Combined"
    "age": 4,
    "age_interval: "years",
    "favorite_food": [{"brand": "Acme", "flavor": "Tuna" }]

pet3 = {
    "title": "Stewey",
    "kind": "Canine",
    "breed": "unknown"
    "age": 34,
    "age_interval: "canine years",
    "nickname": "Soccer"

This information exhibits us after we may want a JSONField . We will retailer all of the pet names (utilizing the title key) and hold the remainder to be saved within the JSONField. The cool factor about JSONFields is they are often queried very like another commonplace Django discipline even with these various schemas.

There’s an ongoing debate amongst Django builders as to which database to make use of: MySQL or PostgreSQL. For the longest time, I all the time opted for PostgreSQL because of the truth the JSONField was solely accessible on PostgreSQL, and that’s not the case. I say decide one and keep it up till it not serves your wants.

However what can we use Redis for?

Redis is an in-memory datastore that’s extremely quick and sometimes used as a short lived database (extra on this in a second), a caching service, and/or a messaging queue. The rationale I name it a short lived database is because of the truth that it’s in-memory. Reminiscence is usually costlier than disk storage and thus making storing information long run in-memory is usually not possible.

My major use case for Redis and Django are caching and queuing

Caching: Let’s say you have got a number of net pages that customers go to a lot. You need the information in these pages to be proven to customers as shortly as attainable. Redis, as a caching system for Django, makes doing this extremely straightforward. The information inside these pages may be rendered from a SQL database however Redis can retailer that rendered information from the cache. In different phrases, utilizing Redis with SQL can usually velocity up your responses whereas lowering the quantity of queries to your SQL databases.

Queuing: One other well-liked use case of Redis is to dump long-running duties to a different course of (usually by way of a Python package deal referred to as Celery). When it’s worthwhile to do that, you should utilize Redis as a queue of the duties that needs to be accomplished at one other time.

For instance, in case you have a person that wants a report of all of their transactions for the previous 5 years, the software program may take hours to really generate that report. Clearly, nobody goes to stare at a machine for hours. So we might offload this request from our person to a Redis queue. As soon as in Redis, we are able to have a employee course of working (like utilizing Celery with Django) to really generate the report. As soon as the report is finished, regardless of how lengthy it took, the person could be notified. This notification, as with different notifications, is also achieved by way of a Redis Queue coupled with a Celery/Django employee course of.

That is all to say that Redis and MySQL really complement one another very properly. You’ll be able to deploy a self-managed Redis database server through the Linode Market.

Object Storage

The final data-related managed service I like to recommend utilizing is Linode Object Storage. Object Storage is accountable for all the opposite sorts of information you might must retailer. For instance, we might not retailer all of the bytes in a video in MySQL. As an alternative, we’d retailer metadata associated to that video and retailer the video in Object Storage.

Right here are some things you’ll use object storage for:

  • Cascading Model Sheets (CSS)
  • JavaScript (like React.js, Vue.js, Vanilla.js, and many others)
  • Movies
  • Pictures (uncooked and compressed)
  • CSVs, XLSX
  • Database Backups
  • Docker Container Picture Layers (if self-managed)
  • Iterations of Educated Machine Studying Algorithms
  • Terraform State Information
  • PDFs (each giant and small)
  • Any persistent file that must be downloaded usually (or uploaded)


After studying this, I hope you’re feeling motivated to leverage the ability of managed providers together with your net software initiatives. Django is a wonderful answer for constructing net apps on high of SQL databases, nevertheless it’s actually not the one one. If you wish to dive into the internals of SQL and SQL servers, I believe it’s a worthwhile train to see what number of profitable purposes leverage Django to deal with the majority of what Django can do.

Right here’s just a few (or many) highlights that make Django with Managed MySQL on Linode superior:

  • Django does the heavy SQL lifting for you (so do instruments like SQLAlchemy for Flask/FastAPI)
  • Django allows uncooked SQL instructions too (once more, so do instruments like SQLAlchemy)
  • Django helps learners be taught SQL instructions
  • Django has built-in assist for MySQL and PostgreSQL (along with a db-specific python shopper)
  • Will increase velocity to manufacturing deployments
  • Elevated reliability and recoverability
  • Allows improvement and manufacturing environments to match database expertise nearly precisely
  • Makes container-based Django simpler and extra dependable
  • Unlocks scaling from a single-node deployment to multi-node and even full fledge transition to Kubernetes
  • Simpler for brand new Django/Python builders to make use of production-grade techniques
  • Sharing databases throughout a number of python-based apps is less complicated and safer (similar to a FastAPI software studying/writing from/to a Django-based MySQL database).
  • Django’s JSONField now supported utilizing MySQL (beforehand solely PostgreSQL)
  • Straightforward to check (throughout CI/CD or in native improvement environments)
  • Scales to satisfy Django calls for
  • Assist for a number of databases in a single Django undertaking similar to: utilizing MySQL as major learn/write database and a MySQL learn reproduction database for widespread queries.
  • Strict Entry Controls (Linode Personal IPs, native improvement)
  • Requires SSL Certificates for connection (add complexity to deployments but in addition will increase safety)
  • Allows non-public connection (in identical area; lowers connection prices)

When you’re desirous about a contemporary strategy to deploying Django purposes on Linode together with a managed MySQL Database, GitHub Actions for CI/CD, Terraform, and Ansible, bounce in to tons of free step-by-step instructional content material:

To assist get you began, the Coding for Entrepreneurs GitHub has a repository of code that goes with every step within the sequence. Good luck, and you’ll want to let me know the way issues are going through Twitter @JustinMitchel.



Please enter your comment!
Please enter your name here

Most Popular

Recent Comments