Mike Slinn
Mike Slinn

Django Notes

Published 2021-02-12.
Time to read: about 4 minutes.

This article is categorized under Django

Mozilla Developer Network

Although the Django documentation is very good, the Mozilla Developer Network (MDN) has possibly the best Django tutorial.

Variables

The syntax for defining and using Django variables is almost identical to the syntax for using Jekyll / Liquid variables.

Settings for Production, Development, Testing, and Continuous Integration

Django has a simple and effective mechanism for establishing settings for various environments. Common environments include production (prod), development (dev), testing (test), and continuous integration (ci). The Frobshop tutorial just sets up one settings environment. The process necessary to refine that collection of settings into multiple environments is well documented and easy to work through.

django-admin

django-admin can be run for a specific Django webapp as manage.py. To set this up, right after running django-admin startproject to create a new Django webapp, make manage.py in the new directory executable:

Shell
$ chmod a+x manage.py

Bash Tab Completion

Tab completion for django-admin and manage.py can be enabled as follows.

Shell
$ wget -O ~/django_bash_completion \
  https://raw.githubusercontent.com/django/django/master/extras/django_bash_completion

$ echo "source django_bash_completion" >> .bashrc

$ source ~/django_bash_completion

Tab completion for django-admin and manage.py in derivative projects, such as django-oscar, is automatic.

Subcommands

Notes on django-admin / manage.py subcommands follow.

datadump Subcommand

The datadump subcommand of manage.py dumps some or all of the django-oscar webapp data from the database. It provides trasparency of the database details.

Both the datadump and dataload subcommands have the same documentation error regarding the --database option. The documentation states that the option “Specifies the database from which data will be dumped. Defaults to default.”. Instead, the documentation should say that the option “Specifies the named database entry in settings. Defaults to default.”

By convention, database dumps should be stored in the fixtures directory of every app. However, when we make dumps for an app distributed via pip such as django-oscar, we need to provide a different directory to store the fixtures into. I created a directory called frobshop/fixtures/ for this purpose.

Shell
(oscar) $ mkdir frobshop/fixtures/

Following the Django documentation on fixtures, I also added the following to frobshop/settings/base.py. However, I found no benefit in doing so, except to document to whomever might read the settings that fixtures live there. The dumpdata and loaddata subcommands of manage.py do not seem to use this information.

frobshop/settings/base.py
FIXTURE_DIRS = [
    'fixtures',
]

By default, the dumpdata subcommand of manage.py returns results in JSON format. The following example dumps all data for the entire django-oscar instance, and uses jq to pretty-print the JSON result before saving into frobshop/fixtures/all.json.

Shell
(oscar) $ ./manage.py dumpdata | \
  jq > frobshop/fixtures/all.json

Jq can also parse JSON data and extract data subsets using queries. The following command displays formatted JSON associated with the catalogue app’s data, including data for models of catalogue.productclass, catalogue.category, catalogue.productattribute, catalogue.attributeoptiongroup, and catalogue.attributeoption:

Shell
(oscar) $ ./manage.py dumpdata catalogue | \
  jq > frobshop/fixtures/catalogue.json

The following saves formatted JSON for all catalogue.attributeoption records into a file called frobshop/fixtures/catalogue.attributeoption.json, which I display.

Shell
(oscar) $ ./manage.py dumpdata catalogue.attributeoption | \
  jq > frobshop/fixtures/catalogue.attributeoption.json

(oscar) $ cat frobshop/fixtures/catalogue.attributeoption.json
[
  {
    "model": "catalogue.attributeoption",
    "pk": 1,
    "fields": {
      "group": 2,
      "option": "Soft red"
    }
  },
  {
    "model": "catalogue.attributeoption",
    "pk": 3,
    "fields": {
      "group": 2,
      "option": "Soft green"
    }
  },
  {
    "model": "catalogue.attributeoption",
    "pk": 4,
    "fields": {
      "group": 2,
      "option": "Soft blue"
    }
  },
  {
    "model": "catalogue.attributeoption",
    "pk": 5,
    "fields": {
      "group": 2,
      "option": "Soft orange"
    }
  },
  {
    "model": "catalogue.attributeoption",
    "pk": 6,
    "fields": {
      "group": 1,
      "option": "Unscented"
    }
  },
  {
    "model": "catalogue.attributeoption",
    "pk": 7,
    "fields": {
      "group": 1,
      "option": "Bergamot - Bergaptene Free"
    }
  },
  {
    "model": "catalogue.attributeoption",
    "pk": 8,
    "fields": {
      "group": 1,
      "option": "Cedarwood - Atlas"
    }
  },
  {
    "model": "catalogue.attributeoption",
    "pk": 9,
    "fields": {
      "group": 1,
      "option": "Frankincense - India"
    }
  },
  {
    "model": "catalogue.attributeoption",
    "pk": 10,
    "fields": {
      "group": 1,
      "option": "Ginger Root"
    }
  },
  {
    "model": "catalogue.attributeoption",
    "pk": 11,
    "fields": {
      "group": 1,
      "option": "Lavender - natural blend"
    }
  },
  {
    "model": "catalogue.attributeoption",
    "pk": 12,
    "fields": {
      "group": 1,
      "option": "Lemongrass"
    }
  },
  {
    "model": "catalogue.attributeoption",
    "pk": 13,
    "fields": {
      "group": 1,
      "option": "Lemon - natural blend"
    }
  },
  {
    "model": "catalogue.attributeoption",
    "pk": 14,
    "fields": {
      "group": 1,
      "option": "Orange - natural blend"
    }
  },
  {
    "model": "catalogue.attributeoption",
    "pk": 15,
    "fields": {
      "group": 1,
      "option": "Spearmint"
    }
  },
  {
    "model": "catalogue.attributeoption",
    "pk": 16,
    "fields": {
      "group": 1,
      "option": "Ylang Ylang"
    }
  },
  {
    "model": "catalogue.attributeoption",
    "pk": 17,
    "fields": {
      "group": 1,
      "option": "Tea Tree - Organic"
    }
  },
  {
    "model": "catalogue.attributeoption",
    "pk": 18,
    "fields": {
      "group": 1,
      "option": "Turmeric - Organic"
    }
  }
] 

The following saves formatted JSON for only the catalogue.attributeoption records associated with the group having id 1 into a file called frobshop/fixtures/catalogue.attributeoption.fields.group.1.json:

Shell
(oscar) $ ./manage.py dumpdata catalogue.attributeoption | \
  jq '.[] | select(.fields.group == 1)' > \
  frobshop/fixtures/catalogue.attributeoption.fields.group.1.json

extract Script

It seems natural to pipe the JSON output of manage.py dumpdata into jq for parsing, formatting and processing. I wrote a bash script called extract, which does that for 4 use cases:

  1. Dump the entire database and pretty-print the JSON.
    Shell
    (oscar) $ bin/extract
    This does the same as the above:
    Shell
    (oscar) $ bin/extract all
  2. Only dump a specified app's data. This dumps all of the catalogue app’s data:
    Shell
    (oscar) $ bin/extract catalogue
  3. Only dump a property of the specified app’s data. This dumps the attributeoption property value in the catalogue app’s data:
    Shell
    (oscar) $ bin/extract catalogue.attributeoption
  4. Only dump a specified app’s data where a jq selector matches a value. This dumps the attributeoption property values in the catalogue app’s data where the fields.group subproperty has value 1:
    Shell
    (oscar) $ bin/extract catalogue.attributeoption .fields.group 1

This is the source code for the script, which I place in the bin directory I created for the Frobshop webapp.

bin/extract
#!/bin/bash

unset APP
unset EXPRESSION
unset VALUE

if [ "$1" ]; then
  APP="$1"
  if [ "$APP" == all ]; then unset APP; fi
  shift
fi

if [ "$1" ]; then EXPRESSION="$1"; shift; fi

if [ "$1" ]; then VALUE="$1"; shift; fi

#echo "APP='$APP'; EXPRESSION='$EXPRESSION'; VALUE='$VALUE'"

if [ -z "$EXPRESSION" ]; then
  ./manage.py dumpdata $APP | jq
elif [ -z "$VALUE" ]; then
  ./manage.py dumpdata $APP | jq ".[]$EXPRESSION"
else
  ./manage.py dumpdata $APP | jq ".[] | select($EXPRESSION == $VALUE)"
fi

I also made a Python version of extract, using jq.py:

bin/extract
"""
Valid call syntaxes:
extract.py
extract.py my_app
extract.py my_app.app_property
extract.py my_app.app_property .property.sub_property
extract.py my_app.app_property .property.sub_property value


It would be nice to be able integrate this into manage.py. See
https://docs.djangoproject.com/en/3.1/howto/custom-management-commands/

One way to do this would be to override `django.contrib.admin`.
If this code was called `mslinn.admin.extras`, then it should be listed in
`INSTALLED_APPS` before `django.contrib.admin`, like this:

INSTALLED_APPS = [
    'mslinn.admin.extras',
    'django.contrib.admin',
     ...
]
"""

import json
import jq
import sys

def pprint(value):
    # value = json.loads(value)
    return json.dumps(value, indent=True, sort_keys=True)

"""
@return output of running command
"""
def run(command):
    import subprocess
    return subprocess \
               .run(command, stdout=subprocess.PIPE, shell=True, check=True) \
               .stdout \
               .decode("utf-8")


def main(argv):
    import jq

    app, expression, value, *extra = argv + ['' for x in range(3-len(argv))]
    app = '' if app == 'all' else app
    if extra or app=='-h':
        sys.exit(f"Syntax: {sys.argv[0]} [app [expression [value]]]")

    dumpdata = run(f'./manage.py dumpdata {app}')

    if not expression:
        result = json.loads(dumpdata)
    elif not value:
        result = jq.compile(f'.[] | {expression}').input(text=dumpdata).all()
    else:
        result = jq.compile(f'.[] | select({expression} == {value})').input(text=dumpdata).all()

    print(pprint(result))

main(sys.argv[1:])

dbshell Subcommand

The dbshell subcommand of the manage.py command picks up the database parameters from Django settings for the webapp and passes them to the PostgreSQL psql command. A psql subshell can be spawned:

Shell
(oscar) $ ./manage.py dbshell
psql (12.5 (Ubuntu 12.5-0ubuntu0.20.10.1))
SSL connection (protocol: TLSv1.3, cipher: TLS_AES_256_GCM_SHA384, bits: 256, compression: off)
Type "help" for help.

frobshop=# 

A single SQL statement can be executed:

Shell
(oscar) $ ./manage.py dbshell -- -tc 'select current_user'
postgres 

diffsettings Subcommand

The diffsettings subcommand displays differences between the current settings file and Django’s default settings.

Shell
(oscar) $ ./manage.py diffsettings | less

loaddata Subcommand

Round-tripping the data involves first dumping the data from the database using the dumpdata subcommand, and then loading it from the dump file using the loaddata subcommand, like this:

Shell
(oscar) $ ./manage.py loaddata frobshop/fixtures/all.json
Installed 706 object(s) from 1 fixture(s) 

The Django docs suggest that loading data from a fixture deletes pre-existing data in the affected tables. I wonder what the corner cases for this look like, for example a fixture that affects just a property of an app?

Each time you run loaddata, the data will be read from the fixture and re-loaded into the database. Note this means that if you change one of the rows created by a fixture and then run loaddata again, you’ll wipe out any changes you’ve made.

To test my understanding of this, I created an empty fixture:

Shell
(oscar) $ echo '{}' > frobshop/fixtures/empty.json

Now I loaded the empty fixture:

Shell
(oscar) $ ./manage.py loaddata frobshop/fixtures/empty.json
/var/work/django/oscar/lib/python3.8/site-packages/django/core/management/commands/loaddata.py:211: RuntimeWarning: No fixture data found for 'empty'. (File format may be invalid.)
  warnings.warn(
Installed 0 object(s) from 1 fixture(s) 

(oscar) $ ./manage.py dumpdata
Lots and lots of JSON is output 

That did not go as expected. The documentation should say that affected records in the database will be overwritten. That is different from my understanding of the current phrasing, which suggests to me that the entire contents of affected tables are replaced.

runserver Subcommand

Run Django for development purposes on all IP addresses and port 8001:

Shell
(oscar) $ ./manage.py runserver 0.0.0.0 8001

By default, the development server doesn’t serve static files. To change this, read Managing static files.

Django Debug Toolbar

The Django Debug Toolbar is a very useful tool, and is easy to set up. This is a good video on YouTube about installing Django Debug Toolbar.

The Django Debug Toolbar is only needed for development, so it should be installed after separate settings for dev have been established.

I put the Django Debug Toolbar settings in settings/dev.py. Below is my complete settings/dev.py for Frobshop; the only line not associated with Django Debug Toolbar is the one starting with EMAIL_BACKEND:

settings/dev.py
from .base import *

DEBUG = True

EMAIL_BACKEND = 'django.core.mail.backends.console.EmailBackend'

DEBUG_TOOLBAR_CONFIG = {
    'JQUERY_URL': '',
}

INSTALLED_APPS += [
    'debug_toolbar',
]

INTERNAL_IPS = [
    '127.0.0.1',
]

MIDDLEWARE += [
    'debug_toolbar.middleware.DebugToolbarMiddleware',
]

The only other changes required in order for the Django Debug Toolbar to operate are:

  1. Add the following to urls.py:
    urls.py
    import debug_toolbar
    
    urlpatterns = [ 
        path('__debug__/', include(debug_toolbar.urls)),
    ] 
  2. Add the .dev qualifier to frobshop.settings in manage.py:
    manage.py
    os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'frobshop.settings.dev')

Now install Django Debug Toolbar and update requirements.txt:

Shell
(oscar) $ python -m pip install django-debug-toolbar

(oscar) $ pip freeze > requirements.txt