Django 2 Web Development Cookbook - Third Edition

4.2 (11 reviews total)
By Jake Kronika , Aidas Bendoraitis
  • Instant online access to over 8,000+ books and videos
  • Constantly updated with 100+ new titles each month
  • Breadth and depth in over 1,000+ technologies
  1. Getting Started with Django 2.1

About this book

Django is a framework designed to balance rapid web development with high performance. It handles high levels of user traffic and interaction, integrates with a variety of databases, and collects and processes data in real time. This book follows a task-based approach to guide you through developing with the Django 2.1 framework, starting with setting up and configuring Docker containers and a virtual environment for your project.

You'll learn how to write reusable pieces of code for your models and manage database changes. You'll work with forms and views to enter and list data, applying practical examples using templates and JavaScript together for the optimum user experience. This cookbook helps you to adjust the built-in Django administration to fit your needs and sharpen security and performance to make your web applications as robust, scalable, and dependable as possible. You'll also explore integration with Django CMS, the popular content management suite.

In the final chapters, you'll learn programming and debugging tricks and discover how collecting data from different sources and providing it to others in various formats can be a breeze. By the end of the book, you'll learn how to test and deploy projects to a remote dedicated server and scale your application to meet user demands.

Publication date:
October 2018
Publisher
Packt
Pages
544
ISBN
9781788837682

 

Chapter 1. Getting Started with Django 2.1

In this chapter, we will cover the following topics:

  • Working with a virtual environment
  • Creating a virtual environment project file structure
  • Working with Docker
  • Creating a Docker project file structure
  • Handling project dependencies with pip
  • Including external dependencies in your project
  • Configuring settings for development, testing, staging, and production environments
  • Defining relative paths in the settings
  • Creating and including local settings
  • Setting up STATIC_URL dynamically for Subversion users
  • Setting up STATIC_URL dynamically for Git users
  • Setting UTF-8 as the default encoding for MySQL configuration
  • Setting the Subversion ignore property
  • Creating a Git ignore file
  • Deleting Python-compiled files
  • Respecting the import order in Python files
  • Creating app configuration
  • Defining overwritable app settings
 

Introduction


In this chapter, we will see a few good practices when starting a new project with Django 2.1 on Python 3. Some of the tricks introduced here are the best ways to deal with the project layout, settings, and configurations, whether using virtualenv or Docker to manage your project. However, for some tricks, you might want to find some alternatives online or in other books about Django. Feel free to evaluate and choose the best bits and pieces for yourself while digging deep into the Django world.

We are assuming that you are already familiar with the basics of Django, Subversion and Git version control, MySQL and PostgreSQL databases, and command-line usage. Also, we assume that you are using a Unix-based operating system, such as macOS X or Linux. It makes more sense to develop with Django on Unix-based platforms as the websites will most likely be published on a similar server, therefore, you can establish routines that work the same while developing as well as deploying. If you are locally working with Django on Windows, the routines are similar; however, they are not always the same.

Using Docker for your development environment, regardless of your local platform, can improve the portability of your applications through deployment, since the environment within the Docker container can be matched precisely to that of your deployment server. Finally, whether developing with Docker or not, we assume that you have the appropriate version control system and database server already installed to your local machine.

Note

You can download the example code files for all Packt books that you have purchased from your account at http://www.packtpub.com. If you purchased this book elsewhere, you can visit http://www.packtpub.com/support and register in order to have the files emailed directly to you.

 

Working with a virtual environment


It is very likely that you will develop multiple Django projects on your computer. Some modules, such as Python Imaging Library (or Pillow) and MySQLdb, can be installed once and then shared for all projects. Other modules, such as Django, third-party Python libraries, and Django apps, will need to be kept isolated from each other. The virtualenv tool is a utility that separates all of the Python projects in their own realms. In this recipe, we will see how to use it.

Getting ready

To manage Python packages, you will need pip. It is included in your Python installation if you are using Python 3.4+. If you are using another version of Python, install pip by executing the installation instructions at http://pip.readthedocs.org/en/stable/installing/. Let's install the shared Python modules, Pillow and MySQLdb, and the virtualenv utility, using the following commands:

$ sudo pip3 install Pillow~=5.2.0
$ sudo pip3 install mysqlclient~=1.3.0
$ sudo pip3 install virtualenv~=16.0.0

How to do it...

Once you have your prerequisites installed, create a directory where all your Django projects will be stored, for example, virtualenvs under your home directory. Perform the following steps after creating the directory:

  1. Go to the newly created directory and create a virtual environment that uses the shared system site packages:
$ cd ~/virtualenvs
$ mkdir myproject_env
$ cd myproject_env
$ virtualenv --system-site-packages .
Using base prefix '/usr/local'
New python executable in ./bin/python3.6
Also creating executable in ./bin/python
Installing setuptools, pip, wheel...done.
  1. To use your newly created virtual environment, you need to execute the activation script in your current shell. This can be done with the following command:
$ source bin/activate
  1. Depending on the shell you are using, the source command may not be available. Another way to source a file is with the following command, which has the same result (note the space between the dot and bin):
$ . bin/activate
  1. You will see that the prompt of the command-line tool gets a prefix of the project name, as follows:
(myproject_env)$
  1. To get out of the virtual environment, type the following command:
(myproject_env)$ deactivate

How it works...

When you create a virtual environment, a few specific directories (bin, include, and lib) are created in order to store a copy of the Python installation and some shared Python paths are defined. When the virtual environment is activated, whatever you have installed with pip or easy_install will be put in and used by the site packages of the virtual environment, and not the global site packages of your Python installation.

To install the latest Django 2.1.x in your virtual environment, type the following command:

(myproject_env)$ pip3 install "Django~=2.1.0"

See also

  • The Creating a virtual environment project file structure recipe
  • The Working with Docker recipe
  • The Deploying on Apache with mod_wsgi recipe in Chapter 12, Testing and Deployment
 

Creating a virtual environment project file structure


A consistent file structure for your projects makes you well organized and more productive. When you have the basic workflow defined, you can get in the business logic more quickly and create awesome projects.

Getting ready

If you haven't done this yet, create a virtualenvs directory, where you will keep all your virtual environments (read about this in the Working with a virtual environment recipe). This can be created under your home directory.

Then, create a directory for your project's environment, for example, myproject_env. Start the virtual environment in it. We would suggest adding a commands directory for local shell scripts that are related to the project, a db_backups directory for database dumps, and a project directory for your Django project. Also, install Django in your virtual environment if you haven't already done so.

How to do it...

Follow these steps in order to create a file structure for your project:

  1. With the virtual environment activated, go to the project directory and start a new Django project as follows:
(myproject_env)$ django-admin.py startproject myproject

For clarity, we will rename the newly created directory django-myproject. This is the directory that you will put under version control, therefore, it will have .git, .svn, or similar subdirectories.

  1. In the django-myproject directory, create a README.md file to describe your project to the new developers. You can also put the pip requirements with the Django version and include other external dependencies (read about this in the Handling project dependencies with pip recipe).
  2. The django-myproject directory will also contain the following:
    • Your project's Python package, named myproject
    • Django apps (we recommend having an app called utils for different functionalities that are shared throughout the project)
    • A locale directory for your project translations if it is multilingual
    • The externals directory for external dependencies that are included in this project if you decide not to use pip requirements
  1. In your project's root, django-myproject. Create the following:
    • A media directory for project uploads
    • A site_static directory for project-specific static files
    • A static directory for collected static files
    • A tmp directory for the upload procedure
    • A templates directory for project templates
  2. The myproject directory should contain your project settings in settings.py and a config directory (read about this in the Configuring settings for development, testing, staging, and production environments recipe), as well as the urls.py URL configuration.
  3. In your site_static directory, create the site directory as a namespace for site-specific static files. Then, we will divide the static files between categorized subdirectories in it. For instance, see the following:
    • scss for Sass files (optional)
    • css for the generated minified Cascading Style Sheets (CSS)
    • img for styling images and logos
    • js for JavaScript and any third-party module combining all types of files, such as the TinyMCE rich-text editor
  4. Besides the site directory, the site_static directory might also contain overwritten static directories of third-party apps, for example, cms overwriting static files from Django CMS. To generate the CSS files from Sass and minify the JavaScript files, you can use the CodeKit or Prepros applications with a graphical user interface.
  5. Put your templates that are separated by the apps in your templates directory. If a template file represents a page (for example, change_item.html or item_list.html), then put it directly in the app's template directory. If the template is included in another template (for example, similar_items.html), put it in the includes subdirectory. Also, your templates directory can contain a directory called utils for globally reusable snippets, such as pagination and language chooser.

How it works...

The whole file structure for a complete project in a virtual environment will look similar to the following:

myproject_env/
├── bin/
├── commands/
├── db_backups/
├── include/
├── lib/
└── project/
└── django-myproject/
├── externals/
│   ├── apps/
│   └── libs/
├── locale/
        ├── media/
├── myapp1/
├── myapp2/
├── myproject/
│   ├── config/
│   │   ├── __init__.py
│   │   ├── base.py
│   │   ├── dev.py
│   │   ├── prod.py
│   │   ├── staging.py
│   │   └── test.py
│   ├── tmp/
│   ├── __init__.py
│   ├── settings.py
│   ├── settings.py.example
│   ├── urls.py
│   └── wsgi.py
├── requirements/
│   ├── dev.txt
│   ├── prod.txt
│   ├── staging.txt
│   └── test.txt
        ├── site_static/
        │   └── site/
        │       ├── css/
        │       ├── img/
        │       └── js/
        ├── static/
        ├── templates/
        │   ├── admin/
        │   ├── myapp1/
        │   │   └── includes/
        │   └── myapp2/
        │       └── includes/
├── utils/
│   ├── __init__.py
│   └── misc.py
├── README.md
├── fabfile.py
└── manage.py*

See also

  • The Handling project dependencies with pip recipe
  • The Including external dependencies in your project recipe
  • The Configuring settings for development, testing, staging, and production environments recipe
  • The Deploying on Apache with mod_wsgi recipe in Chapter 12, Testing and Deployment
 

Working with Docker


Sometimes more flexibility is needed across projects than simply to differentiate Python package versions. For example, it might be necessary to support an application on an existing version of Python itself, or perhaps MySQL, while simultaneously developing an update that relies upon a newer version of the software. Docker is capable of that level of isolation.

Docker is a system for creating configured, customized virtual machines called containers. It allows duplicating the setup of any production server precisely. In some cases, it is even possible to deploy pre-built containers directly to remote servers as well.

Getting ready

First, you will need to install the Docker Engine, following the instructions to be found at https://www.docker.com/get-started. This usually includes the Compose tool, which makes it simple to manage systems that require multiple containers, ideal for a fully isolated Django project. If needed, installation details for Compose are available at https://docs.docker.com/compose/install/. 

How to do it...

With Docker and Compose installed, we will start by creating a myproject_docker directory. Within this, create subdirectories named apps, config, media, project, static, and templates. Then, we will create three configuration files:

  • A requirements.txt file defining Python dependencies, under the config directory
  • Dockerfile for the Django application container, in the myproject_docker root
  • A docker-compose.yml file identifying all of the services making up the application environment, also in the myproject_docker root

The requirements.txt, which lives under the config subdirectory, is much the same as if using a virtual environment (see the Working with a virtual environment recipe), though we will include all dependencies here, not just those that differ from other projects. Because we are likely trying to match our Docker environment to that of the production server, we will generally require very specific versions of each module. In this case, we limit to the latest patch within a minor version range. For example, here, we would prefer mysqlclient 1.3.13 over mysqlclient 1.3.3, but we would not yet upgrade to mysqlclient 1.4.0:

# config/requirements.txt
Pillow~=5.2.0
mysqlclient~=1.3.0
Django~=2.1.0

Dockerfile will define how to build the environment within the container:

# Dockerfile
FROM python:3
RUN apt-get update \
    && apt-get install -y --no-install-recommends \
        mysql-client libmysqlclient-dev
WORKDIR /usr/src/app
ADD config/requirements.txt ./
RUN pip3 install --upgrade pip; \
    pip3 install -r requirements.txt
RUN django-admin startproject myproject .; \
    mv ./myproject ./origproject

We start with the official image for Python 3, install some dependencies for MySQL, set our working directory, add and install Python requirements, and then start a Django project.

Finally, docker-compose.yml puts together the Django application container with other services, such as a MySQL database, so that we can run them together with ease:

# docker-compose.yml
version: '3'
services:
  db:
    image: 'mysql:5.7'
  app:
    build: .
    command: python3 manage.py runserver 0.0.0.0:8000
    volumes:
      - './project:/usr/src/app/myproject'
      - './media:/usr/src/app/media'
      - './static:/usr/src/app/static'
      - './templates:/usr/src/app/templates'
      - './apps/external:/usr/src/app/external'
      - './apps/myapp1:/usr/src/app/myapp1'
      - './apps/myapp2:/usr/src/app/myapp2'
    ports:
      - '8000:8000'
    links:
      - db

As we can see in the volumes section, we will also need to add subdirectories within myproject_docker named project, media, static, and templates, plus each of the apps for the project. These directories will house the code, configuration, and other resources that are exposed within the container.

How it works...

With our basic configuration in place, we can now issue commands to Docker to build and start up our services. If the system we built was using only Dockerfile, this could be done without Compose, using direct docker engine commands. However, in a Compose setup there is a special docker-compose wrapper command that makes it easier to coordinate multiple interconnected containers.

The first step is to build our containers, as defined by the docker-compose.yml file. The first time that you build, any images used as starting points need to be loaded locally, and then each instruction in the Dockerfile is performed sequentially within the resultant machine:

myproject_docker/$ docker-compose build
db uses an image, skipping
Building app
Step 1/6 : FROM python:3
3: Pulling from library/python
f49cf87b52c1: Pull complete
7b491c575b06: Pull complete
b313b08bab3b: Pull complete
51d6678c3f0e: Pull complete
09f35bd58db2: Pull complete
0f9de702e222: Pull complete
73911d37fcde: Pull complete
99a87e214c92: Pull complete
Digest: sha256:98149ed5f37f48ea3fad26ae6c0042dd2b08228d58edc95ef0fce35f1b3d9e9f
Status: Downloaded newer image for python:3
 ---> c1e459c00dc3
Step 2/6 : RUN apt-get update && apt-get install -y --no-install-recommends mysql-client libmysqlclient-dev
 ---> Running in 385946c3002f
Get:1 http://security.debian.org jessie/updates InRelease [63.1 kB]
Ign http://deb.debian.org jessie InRelease
Get:2 http://deb.debian.org jessie-updates InRelease [145 kB]
Get:3 http://deb.debian.org jessie Release.gpg [2434 B]
Get:4 http://deb.debian.org jessie Release [148 kB]
Get:5 http://security.debian.org jessie/updates/main amd64 Packages [607 kB]
Get:6 http://deb.debian.org jessie-updates/main amd64 Packages [23.1 kB]
Get:7 http://deb.debian.org jessie/main amd64 Packages [9064 kB]
Fetched 10.1 MB in 10s (962 kB/s)
Reading package lists...
Reading package lists...
Building dependency tree...
Reading state information...
The following extra packages will be installed:
  libdbd-mysql-perl libdbi-perl libmysqlclient18 libterm-readkey-perl
  mysql-client-5.5 mysql-common
Suggested packages:
  libclone-perl libmldbm-perl libnet-daemon-perl libsql-statement-perl
The following NEW packages will be installed:
  libdbd-mysql-perl libdbi-perl libterm-readkey-perl mysql-client
  mysql-client-5.5
The following packages will be upgraded:
  libmysqlclient-dev libmysqlclient18 mysql-common
3 upgraded, 5 newly installed, 0 to remove and 8 not upgraded.
Need to get 4406 kB of archives.
After this operation, 39.8 MB of additional disk space will be used.
Get:1 http://security.debian.org/ jessie/updates/main libmysqlclient-dev amd64 5.5.59-0+deb8u1 [952 kB]
Get:2 http://deb.debian.org/debian/ jessie/main libdbi-perl amd64 1.631-3+b1 [816 kB]
Get:3 http://security.debian.org/ jessie/updates/main mysql-common all 5.5.59-0+deb8u1 [80.2 kB]
Get:4 http://deb.debian.org/debian/ jessie/main libdbd-mysql-perl amd64 4.028-2+deb8u2 [119 kB]
Get:5 http://security.debian.org/ jessie/updates/main libmysqlclient18 amd64 5.5.59-0+deb8u1 [674 kB]
Get:6 http://deb.debian.org/debian/ jessie/main libterm-readkey-perl amd64 2.32-1+b1 [28.0 kB]
Get:7 http://security.debian.org/ jessie/updates/main mysql-client-5.5 amd64 5.5.59-0+deb8u1 [1659 kB]
Get:8 http://security.debian.org/ jessie/updates/main mysql-client all 5.5.59-0+deb8u1 [78.4 kB]
debconf: delaying package configuration, since apt-utils is not installed
Fetched 4406 kB in 5s (768 kB/s)
(Reading database ... 21636 files and directories currently installed.)
Preparing to unpack .../libmysqlclient-dev_5.5.59-0+deb8u1_amd64.deb ...
Unpacking libmysqlclient-dev (5.5.59-0+deb8u1) over (5.5.58-0+deb8u1) ...
Preparing to unpack .../mysql-common_5.5.59-0+deb8u1_all.deb ...
Unpacking mysql-common (5.5.59-0+deb8u1) over (5.5.58-0+deb8u1) ...
Preparing to unpack .../libmysqlclient18_5.5.59-0+deb8u1_amd64.deb ...
Unpacking libmysqlclient18:amd64 (5.5.59-0+deb8u1) over (5.5.58-0+deb8u1) ...
Selecting previously unselected package libdbi-perl.
Preparing to unpack .../libdbi-perl_1.631-3+b1_amd64.deb ...
Unpacking libdbi-perl (1.631-3+b1) ...
Selecting previously unselected package libdbd-mysql-perl.
Preparing to unpack .../libdbd-mysql-perl_4.028-2+deb8u2_amd64.deb ...
Unpacking libdbd-mysql-perl (4.028-2+deb8u2) ...
Selecting previously unselected package libterm-readkey-perl.
Preparing to unpack .../libterm-readkey-perl_2.32-1+b1_amd64.deb ...
Unpacking libterm-readkey-perl (2.32-1+b1) ...
Selecting previously unselected package mysql-client-5.5.
Preparing to unpack .../mysql-client-5.5_5.5.59-0+deb8u1_amd64.deb ...
Unpacking mysql-client-5.5 (5.5.59-0+deb8u1) ...
Selecting previously unselected package mysql-client.
Preparing to unpack .../mysql-client_5.5.59-0+deb8u1_all.deb ...
Unpacking mysql-client (5.5.59-0+deb8u1) ...
Setting up mysql-common (5.5.59-0+deb8u1) ...
Setting up libmysqlclient18:amd64 (5.5.59-0+deb8u1) ...
Setting up libmysqlclient-dev (5.5.59-0+deb8u1) ...
Setting up libdbi-perl (1.631-3+b1) ...
Setting up libdbd-mysql-perl (4.028-2+deb8u2) ...
Setting up libterm-readkey-perl (2.32-1+b1) ...
Setting up mysql-client-5.5 (5.5.59-0+deb8u1) ...
Setting up mysql-client (5.5.59-0+deb8u1) ...
Processing triggers for libc-bin (2.19-18+deb8u10) ...
Removing intermediate container 385946c3002f
 ---> 6bca605a6e41
Step 3/6 : WORKDIR /usr/src/app
Removing intermediate container 3b23729581ef
 ---> 75bf10f0bee4
Step 4/6 : ADD config/requirements.txt ./
 ---> 31a62236f4b9
Step 5/6 : RUN pip3 install --upgrade pip; pip3 install -r requirements.txt
 ---> Running in 755a1b397b5d
Requirement already up-to-date: pip in /usr/local/lib/python3.6/site-packages
Collecting Pillow~=5.2.0 (from -r requirements.txt (line 2))
  Downloading Pillow-5.2.0-cp36-cp36m-manylinux1_x86_64.whl (5.9MB)
Collecting mysqlclient~=1.3.0 (from -r requirements.txt (line 3))
  Downloading mysqlclient-1.3.0.tar.gz (76kB)
Collecting Django~=2.1.0 (from -r requirements.txt (line 4))
  Downloading Django-2.1.1-py3-none-any.whl (7.1MB)
Collecting pytz (from Django~=2.1.0->-r requirements.txt (line 4))
  Downloading pytz-2017.3-py2.py3-none-any.whl (511kB)
Building wheels for collected packages: mysqlclient
  Running setup.py bdist_wheel for mysqlclient: started
  Running setup.py bdist_wheel for mysqlclient: finished with status 'done'
  Stored in directory: /root/.cache/pip/wheels/0e/11/a1/e81644c707456461f470c777f13fbd11a1af8eff0ca71aaca0
Successfully built mysqlclient
Installing collected packages: Pillow, mysqlclient, pytz, Django
Successfully installed Django-2.1.1 Pillow-5.2.0 mysqlclient-1.3.0 pytz-2017.3
Removing intermediate container 755a1b397b5d
 ---> 12308a188504
Step 6/6 : RUN django-admin startproject myproject .; mv ./myproject ./origproject
 ---> Running in 746969588bd3
Removing intermediate container 746969588bd3
 ---> 8bc2b0beb674
Successfully built 8bc2b0beb674
Successfully tagged myprojectdocker_app:latest

This will create a local image based on the code in the myproject_docker directory. We can see a list of the built images available, as follows:

myproject_docker/$ docker images
REPOSITORY           TAG     IMAGE ID      CREATED         SIZE
myprojectdocker_app  latest  6a5c66f22a02  39 seconds ago  814MB
python               3       c1e459c00dc3  4 weeks ago     692MB

The state of the machine, after each step, is cached so that future build commands do as little work as possible, based only on the steps after which a change was made. For example, if we build again right away, then everything should come from the cache:

myproject_docker/$ docker-compose build
db uses an image, skipping
Building app
Step 1/6 : FROM python:3
 ---> c1e459c00dc3
Step 2/6 : RUN apt-get update && apt-get install -y --no-install-recommends mysql-client libmysqlclient-dev
 ---> Using cache
 ---> f2007264e96d
Step 3/6 : WORKDIR /usr/src/app
 ---> Using cache
 ---> 9621b97ef4ec
Step 4/6 : ADD config/requirements.txt ./
 ---> Using cache
 ---> 6a87941c7876
Step 5/6 : RUN pip3 install --upgrade pip; pip3 install -r requirements.txt
 ---> Using cache
 ---> 64a268b8cba6
Step 6/6 : RUN django-admin startproject myproject .; mv ./myproject ./origproject
 ---> Using cache
 ---> 8bc2b0beb674
Successfully built 8bc2b0beb674
Successfully tagged myprojectdocker_app:latest

Although we added a project to the container via the Dockerfile, the project volume set up for the app would mask some files when the container is running. To get around this, we moved the project files within the container aside to an origproject directory. Compose allows us to easily run commands against our services, so we can copy those project files so they are accessible in the volume by executing the following command:

myproject_docker/$ docker-compose run app cp \
> origproject/__init__.py \
> origproject/settings.py \
> origproject/urls.py \
> origproject/wsgi.py \
> myproject/

We can see that the previously masked project files are now exposed for us to easily edit outside of the container, too:

myproject_docker/$ ls project
__init__.py   settings.py   urls.py   wsgi.py

Once our services are built and the Django project is created, we can use docker-compose to bring up the environment, passing an optional -d flag to detach the process from our terminal. Detaching runs the containers in exactly the same way, except we can use the terminal to invoke other commands in the meantime. With the containers attached, we are only able to view logs that are exposed by the container (generally what is output to stdout or stderr). The first time we start our Compose environment, any pure image-based services will also need to be pulled down. For example, we might see something like this:

myproject_docker/$ docker-compose up -d
Creating network "myprojectdocker_default" with the default driver
Pulling db (mysql:5.7)...
5.7: Pulling from library/mysql
f49cf87b52c1: Already exists
78032de49d65: Pull complete
837546b20bc4: Pull complete
9b8316af6cc6: Pull complete
1056cf29b9f1: Pull complete
86f3913b029a: Pull complete
f98eea8321ca: Pull complete
3a8e3ebdeaf5: Pull complete
4be06ac1c51e: Pull complete
920c7ffb7747: Pull complete
Digest: sha256:7cdb08f30a54d109ddded59525937592cb6852ff635a546626a8960d9ec34c30
Creating myprojectdocker_db_1 ... done
Creating myprojectdocker_app_1 ... done

At this point, Django is now accessible, just as it would be when run directly on your machine and accessing http://localhost:8000/:

It is often necessary to execute commands within an already up-and-running container, and Docker provides a simple way to do this, as well. As an example, we can connect to the machine at a command-line prompt, similarly to how we might access a remote machine over SSH, as follows:

myproject_docker/$ docker exec -it myproject_docker_app_1 /bin/bash
[email protected]:/usr/src/app# ls
db.sqlite3  external     manage.py         media   myapp1    myapp2
myproject   origproject  requirements.txt  static  templates
[email protected]:/usr/src/app# ls myproject
__init__.py  __pycache__  settings.py  urls.py  wsgi.py
[email protected]:/usr/src/app# exit
myproject_docker/$

The preceding code instructs Docker to execute /bin/bash on the myprojectdocker_app_1 container. The -i flag makes the connection interactive, and -t allocates a TTY shell. Shutting down is just as easy. If the container is running in attached mode, simply issue a Ctrl-C keyboard command to end the process. When using the -d flag to start the container, however, we instead issue a command to shut it down:

myproject_docker/$ docker-compose down
Stopping myprojectdocker_app_1 ... done
Removing myprojectdocker_app_1 ... done
Removing myprojectdocker_db_1 ... done
Removing network myprojectdocker_default

There's more...

Read more from the extensive documentation of Docker at https://docs.docker.com/, and specifically about using Compose with Django at https://docs.docker.com/compose/django/. In the Creating a Docker project file structure recipe, we also go into greater depth around the organization of files and configuration to replicate a production environment.

See also

  • The Working with a virtual environment recipe
  • The Creating a Docker project file structure recipe
 

Creating a Docker project file structure


Although Docker provides an isolated environment within which to configure and run your project, development code and certain configurations can still be stored outside the container. This enables such files to be added to version control, and persists the files when a container is shut down. In addition, Docker adds flexibility that allows us to directly recreate an environment that might be used in production, helping to ensure that the conditions in development will much more closely match the real world.

Getting ready

Before you begin, set up a Docker environment as described in the Working with Docker recipe.

How to do it...

The basic structure already created separates aspects of our project into logical groups:

  • All applications to be used in the project are stored under the apps directory, which allows them to be pulled in individually either from version control or other source locations.
  • project and templates are also distinct, which makes sense since the settings and templates for one project switch be shared, whereas applications are commonly intended to be reusable.
  • The static and media files are separated as well, allowing them to be deployed to separate static content containers (and servers) easily.

To make full use of these features, let's update the docker-compose.yml file with some enhancements:

# docker-compose.yml
version: '3'
services:
  proxy:
    image: 'jwilder/nginx-proxy:latest'
    ports:
      - '80:80'
    volumes:
      - '/var/run/docker.sock:/tmp/docker.sock:ro'
  db:
    image: 'mysql:5.7'
ports:
      - '3306'
    volumes:
      - './config/my.cnf:/etc/mysql/conf.d/my.cnf'
      - './mysql:/var/lib/mysql'
      - './data:/usr/local/share/data'
    environment:
      - 'MYSQL_ROOT_PASSWORD'
      - 'MYSQL_USER'
      - 'MYSQL_PASSWORD'
      - 'MYSQL_DATABASE'
  app:
    build: .
    command: python3 manage.py runserver 0.0.0.0:8000
    volumes:
      - './project:/usr/src/app/myproject'
      - './media:/usr/src/app/media'
      - './static:/usr/src/app/static'
      - './templates:/usr/src/app/templates'
      - './apps/external:/usr/src/app/external'
      - './apps/myapp1:/usr/src/app/myapp1'
      - './apps/myapp2:/usr/src/app/myapp2'
    ports:
      - '8000'
    links:
      - db
    environment:
      - 'SITE_HOST'
      - 'MEDIA_HOST'
      - 'STATIC_HOST'
      - 'VIRTUAL_HOST=${SITE_HOST}'
      - 'VIRTUAL_PORT=8000'
      - 'MYSQL_HOST=db'
      - 'MYSQL_USER'
      - 'MYSQL_PASSWORD'
      - 'MYSQL_DATABASE'
  media:
    image: 'httpd:latest'
    volumes:
      - './media:/usr/local/apache2/htdocs'
    ports:
      - '80'
    environment:
      - 'VIRTUAL_HOST=${MEDIA_HOST}'
  static:
    image: 'httpd:latest'
    volumes:
      - './static:/usr/local/apache2/htdocs'
    ports:
      - '80'
    environment:
      - 'VIRTUAL_HOST=${STATIC_HOST}'

With these changes, there are some corresponding updates needed in the Django project settings as well. The end result should look similar to the following:

# project/settings.py
# ...

ALLOWED_HOSTS = []
if os.environ.get('SITE_HOST'):
    ALLOWED_HOSTS.append(os.environ.get('SITE_HOST'))

# ...

DATABASES = {
    'default': {
        'ENGINE': 'django.db.backends.sqlite3',
        'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
    }
}

if os.environ.get('MYSQL_HOST'):
    DATABASES['default'] = {
        'ENGINE': 'django.db.backends.mysql',
        'HOST': os.environ.get('MYSQL_HOST'),
        'NAME': os.environ.get('MYSQL_DATABASE'),
        'USER': os.environ.get('MYSQL_USER'),
        'PASSWORD': os.environ.get('MYSQL_PASSWORD'),
    }

# ...

# Logging
# https://docs.djangoproject.com/en/dev/topics/logging/
LOGGING = {
    'version': 1,
    'formatters': {
        'verbose': {
            'format': '%(levelname)s %(asctime)s %(module)s %(process)d %(thread)d %(message)s'
        },
        'simple': {
            'format': '%(levelname)s %(message)s'
        },
    },
    'handlers': {
        'console': {
            'level': 'DEBUG',
            'class': 'logging.StreamHandler',
            'formatter': 'simple'
        },
        'file': {
            'level': 'DEBUG',
            'class': 'logging.FileHandler',
            'filename': '/var/log/app.log',
            'formatter': 'simple'
        },
    },
    'loggers': {
        'django': {
            'handlers': ['file'],
            'level': 'DEBUG',
            'propagate': True,
        },
    }
}

if DEBUG:
    # make all loggers use the console.
    for logger in LOGGING['loggers']:
        LOGGING['loggers'][logger]['handlers'] = ['console']

# ...

# Static files (CSS, JavaScript, Images)
# https://docs.djangoproject.com/en/2.1/howto/static-files/

STATIC_URL = '/static/'
STATIC_ROOT = os.path.join(BASE_DIR, 'static')
if os.environ.get('STATIC_HOST'):
    STATIC_DOMAIN = os.environ.get('STATIC_HOST')
    STATIC_URL = 'http://%s/' % STATIC_DOMAIN

MEDIA_URL = '/media/'
MEDIA_ROOT = os.path.join(BASE_DIR, 'media')
if os.environ.get('MEDIA_HOST'):
    MEDIA_DOMAIN = os.environ.get('MEDIA_HOST')
    MEDIA_URL = 'http://%s/' % MEDIA_DOMAIN

Furthermore, the my.cnf file is referenced in docker-compose.yml as a volume attached to the db service. Although there would be no error, specifically, if it were left out; a directory would be automatically created to satisfy the volume requirement. At a minimum, we can add an empty file under the config folder, or we might add options to MySQL right away, such as the following:

# config/my.cnf
[mysqld]
sql_mode=STRICT_TRANS_TABLES

Then, add a bin subdirectory in myproject_docker, inside of which we will add a dev script (or dev.sh, if the extension is preferred):

#!/usr/bin/env bash
# bin/dev
# environment variables to be defined externally for security
# - MYSQL_USER
# - MYSQL_PASSWORD
# - MYSQL_ROOT_PASSWORD
DOMAIN=myproject.local

DJANGO_USE_DEBUG=1 \
DJANGO_USE_DEBUG_TOOLBAR=1 \
SITE_HOST="$DOMAIN" \
MEDIA_HOST="media.$DOMAIN" \
STATIC_HOST="static.$DOMAIN" \
MYSQL_HOST="localhost" \
MYSQL_DATABASE="myproject_db" \
  docker-compose $*

Make sure the script is executable by modifying the permissions, as in the following:

myproject_docker/$ chmod +x bin/dev

Finally, the development hosts need to be mapped to a local IP address, such as via /etc/hosts on macOS or Linux. Such a mapping for our project would look something like this:

127.0.0.1    myproject.local media.myproject.local static.myproject.local

How it works...

In docker-compose.yml, we have added more services and defined some environment variables. These make our system more robust and allow us to replicate the multi-host paradigm for serving static files that is preferred in production.

The first new service is a proxy, based on the jwilder/nginx-proxy image. This service attaches to port 80 in the host machine and passes requests through to port 80 in the container. The purpose of the proxy is to allow use of friendly hostnames rather than relying on everything running on localhost.

Two other new services are defined toward the end of the file, one for serving media and another for static files:

  • These both run the Apache httpd static server and map the associated directory to the default htdocs folder from which Apache serves files.
  • We can also see that they each define a VIRTUAL_HOST environment variable, whose value is drawn from corresponding host variables MEDIA_HOST and STATIC_HOST, and which is read automatically by the proxy service.
  • The services listen on port 80 in the container, so requests made for resources under that hostname can be forwarded by the proxy to the associated service dynamically.

The db service has been augmented in a few ways:

  • First, we ensure that it is listening on the expected port 3306 in the container network.
  • We also set up a few volumes so that content can be shared outside the container—a my.cnf file allows changes to the basic running configuration of the database server; the database content is exposed as a mysql directory, in case there is a desire to back up the database itself; and we add a data directory for SQL scripts, so we can connect to the database container and execute them directly if desired.
  • Lastly, there are four environment variables that the mysql image makes use of—MYSQL_ROOT_PASSWORD, MYSQL_HOST, MYSQL_USER, and MYSQL_PASSWORD. These are declared, but no value is given, so that the value will be taken from the host environment itself when we run docker-compose up.

The final set of changes in docker-compose.yml are for the app service itself, the nature of which are similar to those noted previously:

  • The port definition is changed so that port 8000 is only connected to within the container network, rather than binding to that port on the host, since we will now access Django via the proxy.
  • More than simply depending on the db service, our app now links directly to it over the internal network, which makes it possible to refer to the service by its name rather than an externally accessible hostname.
  • As with the database, several environment variables are indicated to supply external data to the container from the host. There are pass-through variables for MEDIA_HOST and STATIC_HOST, plus SITE_HOST and a mapping of it to VIRTUAL_HOST used by the proxy.
  • While the proxy connects to virtual hosts via port 80 by default, we are running Django on port 8000, so the proxy is instructed to use that port instead via the VIRTUAL_PORT variable.
  • Last but not least, the MySQL MYSQL_HOST, MYSQL_USER, MYSQL_PASSWORD and MYSQL_DATABASEvariables are passed into the app for use in the project settings.

This brings us to the updates to settings.py, which are largely centered around connectivity and security:

  • To ensure that access to the application is limited to expected connections, we add SITE_HOST to ALLOWED_HOSTS if one is given for the environment.
  • For DATABASES, the original sqlite3 settings are left in place, but we replace that default with a configuration for MySQL if we find the MYSQL_HOST environment variable has been set, making use of the MySQL variables passed into the app service.
  • As noted in the Working with Docker recipe, we can only view logs that are exposed by the container. By default, the Django runserver command does not output logging to the console, so no logs are technically exposed. The next change to settings.py sets up LOGGING configurations so that a simple format will always be logged to the console when DEBUG=true.
  • Finally, instead of relying upon Django to serve static and media files, we check for the corresponding STATIC_HOST and MEDIA_HOST environment variables and, when those exist, set the STATIC_URL and MEDIA_URL settings accordingly.

With all of the configurations updated, we need to have an easy way to run the container so that the appropriate environment variables are supplied. Although it might be possible to export the variables, that would negate much of the benefit of isolation we gain from using Docker otherwise. Instead, it is possible to run docker-compose with inline variables, so a single execution thread will have those variables set in a specific way. This is, ultimately, what the dev script does.

Now we can run docker-compose commands for our development environment—which includes a MySQL database, separate Apache servers for media and static files, and the Django server itself—with a single, simplified form:

myproject_docker/$ MYSQL_USER=myproject_user \
> MYSQL_PASSWORD=pass1234 \
> ./bin/dev up -d
Creating myprojectdocker_media_1 ... done
Creating myprojectdocker_db_1 ... done
Creating myprojectdocker_app_1 ... done
Creating myprojectdocker_static_1 ... done

In the dev script, the appropriate variables are all defined for the command automatically, and docker-compose is invoked at once. The script mentions in comments three other, more sensitive variables that should be provided externally, and two of those are included here. If you are less concerned about the security of a development database, these could just as easily be included in the dev script itself. A more secure, but also more convenient way of providing the variables across runs would be to export them, after which they become global environment variables, as in the following example:

myproject_docker/$ export MYSQL_USER=myproject_user
myproject_docker/$ export MYSQL_PASSWORD=pass1234
myproject_docker/$ ./bin/dev build
myproject_docker/$ ./bin/dev up -d

Any commands or options passed into dev, such as up -d in this case, are forwarded along to docker-compose via the $* wildcard variable included at the end of the script. With the host mapping complete, and our container up and running, we should be able to access the system by SITE_HOST, as in http://myproject.local/.

The resultant file structure for a complete Docker project might look something like this:

myproject_docker/
├── apps/
│   ├── external/
│   ├── myapp1/
│   ├── myapp2/
├── bin/
│   ├── dev*
│   ├── prod*
│   ├── staging*
│   └── test*
├── config/
│   ├── my.cnf
│   └── requirements.txt
├── data/
├── media/
├── mysql/
│   ├── myproject_db/
│   ├── mysql/
│   ├── performance_schema/
│   ├── sys/
│   ├── ibdata1
│   └── ibtmp1
├── project/
│   ├── __init__.py
│   ├── settings.py
│   ├── urls.py
│   └── wsgi.py
├── static/
├── templates/
├── Dockerfile
├── README.md
└── docker-compose.yml

There's more...

You can find additional details about the configuration that might be specified in my.cnf; see MySQL documentation for Using Options Files, found at https://dev.mysql.com/doc/refman/5.7/en/option-files.html.

See also

  • The Creating a virtual environment project file structure recipe
  • The Working with Docker recipe
  • The Handling project dependencies with pip recipe
  • The Including external dependencies in your project recipe
  • The Configuring settings for development, testing, staging, and production environments recipe
  • The Setting UTF-8 as the default encoding for MySQL configuration recipe
  • The Deploying on Apache with mod_wsgi recipe in Chapter 12, Testing and Deployment
 

Handling project dependencies with pip


The most convenient tool to install and manage Python packages is pip. Rather than installing the packages one by one, it is possible to define a list of packages that you want to install as the contents of a text file. We can pass the text file into the pip tool, which will then handle installation of all packages in the list automatically. An added benefit to this approach is that the package list can be stored in version control. If you have gone through the Working with Docker recipe, then you have already seen this.

Generally speaking, it is ideal and often sufficient to have a single requirements file that directly matches your production environment. When changing versions or adding and removing dependencies, this can be done on a development machine and then managed through version control. It can then be as simple as switching branches to go from one set of dependencies (and associated code changes) to another.

In some cases, environments differ enough that you will need to have at least two different instances of your project: the development environment, where you create new features, and the public website environment that is usually called the production environment, in a hosted server. There might be development environments for other developers, or special tools that are needed during development but are unnecessary in production. Also, you may have a testing and staging environment in order to test the project locally and in a public website-like situation.

For good maintainability, you should be able to install the required Python modules for development, testing, staging, and production environments. Some of the modules will be shared and some of them will be specific to a subset of the environments. In this recipe, we will see how to organize the project dependencies for multiple environments and manage them with pip.

Getting ready

Before using this recipe, you need to have a Django project ready, either with pip installed and a virtual environment activated, or via Docker. For more information on how to do this, read the Working with a virtual environment recipe, or the Working with Docker recipe, respectively.

How to do it...

Execute the following steps one by one to prepare pip requirements for your virtual environment Django project:

  1. Let's go to your Django project that you have under version control and create a requirements directory with these text files, if you haven't already done so:
    • base.txt for shared modules
    • dev.txt for the development environment
    • test.txt for the testing environment
    • staging.txt for the staging environment
    • prod.txt for production
  1. Edit base.txt and add the Python modules that are shared in all environments, line by line. For example, we might migrate our original requirements.txt as base.txt, which would give us this in our virtual environment project:
# base.txt
Django~=2.1.0
djangorestframework
-e git://github.com/omab/[email protected]#egg=python-social-auth
  1. If the requirements of a specific environment are the same as in base.txt, add the line including base.txt in the requirements file of that environment, as in the following example:
# prod.txt
-r base.txt
  1. If there are specific requirements for an environment, add them after the base.txt inclusion, as shown in the following:
# dev.txt
-r base.txt
django-debug-toolbar
selenium
  1. You can run the following command in a virtual environment in order to install all of the required dependencies for the development environment (or analogous command for other environments), as follows:
(myproject_env)$ pip3 install -r requirements/dev.txt

With a Docker setup, we follow steps 1-4 in almost precisely the same manner, except the requirements directory would live underneath the config directory. From there, a few additional steps are needed to install the correct requirements by environment:

  1. The Dockerfile file will need to be updated to select the appropriate requirements file based on a build argument, which here defaults to prod:
# Dockerfile
FROM python:3
RUN apt-get update \
    && apt-get install -y --no-install-recommends \
        less mysql-client libmysqlclient-dev
WORKDIR /usr/src/app
ARG BUILD_ENV=prod
ADD config/requirements ./requirements
RUN pip3 install --upgrade pip; \
    pip3 install -r requirements/$BUILD_ENV.txt
RUN django-admin startproject myproject .; \
    mv ./myproject ./origproject
  1. The docker-compose.yml file needs to pass through this argument using the current environment variable, as in the following:
# docker-compose.yml
version: '3'
services:
  db:
    image: "mysql:5.7"
  app:
    build:
      context: .
      args:
        BUILD_ENV: $BUILD_ENV
    command: "python3 manage.py runserver 0.0.0.0:8000"
    volumes:
      - "./project:/usr/src/app/myproject"
      - "./media:/usr/src/app/media"
      - "./static:/usr/src/app/static"
      - "./templates:/usr/src/app/templates"
      - "./apps/external:/usr/src/app/external"
      - "./apps/myapp1:/usr/src/app/myapp1"
      - "./apps/myapp2:/usr/src/app/myapp2"
    ports:
      - "8000:8000"
    depends_on:
      - db
  1. Scripts under bin for each environment are then updated to set the appropriate value for the BUILD_ENV variable. For example, we would update the dev script as follows:
#!/usr/bin/env bash
# bin/dev
# ...

BUILD_ENV="dev" \adds
#...
  docker-compose $*
  1. We simply use the environment-specific script when building the container, and the argument passes through automatically, causing the correct requirements file to be added to the container:
myproject_docker/$ MYSQL_USER=myproject_user \
> MYSQL_PASSWORD=pass1234 \
> ./bin/dev build

How it works...

The preceding pip3 install command, whether it is executed explicitly in a virtual environment or during the build process for a Docker container, downloads and installs all of your project dependencies from requirements/base.txt and requirements/dev.txt. As you can see, you can specify a version of the module that you need for the Django framework and even directly install from a specific commit at the Git repository, as done for social-app-django in our example.

Note

In practice, installing from a specific commit would rarely be useful, for instance, only when having third-party dependencies in your project, with specific functionality, that are not supported in any other versions.

When you have many dependencies in your project, it is good practice to stick to a narrow range of release versions for Python module release versions. Then you can have greater confidence that the project integrity will not be broken due to updates in your dependencies, which might contain conflicts or backward incompatibility. This is particularly important when deploying your project or handing it off to a new developer.

If you have already manually installed the project requirements with pip one by one, you can generate the requirements/base.txt file using the following command within your virtual environment:

(myproject_env)$ pip3 freeze > requirements/base.txt

The same can be executed within the Docker app container, as in the following:

myproject_docker/$ docker exec -it myproject_docker_app_1 \
> /bin/bash
root:/usr/src/app# pip3 freeze > requirements/base.txt

There's more...

If you want to keep things simple and are sure that, for all environments, you will be using the same dependencies, you can use just one file for your requirements named requirements.txt, generated by definition, as in the following:

(myproject_env)$ pip3 freeze > requirements.txt

To install the modules in a new virtual environment, simply call the following command:

(myproject_env)$ pip3 install -r requirements.txt

Note

If you need to install a Python library from other version control system, or at a local path, you can learn more about pip from the official documentation at http://pip.readthedocs.org/en/latest/reference/pip_install.html.

See also

  • The Working with a virtual environment recipe
  • The Working with Docker recipe
  • The Including external dependencies in your project recipe
  • The Configuring settings for development, testing, staging, and production environments recipe
 

Including external dependencies in your project


Sometimes, it is better to include external dependencies directly within your project. This ensures that whenever a developer upgrades third-party modules, all of the other developers will receive the upgraded version in the next update from the version control system (Git, Subversion, or others).

Also, it is better to have external dependencies included in your project when the libraries are taken from unofficial sources, that is, somewhere other than the Python Package Index (PyPI) or different version control systems.

Getting ready

Start with a virtual environment with a Django project in it.

How to do it...

Execute the following steps one by one for a virtual environment project:

  1. If you haven't done so already, create an externals directory under your Django project django-myproject directory. Then, create the libs and apps directories under it. The libsdirectory is for the Python modules that are required by your project, for example, Boto, Requests, Twython, and Whoosh. Theappsdirectory is for third-party Django apps, for example, Django CMS, Django Haystack, and django-storages.

Note

We highly recommend that you create README.md files in the libs and apps directories, where you mention what each module is for, what the used version or revision is, and where it is taken from.

  1. The directory structure should look something similar to the following:
externals/
├── apps/
│   ├── cms/
│   ├── haystack/
│   ├── storages/
│   └── README.md
└── libs/
    ├── boto/
    ├── requests/
    ├── twython/
    └── README.md
  1. The next step is to put the external libraries and apps under the Python path so that they are recognized as if they were installed. This can be done by adding the following code in the settings:
# settings.py
import os, sys
BASE_DIR = os.path.dirname(os.path.dirname(
    os.path.abspath(__file__)))

EXTERNAL_BASE = os.path.join(BASE_DIR, "externals")
EXTERNAL_LIBS_PATH = os.path.join(EXTERNAL_BASE, "libs")
EXTERNAL_APPS_PATH = os.path.join(EXTERNAL_BASE, "apps")
sys.path = ["", EXTERNAL_LIBS_PATH, EXTERNAL_APPS_PATH] + sys.path

How it works...

A module is meant to be under the Python path if you can run Python and import that module. One of the ways to put a module under the Python path is to modify the sys.path variable before importing a module that is in an unusual location. The value of sys.path, as specified by the settings.py file, is a list of directories starting with an empty string for the current directory, followed by the directories in the project, and finally the globally shared directories of the Python installation. You can see the value of sys.path in the Python shell, as follows:

(myproject)$ ./manage.py shell
>>> import sys
>>> sys.path

The same could be done for a Docker project, assuming the container name were django_myproject_app_1, as follows:

myproject_docker/$ docker exec -it django_myproject_app_1 \
> python3 manage.py shell
>>> import sys
>>> sys.path

When trying to import a module, Python searches for the module in this list and returns the first result that is found.

Therefore, we first define the BASE_DIR variable, which is the absolute path to one level higher than the settings.py file. Then, we define the EXTERNAL_LIBS_PATH and EXTERNAL_APPS_PATH variables, which are relative to BASE_DIR. Lastly, we modify the sys.path property, adding new paths to the beginning of the list. Note that we also add an empty string as the first path to search, which means that the current directory of any module should always be checked first before checking other Python paths.

Note

This way of including external libraries doesn't work cross-platform with the Python packages that have C language bindings, for example, lxml. For such dependencies, we would recommend using the pip requirements that were introduced in the Handling project dependencies with pip recipe.

There's more...

With a Docker project, there is significantly more control of the libraries and apps that are installed within the container:

  • For Python libraries needed for the project, we can use version specifications in the requirements.txt file to require a version known to be compatible. Furthermore, it was demonstrated in the Handling project dependencies with pip recipe that we can differentiate these requirements by environment, as well as being so precise as to require an exact repository version using the -e flag.
  • All Django applications are stored under the apps directory. Here would reside not only the code for ones specifically under development, but also any external apps that are not made available globally via the requirements.txt dependency list.

See also

  • The Creating a virtual environment project file structure recipe
  • The Creating a Docker project file structure recipe
  • The Handling project dependencies with pip recipe
  • The Defining relative paths in the settings recipe
  • The Using the Django shell recipe in Chapter 11, Bells and Whistles
 

Configuring settings for development, testing, staging, and production environments


As noted earlier, you will be creating new features in the development environment, testing them in the testing environment, then putting the website onto a staging server to let other people try the new features, and lastly, the website will be deployed to the production server for public access. Each of these environments can have specific settings and you will see how to organize them in this recipe.

Getting ready

In a Django project, we'll create settings for each environment: development, testing, staging, and production.

How to do it...

Follow these steps to configure project settings:

  1. In the myproject directory, create a config Python module with the following files:
    • __init__.py
    • base.py for shared settings
    • dev.py for development settings
    • test.py for testing settings
    • staging.py for staging settings
    • prod.py for production settings
  2. Put all of your shared settings in config/base.py.
  3. If the settings of an environment are the same as the shared settings, then just import everything from base.py there, as follows:
# myproject/config/prod.py
from .base import *
  1. Apply the settings that you want to attach or overwrite for your specific environment in the other files, for example, the development environment settings should go to dev.py, as shown in the following:
# myproject/config/dev.py
from .base import *
EMAIL_BACKEND = 'django.core.mail.backends.console.EmailBackend'
  1. At the beginning of myproject/settings.py, import the configurations from one of the environment settings and then additionally attach specific or sensitive configurations, such as DATABASES or API keys that shouldn't be under version control, as follows:
# myproject/settings.py
from .config.dev import *

DATABASES = {
    "default": {
        "ENGINE": "django.db.backends.mysql",
        "NAME": "myproject",
        "USER": "root",
        "PASSWORD": "root",
    }
}
  1. Create a settings.py.example file that should contain all the sensitive settings that are necessary for a project to run, however, with empty values set.

How it works...

By default, the Django management commands use the settings from myproject/settings.py. Using the method that is defined in this recipe, we can keep all of the required non-sensitive settings for all environments under version control in the config directory. On the other hand, the settings.py file itself would be ignored by version control and will only contain the settings that are necessary for the current development, testing, staging, or production environments.

There's more...

In the Creating a Docker project file structure recipe, we introduced an alternative approach using environment variables to store sensitive or environment-specific settings. We go into greater depth into this method of differentiating settings in the Creating and including local settings recipe as well.

See also

  • The Creating a Docker project file structure recipe
  • The Creating and including local settings recipe
  • The Defining relative paths in the settings recipe
  • The Setting the Subversion ignore property recipe
  • The Creating a Git ignore file recipe
 

Defining relative paths in the settings


Django requires you to define different file paths in the settings, such as the root of your media, the root of your static files, the path to templates, and the path to translation files. For each developer of your project, the paths may differ as the virtual environment can be set up anywhere and the user might be working on macOS, Linux, or Windows. Even when your project is wrapped in a Docker container, it reduces maintainability and portability to define absolute paths. In any case, there is a way to define these paths dynamically so that they are relative to your Django project directory.

Getting ready

Have a Django project started, and open settings.py.

How to do it...

Modify your path-related settings accordingly, instead of hardcoding the paths to your local directories, as follows:

# settings.py
import os
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))

# ...

TEMPLATES = [{
    # ...
    DIRS: [
        os.path.join(BASE_DIR, 'templates'),
    ],
    # ...
}]

# ...

LOCALE_PATHS = [
    os.path.join(BASE_DIR, 'locale'),
]

# ...

MEDIA_ROOT = os.path.join(BASE_DIR, 'media')
STATIC_ROOT = os.path.join(BASE_DIR, 'static')
STATICFILES_DIRS = [
    os.path.join(BASE_DIR, 'site_static'),
]

FILE_UPLOAD_TEMP_DIR = os.path.join(BASE_DIR, 'tmp'

How it works...

By default, Django settings include a BASE_DIR value, which is an absolute path to the directory containing manage.py (usually one level higher than the settings.py file). Then, we set all of the paths relative to BASE_DIR using the os.path.join function.

Based on the directory layout we set down in the Creating a virtual environment project file structure recipe, we would insert 'myproject' as an intermediary path segment for each of the previous examples, since the associated folders were created within that one. For Docker projects, as shown in the Creating a Docker project file structure recipe, we set the volumes for media, static, and so forth to be alongside manage.py in BASE_DIR itself.

See also

  • The Creating a virtual environment project file structure recipe
  • The Creating a Docker project file structure recipe
  • The Including external dependencies in your project recipe
 

Creating and including local settings


Configuration doesn't necessarily need to be complex. If you want to keep things simple, you can work with a single settings.py file for common configuration and use environment variables for settings that should be kept private and not in version control.

Getting ready

Most of the settings for a project will be shared across all environments and saved in version control. These can be defined directly within the settings.py file. However, there will be some settings that are specific to the environment of the project instance, or sensitive and require additional security such as database or email settings. We will expose these using environment variables.

How to do it...

To use local settings in your project, first we must draw values from environment variables for any configurations in settings.py that will differ across environments or that would be a security risk if stored in version control. It is a good practice to be very clear and unique when naming these variables, but also take into account those that already exist in the environment. Some examples follow:

  1. Whether or not to use DEBUG mode will generally differ per environment, where debugging would be on in development, but not by default:
# settings.py
DEBUG = False
if os.environ.get('DJANGO_USE_DEBUG'):
    DEBUG = True
  1. Similarly, we might want the debug_toolbar to be active in development, or perhaps only in certain situations even then, so we could add it only when necessary:
# settings.py
INSTALLED_APPS = [
    # ...
]
if os.environ.get('DJANGO_USE_DEBUG_TOOLBAR'):
    INSTALLED_APPS += ('debug_toolbar',)

MIDDLEWARE = [
    # ...
]
if os.environ.get('DJANGO_USE_DEBUG_TOOLBAR'):
    MIDDLEWARE += (
        'debug_toolbar.middleware.DebugToolbarMiddleware',)
  1. Perhaps we use a SQLite3 database in testing, but a MySQL database in development, staging, and production. Also, in development, the MySQL database might be on localhost, but have its own separate domain in staging and production. Finally, storing the credentials for the connection in any environment is a security risk. We can handle all of these scenarios just as easily with the following updates to settings.py:
# settings.py
DATABASES = {
    'default': {
        'ENGINE': 'django.db.backends.sqlite3',
        'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
    }
}
if os.environ.get('MYSQL_HOST'):
    DATABASES['default'] = {
        'ENGINE': 'django.db.backends.mysql',
        'HOST': os.environ.get('MYSQL_HOST'),
        'NAME': os.environ.get('MYSQL_DATABASE'),
        'USER': os.environ.get('MYSQL_USER'),
        'PASSWORD': os.environ.get('MYSQL_PASSWORD'),
    }

How it works...

As you can see, the local settings are not directly stored in settings.py, they are rather included via externally defined environment variables and evaluated in the settings.py file itself. This allows you to not only create or overwrite the existing settings, but also adjust the tuples or lists from the settings.py file. For example, we add debug_toolbar to INSTALLED_APPS here, plus its associated MIDDLEWARE, in order to be able to debug the SQL queries, template context variables, and so on.

Defining the values of these variables can be done in one of two ways. In development, we can declare them within runtime commands, as in the following:

$ DJANGO_USE_DEBUG=1 python3 manage.py runserver 8000

This sets the DJANGO_USE_DEBUG variable for this particular process, resulting in DEBUG=True in settings.py as per the examples listed earlier. If there are many variables to define, or the same values will be set every time the server starts, it may be handy to create a reusable script to do so. For example, in the development environment, we can create a dev shell script, such as the following:

#!/usr/bin/env bash
# bin/dev
# environment variables to be defined externally for security
# - MYSQL_USER
# - MYSQL_PASSWORD
# - MYSQL_ROOT_PASSWORD

DJANGO_USE_DEBUG=1 \
DJANGO_USE_DEBUG_TOOLBAR=1 \
MYSQL_HOST=localhost \
MYSQL_DATABASE=myproject_db \
  python3 manage.py runserver 8000

Store the above in a bin directory alongside manage.py in your project, and make sure it is executable, as follows:

$ chmod +x bin/dev

Then, in a terminal, we can now start our development server, with all of the appropriate settings, as in the following:

$ MYSQL_USER=username MYSQL_PASSWORD=pass1234 bin/dev

The resultant runserver command will receive values not only for the MySQL username and password given here, but also all of the variables set in the dev script itself.

See also

  • The Creating a virtual environment project file structure recipe
  • The Creating a Docker project file structure recipe
  • The Toggling the Debug Toolbar recipe in Chapter 11, Bells and Whistles
 

Setting up STATIC_URL dynamically for Subversion users


If you set STATIC_URL to a static value, then each time you update a CSS file, a JavaScript file, or an image, you will need to clear the browser cache in order to see the changes. There is a trick to work around clearing the browser's cache. It is to have the revision number of the version control system shown in STATIC_URL. Whenever the code is updated, the visitor's browser will force the loading of all-new static files.

This recipe shows how to put a revision number in STATIC_URL for Subversion users.

Getting ready

Make sure that your project is under the Subversion version control and you have BASE_DIR defined in your settings, as shown in the Defining relative paths in the settings recipe.

Then, create the utils module in your Django project, and also create a file called misc.py there.

How to do it...

The procedure to put the revision number in the STATIC_URL settings consists of the following two steps:

  1. Insert the following content:
# utils/misc.py
import subprocess


def get_media_svn_revision(absolute_path):
    repo_dir = absolute_path
    svn_revision = subprocess.Popen(
        "svn info | grep 'Revision' | awk '{print $2}'",
        stdout=subprocess.PIPE,
        stderr=subprocess.PIPE, 
        shell=True,
        cwd=repo_dir,
        universal_newlines=True)
    rev = svn_revision.communicate()[0].partition('\n')[0]
    return rev
  1. Modify the settings.py file and add the following lines:
# settings.py
# ... somewhere after BASE_DIR definition ...
from utils.misc import get_media_svn_revision
STATIC_URL = f'/static/{get_media_svn_revision(BASE_DIR)}/'

How it works...

The get_media_svn_revision() function takes the absolute_path directory as a parameter and calls the svn information shell command in that directory to find out the current revision. We pass BASE_DIR to the function, as we are sure that it is under version control. Then, the revision is parsed, returned, and included in the STATIC_URL definition.

See also

  • The Setting up STATIC_URL dynamically for Git users recipe
  • The Setting the Subversion ignore property recipe
 

Setting up STATIC_URL dynamically for Git users


If you don't want to refresh the browser cache each time you change your CSS and JavaScript files, or while styling images, you need to set STATIC_URL dynamically with a varying path component. With the dynamically changing URL, whenever the code is updated, the visitor's browser will force loading of all-new uncached static files. In this recipe, we will set a dynamic path for STATIC_URL when you use the Git version control system.

Getting ready

Make sure that your project is under the Git version control and you have BASE_DIR defined in your settings, as shown in the Defining relative paths in the settings recipe.

If you haven't done so yet, create the utils module in your Django project. Also, create a misc.py file there.

How to do it...

The procedure to put the Git timestamp in the STATIC_URL setting consists of the following two steps:

  1. Add the following content to the misc.py file placed in utils/:
# utils/misc.py
import subprocess
from datetime import datetime

def get_git_changeset(absolute_path):
    repo_dir = absolute_path
    git_show = subprocess.Popen(
        "git show --pretty=format:%ct --quiet HEAD",
        stdout=subprocess.PIPE,
        stderr=subprocess.PIPE,
        shell=True,
        cwd=repo_dir,
        universal_newlines=True)
    timestamp = git_show.communicate()[0].partition(‘\n’)[0]
    try:
        timestamp = datetime.utcfromtimestamp(int(timestamp))
    except ValueError:
        return ""
    changeset = timestamp.strftime(‘%Y%m%d%H%M%S’)
    return changeset
  1. Import the newly created get_git_changeset() function in the settings and use it for the STATIC_URL path, as follows:
# settings.py
# ... somewhere after BASE_DIR definition ...
from utils.misc import get_git_changeset
STATIC_URL = f'/static/{get_git_changeset(BASE_DIR)}/'

How it works...

The get_git_changeset() function takes the absolute_path directory as a parameter and calls the git show shell command with the parameters to show the Unix timestamp of the HEAD revision in the directory. As stated in the previous recipe, we pass BASE_DIR to the function, as we are sure that it is under version control. The timestamp is parsed, converted to a string consisting of year, month, day, hour, minutes, and seconds, returned; and included in the definition of STATIC_URL.

See also

  • The Setting up STATIC_URL dynamically for Subversion users recipe
  • The Creating the Git ignore file recipe
 

Setting UTF-8 as the default encoding for MySQL configuration


MySQL proclaims itself as the most popular open source database. In this recipe, we will tell you how to set UTF-8 as the default encoding for it. Note that if you don't set this encoding in the database configuration, you might get into a situation where LATIN1 is used by default with your UTF-8-encoded data. This will lead to database errors whenever symbols such as € are used. Also, this recipe will save you from the difficulties of converting the database data from LATIN1 to UTF-8, especially when you have some tables encoded in LATIN1 and others in UTF-8.

Getting ready

Make sure that the MySQL database management system and the MySQLdb Python module are installed and you are using the MySQL engine in your project's settings.

How to do it...

Open the /etc/mysql/my.cnf MySQL configuration file in your favorite editor and ensure that the following settings are set in the [client], [mysql], and [mysqld] sections, as follows:

# /etc/mysql/my.cnf
[client]
default-character-set = utf8

[mysql]
default-character-set = utf8

[mysqld]
collation-server = utf8_unicode_ci
init-connect = ‘SET NAMES utf8’
character-set-server = utf8

If any of the sections don't exist, create them in the file. If the sections do already exist, add these settings to the existing configurations. Then, restart MySQL in your command-line tool, as follows:

$ /etc/init.d/mysql restart

How it works...

Now, whenever you create a new MySQL database, the databases and all of their tables will be set in UTF-8 encoding by default. Don't forget to set this on all computers on which your project is developed or published.

There's more...

For a Docker project, these settings can be added to the config/my.cnf file and saved to version control. This file will automatically be added as /etc/mysql/my.cnf within the container at build time. Furthermore, any developer that pulls down the code will automatically gain the configuration.

See also

  • The Creating a virtual environment project file structure recipe
  • The Creating a Docker project file structure recipe
 

Setting the Subversion ignore property


Make sure that your Django project is under the Subversion version control.

How to do it...

  1. Open your command-line tool and set your default editor as nano, vi, vim, or any other that you prefer, as follows:
$ export EDITOR=nano

Note

If you don’t have a preference, we would recommend using nano, which is very intuitive and a simple text editor for the terminal.

  1. Then, go to your project directory and type the following command:
$ svn propedit svn:ignore myproject
  1. This will open a temporary file in the editor, where you need to put the following file and directory patterns for Subversion to ignore:
# Project files and directories
static
media
tmp
# Byte-compiled / optimized / DLL files
__pycache__
*.py[cod]
*$py.class
# C extensions
*.so
# PyInstaller
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov
.tox
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
# Translations
*.pot
# Django stuff:
*.log
# PyBuilder
target
  1. Save the file and exit the editor. For every other Python package in your project, you will need to ignore several files and directories too. Just go to a directory and type the following command:
$ svn propedit svn:ignore .
  1. Then, put this in the temporary file, save it, and close the editor:
# Byte-compiled / optimized / DLL files
__pycache__
*.py[cod]
*$py.class
# C extensions
*.so
# PyInstaller
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov
.tox
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
# Translations
*.pot
# Django stuff:
*.log
# PyBuilder
target

How it works...

In Subversion, you need to define the ignore properties for each directory of your project. Mainly, we don't want to track the Python-compiled files, for instance, *.pyc. We also want to ignore the static directory, where static files from different apps are collected, media, which contains uploaded files and changes together with the database, and tmp, which is temporarily used for file uploads.

Note

If you keep all your settings in a config Python package, as described in the Configuring settings for development, testing, staging, and production environments recipe, add settings.py to the ignored files too.

See also

  • The Creating and including local settings recipe
  • The Creating the Git ignore file recipe
 

Creating the Git ignore file


If you are using Git—the most popular distributed version control system—ignoring some files and folders from version control is much easier than with Subversion.

Getting ready

Make sure that your Django project is under the Git version control.

How to do it...

Using your favorite text editor, create a .gitignore file at the root of your Django project, and put the following files and directories there:

# .gitignore
# Project files and directories
/myproject/static/
/myproject/tmp/
/myproject/media/
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class
# C extensions
*.so
# PyInstaller
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
# Translations
*.pot
# Django stuff:
*.log
# Sphinx documentation
docs/_build/
# PyBuilder
target/

How it works...

The .gitignore file specifies patterns that should intentionally be untracked by the 
Git version control system. The .gitignore file that we created in this recipe will ignore the Python-compiled files, local settings, collected static files, temporary directory for uploads, 
and media directory with the uploaded files.

Note

If you keep all of your settings in a config Python package, as described in the Configuring settings for development, testing, staging, and production environments recipe, add settings.py to the ignored files too.

There's more...

With Git ignore files, we have the ability to follow a whitelist pattern rather than a blacklist, which means we can indicate what files we want to include rather than those we should omit. In addition, the patterns given in .gitignore are honored for all levels of the tree below where the file resides, making them extremely powerful. For example, the file could be written in this manner for a Docker project:

# .gitignore
# ignore everything in the root by default
/*
# allow this file of course
!.gitignore
# allowed root directories
!/apps/
!/bin/
!/config/
!/data/
!/project/
!/static/
!/templates/
# allowed root files
!/Dockerfile
!/docker-compose.yml
# files allowed anywhere
!README.md
# specifically ignore certain deeper items
__pycache__/

See also

  • The Creating a virtual environment project file structure recipe
  • The Creating a Docker project file structure recipe
  • The Setting the Subversion ignore property recipe
 

Deleting Python-compiled files


When you run your project for the first time, Python compiles all of your *.py code in bytecode-compiled files, *.pyc, which are used later for execution.

Normally, when you change the *.py files, *.pyc is recompiled; however, sometimes when switching branches or moving the directories, you need to clean up the compiled files manually.

Getting ready

Use your favorite editor and edit or create a .bash_profile file in your home directory.

How to do it...

  1. Add this alias at the end of .bash_profile, as follows:
# ~/.bash_profile
alias delpyc='find . -name "*.pyc" -delete'
  1. Now, to clean the Python-compiled files, go to your project directory and type the following command on the command line:
$ delpyc

How it works...

At first, we create a Unix alias that searches for the *.pyc files and deletes them in the current directory and its children. The .bash_profile file is executed when you start a new session in the command-line tool.

See also

  • The Setting the Subversion ignore property recipe
  • The Creating the Git ignore file recipe
 

Respecting the import order in Python files


When you create the Python modules, it is good practice to stay consistent with the structure in the files. This makes it easier for other developers and yourself to read the code. This recipe will show you how to structure your imports.

Getting ready

Create a virtual environment and create a Django project in it.

How to do it...

Use the following structure in a Python file that you create. Starting with the first line in the file, put the imports categorized in sections, as follows:

# System libraries
import os
import re
from datetime import datetime

# Third-party libraries
import boto
from PIL import Image

# Django modules
from django.db import models
from django.conf import settings

# Django apps
from cms.models import Page

# Current-app modules
from . import app_settings

How it works...

We have five main categories for the imports, as follows:

  • System libraries for packages in the default installation of Python
  • Third-party libraries for the additionally installed Python packages
  • Django modules for different modules from the Django framework
  • Django apps for third-party and local apps
  • Current-app modules for relative imports from the current app

There's more...

When coding in Python and Django, use the official style guide for Python code, PEP 8. You can find it at https://www.python.org/dev/peps/pep-0008/.

See also

  • The Handling project dependencies with pip recipe
  • The Including external dependencies in your project recipe
 

Creating app configuration


When developing a website with Django, you create one module for the project itself, and then multiple Python modules called applications (or, more commonly, apps) that combine the different modular functionalities and usually consist of models, views, forms, URL configurations, management commands, migrations, signals, tests, and so on. The Django framework has application registry, where all apps and models are collected and later used for configuration and introspection. Since Django 1.7, meta information about apps can be saved in the AppConfig instance for each used app. Let's create a sample magazine app to take a look at how to use the app configuration there.

Getting ready

You can create a Django app in one of three ways:

  • Generate all of the files manually, which can be an excellent tool for learning, but is far from the most efficient approach.
  • Use the startapp command in your virtual environment, as follows:
(myproject_env)$ django-admin.py startapp magazine

Learn how to use virtual environments in the Working with a virtual environment and Creating a virtual environment project file structure recipes.

  • Use the startapp command in a Docker project, as follows:
myproject_django/$ docker-compose run app django-admin.py startapp magazine

Note

Learn how to use Docker in the Working with Docker and Creating a Docker project file structure recipes.

With your magazine app created, add a NewsArticle model to models.py, create administration for the model in admin.py, and put "magazine" in INSTALLED_APPS in the settings.py. If you are not yet familiar with these tasks, study the official Django tutorial at:https://docs.djangoproject.com/en/2.1/intro/tutorial01/.

How to do it...

Follow these steps to create and use the app configuration:

  1. Create the apps.py file and put the following content in it, as follows:
# magazine/apps.py
from django.apps import AppConfig
from django.utils.translation import ugettext_lazy as _


class MagazineAppConfig(AppConfig):
    name = "magazine"
    verbose_name = _("Magazine")

    def ready(self):
        from . import signals
  1. Edit the __init__.py file in the magazine module to contain the following content:
# magazine/__init__.py
default_app_config = "magazine.apps.MagazineAppConfig"
  1. Let's create a signals.py file and add some signal handlers there:
# magazine/signals.py
from django.db.models.signals import post_save, post_delete
from django.dispatch import receiver
from django.conf import settings

from .models import NewsArticle


@receiver(post_save, sender=NewsArticle)
def news_save_handler(sender, **kwargs):
    if settings.DEBUG:
        print(f"{kwargs['instance']} saved.")


@receiver(post_delete, sender=NewsArticle)
def news_delete_handler(sender, **kwargs):
    if settings.DEBUG:
        print(f"{kwargs['instance']} deleted.")

How it works...

When you run an HTTP server or invoke a management command, django.setup() is called. It loads the settings, sets up logging, and prepares the app registry. This registry is initialized in three steps, as follows:

  • Django imports the configurations for each item from INSTALLED_APPS in the settings. These items can point to app names or configuration directly, for example, "magazine" or "magazine.apps.NewsAppConfig".
  • Django tries to import models.py from each app in INSTALLED_APPS and collect all of the models.
  • Finally, Django runs the ready() method for each app configuration. This method is a correct place to register signal handlers, if you have any. The ready() method is optional.
  • In our example, the MagazineAppConfig class sets the configuration for the magazine app. The name parameter defines the name of the current app. The verbose_name parameter is used in the Django model administration, where models are presented and grouped by apps. The ready() method imports and activates the signal handlers that, when in DEBUG mode, print in the terminal that a NewsArticle object was saved or deleted.

There is more...

After calling django.setup(), you can load the app configurations and models from the registry as follows:

>>> from django.apps import apps as django_apps
>>> magazine_app_config = django_apps.get_app_config("magazine")
>>> magazine_app_config
<MagazineAppConfig: magazine>
>>> magazine_app_config.models_module
<module 'magazine.models' from '/usr/src/app/magazine/models.py'>
>>> NewsArticle = django_apps.get_model("magazine", "NewsArticle")
>>> NewsArticle
<class 'magazine.models.NewsArticle'>

You can read more about app configuration in the official Django documentation athttps://docs.djangoproject.com/en/2.1/ref/applications/.

See also

  • The Working with a virtual environment recipe
  • The Working with Docker recipe
  • The Defining overwritable app settings recipe
  • Chapter 6, Model Administration
 

Defining overwritable app settings


This recipe will show you how to define settings for your app that can be then overwritten in your project's settings.py file. This is useful especially for reusable apps.

Getting ready

Follow the steps for Getting ready in the Creating app configuration recipe to create your Django app.

How to do it...

  1. If you just have one or two settings, you can use the following pattern in your models.py file. If the settings are extensive and you want to have them organized better, create an app_settings.py file in the app and put the settings in the following way:
# magazine/models.py or magazine/app_settings.py
from django.conf import settings
from django.utils.translation import ugettext_lazy as _

SETTING1 = getattr(settings, "MAGAZINE_SETTING1", "default value")
MEANING_OF_LIFE = getattr(settings, "MAGAZINE_MEANING_OF_LIFE", 42)
STATUS_CHOICES = getattr(settings, "MAGAZINE_STATUS_CHOICES", (
    ("draft", _("Draft")),
    ("published", _("Published")),
    ("not_listed", _("Not Listed")),
))
  1. If the settings were defined in an app_settings.py file, then you can import and use them in models.py, as follows:
# magazine/models.py
from django.db import models
from django.utils.translation import ugettext_lazy as _

from .app_settings import STATUS_CHOICES


class NewsArticle(models.Model):
    # ...
    status = models.CharField(_("Status"),
                              max_length=20,
                              choices=STATUS_CHOICES)
  1. If you want to overwrite the STATUS_CHOICES setting for a given project, you simply open settings.py for that project and add the following:
# settings.py
from django.utils.translation import ugettext_lazy as _

# ...

MAGAZINE_STATUS_CHOICES = (
    ("imported", _("Imported")),
    ("draft", _("Draft")),
    ("published", _("Published")),
    ("not_listed", _("Not Listed")),
    ("expired", _("Expired")),
)

How it works...

The getattr(object, attribute_name[, default_value]) Python function tries to get the attribute_name attribute from object and returns default_value if it is not found. In this case, different settings are tried in order to be taken from the Django project settings.py module or, if they are not found, the default values are assigned.

About the Authors

  • Jake Kronika

    Jake Kronika, a senior software engineer with nearly 25 years' experience, has been working with Python since 2005, and Django since 2007. Evolving alongside the web development space, his skillset encompasses HTML5, CSS3, and ECMAScript 6 on the frontend, plus Python, Django, Ruby on Rails, Node.js, and much more besides on the server side.

    Currently a senior software engineer and development team lead, he collaborates with skilled designers, business stakeholders, and developers around the world to architect robust web applications. In his spare time, he also provides full-spectrum web services as sole proprietor of Gridline Design and Development.

    Prior to this book, he has acted as a technical reviewer for several other Packt titles.

    Browse publications by this author
  • Aidas Bendoraitis

    Aidas Bendoraitis has been professionally building websites for the past 18 years. For the last 14 years, he has been working at a design company, studio 38 pure communication, in Berlin. Together with a small dedicated team, he has mostly used Django in the backend and jQuery in the frontend to create cultural and touristic web platforms.Among different side projects, he is bootstrapping a SaaS business with strategic prioritizer 1st things 1st. Aidas Bendoraitis is active on Twitter and other social media under the username DjangoTricks.

    Browse publications by this author

Latest Reviews

(11 reviews total)
Les exemples de code sont clairs et les explications très bonnes.
Good
Gute Tipps zu verschiedenen Aufgaben.

Recommended For You

Book Title
Access this book, plus 8,000 other titles for FREE
Access now