Django in Production: Part 3 - Automation & Monitoring
Join the DZone community and get the full member experience.
Join For Freethis is the third part in a series about django in production. if you haven’t already, read part 1 and part 2 first.
in the first two posts of this series, i described the core stack which powers a django application in production, and the celery task queue which can be used to execute code asynchronously. in this third post, i’ll describe how a production django application can be monitored, and how common tasks such as deployment can be automated.
monitoring django applications
there are many ways to monitor a django application, but one that is particularly useful is graphite .
graphite
graphite provides real-time visualization and storage of numeric time-series
data on an enterprise level.
the key thing is not to be scared by the word
enterprise
. graphite is
a relatively simple 3-part system:
whisper
is an efficient, pure-python
implementation of a round-robin database, and
carbon
is a daemon which
manages the whisper database and provides caching. finally, the graphite
“webapp” provides a django frontend to the data stored in the whisper database.
graphite’s web interface is, admittedly, hard to use. its redeeming feature is a powerful url-based api which allows you to compose graphs programatically - which is in some ways easier than navigating the difficult menu system. once mastered, though, graphite can produce some amazingly detailed stats about the very deepest internals of your application.
clearly, the key to great graphs is lots (and lots ) of data. in fact, 37signals report that their servers recorded an incredible 100,000,000 measurements in the first 10 days of 2012. those measurements were made using a statsd , which is a node.js daemon for collecting statistics over a simple udp protocol, and sending them to graphite.
statsd
the great thing about statsd is that it uses udp to collect statistics. this means that a client can be written in almost any language, and when the server isn’t running the client is completely unaffected . after all, statistics are important, but not important enough to knock the entire application offline when they aren’t being collected.
so far, there’s been nothing django-specific (save for the fact that graphite is written in django). in fact, these systems can be used to monitor just about anything.
django-statsd provides a very useful set of basic statistics for django applications, though, and is highly recommended. strangely, it’s installed from pypi with pip install django-statsd-mozilla, as the name was already taken by another app.
once installed, just enable a couple of middleware classes which record every request/response and their execution time to statsd. for more specific stats, though, the client library is very easy to use:
from django_statsd.clients import statsd statsd.incr('response.200')
interestingly, it also integrates with django-debug-toolbar, and can “monkeypatch” django to enable model/cache statistics - but i’ll leave that as as exercise for the reader.
automation
often, in their quest for the least effort possible, programmers automate too much. in the case of deploying applications, though, which tends to be a simple but repetitive task, i think we’re justified.
fabric
if you haven’t already, you should spend some time reading about fabric . essentially, it’s a python library that allows ssh commands to be scripted, whether they are on one or many remote hosts.
fabric is a python library and command-line tool for streamlining the use of
ssh for application deployment or systems administration tasks.
fabric uses a file at the top-level of a project, called fabfile.py, to
define the functions which can be used on the command line.
there are some very complete examples of “fabfiles” which can be used for django deployment, such as this one by gareth rushgrove .
for my projects, which involve only one server, i use a rather more simplified version:
import os from fabric.api import env, require, run, sudo, cd env.project_name = '' env.server_name = '' env.webapps_root = '/opt/webapps/' env.project_root = os.path.join(env.webapps_root, env.project_name) env.activate_script = os.path.join(env.project_root, 'env/bin/activate') env.wsgi_file = os.path.join(env.project_root, 'django.wsgi') env.repo_root = os.path.join(env.project_root, 'repository') env.search_index = os.path.join(env.project_root, 'search_index') env.requirements_file = os.path.join(env.repo_root, 'requirements.txt') env.manage_dir = os.path.join(env.repo_root, env.project_name) def production(): env.hosts = [env.server_name] prod = production def virtualenv(command, use_sudo=false): if use_sudo: func = sudo else: func = run func('source "%s" && %s' % (env.activate_script, command)) def update(): require('hosts', provided_by=[production]) with cd(env.repo_root): run('git pull origin master') def install_requirements(): require('hosts', provided_by=[production]) virtualenv('pip install -q -r %(requirements_file)s' % env) def manage_py(command, use_sudo=false): require('hosts', provided_by=[production]) with cd(env.manage_dir): virtualenv('python manage.py %s' % command, use_sudo) def syncdb(app=none): require('hosts', provided_by=[production]) manage_py('syncdb --noinput') def migrate(): require('hosts', provided_by=[production]) manage_py('migrate') def rebuild_index(): require('hosts', provided_by=[production]) manage_py('rebuild_index --noinput', use_sudo=true) sudo('chown -r www-data:www-data %(search_index)s' % env) def collectstatic(): require('hosts', provided_by=[production]) manage_py('collectstatic -l --noinput') def reload(): require('hosts', provided_by=[production]) sudo('supervisorctl status | grep %(project_name)s ' '| sed "s/.*[pid ]\([0-9]\+\)\,.*/\\1/" ' '| xargs kill -hup' % env) def deploy(): require('hosts', provided_by=[production]) update() install_requirements() syncdb() migrate() collectstatic() reload()
as you can probably tell from the code, this performs the following operations at the push of a button:
- updates the server’s git repository
- installs any new requirements with pip
- synchronises the database to create any new tables
- executes any south migrations that haven’t yet been run
- links any static media
- reloads the webserver (see part 1 for the gunicorn setup)
once this file is in place, executing fab prod deploy from anywhere in the project’s source tree will automatically deploy the latest code to the web server in seconds. with public key authentication, you’re not even prompted for a password. as far as i’m concerned, that’s well worth the effort!
source: http://www.robgolding.com/blog/2012/01/14/django-in-production-part-3---automation-and-monitoring/
Opinions expressed by DZone contributors are their own.
Comments