How to manage local vs production settings in Django?
Solution 1
In settings.py
:
try:
from local_settings import *
except ImportError as e:
pass
You can override what needed in local_settings.py
; it should stay out of your version control then. But since you mention copying I'm guessing you use none ;)
Solution 2
Instead of settings.py
, use this layout:
.
└── settings/
├── __init__.py <= not versioned
├── common.py
├── dev.py
└── prod.py
common.py
is where most of your configuration lives.
prod.py
imports everything from common, and overrides whatever it needs to override:
from __future__ import absolute_import # optional, but I like it
from .common import *
# Production overrides
DEBUG = False
#...
Similarly, dev.py
imports everything from common.py
and overrides whatever it needs to override.
Finally, __init__.py
is where you decide which settings to load, and it's also where you store secrets (therefore this file should not be versioned):
from __future__ import absolute_import
from .prod import * # or .dev if you want dev
##### DJANGO SECRETS
SECRET_KEY = '(3gd6shenud@&57...'
DATABASES['default']['PASSWORD'] = 'f9kGH...'
##### OTHER SECRETS
AWS_SECRET_ACCESS_KEY = "h50fH..."
What I like about this solution is:
- Everything is in your versioning system, except secrets
- Most configuration is in one place:
common.py
. - Prod-specific things go in
prod.py
, dev-specific things go indev.py
. It's simple. - You can override stuff from
common.py
inprod.py
ordev.py
, and you can override anything in__init__.py
. - It's straightforward python. No re-import hacks.
Solution 3
I use a slightly modified version of the "if DEBUG" style of settings that Harper Shelby posted. Obviously depending on the environment (win/linux/etc.) the code might need to be tweaked a bit.
I was in the past using the "if DEBUG" but I found that occasionally I needed to do testing with DEUBG set to False. What I really wanted to distinguish if the environment was production or development, which gave me the freedom to choose the DEBUG level.
PRODUCTION_SERVERS = ['WEBSERVER1','WEBSERVER2',]
if os.environ['COMPUTERNAME'] in PRODUCTION_SERVERS:
PRODUCTION = True
else:
PRODUCTION = False
DEBUG = not PRODUCTION
TEMPLATE_DEBUG = DEBUG
# ...
if PRODUCTION:
DATABASE_HOST = '192.168.1.1'
else:
DATABASE_HOST = 'localhost'
I'd still consider this way of settings a work in progress. I haven't seen any one way to handling Django settings that covered all the bases and at the same time wasn't a total hassle to setup (I'm not down with the 5x settings files methods).
Solution 4
I use a settings_local.py and a settings_production.py. After trying several options I've found that it's easy to waste time with complex solutions when simply having two settings files feels easy and fast.
When you use mod_python/mod_wsgi for your Django project you need to point it to your settings file. If you point it to app/settings_local.py on your local server and app/settings_production.py on your production server then life becomes easy. Just edit the appropriate settings file and restart the server (Django development server will restart automatically).
Solution 5
TL;DR: The trick is to modify os.environment
before you import settings/base.py
in any settings/<purpose>.py
, this will greatly simplify things.
Just thinking about all these intertwining files gives me a headache.
Combining, importing (sometimes conditionally), overriding, patching of what was already set in case DEBUG
setting changed later on.
What a nightmare!
Through the years I went through all different solutions. They all somewhat work, but are so painful to manage.
WTF! Do we really need all that hassle? We started with just one settings.py
file.
Now we need a documentation just to correctly combine all these together in a correct order!
I hope I finally hit the (my) sweet spot with the solution below.
Let's recap the goals (some common, some mine)
Keep secrets a secret — don't store them in a repo!
Set/read keys and secrets through environment settings, 12 factor style.
Have sensible fallback defaults. Ideally for local development you don't need anything more beside defaults.
…but try to keep defaults production safe. It's better to miss a setting override locally, than having to remember to adjust default settings safe for production.
Have the ability to switch
DEBUG
on/off in a way that can have an effect on other settings (eg. using javascript compressed or not).Switching between purpose settings, like local/testing/staging/production, should be based only on
DJANGO_SETTINGS_MODULE
, nothing more.…but allow further parameterization through environment settings like
DATABASE_URL
.…also allow them to use different purpose settings and run them locally side by side, eg. production setup on local developer machine, to access production database or smoke test compressed style sheets.
Fail if an environment variable is not explicitly set (requiring an empty value at minimum), especially in production, eg.
EMAIL_HOST_PASSWORD
.Respond to default
DJANGO_SETTINGS_MODULE
set in manage.py during django-admin startprojectKeep conditionals to a minimum, if the condition is the purposed environment type (eg. for production set log file and it's rotation), override settings in associated purposed settings file.
Do not's
Do not let django read DJANGO_SETTINGS_MODULE setting form a file.
Ugh! Think of how meta this is. If you need to have a file (like docker env) read that into the environment before staring up a django process.Do not override DJANGO_SETTINGS_MODULE in your project/app code, eg. based on hostname or process name.
If you are lazy to set environment variable (like forsetup.py test
) do it in tooling just before you run your project code.Avoid magic and patching of how django reads it's settings, preprocess the settings but do not interfere afterwards.
No complicated logic based nonsense. Configuration should be fixed and materialized not computed on the fly. Providing a fallback defaults is just enough logic here.
Do you really want to debug, why locally you have correct set of settings but in production on a remote server, on one of hundred machines, something computed differently? Oh! Unit tests? For settings? Seriously?
Solution
My strategy consists of excellent django-environ used with ini
style files,
providing os.environment
defaults for local development, some minimal and short settings/<purpose>.py
files that have an
import settings/base.py
AFTER the os.environment
was set from an INI
file. This effectively give us a kind of settings injection.
The trick here is to modify os.environment
before you import settings/base.py
.
To see the full example go do the repo: https://github.com/wooyek/django-settings-strategy
.
│ manage.py
├───data
└───website
├───settings
│ │ __init__.py <-- imports local for compatibility
│ │ base.py <-- almost all the settings, reads from proces environment
│ │ local.py <-- a few modifications for local development
│ │ production.py <-- ideally is empty and everything is in base
│ │ testing.py <-- mimics production with a reasonable exeptions
│ │ .env <-- for local use, not kept in repo
│ __init__.py
│ urls.py
│ wsgi.py
settings/.env
A defaults for local development. A secret file, to mostly set required environment variables.
Set them to empty values if they are not required in local development.
We provide defaults here and not in settings/base.py
to fail on any other machine if the're missing from the environment.
settings/local.py
What happens in here, is loading environment from settings/.env
, then importing common settings
from settings/base.py
. After that we can override a few to ease local development.
import logging
import environ
logging.debug("Settings loading: %s" % __file__)
# This will read missing environment variables from a file
# We wan to do this before loading a base settings as they may depend on environment
environ.Env.read_env(DEBUG='True')
from .base import *
ALLOWED_HOSTS += [
'127.0.0.1',
'localhost',
'.example.com',
'vagrant',
]
# https://docs.djangoproject.com/en/1.6/topics/email/#console-backend
EMAIL_BACKEND = 'django.core.mail.backends.console.EmailBackend'
# EMAIL_BACKEND = 'django.core.mail.backends.dummy.EmailBackend'
LOGGING['handlers']['mail_admins']['email_backend'] = 'django.core.mail.backends.dummy.EmailBackend'
# Sync task testing
# http://docs.celeryproject.org/en/2.5/configuration.html?highlight=celery_always_eager#celery-always-eager
CELERY_ALWAYS_EAGER = True
CELERY_EAGER_PROPAGATES_EXCEPTIONS = True
settings/production.py
For production we should not expect an environment file, but it's easier to have one if we're testing something.
But anyway, lest's provide few defaults inline, so settings/base.py
can respond accordingly.
environ.Env.read_env(Path(__file__) / "production.env", DEBUG='False', ASSETS_DEBUG='False')
from .base import *
The main point of interest here are DEBUG
and ASSETS_DEBUG
overrides,
they will be applied to the python os.environ
ONLY if they are MISSING from the environment and the file.
These will be our production defaults, no need to put them in the environment or file, but they can be overridden if needed. Neat!
settings/base.py
These are your mostly vanilla django settings, with a few conditionals and lot's of reading them from the environment. Almost everything is in here, keeping all the purposed environments consistent and as similar as possible.
The main differences are below (I hope these are self explanatory):
import environ
# https://github.com/joke2k/django-environ
env = environ.Env()
# Build paths inside the project like this: os.path.join(BASE_DIR, ...)
BASE_DIR = os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
# Where BASE_DIR is a django source root, ROOT_DIR is a whole project root
# It may differ BASE_DIR for eg. when your django project code is in `src` folder
# This may help to separate python modules and *django apps* from other stuff
# like documentation, fixtures, docker settings
ROOT_DIR = BASE_DIR
# Quick-start development settings - unsuitable for production
# See https://docs.djangoproject.com/en/1.11/howto/deployment/checklist/
# SECURITY WARNING: keep the secret key used in production secret!
SECRET_KEY = env('SECRET_KEY')
# SECURITY WARNING: don't run with debug turned on in production!
DEBUG = env('DEBUG', default=False)
INTERNAL_IPS = [
'127.0.0.1',
]
ALLOWED_HOSTS = []
if 'ALLOWED_HOSTS' in os.environ:
hosts = os.environ['ALLOWED_HOSTS'].split(" ")
BASE_URL = "https://" + hosts[0]
for host in hosts:
host = host.strip()
if host:
ALLOWED_HOSTS.append(host)
SECURE_SSL_REDIRECT = env.bool('SECURE_SSL_REDIRECT', default=False)
# Database
# https://docs.djangoproject.com/en/1.11/ref/settings/#databases
if "DATABASE_URL" in os.environ: # pragma: no cover
# Enable database config through environment
DATABASES = {
# Raises ImproperlyConfigured exception if DATABASE_URL not in os.environ
'default': env.db(),
}
# Make sure we use have all settings we need
# DATABASES['default']['ENGINE'] = 'django.contrib.gis.db.backends.postgis'
DATABASES['default']['TEST'] = {'NAME': os.environ.get("DATABASE_TEST_NAME", None)}
DATABASES['default']['OPTIONS'] = {
'options': '-c search_path=gis,public,pg_catalog',
'sslmode': 'require',
}
else:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
# 'ENGINE': 'django.contrib.gis.db.backends.spatialite',
'NAME': os.path.join(ROOT_DIR, 'data', 'db.dev.sqlite3'),
'TEST': {
'NAME': os.path.join(ROOT_DIR, 'data', 'db.test.sqlite3'),
}
}
}
STATIC_ROOT = os.path.join(ROOT_DIR, 'static')
# django-assets
# http://django-assets.readthedocs.org/en/latest/settings.html
ASSETS_LOAD_PATH = STATIC_ROOT
ASSETS_ROOT = os.path.join(ROOT_DIR, 'assets', "compressed")
ASSETS_DEBUG = env('ASSETS_DEBUG', default=DEBUG) # Disable when testing compressed file in DEBUG mode
if ASSETS_DEBUG:
ASSETS_URL = STATIC_URL
ASSETS_MANIFEST = "json:{}".format(os.path.join(ASSETS_ROOT, "manifest.json"))
else:
ASSETS_URL = STATIC_URL + "assets/compressed/"
ASSETS_MANIFEST = "json:{}".format(os.path.join(STATIC_ROOT, 'assets', "compressed", "manifest.json"))
ASSETS_AUTO_BUILD = ASSETS_DEBUG
ASSETS_MODULES = ('website.assets',)
The last bit shows the power here. ASSETS_DEBUG
has a sensible default,
which can be overridden in settings/production.py
and even that that can be overridden by an environment setting! Yay!
In effect we have a mixed hierarchy of importance:
- settings/.py - sets defaults based on purpose, does not store secrets
- settings/base.py - is mostly controlled by environment
- process environment settings - 12 factor baby!
- settings/.env - local defaults for easy startup
akv
Updated on March 27, 2020Comments
-
akv about 4 years
What is the recommended way of handling settings for local development and the production server? Some of them (like constants, etc) can be changed/accessed in both, but some of them (like paths to static files) need to remain different, and hence should not be overwritten every time the new code is deployed.
Currently, I am adding all constants to
settings.py
. But every time I change some constant locally, I have to copy it to the production server and edit the file for production specific changes... :(Edit: looks like there is no standard answer to this question, I've accepted the most popular method.
-
akv over 14 yearsAnd what about the local development server? is there a way to tell the django webserver (run using
python manage.py runserver
), which settings file to use? -
Edi over 14 yearsThis is the kind of thing that Django's settings being an actual code file allows, and I was hinting at. I haven't done anything like this myself, but it's definitely the sort of solution that might be a better general answer than mine.
-
T. Stone over 14 years@akv if you add --settings=[module name] (no .py extension) to the end of the runserver command you can specify which settings file to use. If you're going to do that, do yourself a favor and make a shell script/batch file with the development settings configured. Trust me, your fingers will thank you.
-
George Godik over 14 yearsthis is the solution I use. hacking up a settings file to be used for both production or development is messy
-
Andre Bossard almost 14 yearsI think its better to use settings.py in development, as you don't have to specify it all the time.
-
John Mee almost 14 yearsTo ease tracking/deployment of new settings, use a "local_settings.py" on the production/testing machines and none on development.
-
daonb over 13 yearsThat's the way I do - adding those lines at the end of settings.py so they can override the default settings
-
Will over 13 yearsAm I correct in assuming this method requires importing of the settings module via the proxy, django.conf.settings? Otherwise you'd need to edit import declarations to point at the correct settings file when pushing live.
-
GrayedFox about 13 yearsCleanest way, especially if you're using version control.
-
Joe Golton over 11 yearsI just ran into this for the first time and chose to (successfully!) use your solution, with a slight difference: I used uuid.getnode() to find uuid of my system. So I'm testing if uuid.getnode() == 12345678901 (actually a different number) instead of the os.environ test you used. I couldn't find documenation to convince me that os.environ['COMPUTERNAME'] is unique per computer.
-
Indrajeet Kumar over 11 yearsThis approach means you have unversioned code running in development and production. And every developer has a different code base.I call anti-pattern here.
-
Tupteq about 11 years@pydanny The problem is that Django stores it's configuration in .py file. You can't expect that all developers and production server will use the same settings, so you need to alter this .py file or implement some alternative solution (.ini files, environment etc.).
-
kaleissin almost 11 yearsDo you have an example of how you load the settings from the ini into Django's settings?
-
rewritten almost 11 yearsSee docs.python.org/2/library/configparser.html . You can load a parser with
config = ConfigParser.ConfigParser()
then read your filesconfig.read(array_of_filenames)
and get values usingconfig.get(section, option)
. So first you load your config, and then you use it to read values for settings. -
teewuane almost 10 yearsI'm still trying to figure out what to set in my project.wsgi and manage.py files for the settings file. Will you shed some light on this? Specifically, in my manage.py file I have
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "foobar.settings")
foobar is a folder with an__init__.py
file and settings is a folder with an__init__.py
file that contains my secrets and imports dev.py, which then imports common.py. EDIT Nevermind, I didn't have a module installed that was required. My bad! This works great!! -
João dos Reis over 9 yearsTwo things: 1) better to set Debug=True in your dev.py rather than =False in your prod.py. 2) Rather than switching in init.py, switch using the DJANGO_SETTINGS_MODULE environment var. This will help with PAAS deployments (e.g. Heroku).
-
João dos Reis over 9 yearsBetter to just maintain different config files, and pick using the DJango standard env variable DJANGO_SETTINGS_MODULE
-
nu everest over 9 yearsos.environ['COMPUTERNAME'] doesn't work on Amazon AWS Ubuntu. I get a KeyError.
-
nu everest over 9 yearsWhen using the UUID this solution has proven to be the best and simplest for me. It doesn't require lots of complicated and over-modularized patchwork. In a production environment, you still need to place your database passwords and SECRET_KEY in a separate file that resides outside of version control.
-
polarcare over 8 yearsWhen I use this setup in django 1.8.4 and try runserver I get "django.core.exceptions.ImproperlyConfigured: The SECRET_KEY setting must not be empty.", even doh I have SECRET_KEY on my init.py file. Am I missing something?
-
JL Peyret over 8 yearsisn't a the use of something like AWS_SECRET_ACCESS_KEY = os.getenv("AWS_SECRET_ACCESS_KEY") more secure? Honest question - I know why you don't want it versioned, but the other alternative is to get it from the environment. Which begs the question of setting the environment variable, of course, but that can be left to your deployment mechanism, no?
-
fmalina over 8 yearsI prefer calling the module
settings_local
as opposed tolocal_settings
to group it withsettings.py
in alphabetical folder listings. Keepsettings_local.py
out of version control using.gitignore
as credentials don't belong to Git. Imagine open sourcing them by accident. I keep in git a template file calledsettings_local.py.txt
instead. -
abbood almost 8 yearsI tried that.. ran into a wall once i tried to run my django unit tests.. i just couldn't figure out how to specify which settings file to read from
-
sobolevn almost 8 yearsI have created a gist for you: gist.github.com/sobolevn/006c734f0520439a4b6c16891d65406c
-
abbood almost 8 years
-
abbood almost 8 yearshere is another question though: my
uwsgi.ini
file has different settings across dev/prod.. any idea of how to make it pick values from my settings file? -
sobolevn almost 8 yearssorry, i don't get the setup. you can ask a separate question with more details and i will try to help you.
-
gtd over 7 years@Tupteq can't you expect that? In the age of virtualization and containers I think you can and should strive for that. Maybe not always possible, but you can selectively address individual issues by defining environment variables to allow overrides as necessary. Sure beats the "Works for Me"-shrug when the junior asks the senior why something isn't working.
-
Costantin about 7 yearsHi, can I ask someone why
except ImportError as e:
?? isn'texcept ImportError:
just enough? -
nbeuchat almost 7 years
os.environ['COMPUTERNAME']
unfortunately does not work on PythonAnywhere. You get a KeyError. -
MadPhysicist almost 7 yearsHow does Django know what is local and what it production? Is there a default that states that whatever is in
local_settings.py
belongs to the local environment? Also, will having this file in production have an effect on the performance and cause unintended behavior? -
cezar over 6 years@MadPhysicist
local_settings.py
is out of your version control, you don't push that file to the production server. This file is available only in your local development environment. If this file gets to the production server, it will certainly cause an unintended behaviour. -
Red over 6 yearsDefinitely an antipattern, but very popular. Developers should know what settings are used in production envrionment. Everything should be versioned except secrets - those should be pulled from environment variables. Deployment tools like Ansible can set it.
-
Štefan Schindler over 5 yearsIf you want to modify not override variables like lists (e.g.
INSTALLED_APPS
), you can import them in the local_settings.py like:from .settings import INSTALLED_APPS
and then in local_settings.py:INSTALLED_APPS += ['autotranslate']
-
dbinott over 5 yearsHey Janusz... so in the .env file would go all the API keys and auth keys and passwords etc? Just like TWILLIO_API = "abc123"? Or TWILLIO_API = env("TWILLIO_API")?
-
Janusz Skonieczny over 5 yearsYes, but this is only a fallback for environment settings. This file is comes handy for development but is not saved in repo or pushed to production where you should strictly use environment settings or your platform equivalent that will in turn set environment settings for the server process.
-
code_dredd over 5 years@pydanny I'd like to hear what your better solution to this problem is. Mind sharing it?
-
code_dredd over 5 yearsBTW, what I'm currently doing is relying on environment variables and having the
project.settings
module rely on pulling them fromos.environ
. -
alkadelik over 3 yearsDo settings in local_settings automatically override those in settings.py?
-
arnon cohen almost 3 yearsHow to define production settings? For example when I'm explicitly defining my DJANGO_SETTINGS_MODULE as website/settings/production, the init file is still loading the local.py settings. How can I avoid it, or am I doing something wrong? @JanuszSkonieczny