What’s a djangonautic way of handling default settings in an app if one isn’t defined in settings.py?

I’ve currently placed a default_settings file in the app and I’ve considered a few options. I’m leaning towards the first option, but there may be pitfalls I’m not aware of in using globals()

I’ve mostly seen apps do a FOO = getattr(settings, 'FOO', False) at the top of the file that uses the setting but I think there are readability/repetition problems with this approach if the values / names are long.


1: Place settings in a function and iterate over locals / set globals

def setup_defaults():
    FOO = 'bar'
    for key, value in locals().items():
        globals()[key] = getattr(settings, key, value)

setup_defaults()

Pros:

  • Only have to write var name once to pull default of same name from django settings.

Cons:

  • Not used to using globals() and don’t know of any implications

2: Write getattr(settings, 'MY_SETTING', default_settings.MY_SETTING) every call

Pros:
– Very clear.

Cons: – Repetitive


3: Always define settings as FOO = getattr(settings, 'FOO', '...setting here...')

Pros:
– Defaults are always overridden

Cons:

  • Repetitive (must define var twice – once in string form, once in var)
  • Setting is not as readable since it’s now the third argument

4: Create utility function to get_or_default(setting)

Pros:

  • Simple
  • Don’t have to repeat string representation of setting

Cons:

  • Have to call it

5: Create a settings class

class Settings(object):
    FOO = 'bar'

    def __init__(self):
         # filter out the startswith('__') of 
         # self.__dict__.items() / compare to django.conf.settings?

my_settings = Settings()

Cons:

  • Can’t do from foo.bar.my_settings import FOO (actually, that’s a terrible deal breaker!)

I’d love to hear feedback.

I think it’s quite common to create a settings.py in your app’s package, where you define your settings like this:

from django.conf import settings
FOO = getattr(settings, 'FOO', "default_value")

In your app you can import them from your app’s settings module:

from myapp.settings import *

def print_foo():
    print FOO

But I think everybody agrees that Django is lacking a better generic architecture for this! If you’re looking for a more sophisticated way to handle this, there are some third party apps for this like django-appconf, but it’s your decision if you want to introduce one more dependency for your app or not!

Updated for 2020

In settings.py, put settings.* before the property.

from django.conf import settings
settings.FOO = getattr(settings, 'FOO', "default_value")

It seems that every solution I see there tends to create an internal copy of application settings, proxy, wrap or whatever. This is confusing and creates problems when settings are modified in run time like they do in tests.

To me all settings belong in django.conf.settings and only there. You should not read them from somewhere else nor copy it for later use (as they may change). You should set them once and don’t bother about defaults later on.

I understand the impulse to drop the app prefix when app setting is used internally, but this also is IMHO a bad idea. When in trouble looking for SOME_APP_FOO will not yield results, as it’s used just as FOO internally. Confusing right? And for what, few letters? Remember that explicit is better?

IMHO the best way is to just set those defaults in Django’s own settings, and why don’t use piping that is already there? No module import hooks or hijacking models.py being always imported to initialize some extra and complicated meta class piping.

Why not use AppConfig.ready for setting defaults?

class FooBarConfig(AppConfig):
    name="foo_bar"

    def ready(self):
        from django.conf import settings
        settings = settings._wrapped.__dict__
        settings.setdefault('FOO_BAR_SETTING', 'whatever')

Or better yet define them in clean simple way in a separate module and import them as (or close to how) Settings class does it:

class FooBarConfig(AppConfig):
    name="foo_bar"

    def ready(self):
        from . import app_settings as defaults
        from django.conf import settings
        for name in dir(defaults):
            if name.isupper() and not hasattr(settings, name):
                setattr(settings, name, getattr(defaults, name))

I’m not sure use of __dict__ is the best solution, but you get the idea, you can always user hasattr/setattr combo to get the efect.

This way your app settings are:

  1. exposed to others — if they should rely on them in some rare cases, if of course apps are configured in order rely on each other
  2. read normally as any other setting
  3. nicely declared in their own module
  4. lazy enough
  5. controlled how they’re are set in django.conf.settings — you can implement some transposition of names if you want to

PS. There is a warning about not modifying settings in run time but it does not explain why. So I think this one time, during initialization may be a reasonable exception 😉

PS2. Don’t name the separate module just settings as this may get confusing when you import settings from django.conf.

How about this?

In myapp/settings.py:

from django.conf import settings

FOO = 'bar'
BAR = 'baz'

_g = globals()
for key, value in _g.items():
    _g[key] = getattr(settings, key, value)

In myapp/other.py:

import myapp.settings

print myapp.settings.FOO

Given this answer by ncoghlan, I feel ok using globals() this way.

In response to Phil Gyford’s comment, exposing the problem of settings not overwritten in tests (since already imported in modules), what I did was to define an AppSettings class in __init__.py with:

  • an __init__ method to initialize each setting to None
  • a load method to load every settings from getters
  • static getters for each setting

Then in the code:

from . import AppSettings

def in_some_function():
    some_setting = AppSettings.get_some_setting()

Or if you want to load them all in once (but overriding settings in tests won’t work for the impacted module):

from . import AppSettings

app_settings = AppSettings()
app_settings.load()

def in_some_function():
   print(app_settings.some_setting)

You can then use the override_settings decorator in your tests, and still have some DRY and clear way of using app settings, at the cost of more instructions executed each time you want to get a setting (just for tests…).

you can use django-zero-settings which lets you define your defaults and a setting key for user settings to auto-override defaults, has auto-import strings, removed settings management, cache, pre-checks, etc.

to create app settings like your example:

from zero_settings import ZeroSettings

app_settings = ZeroSettings(
    key="APP",
    defaults={
        "FOO": "bar"
    },
)

then you can use it like:

from app.settings import app_settings

print(app_settings.FOO)  # which prints bar

user settings will auto override defaults, like:

# this is settings.py, Django settings file
SECRET_KEY = "some_key"
# other settings ...
# the key `APP` is same key arg for ZeroSettings
APP = {
    "FOO": "not_bar"
}

and then:

from app.settings import app_settings

print(app_settings.FOO)  # this time you get not_bar

Number 3 is best because it is the most simple one. And very consistent look.

Number 1: it is easy to overlook it. If I’ll open your code and I won’t scroll to the bottom I’ll miss it and I will think that settings can’t be overridden in my own module.

Number 2: is not only repetitive, it is harder to read because it is too long, also default values will be defined multiple time and scattered all over your code.

Number 4: non-consistent-look, repetitive calls.

Number 5: Non consistent, we expect settings to be defined in a module not in a class. Well at least I do expect to find to be defined as module because I’ve seen many apps using method 3, and I use it my self so I might be biased.