Default settings in Django

A problem with Django is that you can't have default settings unless you're a core Django developer.

This would be nice to have per-app or even per-funky-module, so I decided I'd implement something to support it!

Here's how I did it: create a base class, with a metaclass. When this base class is subclassed, take the attributes and loop through them. If the attribute is all uppercase and isn't set in the main configuration (yet), copy it there.

This ended up looking like:
class GlobalDefaultSettingsBase(type):
    def __new__(cls, name, bases, attrs):
        from django.conf import settings
        new = super(GlobalDefaultSettingsBase, cls).__new__
        new_cls = new(cls, name, bases, attrs)
        if name == "GlobalDefaultSettings":
            return new_cls
        for (attr, value) in attrs.iteritems():
            if attr == attr.upper() and not hasattr(settings, attr):
                print "SETTING", attr, "TO", repr(value)
                print settings
                print settings._target
                setattr(settings, attr, value)
        return new_cls

class GlobalDefaultSettings(object):
    """A base class for adding default global settings to the Django settings

    All attributes which are in the upper case will be merged into the existing
    configuration, unless they're already set.

    This is a simple solution for having default settings without `getattr`.
    All you have to insure is that your subclass is constructed before the
    access of the defaults you provide -- which should be fairly easy.
    __metaclass__ = GlobalDefaultSettingsBase

To use it, simply make a subclass and set some settings!

class MyDefaults(DefaultSettings):
MY_SETTING_A = "Hello world!"
MY_SETTING_B = ["Hey guys"]

And that's all you need. If either of the settings is set in the project's, that setting will effectively take precedence!

Obviously, there's no great place to put this code since it really belongs to core Django.

In fact, Django should itself be using this style - as it is now, all default settings are specified in one huge file without any good correlation.

What I'd like for logging

Python's logging package is great and all, but it's only half of the solution.

I often find myself in a situation where I'd like to debug live processes, and that's pretty hard when they're FastCGI scripts or other daemons.

Sure, you could have a log file which you spam with information on a large scale, but that's not very helpful.

Sure, you could have N log files with individual output, but that's not very manageable.

What I'd like to have is a good log server which receives these masses of information, and couples every log line with arbitrary key-value pairs.

Then, you'd tap into the stream of information, specifying expressions that must hold true. For example, you could specify that "pid must be one of 1324, 1325, or 1326" and that "level must be above 10".

That way, you'd be able to narrow down one process, and only the output you'd like to have. You'd also avoid having to store all data -- if nothing taps the log lines received, they're simply discarded (or stored to disk based on arbitrary conditions).
$ logtap 'pid in (1234, 1235, 1236) and level > 10'
20:26:01,513    INFO the fox is now red

pylibmc & handmc released!

The all-new libmemcached Python wrapper is out now!

RSS 2.0