It's been a couple of months, but I finally remembered that I wrote part one of a multipart post without actually putting any thought into the remaining parts. In part 1 of this series, I described our data tier's layout and the problems that we have around putting a decent framework in front of it. I got the DATABASES setting to correctly handle the conversion between MSSQL connection strings and the database configuration object that Django's ORM expects, and got database access to any of our databases through Django's Model.objects.using('databasename') syntax.

The next challenge to overcome was that of dynamic schemata: models that exist in multiple databases and have fields that can be added or removed by the system's users. At any given time, there are no guarantees on which columns are present in certain tables. Therefore, we have to determine at query-time which items to retrieve and which ones to update.

This, of course, seemed like unnecessary overhead in the average case. In most cases, there is little to no need to pull back all of these dynamic columns when only the base columns are needed, so I opted for a second, optional call to retrieve the dynamic fields after the base object has been retrieved.

I first set up a base class for our models to give some across-the-board functionality.

import copy  
from django.db import models  
from sfdb.database.query import query, nonquery

class SfBaseModel(models.Model):  
    def _get_field_column_names(self):
        return [field.db_column for field in self._meta.fields]

    def _get_column_names_from_db(self):
        # Pull all column names from INFORMATION_SCHEMA for the current model's 
        # table name.
        information_schema_query = """SELECT COLUMN_NAME
                                      FROM INFORMATION_SCHEMA.COLUMNS
                                      WHERE TABLE_NAME = '%s'""" 
                                          % self._meta.db_table

        results = query(

        return [r['COLUMN_NAME'] for r in results]

    def _get_model_field_names(cls):
        return [ for f in cls._meta.fields]

    def get_field_db_column_name(cls, field_name):
        for f in cls._meta.fields:
            if == field_name:
                return f.db_column
        return None

    def get_field_name_by_db_column(cls, db_column):
        c_name = db_column.lower()
        for f in cls._meta.fields:
            if f.db_column.lower() == c_name:
        return None

    class Meta:
        abstract = True

Then I was able to write an inheriting model class that implemented the Dynamic model functionality and overrode the base save functionality.

class DynamicSchemaModel(SfBaseModel):  
    _dynamic_field_columns = None
    _dynamic_field_dbstate = None
    _dynamic_fields = None

    custom_fields = property(lambda self: self._get_dynamic_fields())

    def _diff_dynamic_fields(self):
        diff = {}
        for k, v in self._dynamic_field_dbstate.iteritems():
            if self._dynamic_fields[k] != v:
                diff[k] = self._dynamic_fields[k]
        return diff

    def sync_dynamic_fields(self, force_refresh=False):
        # Cache field names on the object; it's unlikely that we'll have a 
        # change in schema between calls.
        # TODO: For performance, these should probably be cached somewhere with 
        # a wider scope.
        if force_refresh or self._dynamic_field_columns is None:
            db_table_columns = self._get_column_names_from_db()
            model_columns = self._get_field_column_names()

            self._dynamic_field_columns = list(

        column_name_string = ','.join(self._dynamic_field_columns)

        # PK should always be an Auto or Integer field currently, unless this is 
        # expanded to tables other than:
        # Contacts, Accounts, Opportunities
        query_string = """SELECT %s FROM %s WHERE %s = %s"""

        dynamic_fields_dict = list(query(query_string,
        self._dynamic_field_dbstate = dynamic_fields_dict
        self._dynamic_fields = copy.copy(self._dynamic_field_dbstate)

    def _get_dynamic_fields(self):
        if self._dynamic_fields is None:
        return self._dynamic_fields

    def get_field_value(self, field_name):
        value = getattr(self, field_name, None)
        if not value:
            value = self.custom_fields.get(field_name, None)
        return value

    def get_attribute_by_column(self, column_name, default=None):
        field_name = self.get_field_name_by_db_column(column_name)
        if field_name:
            val = getattr(self, field_name)
            val = self.custom_fields.get(field_name, None)
        return val

    def save(self, *args, **kwargs):
        # We need to save the dynamic fields to the database after the base
        # fields have been saved, but only if needed
        super(DynamicSchemaModel, self).save(*args, **kwargs)

        if not (self._dynamic_fields and self._dynamic_field_dbstate):

        # XOR the dicts to get the modified things
        modified_fields = set(self._dynamic_field_dbstate.items()) ^ set(self._dynamic_fields.items())

        if len(modified_fields) > 0:
            # we have modified dynamic fields, so we need to save them
            update_query = """UPDATE %s SET %s WHERE %s = %s"""

            dict_diff = self._diff_dynamic_fields()
            assignments = []
            for k, v in dict_diff.iteritems():
                    assignments.append("%s = %s" % (k, str(v)))
                except ValueError:
                    assignments.append("%s = '%s'" % (k, str(v)))

            assignments_string = ','.join(assignments)

            update_query = update_query % (
                self._meta.db_table, assignments_string,, str(

            nonquery(update_query, using=self._state.db)

            self._dynamic_field_dbstate = copy.copy(self._dynamic_fields)

    class Meta(SalesfusionModel.Meta):
        abstract = True

I know I promised to get into the custom authentication backend in part 2, but I think that the above wall of code is good enough. Next time!