Database Migrations

SQLAlchemy Migrate 0.5 was released last week, and it got me thinking about how important it is.

Evolutionary Database design and data schema refactoring critical to most projects these days, and pretty much everywhere I’ve worked over the past 10 years, we’ve handled database upgrades via a set of sql upgrade and downgrade scripts, a db version number in the db, and a upgrader/downgrader python thingamabober.

A few years ago I did a couple projects in Rails, and Rails Migrations made this easier, by providing a standard API. There were limitations, and worries about lost data, and sometimes it wasn’t clear what SQL would be generated — but it was good enough, and worked well enough, that everybody used it.

104271343_957605dcd2But what I don’t like about ActiveRecord, and other ORMs like it is that every object-property change ultimately requires an underlying database change. On large projects I really want the flexiblity of a data-mapper to insulate some of my OO code and OO design decisions from the need for database level changes. I think this is incredibly important on very complex projects where the “data model” needs to change quite frequently, but the underlying database schema needs to change less frequently. SQLAlchemy on the other hand provides a clear, simple, elegant solution to that problem by decoupling the Objects from the Relations by providing an explicit Mapper.

But even with the flexibility of a Data Mapper based ORM, you still need to update your database schema sometimes. And that’s where SQLAlchemy-Migrate comes in. I think it or something like it is an indispensable tool in the SA user’s toolkit.

TG2 ships with both SQLALchemy and SQLALchemy-Migrate. And provides a little bit of help to make migration development even easier.

There’s more we can do to integrate Migrations into TG2, mainly by documenting it better, and setting it up a bit more in quickstart. I know there’s a trade-off there too, because for a lot of smaller applications migrations are too much overhead at the beginning, you can more easily wipe and recreate your database as you make changes, and we should support that way of working — so nothing we do should make that harder.

With that said, I’ll know we have arrived at the right place when we’ve made evolutionary database development or database schema refactoring easy enough that it feels natural and easy, and not doing it feels uncomfortable.

3 Responses to “Database Migrations”


  1. I am using it and it is a pain. The migration tool should really save the SQLAlchemy state when I say that there is an update and automatically compute a transition from the old schema to the new one at least for the simple cases of adding and removing tables and columns and changes to column types where automatic conversion is possible (such as varchar(20) -> varchar(40)). It is nontrivial to write correct migration scripts for such things, especially in a way that will support both SQLite and MySQL, for example.

  2. 2Bob

    Mark,

    Would you say more about SQLAlchemy’s ability to allow the data model to change while keeping the underlying db schema stable? If more properties are added to the data model, where would the additional data go if not into the db?

    Thanks!

  3. Well, if you are adding new properties you need to reflect those somewhere in the schema. But if you just want a particular address (home rather than business) to appear directly on every User object now, you can just update your mappers. So what I’m talking about is rearanging object properties without having to rearrange your tables.

    It’s all about the ability to decouple the OO design from the DB design. Sure you may have to change both sometimes, but not every time. And you can move some stuff around on either side more easily when they aren’t 100% coupled.

Comments are currently closed.