Superscript

…back to Change Management.

In my earlier post, I talked about the strong change management ethic at my current company.  It was interesting to see how my predecessor here adapted database schema evolution to that ethic.

I need to add here that I find “ancient” software development organizational ideas like “chief” or “surgeon” programming teams to be fascinating.  In particular because each model describes “super” programmers who end up being responsible for 80-90% of the code anyway due to shear talent and dominant personality.  Almost everywhere I’ve gone has had a local expert who seemed to qualify for that kind of title.  The relevance of ideas from Mills and Brooks is commented on here.

The biggest thing I’ve taken away from those ideas of “super” programmers is that, as a manager, I need to help every member of my staff become as close to “super” as possible.  That means removing as many non-productive, non-programming, non-learning tasks as possible.

Another way to put this is to find out ways to reduce the effort and work required to do non-programming tasks.

This is where the DDL generation scripts came into the picture.

When I arrived, every programmer hand-coded the scripts which were checked into the source code control system.  Everyone had to remember to include the source code control tags, and script names weren’t obvious.  Also, it wasn’t always clear that every changed object and/or procedure was being captured with every release.

To address those issues, I:

  1. Required that all code checked into the source code control system had to be generated via the DDL generation scripts.  We modified the scripts to include the source code control tags automatically.  We also did things like include primary key indexes, grants and initial stats analysis in table creation scripts.  We also injected source code control tags into the stored procedure and package code so that they could be seen when browsing the database.
  2. Modified the scripts to include the object type, schema name and object name in the generated file name.  So, a script that created the SCOTT.EMP table ended up being called mk-tbl-scott-emp.sql  (We used mk for created, rm for drop and ch for alter).
  3. Turned on DDL auditing for all objects in the development database.  This allowed us to double-check and make sure we captured and verified which objects were changed leading up to a release.

A note on DDL auditing — I like to try and have this turned on in every database I work with.  Development, Test, QA, Production, you name it.  I think it’s always useful to have a log of structure changes to the database — and why code up a special trigger when the database will track it automatically?  The only objection I’ve heard to this is space — and the space required is so ridiculously small.  Sometimes people complain about performance — and I say that if you have a system that relies on 1000′s of DDLs per day, you’ve got a system that needs work.  Who does DDL as part of an OLTP transaction?

Our biggest remaining challenge with this approach was how to create the “master” installation “super” scripts.  I’ll cover that in a later post…

(One other concern is the shear number of files this approach can generate.  If Oracle did this for the SYS schema, you’d have over 4,000 individual files.  On the other hand, this works well when you only need to re-create a single object without searching through 4-5 cat scripts to find the object definition.  Other systems use object definition files which are then read by specialized programs to generate installation scripts — that seems like a lot of work for an internal corporate application).

3 Responses to “Superscript”

  1. Hilarious Diff : Oracle Musings Says:

    [...] In my last post, I talked about how I like to remove non-technical work from my staff by giving them tools to do “mundane” things like check code into source code control automatically and how our DDL generation scripts do most of the work. [...]

  2. John Hawksworth Says:

    I too have just gotten pig sick of clever build scripts, and am coding up (in sql plus naturally) a bunch of scripts to extract all the necessary DDL to build a new schema, from an extant db.

    I too put a DBMS_OUTPUT statement in each file, so it precisely logs its activity. Like you, each file “does what it says on the tin”. A file to add table ‘Fred’ to schema ‘test’ will be called ‘CREATE_FRED_IN_TEST.SQL’.

    We’ve just changed from VSS to SourceOffSite (SOS) for our source control. What do you use? Like you, I end up with lots of files, but fault finding and de-bugging is a piece of cake. Since each file only contains a few lines, even a junior can easily fix it.

    I’d be very interested to hear how you progress with this method, as independently of you, I have adopted roughly the same methods, lots of small files, generated by DBMS_METADATA and sqlplus.

    With the upgrade issue, I too wonder about extracting a couple of thousand files, just to alter one or two. Any thoughts on this yet?

    Regards

    J

  3. ddelmoli Says:

    We used a product from McCabe Software called TrueChange — I’ve never seen it anywhere else, but it did the job just fine.

    Our packaging method was simply a tarball of the source directories which was untarred on the target server.

    I suppose you could simple untar the specific upgrade directory — that would reduce the footprint of the delivery.

Leave a Reply

Posting code can be a pain. To make sure your code doesn't get eaten, you may want to pre-format it first by using HTML Encoder