Data Management Forum  -  PO Box 20303, Greeley Square Station, New York, N.Y. 10001-0007


Send Page To a Friend
:: Menu ::

We now have 11613
on-line registered users!


We have an email list of over 27,000 interested
IT professionals!


We also have a postal mailing list of over 83,000.

For greater anti-spam accountability, DMForum uses a professional email subscription service.
:: PLATINUM SPONSOR::
Dataguise
Dataguise


:: GOLD SPONSORS::
Commvault
Commvault
Scientel Information Technology, Inc
Scientel Information Technology, Inc

:: MEDIA SPONSORS::
Dama International
Dama International
DAMA NCR
DAMA NCR
Database Trends and Applications
Database Trends and Applications
ibmdatamag
ibmdatamag
Information Management
Information Management
Inmon Consulting
Inmon Consulting

:: PATRONS::
BI Ready
BI Ready
Embarcadero
Embarcadero
EW Solutions
EW Solutions
keyword search
:: People and Portraits › [NONE]
Rodger Nixon
People and Portraits    8/24/2006 6:42:33 PM

Currently Vice President, Data Architecture at Credit Suisse in New York.  Rodger is an expert Data Architect with extensive (more than three decades) experience in all phases of the software development life cycle.  Over the years he has played lead roles in the support of Data and Information Architecture development, Data Modeling, Database Design and Data Management.  He has gained particular expertise in developing very large scale Data Models, including a number of Enterprise Data Models.

 

Originally from New Zealand, Rodger came to the United States twenty years ago to market an innovative software product, EXSYS, developed by a company he founded.  The product, regularly referred to in the press of the time as a “fifth generation language”, used natural language and expert system technology to perform the analysis and design tasks that would normally be performed by a human analyst.  Rodger received a patent for this idea, one of several software patents he currently holds.  Following the sale of the company Rodger worked extensively throughout the world as a consultant specializing in data architecture and database design.

 

In his present role he has been responsible for the development of a broad based data architecture program for the continuous improvement of data management and database design practices.  Rather than new applications, the program emphasis is a structured approach to ongoing improvements to, and rationalization of, some 4,000-legacy databases.  For new applications the emphasis is on the use by developers of a metadata repository based canonical data model to obtain database design patterns to enhance productivity and design quality.  From Rodger’s point of view one strategy involves making it right, the other getting it right first time.

 

Synopsis: Using Data Patterns to Improve Project Delivery

 

Describes a process to improve the speed and quality of database design and development.

 

The process is based on the iterative development and use of a Canonical Data Model (CDM) and supported by the Unicorn Metadata Repository.

 

Rodger Nixon has developed a large, fully attributed, fully normalized, approved model of the data that supports major business processes at Credit Suisse.  This model takes the form of an Erwin Data Model and provides complete coverage of both logical and physical data structures.  The model is a template for best practice.  The idea is to kick start projects by providing reusable structures of high quality.  The purpose of the CDM is practical, not to serve as an EDM like definition of the business.  That is a secondary goal.  Though the tem “patterns” is useful for conveying the concept of reuse, the model is a single, fully integrated object rather than a series of disconnected ideas.

 

Through an “adapter” the CDM is regularly loaded into the Unicorn Metadata repository where it forms the basis of the ontology.  Think of the ontology as a very sophisticated, highly structured, data dictionary.  Unicorn provides facilities that enable the content of the repository to be easily searched and examined.  It has a custom interface that allows a list of the data falling within the scope of any project to be easily compiled.  Having defined all the data that may be useful that data is extracted from the CDM as a base Project Data Model that will typically contain over half the data needed by a project.  That data model is fully compliant with standards and suitable for schema generation from day one.  The project spends the rest of its time enhancing that model and making it project specific.  At project end the content of the project model is examined and additions and corrections fed into the CDM.  From there into a new release of the ontology and the cycle repeats.