Data Management Forum  -  PO Box 20303, Greeley Square Station, New York, N.Y. 10001-0007


Send Page To a Friend
:: Menu ::

We now have 11610
on-line registered users!


We have an email list of over 27,000 interested
IT professionals!


We also have a postal mailing list of over 83,000.

For greater anti-spam accountability, DMForum uses a professional email subscription service.
:: PLATINUM SPONSOR::
Dataguise
Dataguise


:: GOLD SPONSORS::
Commvault
Commvault
Scientel Information Technology, Inc
Scientel Information Technology, Inc

:: MEDIA SPONSORS::
Dama International
Dama International
DAMA NCR
DAMA NCR
Database Trends and Applications
Database Trends and Applications
ibmdatamag
ibmdatamag
Information Management
Information Management
Inmon Consulting
Inmon Consulting

:: PATRONS::
BI Ready
BI Ready
Embarcadero
Embarcadero
EW Solutions
EW Solutions
keyword search
by Bill Inmon Content type: Downloads
Category: Data Warehouse Document

DW2.0 is the next generation of data warehousing. Data warehousing began in the mid 1980’s. Since then there have been many advances in architecture, technology and information systems. Today those advances have been woven into the next generation of data warehousing. The first generation of data warehousing featured transaction data that was integrated and placed on disk storage. There were other features of early first generation data warehousing, such as the advent of ETL. But there were also many missing features and functions that at the time were not recognized as belonging in a data warehouse.

by Cliff Longman, CTO, Kalido Content type: Downloads
Category: Data Warehouse Document

Data warehouses provide accurate, timely management information for large enterprises. In a business world characterized by frequent, far-reaching change, data warehouses quickly lose their relevance if they do not adapt to this change. When built using traditional software development methodologies, data warehouses are costly, and do not adapt efficiently to change. Data Warehouse Lifecycle Management (DWLM) is a new discipline for the creation and ongoing management of a data warehouse throughout its operational life. DWLM delivers enterprise-scale data warehouses that adapt efficiently to change, at lower cost than traditional software development methodologies. DWLM employs two development methods: Rapid Iteration and Federation. These methods enable organizations to implement an enterprise-wide data warehouse in bite-sized pieces.

by Bill Inmon Content type: Downloads
Category: Data Warehouse Document

Glossary of Data Warehousing Terms

Compiled by Bill Inmon
President, Inmon Data Systems

by Wayne P. Milano Content type: Downloads
Category: Data Warehouse Document



by Pete Stiglich Content type: Downloads
Category: Data Warehouse Document

Enabling high quality analytics through a Data Validity Dimension
While working on an Enterprise Data Warehouse for a state court system the issue of poor data quality in the source systems became apparent. Referential integrity was not strictly enforced and there was very little in the way of attribute level constraints. One normally expects that these types of constraints be enforced for an OLTP application, whether through the application, in the database, or both. Of course, one should never be surprised when there is poor data quality in the source systems – poor data quality is the norm rather than the exception. According to The Data Warehouse Institute (TDWI) over $600 billion a year is lost due to poor data quality.

by Pete Stiglich Content type: Downloads
Category: Data Warehouse Document

Facilitate Customer Integration using Generic Dimensional Modeling Techniques
If you are undertaking a Customer Data Integration (CDI) or Customer Master Data Management (MDM) project as part of a Dimensional Modeling endeavor, how will you tackle the problem of how to store customer addresses?

by Philip Russom Content type: Downloads
Category: Data Warehouse Document

Master data consists of facts that define a business entity, facts that may be used to model one or more definitions or views of an entity. Entity definitions based on master data provide business consistency and data integrity when multiple IT systems across an organization (or beyond) identify the same entity differently.

by Neil Raden Content type: Downloads
Category: Data Warehouse Document

ROI: Cost/Benefit Justification for the Data Warehouse ROI (Powerpoint Slides)
Some topics include:
The Elusive ROI for IT
Is a data warehouse an infrastructure investment?
Basics: Present Value (PV)
A Simple Example
Break-Even Time
Break-Even Analysis
Let’s Start With Benefits
Are The Usual Promises Reasonable?
IT Investment Appraisal Method


by Bill Inmon Content type: Downloads
Category: Data Warehouse Document

In the beginning were simple applications. They were shaped around requirements, which were used to determine the content, structure, and processing of an application. Soon there were many applications. With the large number of applications came a great deal of discomfort. Simply building applications led to many problems, such as:
•The inability to look at data across the enterprise. Each application had its own version of reality. The problem was that no two applications had the same notion of what was reality, so there were as many interpretations as there were applications. Management’s ability to make decisions was greatly impaired by the inaccuracy and incomplete definitions of information that appeared in the applications environment.
•The inability to make changes gracefully. Once a change was needed, it had to be made in many places, which required coordination of efforts. Simply stated, once many applications appeared, there was no way the information of the organization could be changed gracefully.
•The inability to make simple accesses of data. Once data became locked up in the applications environment, it was “in jail.” The application structure of the data optimized the storage of data and the ability to do speedy online transaction processing. There was no thought given to the need for access of data. ( Inmon )

by Bill Inmon Content type: Downloads
Category: Data Warehouse Document

For years the IT organization has had a dilemma. The legacy applications developed over the years are unintegrated and require so many resources for maintenance and other care that the IT organization does little more than take care of the legacy environment. No resources are left over for new development or experimentation with other promising technologies. The legacy application environment has become the child that consumes the parent.

by Bill Inmon and Robert H. Terdeman Content type: Downloads
Category: Data Warehouse Document

The Corporate Information Factory consists of many existent process based systems each inter-linked by information. In order for the factory to process successfully, each component in the production line must share a consistent and robust infrastructure. This infrastructure has certain requirements. These requirements include, but are not limited to: change resilience, redundancy, non-disruptive upgrade-ability, and expandability. Few infrastructure providers treat technology as an enterprise requirement in architecture to support the Corporate Information Factory. It is mission critical, in the next decade, to plan proactively for an Enterprise Infrastructure to retain competitive advantage in the e-commerce age. The companies that do plan for the right infrastructure will prosper. Written by W.H. Inmon and Robert H. Terdeman

by Bill Inmon Content type: Downloads
Category: Data Warehouse Document

This white paper is divided into three sections – Parts I and II describe the Corporate Information Factory (CIF) and several important perspectives about the building of the Corporate Information Factory. Part III describes a high level methodology of what the different activities of Corporate Information Factory development are and in what order the different activities are to be executed.

by Bill Inmon Content type: Downloads
Category: Data Warehouse Document

This white paper is divided into three sections – Parts I and II describe the Corporate Information Factory (CIF) and several important perspectives about the building of the Corporate Information Factory. Part III describes a high level methodology of what the different activities of Corporate Information Factory development are and in what order the different activities are to be executed.

by Bill Inmon Content type: Downloads
Category: Data Warehouse Document

This white paper is divided into three sections – Parts I and II describe the Corporate Information Factory (CIF) and several important perspectives about the building of the Corporate Information Factory. Part III describes a high level methodology of what the different activities of Corporate Information Factory development are and in what order the different activities are to be executed.

by Bill Inmon Content type: Downloads
Category: Data Warehouse Document

The Zachman framework specifies that many different aspects of an architecture be identified and described. In particular, it requires that systems and components of the information systems architecture be identified and positioned. The Government Information Factory fulfills all of these requirements. Furthermore, it takes the components that have been specified and describes them in greater detail. Specifically, the GIF identifies and positions a government’s information needs in terms of: • Definition of scope, • Identification of the enterprise model, • Articulation of needed systems, • The technology underpinning the architecture, • Needed architectural components, • The functioning systems of the architecture.

by Bill Inmon Content type: Downloads
Category: Data Warehouse Document

The GIF (GOVERNMENT INFORMATION FACTORY) Inmon Associates’ GIF is an information systems blueprint for government agencies. The blueprint is for federal, state, and local agencies, and takes into consideration the needs for: • Operational processing, • Informational processing, • Multidimensional processing, • Managing very large amounts of data, • Being responsive to changing and unknown conditions, • High availability, • Good response time for transactions, • Multidimensional reporting, • Data mining and exploration, and so forth. In addition, the GIF takes into account the need for: • Interagency passage of data, • Integrated electronic security, • Predictive security (the ability to use data to anticipate threats before they occur), • Reconciliation of data, • Addressing the challenges of stovepipe systems. When it comes to systems modernization, the GIF is the premier blueprint for government agencies.

by Bill Inmon Content type: Downloads
Category: Data Warehouse Document

The first introduction of SAP to most corporations is as an ERP vendor. SAP's ERP products address the challenge of legacy applications. ERP technology allows organizations to get a handle on the integration and modernization of daily transactions that run the operations of the business. This paper examines how SAP fits into the CIF model.

by Bill Inmon Content type: Downloads
Category: Data Warehouse Document

In the beginning were simple applications. They were shaped around requirements, which were used to determine the content, structure, and processing of an application. Soon there were many applications. With the large number of applications came a great deal of discomfort. Simply building applications led to many problems, such as:
•The inability to look at data across the enterprise. Each application had its own version of reality. The problem was that no two applications had the same notion of what was reality, so there were as many interpretations as there were applications. Management’s ability to make decisions was greatly impaired by the inaccuracy and incomplete definitions of information that appeared in the applications environment.
•The inability to make changes gracefully. Once a change was needed, it had to be made in many places, which required coordination of efforts. Simply stated, once many applications appeared, there was no way the information of the organization could be changed gracefully.
•The inability to make simple accesses of data. Once data became locked up in the applications environment, it was “in jail.” The application structure of the data optimized the storage of data and the ability to do speedy online transaction processing. There was no thought given to the need for access of data. ( Inmon)

by Bill Inmon Content type: Downloads
Category: Data Warehouse Document

From an age of applications and the confusion over application based information in the corporation arose the concept of a data architecture and data warehousing. Into the miasma came Bill Inmon’s best selling book - BUILDING THE DATA WAREHOUSE. And there was Kimball’s software company – RedBrick Systems. And soon the world of data warehousing was born. It was the late 1980’s and the world was about to witness the rise of analytical processing, business intelligence and a whole host of technologies never before seen that would change the world forever.

by Bill Inmon Content type: Downloads
Category: Data Warehouse Document

The data warehouse environment - like all other computer environments - requires hardware resources. Given the volume of data and the type of processing that goes against the data, the data warehouse is capable of consuming large amounts of resources. For organizations that want to be in a proactive stance - where hardware resource utilization is not a surprise and the response time of a system is anticipated ahead of building the system, capacity planning for the data warehouse environment is a very important exercise. There are several aspects to the data warehouse environment that make capacity planning for the data warehouse a unique exercise. The first factor is that the workload for the data warehouse environment is very variable. In many ways trying to anticipate the DSS workload requires imagination. Unlike the operational workload that has an air of regularity to it, the data warehouse DSS workload is much less predictable. This factor, in and of itself, makes capacity planning for the data warehouse a chancy exercise.

Page  1 2 NEXT