Data Management Forum  -  PO Box 20303, Greeley Square Station, New York, N.Y. 10001-0007


Send Page To a Friend
:: Menu ::

We now have 11609
on-line registered users!


We have an email list of over 27,000 interested
IT professionals!


We also have a postal mailing list of over 83,000.

For greater anti-spam accountability, DMForum uses a professional email subscription service.
:: PLATINUM SPONSOR::
Dataguise
Dataguise


:: GOLD SPONSORS::
Commvault
Commvault
Scientel Information Technology, Inc
Scientel Information Technology, Inc

:: MEDIA SPONSORS::
Dama International
Dama International
DAMA NCR
DAMA NCR
Database Trends and Applications
Database Trends and Applications
ibmdatamag
ibmdatamag
Information Management
Information Management
Inmon Consulting
Inmon Consulting

:: PATRONS::
BI Ready
BI Ready
Embarcadero
Embarcadero
EW Solutions
EW Solutions
keyword search
by Krish Krishnan Content type: Downloads
Category: Big Data Document

One of the most richest and deepest sources of data exists is the world of Telco and Wireless. There is data from switches, towers, call centers, billing systems, devices (search, applications and other data), customer data, service data, contracts, inter-carrier relationship and revenue data, customer relationship management systems, sales databases, website activity data and search appliances datasets. The biggest issues that Telco’s and Wireless carriers, service providers and licensed operators face is the limited access to transactional data collected across different systems, and no ability to access data at tower, switch, call center and other sources like social media.

Content type: Images
Category: People and Portraits

Nicole Bradley

by Joab Jackson,Government Computer News Content type: Downloads
Category: Metadata Design Document

Michael Daconta, the Homeland Security Department’s metadata program manager, served as the lead architect for the DRM. The working group submitted the revised draft to the CIO Council in October, which reviewed the document and submitted it to OMB a month later.
Reference: > http://appserv.gcn.com/cgi-bin/udt/im.display.printable?client.id=gcndaily2&story.id=37824

by Philip Russom Content type: Downloads
Category: Data Warehouse Document

Master data consists of facts that define a business entity, facts that may be used to model one or more definitions or views of an entity. Entity definitions based on master data provide business consistency and data integrity when multiple IT systems across an organization (or beyond) identify the same entity differently.

by Neil Raden Content type: Downloads
Category: Data Warehouse Document

ROI: Cost/Benefit Justification for the Data Warehouse ROI (Powerpoint Slides)
Some topics include:
The Elusive ROI for IT
Is a data warehouse an infrastructure investment?
Basics: Present Value (PV)
A Simple Example
Break-Even Time
Break-Even Analysis
Let’s Start With Benefits
Are The Usual Promises Reasonable?
IT Investment Appraisal Method


by Services and Components Based Architectures Commit Content type: Downloads
Category: Documents

This document serves as an “Executive Strategy” for planning and implementing modern information technology (IT) architectures within the Federal Government. The specific architecture it describes, Services and Components Based Architecture (SCBA), leverages the Federal Enterprise Architecture (FEA) and builds upon the concepts, principles, and benefits of Service Oriented Architecture (SOA) – an architecture designed to maximize the reuse of components and services and one of the most promising and widely accepted architectural approaches to-date. SCBA represents a practical, results-oriented, approach to modernizing enterprises. It is intended to help organizations reduce long-term costs, improve quality of service, improve information sharing, and help achieve a vision of flexible business processes supported by customer-focused applications, which can be altered in a matter of days instead of months.

by Bill Inmon Content type: Downloads
Category: Data Warehouse Document

In the beginning were simple applications. They were shaped around requirements, which were used to determine the content, structure, and processing of an application. Soon there were many applications. With the large number of applications came a great deal of discomfort. Simply building applications led to many problems, such as:
•The inability to look at data across the enterprise. Each application had its own version of reality. The problem was that no two applications had the same notion of what was reality, so there were as many interpretations as there were applications. Management’s ability to make decisions was greatly impaired by the inaccuracy and incomplete definitions of information that appeared in the applications environment.
•The inability to make changes gracefully. Once a change was needed, it had to be made in many places, which required coordination of efforts. Simply stated, once many applications appeared, there was no way the information of the organization could be changed gracefully.
•The inability to make simple accesses of data. Once data became locked up in the applications environment, it was “in jail.” The application structure of the data optimized the storage of data and the ability to do speedy online transaction processing. There was no thought given to the need for access of data. ( Inmon )

by Robert Seiner Content type: Downloads
Category: Documents

To effectively manage your company’s information assets – data, content & knowledge – it is important to identify and record information about the individuals that are accountable for the quality of these assets. Accountability, especially formal (and communicated) accountability, has been known to drive behavior. Accountability for what … you may ask. In the field of information asset management … Accountable for defining and making certain that information assets are defined according to the way the business intends to use them; Accountable for making certain information assets are re-used rather than re-created; Accountable for making certain the data assets are of high quality both in terms of design and accuracy. And the list of accountabilities can go on …

by Ivar Jacobson Content type: Downloads
Category: Object Design Document

Agenda • Gaps in enterprise IT • Realities in enterprise IT • Need for enterprise architecture • Ingredients for a good enterprise architecture • Our approach towards enterprise architecture • Applying our approach in practice • Turn enterprise architecture to reality

by Neil Raden Content type: Downloads
Category: Metadata Design Document

Semantics for the People
The Emerging Role of Semantic Technology in the Enterprise
Neil Raden
Principal Analyst
Hired Brains Research
March. 2006
www.metatomix.com
Sponsored by Metatomix

Do you remember the first time someone told you about the World Wide Web?
As a technologist or someone directly affected by technology, did you wonder how useful this defense department project could be to your business? Do you remember the first time you used the World Wide Web? Did you wander around the small number of sites available and look at old articles and newspapers headlines? Did you imagine that less than 10 years later, you would be using it many times a day to conduct business? Did you predict that the global reach and impact would be so large? It took vision in the early 1990’s to understand the pervasive impact the Internet would have. Those who saw the opportunity early reaped large rewards for their organizations.

Content type: Downloads
Category: Metadata Design Document

The Data Reference Model (DRM) is one of the five reference models of the Federal Enterprise Architecture (FEA). The DRM is a framework whose primary purpose is to enable information sharing and reuse across the federal government via the standard description and discovery of common data and the promotion of uniform data management practices. The DRM describes artifacts which can be generated from the data architectures of federal government agencies. The DRM provides a flexible and standards-based approach to accomplish its purpose. The scope of the DRM is broad, as it may be applied within a single agency, within a Community of Interest (COI) , or cross-COI.

by James G. Kobielus Content type: Downloads
Category: Big Data Document

In Forrester’s 15-criteria evaluation of enterprise Hadoop solution providers, we found that in the Leaders category, Amazon Web Services led the pack due to its proven, feature-rich Elastic MapReduce subscription service; IBM and EMC Greenplum offer Hadoop solutions within strong EDW portfolios; MapR and Cloudera impress with best-of-breed enterprise-grade distributions; and Hortonworks offers an impressive Hadoop professional services portfolio. Strong Performer Pentaho provides an impressive Hadoop data integration tool. Of the Contenders, DataStax provides a Hadoop platform for real-time, distributed, transactional deployments; Datameer has a user-friendly Hadoop/MapReduce modeling tool; Platform Computing and Zettaset offer best-of-breed Hadoop cluster management tools; and Outerthought has optimized its Hadoop platform for high-volume search and indexing. HStreaming is a Risky Bet with a solution that is strong in real-time Hadoop.

by Jane Griffin Content type: Downloads
Category: Documents

The best place to begin a data quality initiative is with the data that tells you how much money you're making and how much money you're spending - for example, data about your customers, vendors and products. This is your organization's master data. It's also the most valuable nonmonetary asset your organization owns.

by Bill Inmon Content type: Downloads
Category: Data Warehouse Document

For years the IT organization has had a dilemma. The legacy applications developed over the years are unintegrated and require so many resources for maintenance and other care that the IT organization does little more than take care of the legacy environment. No resources are left over for new development or experimentation with other promising technologies. The legacy application environment has become the child that consumes the parent.

by Dr. Richard Mark Soley et al Content type: Downloads
Category: Object Design Document

As organizations, products, customers and technologies continue to change at an increasingly rapid rate, managers have sought overviews that will allow them to understand how everything within their organization fits together. The currently popular term for such an overview is an architecture. Some architectures – a data architecture, for example – provide overviews of a specific part of the overall organization. Increasingly, the term enterprise architecture refers to a set of architectures, which, taken together, provide a complete view of an organization. The Zachman Framework is one popular way of conceptualizing how all of the more specific architectures that an organization might create can be integrated into a comprehensive picture. The Zachman Framework is an analytic model or classification scheme that organizes descriptive representations. It does not describe an implementation process and is independent of specific methodologies.

Content type: Images
Category: Business Images

This is Tom Barrett of Data Advantage Group. Tom has been a vendor participant at our conferences for many years now. Here he is speaking to the attendees at our annual Data Warehouse Conference in Arlington, Viginia on June 16, 2005.

by Bill Inmon Content type: Downloads
Category: Metadata Design Document

There is a simple definition for metadata: “metadata is data about data”. The problem is, this definition of metadata is so simplistic and general that it is almost worthless. One of the reasons for the travails of metadata is that there is no theoretical understanding of it; metadata is ghastly complex. Another reason is that other arenas of computing have always had more glamour and more marketplace appeal than metadata. Many corporations see minimal value in investing in a metadata infrastructure. It is only over time and as the information processing environment ages and grows that the investment value in metadata becomes obvious. No other aspect of information is so important and holds so much promise as metadata. This white paper will address some metadata complexities and will put a theoretical underpinning for some of those more important aspects of metadata

by Ivar Jacobson Content type: Downloads
Category: Object Design Document

• Service Oriented Architecture • Enterprise Service Bus • Reusable Component Library • Asset Based Development • Model Driven Architecture (MDA) • Enterprise Architecture • Product Line Engineering • Controlled Outsourcing • Legacy Reengineering • Aspect-Oriented Software Development • Agile Development • Active Software • etc. etc.

Content type: Images
Category: People and Portraits



by Mike Wolcott Content type: Downloads
Category: Other Document

Ask a dozen tech pundits to describe Web 2.0 and you're likely to get two dozen explanations as to what it is. The precise definition remains open to debate — and in some ways, that's exactly the point. This much is clear: Web 2.0 represents an important shift in the way digital information is created, shared, stored, distributed, and manipulated. In the years ahead, it will have a significant impact in the way businesses use both the Internet and enterprise-level IT applications.

PREV Page  1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 NEXT