Compiled and edited by Tony Shaw, Program Chair, Wilshire Conferences, Inc

Скачать 404.5 Kb.
НазваниеCompiled and edited by Tony Shaw, Program Chair, Wilshire Conferences, Inc
Размер404.5 Kb.
1   2   3   4   5   6   7   8   9

PANEL: Is Data Modeling Alive and Well?

Davida Berger

Graham Witt

Terry Quatrani

Terry Halpin

Alec Sharp

We live in a world of ERP, packages, short-term solutions, tactile projects and outsourcing. Some organizations have replaced corporate data architecture areas with project teams that include database administrators but may not include data modelers. Others only develop physical models and skip development of the logical model.

Nonetheless in many organizations, data modeling is indeed alive and well. Some positive moves in these organizations (that others would do well to emulate) include:

  • Moving up the food chain – getting out of tool-driven detail and doing more high-value conceptual modeling

  • Adapting to the new reality –reverse-engineering conceptual models out of existing systems to highlight business impact

  • Passing our selfish gene to new hosts – helping every developer want to become a competent data modeler

  • Acting local while thinking global – giving up on producing the perfect, enterprise-wide model

  • Moving beyond being a speed bump – helping out in any possible way as a means of building up credibility

Enterprise Common Data Architecture - Roadmap to Integration

Daniel Paolini

Director, Data Management Services

State of NJ Office of Information Technology

An Enterprise Common Data Architecture (ECDA) is a collection of related tools and technologies, along with standards and policies and the methodology and the expertise to employ them. The architecture enables real-time operational integration as well as the delivery of integrated analytical data to different communities in the format that each requires. The creation of an ECDA is a major yet essential commitment to any long-term strategic initiative to support data reusability. This architecture forms the foundation for collecting, storing, managing and controlling privacy of and access to data on an enterprise basis. The presentation discussed:

  • The problems and challenges of a legacy environment

  • The opposing needs of operational and analytical systems

  • The process to create integrated operational systems'

  • The rational creation of analytical solutions

  • How to get IT staff, executive management and business users to see the big picture

Can the Warehousing Center of Excellence implement a Meta Data Repository?

Lowell Fryman


Metrics Consulting Corp

This session explored the real-world implementation of an enterprise meta data repository by the Center of Excellence (CoE) for an international automobile-manufacturing firm. Many companies are implementing the model were the CoE is responsible for Pattern definition, tools, practices, and providing guidance to application teams. Attendees were presented with the case study of how this issue was resolved by this an international automobile manufacturing firm.

Working with the Dublin Core Metadata Standard

David White

Lead Architect, Metadata

BellSouth Corporation

The Dublin Core is fast becoming the standard representation of metadata that for representing the semantic meaning of web-based documents. This presentation introduced the basics of the Dublin Core initiative and how organizations can implement this standard.

  • Dublin Core Schema, Elements, Qualifiers, Expansion

  • Dublin Core within HTML (Metatag)

  • Dublin Core within XML (RDF)

  • Dublin Core Syntax

  • Recommended starting points for working with the standard

Metadata and IT Service Management

Charlie Betz

Technical Consultant

Best Buy Corporation

What is configuration management? What is the difference between incident and problem management? What is the purpose of a capacity planning capability? These IT process areas have increasingly formalized definitions, based on the emerging general area of IT Service Management and its flagship standard, the UK-based Information Technology Infrastructure Library (ITIL). ITIL has been sweeping US Fortune 500 corporations over the past 2 years and, while it never mentions the word “metadata,” calls for something termed a “Configuration Management Database” (CMDB). What is the relationship between this and the concept of a metadata repository? There are significant overlaps and opportunities for both synergy and disharmony.

This presentation discussed the ITIL concepts as they relate to metadata. In particular, the highly generalized data model of the ITIL CMDB concept was critiqued, and richer, more accurate approaches discussed.

Enterprise Information Architecture for Unstructured Data

Michael Reed

Principal Consultant

Enterprise Warehousing Solutions, Inc.

This session discussed the integration of an unstructured data framework within an overall Enterprise Architecture. Efforts currently underway at the Federal Bureau of Investigation were used as an example of building a basic framework for the customer's understanding of unstructured data issues.

Unstructured data in the business, information, data, systems, and computer architecture were covered with examples of current successes and shortcomings. An overview of security requirements at the document level and their effect on table population was explained. The audience was surprised to learn that surveys show any government agency building an enterprise-wide taxonomy should expect over one million categories. Ontologies, glossaries, and data dictionaries were also briefly discussed.

Data Privacy and Security: What It Is, Why You Need It, and How To Do It

David Schlesinger

Intel Data Policy Governance Manager

Intel Corporation

New data laws require an understanding of who is using what corporate data. These laws, Ordinances, Regulations, and Policies cross all departments and are a corporate issue. If each department tries to manage data regulations individually it will exacerbate system complexity, add data definition confusion, and inflate costs. Since data laws and policies are proliferating, we need an extensible way to manage them all. Thinking of "Regulated Data" as a class allows you to define sub-classes to meet present and future regulatory requirements. Data Administrators need to lead the way. Here's how:

- Focus on solutions to share and re-use enterprise data according to data regulations.

- Simplify redundant and costly account request processes into one standard and informed system.

- Drive standard data regulatory classification standards in metadata to reduce complexity, inform the business, and comply with data regulations.


Wednesday, May 5

7:15 am – 8:15 am

Outsourcing of Data Management: Will You Be Next?

Michael Scofield

Assistant Professor

Loma Linda University

"Outsourcing" is one of the most ominous words in the business lexicon because it typically means that people are losing their jobs. And as the pace and reach of outsourcing increases in the data management area - for everything from database design, to real-time monitoring of systems, and now the latest trend of outsourced analytics and BI - then many of us are asking ourselves, could I be the next in line? Mike Scofield led this discussion of outsourcing trends and where they will go next.

Post-Secondary Education in Data Management

Deborah Henderson

Data Warehousing and Reporting


Anne Marie Smith

Practice Director, Ed. Svcs. and Info. Mgmt.

Staccato Consulting Group, LLC

The DAMA Education/Curriculum team presented the work to-date on the DAMA-sponsored IRM/DRM curriculum The session was well-attended, despite the very early hour. Members of the Education / Curriculum team - Deborah Henderson, Patricia Cupoli, Anne Marie Smith and Eva Smith (committee members Brett Champlin and Jeff Hoffer were not in attendance at the conference) presented the rationale behind the development of a curriculum for IRM/DRM education, its current focus on the North American educational system (2-3 year colleges, 4-year colleges, graduate schools), explained the structure of the proposed curriculum, and outlined some of the coming work for the committee (finalizing the current version of the curriculum's outline, presenting it for approval, marketing/promoting the curriculum). After the formal presentation, the attendees were invited to contribute questions, comments and suggestions. This led to a lively discussion and some attendees offered their services to the committee. The committee hopes to present the curriculum formally at the DAMA International Conference in 2005.


Wednesday, May 5

8:30 am – 9:30 am

What do you do when the CIO says "forget the gurus - go save me some money!"

David Newman

Senior Director of Enterprise Data Management

Radio Shack

David Newman is creating a comprehensive information asset management environment that will challenge some of the conventional wisdom around the role of today's data management organizations. He is responsible for defining and implementing a comprehensive information management program in a tight economic climate and an increasingly competitive industry. He is introducing a new value proposition for Enterprise Data Management, including:

Teaming with Internal Audit to help the CIO and CFO, especially in areas such as SARBOX: How to assist Internal Audit functions with: Meta Data Management; ETL; Data Quality Management; and Data Model Management

Helping the CIO on Operational Efficiency
Identifying inefficiencies in current application portfolio, such as redundant point-point interfaces and how to create common views for a Single Version of the Truth.

Architecting a better way - EDM support for EAI
Migrating to lower cost platforms, establising a common data layer and EDM's role in Enterprise Data Warehousing

UML for Database Design

Terry Quatrani

UML Evangelist


The Unified Modeling Language is the standard language for visualizing, specifying, constructing and documenting all of the artifacts for a software system. Database modelers can now reap the same rewards from using the Unified Modeling Language as application developers have been doing for years. Application and database modelers can now speak one language - UML. Also, being able to link together the object and data models, and thereby improving the understanding of both, helps yield higher quality systems.

Success in a Storm: A Case Study on Wyeth Consumer Healthcare Data Architecture

April Reeve

Data Architect

Wyeth Consumer Healthcare

While many corporations were trimming or eliminating their IS Data Architecture functions during these last two years, Wyeth Consumer Healthcare continued to invest and grow its Data Architecture team. Data Architecture had become a key contributor and trusted advisor to their client organizations while other Data Architecture organizations have struggled to prove their value within Information Technology and their corporate organizations. This presentation discussed the factors and behaviors that have made Data Architecture a success at Wyeth Consumer Healthcare.

Meta Data Exploitation for Financial Gain

Christine Craven

Customer Insight Development Programme Manage


Increasing competition, strategic partners, mergers and acquisitions represent some major challenges for the organisations of today. However, effective development and exploitation of meta data can significantly facilitate information integration requirements associated with these challenges and at the same time provide substantial financial benefits and improved customer service.

This presentation provided practical examples of how Abbey has developed and exploited its meta data to improve its Cost : Income Ratio and eliminate some major customer service eyesores.

The Ins and Outs of Semantic Integration

JP Morgenthal

Chief Services Architect

Software AG

Semantic integration is a powerful superset of data and systems integration. It allows us to integrate data and systems behind an abstract representation that provides additional context. JP discussed how semantics allows machines to understand how data and systems relate to one another and then provide abstraction, reusability and agility for future application development. He demonstrated the critical importance of metadata and explain how ontologies improve the capability for semantic integration by providing classifications that help to create order and improve the understanding of relationships between entities.

XML vs Relational: Comparisons and Contrasts

David Plotkin

Data Quality Manager

Well Fargo Bank Consumer Credit Group

This presentation compared and contrasted XML vs. Relational data management and technologies. Many of the differences between them are quite clear, and their advantages and disadvantages obvious. However that doesn't necessarily make it easy to choose between the two, as trade-offs will always exist. David Plotkin identified the principle differences and similarities so that the audience could better understand each one and make better architectural decisions in future.

The Human Side of Data Modeling:

Proven Techniques for Improving Communication With Subject Matter Experts

Alec Sharp


Clariteq Systems Consulting Ltd.

The human side referred to understanding and taking advantage of typical human likes, dislikes, etc. to make modeling more effective. We need to do this because:

- Time is more precious than ever, so we must help our clients quickly become comfortable and productive with modeling

- The human aspects of modeling can’t easily be outsourced or offshored

Seven specific human side behaviors that modelers can use are -

- Accessibility – make it easy for non-modelers to get involved

- Directionality – models are clearer when drawn with a sense of direction

- Simplicity – get out of the detail into high-value conceptual modeling

- Consistency – do the same things the same way every time

- Visibility – don’t hide patterns or symmetries in randomly drawn models

- Relevance – relate the model to familiar artifacts, business issues, etc.

- Plurality – use multiple techniques and appeal to multiple styles (visual, auditory, kinesthetic)


Wednesday, May 5

10:00 am – 11:00 am

Challenges for Data Architecture in a Package Environment

Dawn Michels

Senior Data Architect

Guidant Corporation

The data architecture discipline has relevance and value to add in an integrated application environment (IAE). Just because a corporation chooses to purchase "best of breed" packages to solve some application challenges, does not relieve the Data Architect of the responsibilities to ensure that the needs of the business are reflected in the purchased package.

The highlights of this presentation included the following:

  • A DA can help provide value by defining business requirements and develop the data modeling artifacts of conceptual, logical and physical models to illustrate the requirements.

  • Participate in the RFP process early – especially by providing criteria for the evaluation that is data focused.

  • Do your utmost to secure a copy of the physical data construct from the vendors, and begin to map the data to the design of your data models. Document this mapping in an xls (format and make it available to all).

  • Above all remain involved thought the lifecycle of the project. Put on your “scouting for the future” thinking caps, and point out opportunities for the future.

A couple additional ideas from the audience included:

  • Be sure a member of the security team is involved

  • Involve your DBA early in RFP so they are on board with what is happening

  • Insist the vendor share their underlying proprietary data model – make it a “deal breaker” during negotiations.

Creating a Global Master Data Organization and Process: A Real World Case Study

Terry Haas

Director, Customer Analytics

Air Products and Chemicals, Inc.

This session discussed how:

  • Changing from a dispersed, decentralized data updating environment to a single, global master data organization can be accomplished.

  • Success requires first defining a formal process and associated roles and then the organization.

  • Key elements of a successful change plan include:

  • Data standards

  • Process and Quality Measures

  • Responsive Service Level Agreement

  • Clear Accountability and Control

  • Being part of a Business/Functional unit vs. IT

  • Sustaining success requires pro-actively solving business problems – i.e. going beyond data maintenance.

Making Your Entities Behave: Entity Life Histories

David Hay


Essential Strategies, Inc.

The characteristic of object classes in object models that distinguishes them from entity types in entity-type relationship diagrams is the specification of their behavior. It seems a reasonable extension to suggest, then, that when entity types representing things in the world are specified in requirements analysis, entity/relationship models could also be extended to represent the behavior of these entity types. The behavior of a real-world entity type, however, is often much more complex than the list of steps that could be captured in the lower third of a UML class box. So, while it is appropriate to want to know what functions are performed around the entity type, the description requires much more than a simple narrative. Enter the entity life history. This is a technique for representing the set of events that affect a particular entity type and the operations that are carried out in response to those events. It represents the complexity of different events and their effects.

End-to-End Enterprise Data Integration

Toyota's best practices for integrated metadata, data migration and data quality management

John Gonzales, Elizabeth Kemper & Ken Karacsony

Toyota Motor Sales

In this session, Toyota Motor Sales described the best practices it successfully implemented via an integrated data integration suite – enabling them to incorporate metadata into data quality analyses, as well as data migration and transformation logic, and vice versa. Toyota explained its data integration architecture, and discussed the key benefits it has achieved since the implementation. These benefits include the ability to provide timely and critical information to business experts, as well as data integration and quality analysts. By capturing information about data modeling and architecture, source-to-target mappings, data transformation logic and data validations, Toyota is able to incorporate the results of its data quality analyses into its data migration logic and quickly address potential data anomalies.

Proactive Metadata-Driven Information Quality

David Loshin


Knowledge Integrity, Inc.

"Active metadata" is a concept that is beginning to gain a developing stronghold in a number of organizations, as was evidence by some of the questions and comments that were made during my presentation on Proactive Metadata-Driven Information Quality.

As more data is used for multiple purposes across an enterprise, the perception of information quality depends on the information consumer and the corresponding application context. Fitness for use is directly related to compliance with the expectations of how data is used by the information client, and being able to measure compliance with those expectations can provide an objective assessment of the quality of the data. For any data set, many consumer data quality expectations can be refined into formal relational constraints to be incorporated as metadata. In turn, these constraints can be used as business logic for proactively identifying noncompliant data instances before they are integrated or committed. This workshop focuses on a process for successively refining naturally defined data quality requirements into formal metadata constraints and rules, as well as techniques for using these rules as a component of a proactive information quality assessment and monitoring program.

Questions that arose during the session involved the ability to capture state associated with the firing of information quality business rules, the ability to accurately state and transform information quality business rules, and how classification, object concepts, and taxonomies are increasingle relevant to information quality programs. Lastly, there were comments made about the important connection between information quality, data standards, and legislative/regulatory compliance.

A Checklist for Building a Taxonomy

Malcolm Chisholm

President, Inc.

Taxonomies are more common than most IT professionals think. For instance, reference data tables often consist of classification schemes, even though they are not formally termed “taxonomies”. This presentation described a checklist of points that experience has shown to be useful when implementing a taxonomy. Three broad areas were covered:

  • Design of the taxonomy

  • Helping system operators use the taxonomy to accurately classify things

  • Ensuring proper governance of the taxonomy, especially after production release

Examples of what can go wrong with taxonomies focused on misclassification of things by individuals using a taxonomy, over-expectation of what a taxonomy can deliver, and misuse of a taxonomy in analytical contexts for which it was not intended.

PANEL: Data Management Imperatives for Sarbanes Oxley Compliance

Bonnie O’Neil, Westridge Consulting

Christine Crave, Abbey National
1   2   3   4   5   6   7   8   9


Compiled and edited by Tony Shaw, Program Chair, Wilshire Conferences, Inc iconThis report compiled and edited by Tony Shaw, Program Chair, Wilshire Conferences

Compiled and edited by Tony Shaw, Program Chair, Wilshire Conferences, Inc iconGeneral Chair Yogendra K. Joshi Georgia Institute of Technology, usa program Chair Sandeep Tonapi Anveshak Technology and Knowledge Solutions, usa chair, Thermal Management

Compiled and edited by Tony Shaw, Program Chair, Wilshire Conferences, Inc iconOpening Ceremony and Committees Address by General Chair and Program Chair

Compiled and edited by Tony Shaw, Program Chair, Wilshire Conferences, Inc iconWseas and naun conferences Joint Program

Compiled and edited by Tony Shaw, Program Chair, Wilshire Conferences, Inc iconAssociated Press – 1/21 isu space food program closes Tony Pometto Faculty/research

Compiled and edited by Tony Shaw, Program Chair, Wilshire Conferences, Inc iconWarren L g koontz (Professor and Program Chair)

Compiled and edited by Tony Shaw, Program Chair, Wilshire Conferences, Inc iconNelson is president, board member and annual congress program co- chair of the International Management Development Association imda

Compiled and edited by Tony Shaw, Program Chair, Wilshire Conferences, Inc iconIpo chair, nipde general Manager, us tag chair

Compiled and edited by Tony Shaw, Program Chair, Wilshire Conferences, Inc iconPygmalion by George Bernard Shaw

Compiled and edited by Tony Shaw, Program Chair, Wilshire Conferences, Inc iconShaw College, The Chinese University of Hong Kong

Разместите кнопку на своём сайте:

База данных защищена авторским правом © 2014
обратиться к администрации
Главная страница