This report compiled and edited by Tony Shaw, Program Chair, Wilshire Conferences

Скачать 220.24 Kb.
НазваниеThis report compiled and edited by Tony Shaw, Program Chair, Wilshire Conferences
Размер220.24 Kb.
  1   2   3   4   5   6   7   8

Conference Trip Report

2003 DAMA International Symposium


Wilshire Meta-Data Conference

Renaissance Resort, Orlando, Florida April 27 - May 1, 2003

This report compiled and edited by Tony Shaw, Program Chair, Wilshire Conferences

The 2003 DAMA International Symposium and Wilshire Meta-Data Conference was held before an audience of almost 900 attendees and speakers. To receive more information about this conference, and related future events, go to

This report contains a chronological summary of the key discussions and conclusions from almost all of the workshops, tutorials, conference sessions, and special interest groups.

Reproduction Policy

This document is © Copyright 2003 Wilshire Conferences. It may be copied, linked to, quoted and/or redistributed without fee or royalty provided that all copies and excerpts provide attribution to Wilshire Conferences and the appropriate speaker(s). Any questions regarding reproduction or distribution may be addressed to


Sunday, April 27

3:30 pm – 6:45 pm

Data Modeling Basics

Everything you need to know to get started

Marcie Barkin Goodwin


Axis Software Designs

This workshop for novices covered the basics of data modeling, including an explanation of:

* IDEF & IE Methods

* Entities, Attributes & Relationships

* Keys, Cardinality, Recursion & Generalization Hierarchies

* Normalization

* And let’s not forget Standards & Procedures!

Creative Data Modeling: Ideas and Techniques for Advanced Practitioners

Graeme Simsion

Senior Fellow

University of Melbourne

The speaker's basic premise is that data modeling is a creative design activity, and he presented some compelling evidence to support this. He drew heavily on the analogy with architecture (useful for presenting data modeling ideas and process to non-initiates). Some key conclusions: There is no "requirements model" - just iterative design; there is no "one right answer" - it's worth getting a second opinion for important models; if we want to become better modelers, we should work on building our stock of patterns; we need to beware of developing personal styles of modeling that may not always provide the best solutions for the business.

Practice Made Perfect: A Business Rule Discovery and Analysis Workshop

Barbara von Halle


Knowledge Partners Inc.

This tutorial attracted very motivated and enthusiastic BR practitioners to hear case studies from the instructor, share BR experiences among tutorial participants, and unleash the BR expertise in each attendee. The attendees willingly shared the most valuable lessons they learned over the 3 hour interactive session. These included:

- The tutorial validated ideas that one attendee has been discussing at their organization, specifically how to separate business rules as separate requirements, not embedded in other deliverables.

- For one attendee, the most valuable aspect was a simple, practical technique by which to separate rules from Use Cases and, instead, reference rules in a separate repository (possibly using existing tools) for reuse and independent change management.

- The tutorial solidified one attendee’s vague ideas about business rules into detailed how-to instructions, supported with simple, understandable examples.

- Discussions suggested that, while many people aim to discover and manage data validation rules as the first business rules project, the significant value to the business is likely to come from discovering and managing instead complex constraints, computations, and significant inference rules. The reason is that most of us are likely to know what the data validation rules are (or we know where we can find them) and these don’t usually change often. The more complex rules are the ones that drive the business, that underscore business agility, and are the ones we have lost total knowledge of control over in our previous automation approaches.

- The most useful realization for another attendee was that terms, facts, and the types of rules each have appropriate manifestations and enforcement that should be separate and distinct.

- The introduction of a term-fact or fact model as a model by and for business people as an inventory of terms and facts for writing rules was new to almost everyone

- A most intriguing aspect of the presentation was the idea of a rule family because this concept provides the architectural underpinnings for rule analysis, rule design, and rule impact analysis.

- The concept of rule families can be added to the metadata management environment of one attendee relatively easily and may prove quite useful.

- The most valuable aspect to many attendees was the very tight integration among process modeling, data and object modeling, and business rule discovery and analysis - an integration that has been elusive in the past.

The Agile Data Method

Scott Ambler

Senior Consultant

Ronin International

Modern development methodologies, including the RUP, EUP, XP, DSDM, FDD, and even forthcoming offerings from the IEEE have only one thing in common: they are all iterative and incremental (evolutionary) in nature. For data professionals to remain relevant they must be able to work in this manner -- no more creating, reviewing, then baselining data models early in a project. Luckily there are techniques for data professionals to work in an evolutionary manner such as database refactoring and agile modeling-driven development (AMDD), you merely need to choose to work that way. The Agile Data web site,, describes these techniques in detail.

Developing a Meta Data Management Strategy

Patricia Graham

Systems Director

Prudential Financial Services

The presentation urged participants to develop and adhere to a strategy for meta data management, whether creating a new program or managing an existing one. The strategy should align with corporate objectives and lead to specific end of year accomplishments. Three strategies were covered: do nothing, create a large centralized repository, or create a small centralized site with links out to meta data. An example of an implementation using a purchased meta data repository demonstrated that there are seven points to success:

- support from key executives

- identification of users

- relentless marketing

- easy access

- high quality

- sufficient quantity

- attention paid to ROI

Areas where meta data practices can be expanded to grow with the business were explored as well as the expansion of meta data management to include the management of XML assets.

XML for Data Practitioners

David Plotkin

Senior Data Administrator

This one-day program provided a comprehensive introduction to XML as it relates to various data management functions and responsibilities. It provided an understanding of the importance of XML to meta data management, and the new breed of repository for XML meta data. It introduced the essential aspects of XML-based systems, including DTDs, XML Schema and namespaces. The "data" applications for XML are highly significant and varied, including Corporate Portals, Data Warehouses and Data Marts, Meta Data Management, Business Rules and Data Integration.

Practical Techniques in Assessing Data Behavior, Meaning, and Quality

Michael Scofield

Asst. Professor, Health Information Management

Loma Linda University

The speaker showed practical techniques for using reporting tools already in your shop for assessing the behavior and quality of legacy data, as well as data imported from external (and uncontrolled) sources. These center around the "domain study" which shows the distribution of values found in an individual column. These techniques uncover data anomalies which could either be erroneous data, or accurately describing anomalous business behavior (also a valid concern). A good data analyst must be very interested in the business which the data describes, and possess cynicism in testing what the documentation says about the data. They must also interact with the business experts and "owners" of the data in a non-threatening way to understand the data problems, and improve the data quality.

The Dangerous Illusion: Normalization, Performance, Integrity and the Logical-Physical Confusion

Fabian Pascal

Data Management Specialist, Editor and Publisher


One of the most egregiously abused aspects of data modeling and database design is normalization. Despite the fact that they were repeatedly debunked, arguments against normalization and for denormalization continue to sway practitioners, be they experienced or novices. This costs dearly and reveals the poor understanding of sound design principles by even those who profess to be experts. It is both a major reason for and a consequence of SQL deficiencies and technology regressions such as ODBMS and OLAP, that have come to haunt data management.

Even if current data management systems performed better with denormalized databases, denormalization would still be unjustified, because performance gains, if any, can be had only at the expense of integrity. If the integrity consequences of denormalization are taken into account, thy will cancel out performance gains, if any. This workshop showed why the notion of "denormalization for performance" is a misconception due to the logical-physical confusion prevalent in the industry.


Sunday, April 27

7:00 pm – 8:00 pm

Process Analysis Basics

Thom Harrington

Senior IT Analyst

SAFECO Corporation

Eva Smith

Instructor, Computer Information Systems

Edmonds Community College

Why do managers see process and data as mutally exclusive approaches to systems requirements and design? In truth, they are complimentary, and both must be performed and correlated to assure successful implementation. Understandng process is paramount to understanding system design requirements – front-end user interfaces as well as back-end infrastructure and middleware.

Data Management and ICCP Certification

Patricia Cupoli

ICCP Liaison

DAMA International

DAMA has been working with the Institute for Certification of Computing Professionals (ICCP) since 1993 in the area of professional data certification. There are new DAMA certification exam efforts in Data Management, Data Warehousing, and Database Administration that will eventually replace the current Data Resource Management exam. Reasons to certify include professional growth, credentials, self-assessment, professionalism, and challenge. Certification benefits include increased credibility, assessed ability, and increased compensation (up to 15% at some organizations). Attendees were offered the opportunity during the Meta-Data Conference / DAMA International Symposium to take ICCP exams for either the Certified Computing Professional (CCP) or Proficiency certificate.

Putting User Language On Data Models

Joseph Maguire

Consultant, Author, Trainer

Independent Consultant, Author, Trainer

Session attendees learned some culture-focused data-modeling principles and recommendations that contrast starkly with conventional, syntax-focused principles.

To admit the most user vocabulary while honoring the software engineer's need for rigor, Joe Maguire suggested specific guidelines that include: using singular nouns for all named types (e.g., entities and attributes); using a single, shared namespace for all of those nouns; labeling relationships with verbs that will yield forceful, evocative sentences that help users evaluate the accuracy of in-progress models; and recognizing that identifiers, typically thought of as constraints, should be considered part of the essential definition of categories because identifiers solve the homonym problem.

To ensure that data models remain about data, rather than about data-plus-process, the speaker suggested excluding most constraints from a data model.

One distinguished member of the audience, Terry Halpin, colorfully disputed these recommendations; Professor Halpin is the main proponent of another data-modeling technique. Nevertheless, Mr. Maguire provided real examples from business clients illustrating that constraints -- even simple constraints such as minimum cardinality -- indeed are typically the result of processing requirements, not data requirements, and that the proper time to express most constraints is during the conceptual process-modeling phase, not the conceptual data-modeling phase.

Orientation to Geographic Information Systems and Geospatial Data Management

Mike Walls

Executive Consultant

Plangraphics, Inc.

This Night School session described how geospatial data fits into the enterprise data resource management function. Walls stressed that geospatial data -- data about location expressed as coordinates such as latitude/longitude on the earth's surface -- is just like any other data in the enterprise. The differences are that (a) the description of location is much more precise and formalized, taking advantage of surveying and cartographic descriptions to describe location using geometry and topology. This means that (b) a query can take advantage of these geospatial attributes to add a new type of relationship between the coordinates describing location of the two entities. However, adding the graphics attributes to the rest of the database means (c) the level of metadata required to effectively work with GIS data is dramatically larger than that needed for data with only non-graphic attributes.

Building and Using Taxonomies

Malcolm Chisholm

Senior Consultant Inc

- Taxonomies are forward-engineered tools for understanding information. They should not be used to try to impose personal or corporate views of how the Universe should be arranged.

- Taxonomies are not easy to get right. Biologists have been working with taxonomy for 250 years, and have created many significant problems that others should avoid.

- A taxonomy should serve one purpose. Multiple overlapping goals can destroy the coherence of a taxonomy.

- Identification of the "specimens" being classified in a taxonomy dramatically affects how the taxonomy can be implemented, yet is often given little attention.

  1   2   3   4   5   6   7   8


This report compiled and edited by Tony Shaw, Program Chair, Wilshire Conferences iconCompiled and edited by Tony Shaw, Program Chair, Wilshire Conferences, Inc

This report compiled and edited by Tony Shaw, Program Chair, Wilshire Conferences iconGeneral Chair Yogendra K. Joshi Georgia Institute of Technology, usa program Chair Sandeep Tonapi Anveshak Technology and Knowledge Solutions, usa chair, Thermal Management

This report compiled and edited by Tony Shaw, Program Chair, Wilshire Conferences iconOpening Ceremony and Committees Address by General Chair and Program Chair

This report compiled and edited by Tony Shaw, Program Chair, Wilshire Conferences iconWseas and naun conferences Joint Program

This report compiled and edited by Tony Shaw, Program Chair, Wilshire Conferences iconAssociated Press – 1/21 isu space food program closes Tony Pometto Faculty/research

This report compiled and edited by Tony Shaw, Program Chair, Wilshire Conferences iconWarren L g koontz (Professor and Program Chair)

This report compiled and edited by Tony Shaw, Program Chair, Wilshire Conferences iconNelson is president, board member and annual congress program co- chair of the International Management Development Association imda

This report compiled and edited by Tony Shaw, Program Chair, Wilshire Conferences iconProgram Self-Study Report

This report compiled and edited by Tony Shaw, Program Chair, Wilshire Conferences iconClips Report is a selection of local, statewide and national news clips about the University of Missouri and higher education, compiled by um system University

This report compiled and edited by Tony Shaw, Program Chair, Wilshire Conferences iconClips Report is a selection of local, statewide and national news clips about the University of Missouri and higher education, compiled by um system University

Разместите кнопку на своём сайте:

База данных защищена авторским правом © 2014
обратиться к администрации
Главная страница