Скачать 220.24 Kb.
Surviving the Trek to Data Element Standardization To a High Payoff
Whitemarsh Information Systems Corporation
Senior Information Engineer
On Tuesday afternoon, Hank Lavender and Mike Gorman presented the proof of the pudding. Yes, you can survive the trek to establishing an environment of data sharing. On Wednesday evening, the attendees from business totaling about $500 B in annual revenue, attended a five hour marathon session of kicking tires, looking under the hood, and plotting long range strategies. We all walked through an actual real example where in operational schemas were imported, upper abstraction layers were built, and then an example data mart was designed and built from the now commonly available data model abstraction layer. Then, the data mart application itself was generated via Clarion for Windows right from the new, forward engineered operational data model layer. QED. Victory was snatched from the jaws of data semantics chaos. The revised set of slides will be posted May 10th to www.wiscorp.com/dama2003.zip.
Data Architecture/ Data Management
- How to sustain your career in a changing discipline?
Data Architect Lead
Fair Isaac Inc
Data Architecture professionals have an opportunity and challenge to demonstrate the bottom line value they and their work provide their companies. Developing collaborative and sharing relationships with peers in data fields closely aligned to their own, provides input into better designs. Extending your skills by trying new related types of data work expands your breadth, depth and personal value to the company. This might include project mgmt, data validation and analysis and may even include ETL processing or report writing. Continuing to educate yourself, via symposiums, conferences, reading, classes, brown bags and associations with assorted professional organizations keeps you vital & passionate about your work.
Datawarehouse Architect/Systems Integrator
This presentation provided an overview of a metadata driven ETL process. This process is in use to load net change data in over 700 tables three times a day. Company drivers, metadata model, process and integration utilities were discussed.
* Key components of process - meta model, metadata databases, integration utilities
* Key Points included:
- Meta Data Driven approach allows for quick changes to production scripts
- Meta Data Driven approach allows for conformity
- Meta Data Driven approach provides for metrics capture, improving Data Quality
Unstructured Data and Content Management
Managing the Metadata
Robert Seiner (moderator)
Founder & Principal, KIK Consulting
Sr Product Manager, Digital Asset Management
Adobe Systems, Inc.
Metadata is critical to managing unstructured data. XMP is an open source framework that tags unstructured data (rich media assets/documents) with XML metadata. It is based on the W3C's standard for metadata and is integrated within ten Adobe applications and a growing number of third party systems.
- Extends metadata beyond the context of a database
- Enables creation of “smart assets”
- Fosters re-use, re-purposing, re-expression across multiple domains
- Promotes brand equity, intellectual capital & other intangible assets
- Enables metadata capture, preservation & propagation - across devices, applications, file formats, institutions
- Is self-describing, and not limited to a specific or predefined schema
- Enables easy access to relevant properties, including rights and relationship information, expressed in XML
- The XMP spec is available at www.adobe.com/products/xmp
Data Management for ECommerce
Program and Architecture Manager
Mellon Financial Corporation
Developing an E-Commerce system introduces issues like naming conventions, data ownership/ stewardship, data quality, data conversion/transformation, and on occasion, system documentation. Currently, projects need to address these issues around technology like Web Services, UDDI, SOAP, and, under all of these, XML and its related technologies. Proliferation of so-called XML "standards" also complicate data management efforts. How can we use tried-and-true data management methods from prior technologies when tackling these issues in a progressive E-Commerce development environment?
Data Management professionals will gain an understanding of web services, UDDI, and XML, and how data management principles apply to these technologies.
Implementing Complex Data Integrity Rules Inside Your RDBMS
Application Database Specialist
The speaker focused on the benefits of storing data integrity rules within a database and the options available for doing so. Benefits highlighted included being able to write the integrity checking code once for the entire application, the ability to ensure that the rules are always followed, and the development of a single repository for data integrity rules thereby promoting easy documentation and understanding of the rules and simplified maintenance in the event of rule changes. Typical database options for implementing these rules such as row-level check constraints and triggers were reviewed along with some sample code. The sample code, written for Oracle, was presented as a starting point for the attendees who could apply some of the ideas and concepts from the code to their own database platforms and/or business situations."
SPECIAL INTEREST GROUPS
Wednesday, April 30
7:15 am – 8:15 am
PANEL: Marketing and Selling Data Management
Danette McGilvray, Agilent
Todd Stephens, BellSouth
Richard Hecht, DATA Architects Technicians Analysts, Inc.
- Marketing and selling is essential to the success of data management.
- These are skills that can be developed.
- Data management must backup the marketing and selling by always delivering value to the business.
- If you want to market your services you have to think and act like a marketer.
- The key to marketing starts with adopting the marketing philosophy - you have to think about your customer first.
Selling, Marketing and Branding Data Management is about creating a perception of value. Data Management will have a perception, either by design or by default. A solid data management strategy will contain a solid set of products, services, processes and procedures. Adding the element of branding will round out a complete and effective strategy.
Its Not Just Arts and Crafts: Quality Assurance for Data Models
Data models serve as specifications or "blue prints" that are essential to the successful creation, maintenance and operations of our information systems. Quality data models enable the organization to break free from "data jails" making data an asset available to the entire organization. Despite the importance of quality data models, as a profession, we do not place a strong rigor on ensuring quality data model. In the session the following issues were discussed:
- Data models must meet basic standards for definitions, use of structured modeling patterns and referential integrity
- Model reviews at critical points of the model development life cycle must be executed to ensure that data models meet quality standards
- Model review templates were given that identified critical quality points that must be met at each stage of data model development
- There is a need in our industry to set clearly defined standards for data modeling, expected skill sets for data modelers, and training for data modelers that would lead to certification in this profession.
Wednesday, April 30
8:30 am – 9:30 am
Analytical Modeling Manifesto
- The difference between analytical and operational data modeling lies in the data, not the modeling method.
- Data design should be based on principles, not just patterns.
- It is unreasonable to say a priori that one type of design is better than another.
- ER modeling and dimensional modeling should not be compared because they solve different problems.
- People, vendors, authors should stop doing it.
- ER modeling is a logical modeling method; dimensional modeling is a physical design method.
- Don't believe everything you read or hear; test things to prove them to your projects.
- A project should allow sufficient time for prototyping of the DB in a data warehousing project.
Information Quality in an Integrated ERP Production Environment
Enterprise Information Quality Program Manager
Information management is the coordination of components that, working together, will result in high quality data required by the business. These components are information quality processes and tools, enterprise architecture, data standards, meta-data management, data stewardship, enterprise data model, change management etc. Agilent Technologies has created an Enterprise Information Management function with a roadmap that consists of a logical sequence of manageable worksteps that yield short term successes while building towards a long term vision. One example: Agilent has identified data stewards that resolve strategic data questions for a specific subject area and engage other IM functions where appropriate.
Developing a Business Rules Strategy using a Business Rules Special Interest Group Approach
Project Manager - Infrastructure
After attending the presentation covering 'Developing Business Rules using a Special Interest Group (SIG) Approach', the attendees have an better appreciation for the challenges that a business rules practitioner faces. In addition to challenges, the attendee will have an approach and plan for presenting business rules management to senior management, application developers and users. Finally, success metrics and what steps should be taken hopefully are now part of a strategy for defining, capturing and presenting business rules.
Web based Dynamic Data Dictionary
Data Warehouse Architect
Lee Arnett discussed the construction and deployment of a web based dynamic data dictionary as a case history. He accomplished this by creating a web based data dictionary sourced from meta data.
1. Current - The dictionary is an "active" dictionary that runs off production meta data.
2. Self Service - The dictionary empowers analysts that write reports by providing them information without calling the data warehouse team.
3. Power Users - Power Users are the primary customers of the data dictionary.
FGDC MetaData and Oracle Spatial Data Integration
Mike Walls described how the author's consulting firm has performed several projects in which geospatial metadata from multiple, disparate GIS sources had to be integrated for inventory and searching by location. The FGDC (Federal Geospatial Data Committee) has produced a standard for digital geospatial metadata, which was used in these efforts. A simple approach to storing the metadata within an Oracle database was described. The presentation stressed that geospatial metadata is manageable and cost-effective to implement, and that RDBMS technology can be used without requiring specialized tools. One benefit of this approach is that the metadata search process can be tightly integrated into an application.
Data Challenges in Getting to the Single Customer View
Director of Strategic Marketing
Director, Professional Services
Innovative Systems, Inc.
Senior Data Administrator
Every sales group asks for the "full view" of customers -- a substantial task, invariably requiring the integration of multiple customer touch points (and therefore, data sources). In this unique session we asked three experienced data specialists to respond to the specific questions below, and asked them to explain how each one would approach the problems and issues involved.
- How do you handle the modeling of customer hierarchies -- including management org charts as well as corporate parent/subsidiary relationships
- What's the best approach to data cleansing, merge-purge, cleaning up duplicates, etc.?
- What about data integration and loading? What works, what doesn't?
- What unique data & business rule challenges arise in real-time vs batch or event-triggered marketing scenarios?
- What metrics (business and/or technical) do you recommend for measuring success with CRM efforts?
Improving Your Data Modeling by Developing Custom Patterns and Generalizations
The power associated with the use of data model patterns and design generalizations is widely recognized. While many excellent patterns and generalizations have been published, what happens when you can't find an existing pattern or generalization that meets your needs? This presentation addressed how to identify and model your own patterns. The things to look for, the questions to ask, and the sub-patterns (yes - there are sub-patterns) to apply. A series of data models showing how the evolution of understanding of both the problem domain and the modeling techniques led to a straight-forward solution to a complex problem.
Wednesday, April 30
10:00 am – 11:00 am
Data Modeling for Authentication and Entitlement
Senior Technology Specialist
Cambridge Technology Partners
Data Modeling remains the best practice for capturing business requirements for Authentication and Authorization. Current Authentication and Authorization implementation technologies such as LDAP directories require more than "one button push" for logical-to-physical translation.
The benefits of current authentication and authorization technologies include standards for schemas, classes, attributes, and interchange formats. Data Management professionals should get involved with authentication and authorization requirements, and drive the implementation from the data requirements!
Data Quality @ Bulgari: a real case study.
Data Quality Manager
Alberto Villari gave an insightful case study of his data quality initiative at Bulgari, the Italian-based, world-wide supplier of upscale consumer goods, including jewelry, timepieces and clothing. His experience showed that:
1. Data Quality can be done.
2. It will impact positively on the bottom line.
3. The whole organization will work better.
4. DQ Project have to have a strong commitment, and sponsorship, to start and succeed.
Business Rules and Rule Engines: Opening New Doors of Opportunity
Ronald G. Ross
Business Rule Solutions, LLC
Mr. Ross discussed eight core ideas taken from his popular new book, Principles of the Business Rule Approach:
* When a rule becomes important enough, it is always written down (made explicit).
* A rule important enough to write down is worth writing down plainly.
* Rules can exist independent of procedures.
* The purpose of a rule is always to guide or influence behavior in desired ways.
* Business rules are always motivated by identifiable and important business factors.
* Business logic should be always be specified directly by those with the relevant knowledge.
* If rules are important enough to be enforced, they are important enough to be single-sourced.
* The best way to ensure rules are followed is to get them right in front of people at the exact point where the guidance is relevant.
Understanding and Leveraging the Object Management Group’s Metadata Standards
Metadata Management Office
Over the past several years, the Object Management Group has organized a very substantial series of collaborative intellectual processes. These efforts have been aimed at defining a lingua franca for metadata and modeling language description, and have resulted in specifications such as the Unified Modeling Language, the Meta-Object Facility, XML Metadata Interchange, and the Common Warehouse Metamodel (as well as many others). This presentation provided an overview of these efforts and their industry relevance, from the perspective of an interested Fortune 100 data administrator. He covered:
* The OMG 4-level metamodeling concept
* MOF, UML, CWM, and how they are related
* XML Metadata Interchange -* a lingua franca for models
* MOF Query/View/Transformation: a high-stakes, high-participation standard
* Current academic, open-source, and commercial products and initiatives
* Employing the OMG standards to create a distributed metadata bus architecture, or Bringing standardization higher into the technology stack
The Semantics of Semantics
Business Language & Rules Consultant
Business Semantics Ltd.
Unicorn Solutions Inc.
Donald presented an outline of how Semantics can bridge the gap between business understanding and IT implementation. Zvi presented an outline of an approach to systems integration based on the existence of a common semantic representation and the ability to map existing systems to it. Key points included the need to use a structured approach to ensure that business meaning is captured in terms that the business users can understand, which at the same time is expressive enough to form the basis for information systems. Also the key to the economic advantage of applying semantics to integration is tying each current term to a single common semantic expression, which is agreed upon by users and IS.
-.The only meaning there is in a work environment is the sharedunderstanding of the people who are working together in a particularactivity. In reality you can't talk about business semantics withoutputting the people who run the business at the center of the discussion.
-.The business already has a great many business vocabulary resources,independent of IT, which should be gathered together, structured, andleveraged for maximum business benefit.
-.If it isn't part of the corporate culture, it isn't a businessvocabulary - it's only an IT model of (what IT thinks is) business meaning.
-.Structured Business Vocabularies and Business Rules are inseparable-- together they comprise the business semantics -- and belong to and shouldbe the responsibility of the people who run the business.
The speaker defined Data Semantics as capturing the meaning of physical data by mapping to agreed business terms. He proposed a Semantic Information Architecture for enterprises which includes (i) metadata (ii) an Information Model of the business (iii) Semantic mappings relating the physical metadata (schemas) to the Information Model. It was shown how such a Semantic Information Architecture may be used to automate tasks of data management, data integration and data quality. Finally Schreiber discussed business benefits of a Semantic Information Architecture which may include (a) Information Quality (b) Greater Business Agility (c) IT productivity.
Managing the XML Data Resource
Enterprise Data Architect
The requirement for managing tagged data in XML documents and specifications corresponds exactly to the requirement for managing the more traditional relational data resources. This presentation detailed the approach taken at Engelhard Corporation in response to this challenge. The approach calls for the definition of the XML objects under management as part of the Engelhard document standard, ecXML. It also includes techniques for the definition and enforcement of the ecXML metadata and process requirements.
Data Model and Integration Strategies for Real-Time Analytics
Vice President of Technology
Effective marketing requires judicious use of customer data and coordinated treatment logic across many different operational systems and touchpoints. Unique data & business rule challenges are introduced when distributed operational systems are coupled with the analytical & marketing processing requirements needed to support real-time customer dialogs across these systems. This session explored system architectures and new data model requirements for coordinating outbound and inbound customer treatment strategies based on batch, real-time & event-triggered marketing scenarios across different touch points, including:
- Extended customer data model requirements to support cross-channel dialog marketing.
- Architecture, system design and data integration strategies to support both synchronous and asynchronous real-time marketing application demands
- Techniques for integrating & coordinating real-time, scheduled & event triggered marketing dialogs across touchpoints
- Important lessons from recent deployments of real-time marketing in high-volume production environments.
- Guidelines for establishing & prioritizing cross-channel dialogue based on business value metrics and implementation readiness/complexity
Wednesday, April 30
11:10 am – 12:10 am
Methods and Models in Data Architecture.
Michael K. Miller
For data management professionals, a major objective is aligning the database environment with the needs and goals of the business. Real-life methods exist that help to ensure a database architecture that is an implementation of the business objectives and strategies. This presentation combined data architecture methods/practices, the Zachman Framework, and use of an Enterprise Data Model. The Zachman Framework itself is methodology-neutral and it is up to the data management professional how to implement it. The presenter showed examples of models in each cell of the Zachman Framework data column and methods/procedures on how to build each of those models while at the same time incorporating an Enterprise Data Model.
The Data Analysis Political Toolkit
Quality Process & Data Systems Manager
Data Analysts need to have a set of political tools that enable them to effectively apply their data analysis tools and methods as appropriate. Three political tools that can be useful are:
1) Running Data Analysis as a Business - Use business acumen to improve your bottom line
2) Drive Data Quality by the Actual Quality of the Data - Leverage your core expertise to present a compelling business case
3) Data Analysis Diagnostics - Determine the solution based on the characteristics of your environment. Don't make it more sophisticated than it needs to be.
Banking on Metadata at Allstate
This presentation provided an overview of Allstate's metadata management practices. By leveraging the information in Allstate's repository, Doug and Pam demonstrated how data is gathered using a custom-built suite of tools, integrated, and then circulated throughout the enterprise. Attendees were shown the types of data that can be gathered, the means with which to gather it, and just how it could leveraged for their enterprise. Through the management of Business Domains, the company's Enterprise Data Management team has achieved consistency in business definitions, documented and integrated the multiple sets of values and codes used throughout the enterprise, and provided the links between physical schemas and logical data models. By building a Domain Management set of tools, Allstate has created a solution for researching, managing, and standardizing both encoded and non-encoded data.
Business Intelligence in the Competitive Corporate World
Database Design Solutions, Inc.
Adrienne Tannenbaum provided a new focus on the role of metadata. She discussed a current project in the pharmaceutical industry where existing, in many cases, "undiscovered" metadata was organized into a beneficial source of analytical statistics with respect to research study contents and results. Of particular interest was her focus on the cost/benefit of existing vs.. new metadata.
Integration Starts with Business...Using collaboration with a Business-Centric Methodology for Enterprise Agility and Interoperability
Defense Finance and Accounting Service
The presentation describe how to use a business integration methodology being applied at the Defense Finance and Accounting Service (DFAS). The journey begins with establishing and outlining your organization’s Vision, Goals, and Strategy for achieving precise communications among your primary stakeholders. Then, the task is expanded to identify and manage your information assets, their associated business metadata, context and ontology. These technology-neutral artifacts become the building blocks for assembling reusable components to be used in coupled communications between stakeholders. Once these artifacts are identified and documented, we can begin the work of obtaining the infrastructure layers to support either the existing “as is” method of doing business or migrate to technology-oriented or business-centric mechanisms to deliver business agility.
The journey is constrained by a Business Centric Methodology that outlines management criteria to guide you through the myriad of choices and trade-offs you will have to make in order to achieve your organizations tailored vision. The result is tailoring your business message communications to your business partners’ desired semantics and syntax. The integrated information architecture can enhance your organization’s performance and agility to deliver the ultimate business metric, “Customer Best Value”.
Physically Implementing Universal Data Models to Integrate Data
Universal Data Models, LLC
- The Universal Data Model for Parties, Roles, and Relationships provides a solid foundation for data integration and allows data about people and organizations to be stored in one consistent place and a complete profile of all the roles and relationships for each party.
- This model may be implemented in many different ways, for example, by allowing a single PARTY table, separate PEOPLE and ORGANIZATION tables, a single PARTY ROLE tables, separate tables for roles such as CUSTOMER, EMPLOYEE, PARTNER, a single PARTY RELATIONSHIP table, and/or separate relationship tables such as EMPLOYMENT and CUSTOMER CONTACT RELATIONSHIP.
- In order to populate the tables, a "system of record" strategy and "pattern matching" strategy is needed. The integrated data store identifiers (such as party_id) need to cross reference the application keys and there are three main database structures that can accommodate this: placing a foreign key in the application table, building a cross reference table between the enterprise key and the application, or a combination of both.
- The architecture for implementing these integrated structures may be virtual (for example common XML schemas) or physical (for example using an operational data store)
Wednesday, April 30
1:15 pm – 2:15 pm
OK, So What Exactly is a Data Model, Anyway?
Essential Strategies, Inc.
Our industry is awash with data models--of every kind and description. The problem is that we can't even agree on what we mean by a data model. Different people use the technique for quite different purposes. This presentation laid out the different kinds of data models, along with not only their purposes, but also the perspectives of the people who use each kind. The paper began with an introduction to the Architecture Framework, Dave Hay's version of John Zachman's "Framework for Information Systems Architecture". This is a very powerful way to sort out different approaches to data modeling. Specifically, data models are different as seen by business users, system architects, and system designers.
Business users view their world in very concrete terms. The business owner is looking at one of many "external schemas". They are concerned with the things they see and use. This makes any models produced for them very concrete and "divergent"--with many entity classes. The most useful thing to get from the business users is the vocabulary of the business--its technical terms and other jargon--along with the precise definition of each term.
The architect is looking at a "conceptual schema" as a representation of the fundamental, underlying structure of the organization. This is the structure that is behind the various--different--external schemas. This is a "convergent" model: it is a bit more abstract, with fewer entity classes. These entity classes tend to represent more abstract concepts, of which the business owners' entity classes are but examples. This is the "entity/relationship diagram", typically represented in Information Engineering, SSADM, or Object Role Modeling notation.
(A subset of UML can be used to produce an entity/relationship diagram, as long as it is understood that what is being represented are things of significance to the business, not computer artifacts.)
The designer sees the "logical schema"--data structures organized according to a particular data manipulation technology. In 2003, the most common structures for logical schemas are the relational model, managed by relational database management systems, and the object model, manipulated by object-oriented programs. This is still not "physical", since the "tables", "columns", and "classes" are still concepts. Both the database administrator's relational database and the object oriented designer's class models are derived from the same conceptual model, but variations in design often make it hard for them to communicate. The relational database design can be represented by IDEF1X, and the object design model can be represented by UML.
Whose on First - 'Data' or 'Process'?
VP - Technology Services Architecture
This presentation explained the drivers and effects of the movement towards a 'process-centric' view of technology solutions, and made the following recommendation to help data professionals maintain relevance and flexibility through this transition:
- Understand Process Modeling and its relationship to Data Modeling
- Know what part of the business model the problem is addressing
- Focus data efforts to the scope of the business problem in hand
- Emphasize Data Quality needs and the skills that support it
- Understand the model for delivering data to the 'operational front line at the Internet
- Learn about XML - it won't replace RDBMSs, but it will interface to them
- Apply classification skills to capturing knowledge about Web Services as they proliferate
- Learn about objects, object oriented design and languages, and object modeling
- Understand the need for Object/Relational mapping, where and why it happens, and how to facilitate designing to support it
Case Study: Partnering with Business Process Reengineering to Improve Data Quality
Senior Data Administrator
Director, Data Administration
At MetLife, the Data Administrative team has successfully partnered with Business Process Reengineering teams to focus on data requirements and data quality issues. We were able to introduce a data stewardship program to the Disability business. Taking advantage of opportunities by marketing the value of understanding data from a business perspective is a win-win for the business and the process redesign efforts.
PANEL: Meta Data Convergence
|Compiled and edited by Tony Shaw, Program Chair, Wilshire Conferences, Inc||General Chair Yogendra K. Joshi Georgia Institute of Technology, usa program Chair Sandeep Tonapi Anveshak Technology and Knowledge Solutions, usa chair, Thermal Management|
|Opening Ceremony and Committees Address by General Chair and Program Chair||Wseas and naun conferences Joint Program|
|Associated Press – 1/21 isu space food program closes Tony Pometto Faculty/research||Warren L g koontz (Professor and Program Chair)|
|Nelson is president, board member and annual congress program co- chair of the International Management Development Association imda||Program Self-Study Report|
|Clips Report is a selection of local, statewide and national news clips about the University of Missouri and higher education, compiled by um system University||Clips Report is a selection of local, statewide and national news clips about the University of Missouri and higher education, compiled by um system University|