Скачать 220.24 Kb.
|
Cheryl EstepWarren Selkow John Zachman Members of the Business Rules Group (BRG), a standards body organized in the late 1980s, held an open question-and-answer discussion on business rules and related standards. Highlighted were the BRG's recent publication of the Business Rules Manifesto, and BRG's joint activity with OMG's Business Rules Working Group (BRWG) and its RFI for business rules. In addition, the panel discussed its current work on structuring business semantics and vocabularies, and how that work relates to declarative rules and data models. Why Meta Data Management is Pivotal to Enterprise Data Security David Schlesinger Enterprise Data Security Manager Intel Corporation As Data Administrators work to decrease data entropy among a multitude of disparate systems, programmers are creating stand-alone solutions to meet deadlines. Corporate auditors develop policies to protect data and obey laws, and security teams work to keep systems secure from outsiders without knowing how workers get their accesses. These separate interests, apparently at odds, can work together to drive better data management, protect sensitive information, and automate security tasks now done slowly, manually and poorly. Meta Data Driven Access Control provides an infrastructure-based data security capability to leverage the power of meta data for information security and defining the data flowing through the enterprise. Mapping data between XML and RDBMs Denise Draper Chief Technology Officer Nimble Technology This talk showed different methods for mapping between relational and XML data models. The basic techniques, including those used by database vendors and third-party products, were introduced and compared. The focus was on fundamental issues that arise when mapping, such as handling of data types, identity, order, nulls, and mixed text. Denise examined the ways in which XML data can be used, such as publication versus modification of XML data, and how that impacts the choice of mapping. Embarcadero User SIG Karen Lopez Principal Consultant InfoAdvisors, Inc. The attendees of the first Embarcadero User SIG voted to begin the work of forming an official ERStudio User Group. The initial efforts will focus on polling existing ER/Studio users to determine what format the User Group will take and how it should be organized. The ER/Studio Discussion Groups will be used to coordinate the efforts. It was agreed we should continue to hold informal SIGs at data management events where possible. CONFERENCE SESSIONS Thursday, May 1 8:30 am – 9:30 am Enterprise Shared Data -- A Road Map to Realization David Grube Information Architect Merck Research Laboratories Shiraz Kassam Data Architect Merck & Co., Inc. The path to sharing data across an enterprise has been fraught with pitfalls, and more corporations have failed than have succeeded in this endeavor. This presentation explored the roots of past failure and focus on an approach used to implement an enterprise sharing architecture at a large, multi-division, global pharmaceutical corporation. The presentation discussed business processes, technical enterprise architecture and data models, and a process called “data planning”, used to determine the portfolio of data projects and related publication of data projects that are necessary for a successful enterprise data and data model. The speakers discussed a new business process called data planning, which the data portfolio and publication of data and information assets. - A “hub and spoke” data architecture that allows an enterprise data model to evolve from divisional models over time to remain a viable and relevant corporate data model. - Definited of scaleable data architecture business processes focused on defining data publication. - The roles and responsibilities necessary to sustain a data sharing architecture. - The creation and evolution of the enterprise data model and metadata for data sharing. - Discussion of a new business process called data planning, which drives the data portfolio and publication of data and information assets. Communicating the Value Proposition of Data Modeling: "Getting the business users excited about Data Modeling" Melanie Shanks Lead Data Analyst Pactiv Corporation Michael Voss Lead Data Modeler Pactiv Corporation The speakers discussed a 4-step process to communicate and measure the value delivered by your data models working in a project environment. 1. Effective workshops - use the business resources wisely, don't expect them to read a data model, go the extra mile and produce a data model document which reads like a business document. 2. Model and Source Mapping - Data models are a key communication tool between the business and the data modelers but not all developers can read them. Additional deliverables, such as source mapping can take the communication to the next level, a spec for the developers. 3. ROI - We were able to measure the time and money saved by identifying changes to the data model and source mapping in design rather than post-production. 4. Vision and Marketing - Market your successes to the business and IT through newsletters, intranet page and "what we do" sessions. Toward Legacy Architecture Recovery Measures Peter Aiken Founding Director VCU/Institute for Data Research It is difficult to estimate the cost of legacy architecture recovery but recovering certain aspects of the architect of both new and old systems can be crucial to successful systems reengineering. While all systems are different, there are certain characteristics that can be measured and these can form the basis for useful estimates. Current urban myths related stories of 3-5 hours of analysis per attribute. Applying automation to the process indicates that it can be reduced significantly. More work still needs to be done to provide some more formal measures. Implementing a Web-based System from Scratch: A Data Management War Story Richard Bates Senior Technical Consultant Hewlett-Packard The speaker discussed the process that was used to implement an on-line account management and invoicing system. This process can be applied to any effort that involves moving data from one system to another. Additionally, the speaker presented several key lessons learned and that should be considered when implementing a project of this type. These key lessons learned are: 1. Profile your source data to understand the content and quality of your data before you attempt to move it. 2. Don't let the limits of the tools drive you to not using the tools 3. Determine all the players in the process don't forget the system operations team 4. Don't underestimate the effort to provide "live" test data to the development team 5. Be adaptable Data Management's Next Big Thing Tony Shaw Chairman Wilshire Conferences Karen Lopez Principal Consultant InfoAdvisors, Inc. Michael Jennings HRe Technology, Data Warehousing & Architecture Hewitt Associates Technology people always want to know “what’s the next big thing?” In this panel we addressed that question specifically as it relates to data management. What new technologies and/or trends are going to change our lives in the near future? Each speaker had the opoortunity to offer his/her version: Don Soulsby “Trust & Traceabilty” - Data management and metadata have been relegated, typically, to an IT function. With the advent of regulatory concerns represented by legislation in the US (HIPAA, Patriot Act, Sarbanes-Oxley), there is a greater need for information accountability (stewardship/governance) in the boardroom, not the backroom. This will place greater accountability in the laps of DM professionals. Don asks if some day, data managers may be held liable for data problems that result in lawsuits against a company, in the same way that CEOs have been made liable. Lisa Cash “Managing Big Data” What are the challenges of managing your ever growing databases? New government regulations and corporate mergers are compounding the problem. Companies need to address these challenges while working with the current budget constraints. I will address some of the challenges of growing databases, the impact to the business the and emerging technologies designed to address the problem." Mike Jennings “Data Security” – Mike’s next big thing in data management is an increased focus on data security from a data protection perspective (versus availability, backup, or disaster recovery) because: - The complexities and interdependencies of today's information technology environments, software, and infrastructure make securing data problematic (information leakage, parameter manipulation, cross-site scripting, SQL injection) - The increasing prevalence of web enabled applications compound security threats around data management by spanning the enterprise infrastructure and the Internet - Increased focus and spending on data security by corporations due to concerns around data continuity (up since 9/11), privacy regulations, competition, and publicity concerns - Publicity associated around news reports of data security exploits of widely used applications - An increased focus by vendors on data security as a means to differentiate their products and increase market share - An increase focus by companies on data security in the same context as performance and functionality - Increased movement in some business areas towards outsourcing (ASP, BPO, BSP, ITO) bringing a greater focus on data security - Increased use of geographically distributed data repositories utilizing network storage deployment strategies (NAS, SAN) - Customer privacy concerns and legislation will create interest and demand for products and service offerings Karen Lopez “Encroachment” - Karen’s next big thing is the encroachment of others into Data Management's turf. This includes the agile/xp/buzzword methods plus the issues of other professions and positions being assigned to do data management work. Some of these issues she sees as threatening to DM, and others create opportunities. CENTRAL, Boeing's Registry and Repository Kathryn Breininger CENTRAL Project Manager The Boeing Company As XML becomes more widely used, the need for efficient management of XML-related assets becomes critical. This presentation described how repository and registry technology are being used to manage XML assets – such as DTDs and schemas --- providing discovery of, access to, and sharing of these assets. The Central Registration Authority and Locator (CENTRAL) is a Boeing enterprise-wide Registry and Repository designed to store and retrieve reusable eXtensible Markup Language (XML) assets such as Document Type Definitions (DTDs) and XML schemas. The CENTRAL Registry contains metadata and locations for XML assets and makes these assets available to the entire Boeing enterprise as reusable objects. This presentation provided an overview of the CENTRAL project, including the scope of the project, events that led up to the development of the system, the design of the system, and the functions and roles of the users. The first production release of CENTRAL was presented, including concepts for the architecture, services, and functions to be provided in future phases. The Case for Globally Unique Surrogate Logical and Physical Keys Chris Willoughby Data Architecture Consultant Data Architecture Consulting Business Identifiers, Globally Unique Physical Surrogate Keys, and Globally Unique Logical Surrogate Keys - all three identifiers are needed to protect the enterprise from changes to business information needs and from changes to the physical storage of information. Business identifiers may change, but logical keys and primary keys must be constant in format and value. A globally unique surrogate value in the primary key of relational tables meets all of the criteria for selecting primary keys. They must be meaningless and invisible to the business. Globally unique logical surrogate keys are used to track entities through life cycles, and to consolidate information across the enterprise. They must be meaningful and visible to the business. CONFERENCE SESSIONS Thursday, May 1 9:50 am – 10:50 am Developing an Enterprise Object Class Hierarchy Graham Witt Author and Consultant Consulting Insights An Enterprise Object Class Hierarchy provides a viable starting point for an Enterprise Information Architecture. Rather than being a traditional E-R or Object Class Model with relationships or associations setting in stone a potentially volatile set of structures, and thus subject to obsolescence, it consists of Business Terms (which may be Entity, Relationship or Attribute Classes or even Instances), each with Definitions and organized hierarchically with inheritance of properties. It provides a high level of business buy-in, and a genuinely reusable resource, since application projects retain confidence in the validity of the content. Data Modeling Issues in Product Data Management Jeff Davey Manager, Integrated Engineering Toolset - Americas Siemens Dematic Darrell Raymond Principal Consultant Alternative Output Inc. Product Data Management (PDM) systems are databases that manage parts, documents, drawings, and other information that manufacturing and engineering companies use to generate their products. The speakers discussed specific characteristics of PDM systems that make data modeling more complex, including the following: * The underlying graph model. Product structures are most logically modeled as directed acyclic graphs. Effective capturing of this graph in a relational database is essential for good performance. * Authorization. Management of change to data is a key functional aspect of a PDM system, and this requires a significant authorization structure. Certain aspects of authorization are susceptible to modeling, but others are not. * Sources of attributes. Much of the data that enters PDM systems comes from external systems such as CAD packages, ERP systems, component databases, and other databases. Modeling must take into account the capabilities and schemas of these external systems How Have Shell and other Large Companies Approached Enterprise Information Integration? Chris Worsley Global VP Marketing Kalido Large companies are still striving to improve information integration at the global level to drive business benefits such as reduced procurement costs, improve service to customers, rationalise products. The information challenge is increased by the level of diversity in the company and the degree of continual change. However companies such as Shell, Unilever, Cadbury Schweppes and Halifax Bank of Scotland have implemented projects with Kalido which have addressed the challenges and delivered significant business benefits. Flexible data warehousing solutions and management of master reference data have been fundamental in delivering these successful projects and providing a platform for future evolution. Building and Utilizing a Successful Meta Data Database: Fleet Bank Credit Card Services Case Study Beth Cathcart Consultant Fair, Isaac Athina Croom Database Manager Fleet Credit Card Services |