Number of benchmark sources currently: 20




Скачать 391.79 Kb.
НазваниеNumber of benchmark sources currently: 20
страница1/11
Дата26.10.2012
Размер391.79 Kb.
ТипДокументы
  1   2   3   4   5   6   7   8   9   10   11
SOURCES OF SOFTWARE BENCHMARKS


Version 19 March 14, 2012


Capers Jones & Associates LLC

Namcook Analytics LLC


INTRODUCTION


Number of benchmark sources currently: 20


Number of projects in all benchmarks sources: 86,100


Quantitative software benchmark data is valuable for measuring process improvement programs, for calibrating software estimating tools, and for improving software quality levels. It is also useful for studies of industry and company progress over time.


There are a number of organizations that gather and report quantitative benchmark information. However these organizations are independent and in fact some of them are competitors.


This catalog of software benchmark data sources is produced as a public service for the software community.


There are many different kinds of benchmarks including productivity and quality levels for specific projects; portfolio benchmarks for large numbers of projects, operational benchmarks for data center performance; security benchmarks, compensation and staffing benchmarks for human resource purposes; and software customer satisfaction benchmarks. SEI software assessment data is also included.


This catalog is intended to grow over time and to include all major sources of software benchmark and assessment information.


The information in this catalog is provided by the benchmark and assessment groups themselves and includes topics that the benchmark groups wish to be made available.


In this version the benchmark groups are listed in alphabetical order.


TABLE OF CONTENTS


Introduction to Software Benchmark Sources 3


1. 4SUM Partners Inc. (Finland) 4


  1. Capers Jones & Associates LLC 6




  1. CAST 11




  1. COSMIC Consortium 13




  1. David Consulting Group (DCG) 16




  1. Galorath Incorporated 18




  1. German Computer Society Interest Group (GI) 20




  1. International Software Benchmark Standards Group (ISBSG) 21




  1. Jerry Luftman (Stevens Institute of Technology) 23`




  1. Price Systems LLC 26




  1. Process Fusion 29




  1. Quantimetrics 30




  1. Quantitative Software Management (QSM) 33




  1. Q/P Management Group 36




  1. RCBS, Inc. 39




  1. Reifer Consultants LLC 40




  1. Software Benchmarking Organization 41




  1. Software Engineering Institute (SEI) 46




  1. Software Improvement Group (SIG) 52




  1. Test Maturity Model Integrated (TMMI) by Geoff Thompson 57


Appendix A: Books and web sites with quantitative data 60


Appendix B: Survey of Software Benchmark Usage and Interest 77


Appendix C: A New Form of Software Benchmark 79

INTRODUCTION TO SOFTWARE BENCHMARK SOURCES


The software industry does not have a good reputation for achieving acceptable levels of quality. Neither does the industry have a good reputation for schedule adherence, cost control, or achieving high quality levels.


One reason for these problems is a chronic shortage of solid empirical data about quality, productivity, schedules, costs, and how these results vary based on development methods, tools, and programming languages.


A number of companies, non-profit groups, and universities are attempting to collect quantitative benchmark data and make it available to clients or through publication. This catalog of benchmark sources has been created to alert software engineers, software managers, and executives to the kinds of benchmark data that is currently available.


The information in this catalog is provided by the benchmark groups themselves, and shows what they wish to make available to clients.


This catalog is not copyrighted and can be distributed or reproduced at will. If any organization that creates benchmark data would like to be included, please write a description of your benchmark data using a format similar to the formats already in the catalog. Please submit new benchmark information (or changes to current information) to Capers Jones & Associates LLC via email. The email address is Capers.Jones3@gmail.com. The catalog can also be downloaded from several web sites including www.Namcook.com which is the editor’s web site.


The catalog is expected to grow as new sources of benchmark data provide inputs. Benchmark organizations from every country and every industry are invited to provide information about their benchmark data and services.


4SUM Partners Inc

Web site URL: www.4sumpartners.com

Email: pekka.forselius@4sumpartners.com


Sources of data: The 4SUM Partners database contains high quality data from the completed projects of clients. Long-range client projects’ data has been collected mainly in project benchmarking studies and scope management assignments. The data have been validated and represent actual project completions. Anonymity of source organizations is protected.

Data metrics: Project data metrics is based on functional size of applications in FiSMA function points.

Analogy data consists of project delivery rate categorized by project classifiers. Project situation analysis data express project’s productivity factors.

Data usage: Data is used in making first estimate of project delivery rate (hours/fp) for project planning and especially estimating project cost and duration according to the northernSCOPE concept enhancing the scope management of software projects.

Data is also used in software project performance benchmarking studies. These comparison studies enable positioning productivity of project delivery in the terms of operational efficiency. They also help to identify and prioritize improvement opportunities in software project delivery and maintaining a culture of continuous improvement.

Data availability: Data is provided to clients of benchmark studies and users of Experience Pro size measurement software.

Kinds of data: Productivity data is collected from all kind of software projects. Several project and application classifiers are used to improve comparativeness of projects.

Functional size of projects ranges from 20 fp to several thousands of function points. Average size of local data is about 500 fp.

Project data represents several business areas and development tools.

Actual effort and cost data in Experience database is activity-based supporting costing of software development life-cycle from specifications to preparing to install.

Quality of new project data is categorized when storing data to the master repository.

Volume of data: Over 1000 projects. Majority of the data represents new development tailored projects. There is also data on enhancements and maintenance.

Industry data: Banking, insurance, public administration, manufacturing, telecom, wholesale & retail and other industries to some extent.

Methodology data: Project classifiers include case development tools, development model, project management tools and several various techniques and methods.

Language data: Data includes COBOL, PL/I, Java, C++, Visual Basic, COOL:Gen, Oracle, C, SQL and other languages to some extent.

Country data: Data is mainly from Finland.

Future data: Number of data is growing. Increase in 2011 was about 100 projects, and data volume is expected to increase by 300 projects in 2012.

Summary: High quality data in the 4SUM Partners database have been validated and represent actual project completions. Metrics of the data is expressed in terms of function point measured using standard measurement methods (IFPUG, FiSMA).

Data is highly useful in productivity analysis and software project performance benchmarking studies. Data supports software projects’ scope management in estimating cost and duration.

Volume of data is growing powerfully due to large-scale productivity studies expected to be carried out in 2012 and 2014.


Capers Jones & Associates LLC

Namcook Analytics LLC

Web site URL: www.Namcook.com

Email: Capers.Jones3@gmail.com


Sources of data: Primarily on-site interviews of software projects. Much of the data

is collected under non-disclosure agreements. Some self-reported

data is included from Capers Jones studies while working at IBM

and ITT corporations. Additional self-reported data from clients

taught by Capers Jones and permitted to use assessment and

benchmark questionnaires.


Data metrics: Productivity data is expressed in terms of function point

metrics as defined by the International Function Point

User’s Group (IFPUG). Quality data is expressed in

terms of defects per function point.


Also collected is data on defect potentials, defect removal efficiency, delivered defects, and customer defect reports at 90

day and 12 month intervals.


Long-range data over a period of years is collected from a small group of clients to study total cost of ownership (TCO) and

cost of quality (COQ). Internal data from IBM also used for

long-range studies due to author’s 12 year period at IBM..


At the request of specific clients some data is converted

into COSMIC function points, use-case points, story points,

or other metrics.


Data usage: Data is used to create software estimating tools and predictive

models for risk analysis. Data is also published in a number of

books including The Economics of Software Quality, Software

Engineering Best Practices, Applied Software Measurement,

Estimating Software Costs and 12 others. Data has also

been published in about 200 journal articles and monographs.


Data is provided to specific clients of assessment, baseline, and

benchmark studies. These studies compare clients against similar

companies in the same industry.


Data from Capers Jones is frequently cited in software litigation

for breach of contract lawsuits or for suits alleging poor quality.

Some data is also used in tax litigation dealing with the value of

software assets.


Data availability: Data is provided to clients of assessment and benchmark studies.

General data is published in books and journal articles.

Samples of data and some reports are available upon request.


Some data and reports are made available through the library,

Webinars, and seminars offered by the Information Technology

Metrics and Productivity Institute (ITMPI.org).


Kinds of data: Software productivity levels and software quality levels

for projects ranging from 10 to 200,000 function points.

Data is primarily for individual software projects, but some

portfolio data is also collected. Data also supports activity-based

costing down to the level of 40 activities for development

and 25 activities for maintenance. Agile data is collected

for individual sprints. Unlike most Agile data collections

function points are used for both productivity and quality.


Some data comes from commissioned studies such as an

Air Force contract to evaluate the effectiveness of the CMMI

and from an AT&T study to identify occupations employed

within large software labs and development groups.


Volume of data: About 13,500 projects from 1978 through today.

New data is added monthly. Old data is retained,

which allows long-range studies at 5 and 10-year

intervals. New data is received at between 5 and

10 projects per month from client interviews.


Industry data: Data from systems and embedded software, military

software, commercial software, IT projects, civilian

government projects, and outsourced projects.


Industries include banking, insurance, manufacturing,

telecommunications, medical equipment, aerospace,

defense, and government at both state and national levels.

Data is collected primarily from large organizations with

more than 500 software personnel. Little data from small

companies due to the fact that data collection is on-site and

fee based.


Little or no data from the computer game industry or

the entertainment industry. Little data from open-source

organizations.


Methodology data: Data is collected for a variety of methodologies including

Agile, waterfall, Rational Unified Process (RUP), Team

Software Process, (TSP), Extreme Programming (XP),

and hybrid methods that combine features of several methods.


Some data is collected on the impact of Six Sigma, Quality

Function Deployment (QFD), formal inspections, Joint

Application Design (JAD), static analysis, and 40 kinds of

testing.


Data is also collected for the five levels of the Capability

Maturity Model Integrated (CMMI™) of the Software

Engineering Institute.


Language data: As is usual with large collections of data a variety of

programming languages are included. The number of

languages per application ranges from 1 to 15, with an

Average of about 2.5. Most common combinations

Include COBOL and SQL and Java and HTML.

Specific languages include Ada. Algol, APL,ASP Net, BLISS,

C, C++, C#, CHILL. CORAL, Jovial, PL/I and many

Derivatives, Objective-C. Jovial, and Visual Basic..

More than 150 languages out of a world total of 2,500

are included.


Country data: About 80% of the data is from the U.S. Substantial data

From Japan, United Kingdom, Germany, France, Norway,

Denmark, Belgium, and other major European countries.

Some data from Australia, South Korea, Thailand, Spain, and

Malaysia.


Little or no data from Russia, South America, Central America,

China, India, South East Asia, or the Middle East.


Unique data: Due to special studies Capers Jones data includes information

on more than 90 software occupation groups and more than 100

kinds of documents produced for large software projects. Also,

the data supports activity-based cost studies down to the levels

of 40 development activities and 25 maintenance tasks. Also

included are data on the defect removal efficiency levels of

65 kinds of inspection. static analysis, and test stages.


Some of the test data on unit testing and desk checking came

from volunteers who agreed to record information that is

normally invisible and unreported. When working as a

programmer Capers Jones was such a volunteer.


From longitudinal studies during development and after release

the Jones data also shows the rate at which software requirements

grow and change during development and after release. Monthly

change rates exceed 1% per calendar month during

development and more than 8% per year after release.

From working as an expert witness in 15 lawsuits, some special

data is available on litigation costs for plaintiffs and defendants.


From on-site data collection and carrying out interviews

With project teams and then comparing the results to

Corporate resource tracking systems, it has been noted

that “leakage” or missing data is endemic and approximates

50% of actual software effort. Unpaid overtime and the

work of managers and part-time specialists are most common.


Quality data also leaks and omits more than 70% of internal

defects. Most common omissions are those of desk checking,

unit testing, static analysis, and all defect removal activities

prior to release.


Leakage from both productivity and quality data bases inside

Corporations makes it difficult to calibrate estimating tools and

also causes alarm to higher executives when the gaps are revealed.

The best solution for leakage is activity-based cost collection.
  1   2   3   4   5   6   7   8   9   10   11

Похожие:

Number of benchmark sources currently: 20 iconКонспект лекций по курсу
Линейные управляемые источники, задаваемые преобразованиями Лапласа (Laplace Sources) и z-преобразованиями (z transform Sources)...
Number of benchmark sources currently: 20 iconPlanning resources – Critical Benchmark #2: Surge Capacity and #6 Terrorism Preparedness Exercises

Number of benchmark sources currently: 20 iconCourse Number: artmulti 391 Cross Listed Number

Number of benchmark sources currently: 20 iconNotice of confidentiality rights: if you are a natural person, you may remove or strike any of the following information from this instrument before it is filed for record in the public records – your social security number or your driver’s license number

Number of benchmark sources currently: 20 iconCourse Number, Section Number, and Course Title

Number of benchmark sources currently: 20 iconCourse Number: speech 271 Cross Listed Number: journlsm 271

Number of benchmark sources currently: 20 icon"name of contributor" "payment type" "city" "state" "zip" "id number" "employer" "occupation" "amount" "transaction date" "filed date" "transaction number"

Number of benchmark sources currently: 20 iconLicensure and Certification Number Year Licensure 00132 dir 1994 Nevada Medical Laboratory Director License Number Year Certification

Number of benchmark sources currently: 20 iconThis contains material from many sources

Number of benchmark sources currently: 20 iconSources: un-term/un-interpreters

Разместите кнопку на своём сайте:
Библиотека


База данных защищена авторским правом ©lib.znate.ru 2014
обратиться к администрации
Библиотека
Главная страница