Evaluation of the Health and Research Outcomes of Technologies Licensed by the National Institutes of Health




НазваниеEvaluation of the Health and Research Outcomes of Technologies Licensed by the National Institutes of Health
страница7/36
Дата17.10.2012
Размер1.26 Mb.
ТипДокументы
1   2   3   4   5   6   7   8   9   10   ...   36

SECTION 4
METHODS AND METRICS FOR RESEARCH ACTIVITY

4.1 Framework for Assessing Effectiveness


Identifying metrics for research outcomes poses significant challenges because research tools encompass a broad range of technologies including reagents, DNA sequences, databases, cell lines, animal models and laboratory techniques. Some of the discoveries may remain research tools, whereas others might ultimately prove to be therapeutic or diagnostic products that are marketed to consumers outside of the laboratory setting. Research tools overall can result in two sets of benefits that need to be distinguished (Cozzens et al., 1994):

  1. Research Advancement – research that result in new products or radical changes in production efficiency.

  2. Process Optimization – research tools that lead to incremental changes in production efficiencies.

Given the potentially different impact of research activity in these two categories, any attempt to study effectiveness requires methodologies that adequately capture both types of impacts. In addition to the direct impact of technology on the creation of research tools and clinical applications, research tools can also generate knowledge spillovers that result in discovery of other tools and methods. Knowledge spillover occurs when the knowledge creation by one party produces external benefits by facilitating innovation by other parties (Jaffe, 2001). Therefore direct and indirect benefits of the technology to the research process can occur.

As indicated in Figure 3, basic and applied research effectiveness can be measured in several steps including activities/outputs of an organization, specific technology transfer outcomes and finally health impacts (Roessner et al., 1996; Collins, 1997). As discussed earlier, activities/output measures are inadequate measures because they do not measure outcomes or public health impacts and therefore will not be included in the assessment framework.

Figure 3. Schematic model for assessing effectiveness of biomedical technology transfer

frame14
INPUT


NIH Technologies

ACTIVITIES/OUTPUT


# of patents, #of licensees

TECHNOLOGY TRANSFER OUTCOMES


# of citations, # of references in FDA submissions


HEALTH IMPACTS


% reduction in mortality, improvement in quality of life

No Direct
Health Impacts


Feedback process


SOURCE: Adapted from Roessner et al., 1996

4.2 Systematic Review of Methods


Several approaches can be used to assess the effectiveness of NIH research tools based on our assessment of the literature (Hertsfeld, 1992; Cozzens et al., 1994; NSTC, 1996; Youtie et al., 1997; Youtie et al., 1998; Georghiou et al., 2000; NIST, 2003; Wang et al., 2003). These include

  • Historical Tracing

  • Bibliometrics/Aggregate Counts

  • Peer Review/Expert Review

  • Flow and Process Analysis

  • Performance Indicators/Metrics

  • Diffusion and Network Analysis

We have not included surveys, expert interviews or focus groups as specific methods as these are discussed under data sources. For each of the above approaches we provide a detailed description, use of method in past assessments, overview of advantages and disadvantages and data sources required for gathering the required information.

4.3 Historical Tracing


Description: Historical tracing involves the tracing of knowledge that results in specific innovations. This process helps identify linkages between technologies, whether basic or applied research, and significant research breakthrough or health outcomes that occur. This is often referred to as the historiographic method and involves the chronological tracking of developments that are interrelated. Historical tracing could involve either tracing forward from research to a future outcome or backward from an outcome to precursor contributing developments. Forward tracing begins when a technology is first developed and follows the diffusion pathway and its spawning of new technologies over the years. Backward tracing is retrospective and begins with an outcome of interest and then traces back in time the technologies and research activities that resulted in the outcome. Backward tracing is attractive because the outcome is known, but because it is focused on particular outcomes, other impacts of the technologies identified may not be captured and documented.

Use in Past Assessments: Examples of early studies in the 1960s and 1970s include the National Science Foundations, TRACES Studies that performed “Backward Tracing of Selected Major Technological Innovations to Key Events in Agency R&D Funding History” and Project Hindsight sponsored by the Department of Defense. These earlier versions were discredited by researchers as they used a highly qualitative case study format and were largely subjective (Faulkner, 1994). The availability of more quantitative approaches including patent analysis, citation analysis and content analysis (discussed below) has made this process more transparent and easier to perform (Narin, 2000; Bozeman, 1999). In the 1990s, R&D Value Mapping, a semi-quantitative approach that includes case studies and indicators, has been used to assess projects at the Department of Energy (Kingsley et al. 1996; Kingsley et al, 1997; Bozeman et al., 1995; Bozeman et al., 1992) and also R&D projects sponsored by the New York Sate Energy Research and Development Authority (Bozeman, 1999). Additional tracing studies have also been performed on the transfer of public science to patented technology in agricultural science (Perko et al. 1997) and value mapping of social capital outcomes in state research and development programs (Kingsley et al., 1999).

Advantages:


  1. Historical tracing not only connects inputs to the final outcome but also provides information on the process. This could be very valuable in understanding the pathways for future technologies and identifying areas for improved efficiency.

  2. The activities involved in performing historical tracing could identify other outcomes and linkages previously unknown

Disadvantages:


  1. Forward tracing may be complex because there could be multiple interactions between organizations and several years of follow-up prior to identifying final outcome.

  2. Backward tracing can follow the incorrect pathway and result in dead-ends.

  3. Significant effort, time and cost are involved with both backward and forward tracing.

Data Sources: The identification of pathways and linkages required for historical tracing is usually accomplished through other combination of other approaches such as patent tree assessments, citation analysis, expert interviews and diffusion assessments. Interviews and surveys can also be performed to obtain the necessary information.

Bibliometrics/Aggregate Counts


Description: Counts of outcomes/actions to document both quantity and quality can be performed. Bibliometrics can therefore be used to assess the significance, dissemination, and intellectual linkages of research, as well as to measure the progress, dynamics, and evolution of scientific disciplines (Ruegg, 2003). The three approaches generally included are counting publications and patents, citation analysis and content analysis.

Counting Publications and Patents: A simple count of the number of patents and technology-related publications can give some understanding of productivity. These counts can be compiled on an annual basis to view trends but even when quality adjustments are made it may be an imprecise assessment of the value of the technology patented and the real world impacts in terms of both research and health outcomes.

Citation Analysis:This involves tracking either publications or patents to identify the pathways of knowledge transfer and sharing. That is, the more often a patent is cited, the greater it’s assumed relevance, quality and impact. Citations can include publications citing other publications, patents citing publications, and patents citing other patents. Citation analysis can also be used in conjunction with other approaches such as historical tracing to identify evolution of significant research concepts and knowledge spillovers.

Content Analysis: This approach involves identifying and extracting specific information from the documents examined. For example, word analysis can be performed where frequency of the occurrence of a specific word or words can be documented. High frequency of occurrence in this context reflects a widely disseminated concept or technology. Increased computerization has allowed the development of new approaches including database tomography and textual data mining, that allow for searches to be performed and relationships to be identified without previously specified key words. Special hardware and software programs, including SPIRE™ and Starlight, effectively illustrate the results of content analysis.

Use in Past Assessments: Patent counts are used in a wide range of assessments to evaluate productivity of biotechnology firms, technology alliances, and academic and federal technology transfer organizations (Reitzig, 2004). The usefulness of citations for the purpose of studying spillover effects has been widely analyzed (Jaffe et al., 1996; Zucker, 1996). Patent citations are also used widely to perform assessments of the effectiveness of technology transfer activities (Bozeman, 1996). Forward citation was introduced in the 1990s and has been evaluated to be a valid indicator in numerous surveys (Harhoff et al., 2003; Lanjouw et al., 2001).

Advantages:


  1. The approach is very straightforward and therefore stakeholders can easily understand the results.

  2. Bibliometrics is relatively inexpensive and therefore can be applied to a large volume of technologies. There are secondary data sources that can be utilized to perform the assessment.

  3. This approach is also objective because there is very little room for interpretation and manipulation by analyst performing the assessment.

Disadvantages:


  1. Because there is no direct relationship to final outcomes this approach provides only an indirect measurement of benefits (Cozzens et al., 1998). Also, the counts give no indication of the quality of the bibliometric finding as there could be quality differences in the citations.

  2. This approach cannot be used as an immediate short-term evaluation because time is needed for citations to be recorded and cited (Altman, 1994). In addition, in comparing technologies, there is potential bias if differing time periods from initial development is used because more publications will occur as time passes.

  3. Although databases are available to perform the measurement they may be incomplete. For instance, patents may not be filed due to high cost associated with the process (Zeckhauser, 1996; Hall, 2001).

Data Sources: The key sources of information are the U.S. Patent and Trademark Office (http://www.uspto.gov) offers online and offline search capability for patent citations. The USPTO has developed a highly elaborate classification system for the technologies to which the patented inventions belong. This database has more than 400 main patent classes and over 12,000 patent subclasses. The Orange Book maintained by the Food and Drug Administration (FDA) also provides a listing of patents related to applications filed. We did not find any analysis that solely relied on patent counts based on FDA filings although potentially such an analysis could supplement a larger assessment of patent citations using a comprehensive source. We also found several commercial tools such as CLAIMS(r) and Minesoft that can assist in performing patent citation analysis.

Additionally, the Institute for Scientific Information (ISI) in Philadelphia, PA, provides access to ISI citation databases covering thousands of international journals. ISI provides an integrated platform of networked resources for bibliometric research and offers desktop access to cited references (Ruegg, 2003). Specifically, the ISI’s Science Citation Index, includes the whole spectrum of scientific disciplines from basic to applied and contains more than 3,500 periodicals and about 400,000 articles annually.

Peer Review/Expert Review


Description: A panel of experts can be convened to assess the scientific significance of the technology. The process and types of individuals selected to serve as reviewers is key to ensure successful execution of this approach. Participants should be free of any conflicts of interest and need to be knowledgeable in the subject area that is being assessed. In addition, the group dynamics of the individuals in the evaluation team need to be considered. It is important not to include individuals who will dominate so as to hinder group discussions and erode the consensus- making process. Typically, the experts are provided with information either in writing or in an oral presentation and asked to render their decision either qualitatively or semi-quantitatively. That is, they may express their opinions in terms of descriptive narratives or provide rankings (such as excellent/good/fair, high/ medium/low, or satisfactory/unsatisfactory), or numerical scores ( instance e.g. 0-3).

Use in Past Assessments: This method has been used extensively by federal agencies. For example, the Office of Science and Technology has a Peer Review Program for environmental technology development programs (NRC, 1999). Peer review is performed in other agencies including the Department of Agriculture, Department of Commerce, the National Institute of Standards and Technology (NIST), the Department of Energy and the Environmental Protection Agency (COSEUP, 1999).

Advantages:


  1. Allows for quick, low-cost feedback and can be applied to technologies at any stage

  2. It is a widely accepted approach to assessment (Ruegg, 2003). As mentioned, most federal agencies use expert review to evaluate their activities.

Disadvantages:


  1. It is often difficult to identify an unbiased sample of experts especially during the early stages of a new technology because those knowledgeable tend to be the key researchers in the field who have a vested interest in providing a positive outlook for the technology.

  2. Consensus may be difficult or impossible to achieve based on the group dynamics present.

  3. Significant effort needs to be placed on ensuring that information in collected in a consistent format and that reviewers have a common understanding of the technology they is being evaluated.

Flow and Process Analysis


Description: Process modeling facilitates the examination of any business system, no matter how small or large, from the perspective of the information that it gathers, stores and processes. Data flow diagrams can be used to provide a clear representation of a particular function. Logic models can be developed to diagrammatically represent the impact of a new research tool on the research process. This approach maps the process from activities to specific outcomes and has been shown to be a useful management analysis tool (Youtie et al., 1998). For example, the impact of new screening technology on lead optimization can be presented in this manner. The individuals who work with the particular technology are those that are most familiar with the process and the impact on outcomes and would be ideal candidates to develop the flow analysis. Impacts identified through the flow analysis can be quantified through surveys and/or interviews with experts.

Use in Past Assessments: Several good examples are the analysis performed by NIST to assess the effectiveness of technologies used in the research process or to develop other tools including software error analysis (Peng, 1993; Wallace, 1996). Logic and data requirements are transformed from text into graphic flows, which are easier to analyze. This method has also been used to diagrammatically present the effectiveness of biomedical research tools. An excellent example is the schematic representation by Anderson et al., (2004) of the AMP-activated protein kinase (AMPK) assay in microarrayed compound screening.

Advantages:


  • Charting the flow of non-complex activities does not require specific professional expertise and produced easily by individuals familiar with the processes.

  • Modeling the flow process can identify potential benefits and drawbacks of a new technology that were not considered initially.

Disadvantages:


  • This approach cannot quantify potential impacts. The type of impacts identified by this mapping approach has to be quantified through additional data collection. For instance, the flow diagrams would identify the research tool could significantly impact the time required to perform a particular activity but other studies or data sources are required to quantify the time savings.

Performance Indicators/Metrics


Description: A set of critical indicators or metrics can be developed to assess the effectiveness of technologies or programs. This information can be critical to understanding the process and outcomes of activities (Shapira et al., 1996). Such indicators provide a quantitative basis for evaluating the activities. Measurement and assessment of key performance indicators are meaningful only when they are easy to understand and timely.

Use in Past Assessments: Performance indicators have been used to assess the benefits of both federal and university-based research centers and initiatives. To assess the value of the National Science Foundation’s engineering research centers (ERCs), Feller et al. (2002), evaluated outcomes by measuring among other metrics the number of centers who reported that they developed patented technology, commercialized a product line, maintained or increased employment, and/or established a new company. Indicators have also been developed to value patents - in a review of 23 empirical studies many indicators were identified including patent age, market value of a corporation, family size, key inventor and patenting strategy (Reitzig, 2004). Rogers et al. (2000) studied the effectiveness of technology transfer at academic institutions using metrics such as US patent applications filed, licenses executed, number of start-ups, and license income received. Many of the metrics and performance indicators are not directly applicable to products made using NIH technologies because they do not focus on the intermediate or final health impacts; much of the focus of these metrics is to ascertain the economic impact of technologies (Link, 1999; Shapira, 1996).

Advantages:


  1. Metrics provide the ability to quantify benefits that allow for the comparison across technologies or programs.

  2. Metrics provide clear, predefined measures to evaluate impacts and therefore the results are accepted as objective.

Disadvantages:


  1. Significant effort may be required to identify metrics that are valid and reproducible over time.

  2. Although the metrics allow benefits to be quantified, potential benchmarks for assessing the value of these benefits may not be easily identifiable.

  3. Data sources required to quantify the metrics may not be readily available and primary data collection may be expensive.

Diffusion and Network Analysis


Description: Charting the diffusion of technology can provide valuable information on the network of users, and thereby demonstrating the reach or impact of a technology. The information required can be obtained by interviewing users or surveying them on their pattern of resource sharing. Researchers can be asked to nominate individuals with whom they share information and also those whose work influences their research. Alternatively, coauthorship of manuscripts can be reviewed to identify research ties between individuals or institutions. Network analysis has over the years been transformed from an exclusive qualitative assessment to one that included some quantitative measures (Rogers, 2001). Citation analysis is often used as a quantitative metric to assess knowledge diffusion (Jaffe, 1996; Tijssen, 2001; Tijssen, 2002; Autant-Bernard, 2001; Zoltan, 2002). Diffusion of innovation has also been recently studied through the formation of social networks. This is based on the theory is that the structure of the social system can favor or impede diffusion of innovation in a system (Decorian, 2002).

Use in Past Assessments: Network analyses have been used as early as the 1970s and there is a wide variation in the types of studies performed (Rogers, 2000). Recent studies includes an assessment of clustering in energy research (Tijssen, 2002) and an examination of how new biotechnology firms use boundary-spanning social networks to increase both their learning and their flexibility (Liebeskind, 1995). In analysis of federal programs, examples of network analysis include the evaluation of bioremediation technologies via technology transfer from government and industry (Day, 1992) and the U.S. Department of Defense study on Crew System Ergonomics (Boff et al., 1990).

Advantages:


  1. Provides details information on the process by which technology produces benefits.

  2. A semiquantitative approach that is viewed as being largely objective.

  3. Can provide previously overlooked information about knowledge spillovers.

Disadvantages:


  1. The qualitative information obtained may not be very informative about the final health impacts of the technology.

  2. May only be possible to perform a thorough assessment only during the early stages of the technology because large-scale diffusion worldwide can make it difficult to chart complex interactions.

  3. Network analysis provides information on linkages or correlations but does not provide the direction of the causality. As Bozeman et al. (2001) have argued “R&D evaluations require something more than identification and description of networks; one must have a criterion by which networks can be said to have improved or declined.”
1   2   3   4   5   6   7   8   9   10   ...   36

Похожие:

Evaluation of the Health and Research Outcomes of Technologies Licensed by the National Institutes of Health iconAbstract: The Office of Research on Women's Health (orwh) at the National Institutes of Health (nth) was created in 1990 to carry out three major mandates: (1)

Evaluation of the Health and Research Outcomes of Technologies Licensed by the National Institutes of Health iconNational institutes of health

Evaluation of the Health and Research Outcomes of Technologies Licensed by the National Institutes of Health iconStudie úČinnosti světelné terapie 1981 2008 pramen: PubMed – service of the U. S. National Library of Medicine and the National Institutes of Health

Evaluation of the Health and Research Outcomes of Technologies Licensed by the National Institutes of Health iconPrepares Researcher of National Institutes of Health

Evaluation of the Health and Research Outcomes of Technologies Licensed by the National Institutes of Health iconFood and drug administration national institutes of health advisory Committee on: transmissible spongiform

Evaluation of the Health and Research Outcomes of Technologies Licensed by the National Institutes of Health iconA dimensional Bus Model for Integrating Clinical and Research Data. Wade td, Hum rc and Murphy jr. National Jewish Health

Evaluation of the Health and Research Outcomes of Technologies Licensed by the National Institutes of Health iconThese health resources are a sample of the resources found in Health-Minder software, a Family Health Information Manager. Use Health-Minder to organize family

Evaluation of the Health and Research Outcomes of Technologies Licensed by the National Institutes of Health iconA review of the Potential Health and Environmental Impacts from Municipal Waste Management Technologies which might be used in Milton Keynes

Evaluation of the Health and Research Outcomes of Technologies Licensed by the National Institutes of Health iconThis Instructor Manual is a resource for instructors using the materials for Component 1: Introduction to Health Care and Public Health in the us. Each

Evaluation of the Health and Research Outcomes of Technologies Licensed by the National Institutes of Health iconWorld Health Organization. Collaborating Centre for Evidence in Mental Health Policy Treatment Protocol Project

Разместите кнопку на своём сайте:
Библиотека


База данных защищена авторским правом ©lib.znate.ru 2014
обратиться к администрации
Библиотека
Главная страница