Evaluation 2009 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: International Perspectives on Evaluating Scientific Policies, Programs, and Collaborations
Panel Session 305 to be held in Sebastian Section L2 on Thursday, Nov 12, 1:40 PM to 3:10 PM
Sponsored by the International and Cross-cultural Evaluation TIG
Chair(s):
Krishna Kumar, United States Department of State, kumark@state.gov
Abstract: This panel presents on how evaluation procedures, practices, and techniques are implemented in scientific programs, policies, and/or collaborations in the United States, Russia, selected Eurasian countries and New Zealand. The presenters highlight the complex nature of these evaluations and discuss critical issues in developing core metrics that can be applied across the borders in different socio-cultural contexts to measure long-term program outcomes of science-related programs. The panelists represent a wide range of experienced professionals with international, regional and cross-cultural expertise in program management and program evaluation. The panelists discuss best practices and shortcomings in the evaluation studies that they conducted on programs related to science and scientific collaboration in their respective countries and address issues relating to cross-cultural sensitivity and the context in which the presented programs operate. An exchange of ideas will be encouraged about evaluation methodologies and techniques that are applied internationally to measure science and engineering programs.
Using National Data to Evaluate the International Collaborations of United States Scientists and Engineers
John Tsapogas, National Science Foundation, jtsapogas@nsf.gov
Science and Engineering doctoral degree holders are of special interest to many decision makers because they represent some of the highest educated individuals in the U.S. workforce. This presentation will discuss the process involved in using a national survey of doctoral scientists and engineers, the Survey of Doctorates Recipients (SDR), to evaluate the extent and character of international collaboration among U.S. doctoral degree holders. A description will be made on the procedures used to develop and append a set of questions on international collaborations to existing questionnaires of doctoral degree recipients. Data from the survey were used to evaluate how international collaborations are impacting the U.S. doctoral workforce. Data on international collaborations from the SDR will be presented by sex, research/teaching faculty status, sector of employment (industry, government, and academia), minority status, citizenship, presence of children in household, field of study of doctorate, and year of doctorate receipt.
Evaluating Micro-Ventures Development Through Science-Business Partnerships in Eurasia
Liudmila Mikhailova, United States Civilian Research and Development Foundation, lmikhailova@crdf.org
Fostering micro-ventures and facilitating new technology transfer through the formation of science-business partnerships is critical in building a market economy in Eurasia. This presentation will discuss the Science and Technology Entrepreneurship Program (STEP) and the findings from its impact evaluation conducted in Armenia, Azerbaijan, Georgia and Moldova. STEP was launched by the U.S. Civilian Research and Development Foundation (CRDF) in 2004 in several Eurasian countries under the State Department funding to nurture the creation of micro-ventures through science-business collaboration and to facilitate new technology transfer. Testimonials of each country officials about the program emphasized that STEP was recognized as an innovative development model for building entrepreneurial clusters and promoting a new generation of scientists with entrepreneurial thinking. Data from interviews and surveys will also address participation of young scientists and students in STEP grant projects; job creation opportunities and engagement of former weapon scientists in the civilian research.
Evaluating Institutional Programs: Linking Research And Education At Russian Universities
Irina Dezhina, Russian Academy of Sciences, dezhina@crdf.ru
This presentation elaborates on evaluation results of the "Basic Research and Higher Education" (BRHE) Program, a large institutional initiative aimed at strengthening linkages between research and education at Russian universities. Twenty "Research and Educational Centers" (RECs) were established at geographically spread Russian universities within the BRHE. Data from a longitudinal evaluation study show RECs performance and their impact on the Russian university system. The BRHE started in 1998 with the funding from McArthur Foundation and is cost-shared by the Russian government. The program was very timely in the Russian context, which is characterized by the separation of research and education; limited research funding at universities; weak linkages with the Academy of Science institutes; and other research and development (R&D) organizations. The evaluation results show that the BRHE Program is an effective reform effort in Russia and proves that integration of research into higher education institutions is beneficial for both areas.
Evaluation Practices In The New Zealand Government: A Case Study Of The Ministry Of Research Science And Technology
Yelena Thomas, Ministry of Research Science and Technology New Zealand, yelena.thomas@morst.govt.nz
Best evaluation practices in the area of science policy are topics of interest for many policy-makers. This presentation will focus on how evaluation practices and procedures are implemented at policy level in the New Zealand government and discuss shortcomings and stumbling blocks of this process. Many government agencies have research or performance and evaluation units that are separate from policy teams. That practice allows evaluators to stay independent and objective. The presenter will showcase a policy cycle and how it incorporates the development of a comprehensive evaluation framework when evaluation specialists work closely with policy advisors and contract managers as a team. The majority of policies are introduced to the Cabinet with a high level of evaluation framework and the timelines for reporting. The presentation will conclude with lessons learned and recommendation to the practitioners who work in the area of policy level assessment.

 Return to Evaluation 2009

Add to Custom Program