| Session Title: Unique Challenges With International Monitoring and Evaluation Activities: Examples From the Centers for Disease Control and Prevention (CDC) |
| Multipaper Session 277 to be held in Wekiwa 4 on Thursday, Nov 12, 10:55 AM to 12:25 PM |
| Sponsored by the International and Cross-cultural Evaluation TIG |
| Chair(s): |
| Thomas Chapel, Centers for Disease Control and Prevention, tchapel@cdc.gov |
| Abstract: Monitoring and evaluation (M & E) activities conducted in international settings often present unique challenges and opportunities due to multiple factors, including the lack of resources (financial and personnel), lack of capacity to monitor and evaluate activities, and limited understanding of the importance and use of evaluation. Further, logistical challenges such as limited personal interactions due to distance and time-zone differences can create additional complexities. This panel will describe how two programs at the Centers for Disease Control and Prevention have implemented M & E activities in international settings. The first presentation will focus on the development and implementation of a M & E framework in six international locations, with discussion on the challenges, successes, and lessons learned during this process. The next two presentations will describe specific strategies and solutions that have been developed, implemented, and in some cases replicated, to address particular M & E issues. |
| Multi-site Evaluations: Lessons Learned in Implementing a Monitoring and Evaluation Framework for the Centers for Disease Control and Prevention's (CDC's) Global Disease Detection Program |
| Rachel Nelson, Centers for Disease Control and Prevention, rqk0@cdc.gov |
| Suzanne Elbon, Ciber Inc, sge4@cdc.gov |
| Namita Joshi, Centers for Disease Control and Prevention, ngs5@cdc.gov |
| Alison Kelly, Centers for Disease Control and Prevention, ayk7@cdc.gov |
| Douan Kirivong, Centers for Disease Control and Prevention, bpq7@cdc.gov |
| Naheed Lakhani, Centers for Disease Control and Prevention, bqv8@cdc.gov |
| Dana Pitts, Centers for Disease Control and Prevention, gqo1@cdc.gov |
| Scott F Dowell, Centers for Disease Control and Prevention, sfd2@cdc.gov |
| The Centers for Disease Control and Prevention (CDC) supports Global Disease Detection (GDD) Regional Centers in six countries that focus on early detection and rapid containment of emerging infectious diseases. The GDD Program uses a comprehensive monitoring and evaluation (M & E) framework to collect accomplishments and achievements from the GDD Centers. The M & E process began in 2006, with an open-ended questionnaire, and has since developed into a more formalized process that includes standardized indicator definitions, a database collection tool, and a quarterly-based reporting schedule. In this presentation, we will share challenges and lessons learned from the past three years, including a step-by-step description of how we developed, finalized, and implemented the M & E framework across six international Centers. |
| Monitoring and Evaluation From Afar - How the Centers for Disease Control and Prevention's (CDC's) Sexually Transmitted Diseases (STD) Program Approaches Time and Distance Constraints in Evaluation |
| Sonal Doshi, Centers for Disease Control and Prevention, sdoshi@cdc.gov |
| Since 1990, the Centers for Disease Control and Prevention, Division of Sexually Transmitted Disease Prevention (CDC/DSTDP) has provided categorical funding for sexually transmitted disease (STD) prevention activities through grants to 65 cities, states, and territorial public health agencies, including six United States Associated Pacific Island Jurisdictions (USAPIJs). These six jurisdictions are: American Samoa, Commonwealth of the Northern Mariana Islands (CNMI), Federated States of Micronesia (FSM), Guam, Republic of the Marshall Islands, and Republic of Palau. Although CDC funds STD prevention activities in these areas, our partners in the USAPIJs lack sufficient resources, infrastructure, and access to consistent technical assistance to fully implement comprehensive and sustainable STD programs compared with their mainland health department counterparts. This presentation will discuss the process, challenges, and lessons that CDC/DSTDP headquarters learned while implementing an evaluation of an adolescent health center in Saipan, CNMI while based in Atlanta, GA. |
| The Value of an Electronic Database in Standardizing and Enhancing Evaluation Activities for the Centers for Disease Control and Prevention's (CDC's) Global Disease Detection Program |
| Suzanne Elbon, Ciber Inc, sge4@cdc.gov, |
| Rachel Nelson, Centers for Disease Control and Prevention, rqk0@cdc.gov |
| Douan Kirivong, Centers for Disease Control and Prevention, bpq7@cdc.gov |
| Dana Pitts, Centers for Disease Control and Prevention, gqo1@cdc.gov |
| Naheed Lakhani, Centers for Disease Control and Prevention, bqv8@cdc.gov |
| Scott F Dowell, Centers for Disease Control and Prevention, sfd2@cdc.gov |
| The Centers for Disease Control and Prevention (CDC) requires that its six Global Disease Detection (GDD) Regional Centers submit annual reports as part of its monitoring and evaluation process. During the first two years of data collection, GDD Centers reported accomplishments via paper-based reporting. This resulted in a cumbersome reporting process and difficulties in summarizing data. To standardize and simplify data collection, as well as enhance reporting, a database was developed for the GDD Centers. Presenters will discuss factors that influenced the design of the database, the process of pilot-testing, implementation across all GDD Centers, and will demonstrate how the database has enhanced the overall GDD monitoring and evaluation process. Limitations of the current design and plans for revisions will be addressed. |