|
Session Title: National HIV Prevention Program Monitoring and Evaluation: Lessons Learned and Future Directions for Data Collection, Reporting and Use
|
|
Panel Session 720 to be held in El Capitan A on Friday, Nov 4, 2:50 PM to 4:20 PM
|
|
Sponsored by the Health Evaluation TIG
|
| Chair(s): |
| Mesfin Mulatu, Centers for Disease Control and Prevention, mmulatu@cdc.gov
|
| Discussant(s):
|
| Dale Stratford, Centers for Disease Control and Prevention, bstratford@cdc.gov
|
| Abstract:
As part of its national HIV prevention monitoring and evaluation effort, the CDC has implemented the National HIV Prevention Program Monitoring and Evaluation (NHM&E) approach to collecting program data from over 200 funded health departments and community-based organizations. NHM&E consists of standardized variables, program performance indicators, and an optional data management and reporting system. CDC provides technical assistance to grantees to enhance their data collection and reporting, with the goal of using data for program improvement and accountability to stakeholders. This panel discusses critical steps taken by CDC and grantees to ensure the collection and reporting of NHM&E data; highlights the use of NHM&E and other data for local program improvement and national monitoring; and discusses lessons learned during these processes. It addresses the future direction of national-level monitoring and evaluation of HIV prevention programs in the context of limited resources, innovations in HIV prevention, and changes in the epidemic.
|
|
The National HIV Prevention Program Monitoring and Evaluation (NHM&E): An Overview
|
| Antonya Rakestraw, Centers for Disease Control and Prevention, apiercerakestraw@cdc.gov
|
| David Davis, Centers for Disease Control and Prevention, ddavis1@cdc.gov
|
|
CDC funds over 200 health departments and community-based organizations to conduct HIV prevention programs in the United States. In addition to the challenge of providing evaluative support across all agencies, CDC has to address the data use needs of stakeholders including Congress, national interest groups, and local HIV program directors and evaluators. This presentation highlights the approach taken by the Program Evaluation Branch in the Division of HIV/AIDS Prevention at CDC to implement a national-level monitoring and evaluation to meet the accountability, program monitoring, and program improvement needs of stakeholders. This presentation briefly introduces the National HIV Prevention Program Monitoring and Evaluation (NHM&E) approach, its history, and its components. The NHM&E includes standardized variables to capture information about HIV prevention programs, a set of program performance indicators, and an optional web-based data management and reporting system. This presentation is intended to provide the context for subsequent discussions.
|
|
|
Supporting the Collection and Reporting of Standardized NHM&E Data: Capacity Building, Technical Assistance and Quality Assurance
|
| Elin Begley, Centers for Disease Control and Prevention, ebegley@cdc.gov
|
| Michele Rorie, Centers for Disease Control and Prevention, mrorie@cdc.gov
|
|
This presentation identifies the capacity building, technical assistance and quality assurance mechanisms used by CDC to ensure standardized National HIV Prevention Program Monitoring and Evaluation (NHM&E) data reporting from health department and community-based organizations grantees. It is a challenge to provide these mechanisms to grantees that have unique epidemic profiles, while funded under different announcements, and using diverse data collection and reporting systems. To ensure standardization, CDC conducts webinars and workshops on data requirements and offers a fully functional service center to answer grantee questions. CDC also communicates quarterly with grantees about the quality of submitted data. For grantees using non-CDC supported data reporting systems, crosswalks are conducted to document the degree to which each variable submitted meets CDC requirements. Using partner services data, this presentation shares examples and lessons learned during the development of variable requirements, provision of webinars or workshops, and execution of variable crosswalks.
| |
|
Using NHM&E Data for Program Improvement: Los Angeles County Department of Public Health's Experience
|
| Mike Janson, Los Angeles County Department of Public Health, mjanson@ph.lacounty.gov
|
| Sophia Rumanes, Los Angeles County Department of Public Health, srumanes@ph.lacounty.gov
|
| Mario Perez, Los Angeles County Department of Public Health, mjperez@ph.lacounty.gov
|
|
This presentation highlights how the Los Angeles County Department of Public Health (LACDPH) utilizes NHM&E and other data sources for the purposes of program improvement. Using NHM&E data collected from HIV testing services (HTS) and HIV/AIDS surveillance data, LACDPH tracks program performance measures including HIV testing volume, HIV positivity rates, location of test, and linkages to care and partner services. Testing volume and positivity rates are measured at each site to ensure that the testing investment is being maximized. Testing location is measured to track geographic coverage of testing services. These data are matched to ensure that the volume of testing within specific geographic areas is commensurate with HIV disease burden. NHM&E data are matched with surveillance data to track linkage to care by verified CD4/Viral Load laboratory result. As a result of monitoring linkage to care rates, LACDPH has moved to incentivize linkage to care and partner services activities.
| |
|
HIV Prevention Program Performance Indicators: A Tool for National Program Monitoring and Improvement
|
| Barbara Maciak, Centers for Disease Control and Prevention, bmaciak@cdc.gov
|
| Mesfin Mulatu, Centers for Disease Control and Prevention, mmulatu@cdc.gov
|
|
HIV Prevention Program Performance Indicators are standardized measures reported by CDC-funded prevention grantees that capture key components of prevention planning, service delivery, and evaluation. At the national level, CDC uses indicator data in combination with other data sources (e.g., surveillance, program context) to assess progress towards national prevention goals. In this presentation, we describe the use of standardized National HIV Prevention Program Monitoring and Evaluation (NHM&E) variables for indicator calculation, analysis, and reporting. We provide specific examples to illustrate the process used to align operational definitions for key indicator terms with NHM&E variables; develop calculation algorithms; build quality assurance checks; and manage and analyze indicator data. We highlight key facilitators and challenges associated with this process and describe ongoing efforts aimed at interpreting trends in indicator data in the context of multiple data sources; developing national indicator reports; and engaging stakeholders in using indicator data for program monitoring and improvement.
| |
|
Evaluating in the Midst of a Paradigm Shift: New Directions for the NHM&E
|
| Dale Statford, Centers for Disease Control and Prevention, bstatford@cdc.gov
|
| Kimberly Thomas, Centers for Disease Control and Prevention, krthomas@cdc.gov
|
| Romel Lacson, Centers for Disease Control and Prevention, rlacson@cdc.gov
|
|
This presentation focuses on 1) the shift in the nation's approach to HIV prevention and concomitantly to national-level HIV prevention program evaluation; 2) challenges this paradigm shift brings in an era of accountability; and 3) how lessons learned will contribute to strategies moving forward. The National HIV/AIDS Strategy calls for a comprehensive, coordinated federal effort with measurable goals and program evaluation that includes impact-driven measures. With this new strategy come challenges at the national and local levels, including evaluation questions that will have to be answered synthesizing data across federal agencies and involving cross-agency collaboration at the local level. Developing relationships across funders to respond to nationally coordinated efforts takes time, and implementing methods that will enable impact evaluation may take years. Finally, this presentation highlights proposed approaches for moving forward, including consistent engagement with stakeholders and use of innovative methods to synthesize cross-agency data.
| |