2011

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Issues in Generating Evaluative Data
Multipaper Session 767 to be held in Coronado on Friday, Nov 4, 4:30 PM to 6:00 PM
Sponsored by the AEA Conference Committee
Baseline Data and Research Design
Presenter(s):
Ximena Burgin, Northern Illinios University, xrecald1@niu.edu
Marcella Zipp, Northern Illinois University, mreca@niu.edu
Abstract: The planning stage of a grant proposal includes the understanding of the problem to be studied through current data. Current data will indicate the problem/s to be addressed, intervention/s, methodological approaches, and stakeholders of the issue/s. Thus, it is important to identify meaningful baseline data to demonstrate achievement of outcomes through the project. There are a variety of ways to obtain baseline data, such as restricted databases, public databases, commercial databases, and your own. The researcher should know the validity and reliability scores of the instruments that databases gathered information from. The baseline information will be meaningful if the instrument measures the desired domains (validity) and the instruments' content is appropriate, correct, meaningful, and useful for the specific inferences made from data (reliability). Moreover, consistency of scores and repeatable results from one administration of instrument to another should be considered as part of the evaluation design.
Beyond 'Agree' and 'Somewhat Disagree': Using Q Methodology to Reveal Values and Opinions of Evaluation Participants
Presenter(s):
Ricardo Gomez, National Collegiate Inventors and Innovators Alliance, rgomez@nciia.org
Angela Shartrand, National Collegiate Inventors and Innovators Alliance, ashartrand@nciia.org
Abstract: In this paper we introduce Q methodology as an alternative to survey-based research methods. Whereas the typical outcome of a survey-based study is a descriptive statistical analysis of pre-specified independent categories deemed relevant by the researcher(s), the outcome of a Q study is a more authentic set of factors that capture people's attitudes and perspectives about an issue. Q also has the capacity to reveal underlying or unrecognized social discourses that can represent other agendas connected to an issue. Q methodology statistically identifies different points of view on a given topic based on how individuals sort a set of statements about that topic. Because people are required to rank through a sorting procedure, they must make choices which reflect their underlying values. The sorted statements are then statistically analyzed and the resulting factors are qualitatively interpreted, thus bridging the gap between qualitative and quantitative inquiry.
An Evaluation Framework for a Smart Parking System
Presenter(s):
Tayo Fabusuyi, Numeritics, tayo.fabusuyi@numeritics.com
Victoria Hill, Numeritics, tori.hill@numeritics.com
Robert Hampshire, Carnegie Mellon University, hamp@cmu.edu
Abstract: We present an evaluation of ParkPGH, a smart parking system that provides real-time information on the availability of parking spaces within the Pittsburgh Cultural District. The initiative is in response to increased demand for parking spaces and the desire to improve parking experiences through the provision of real-time information on parking availability. Primary data, obtained through both in-person and online surveys of patrons of the Pittsburgh Cultural District events, was utilized for the baseline data analysis, process evaluation and outcome evaluation phases. Secondary data that utilized count data obtained from website use logs was employed for the output evaluation phase. The contributions of the evaluation framework are the insights it provides on how the key challenges created by the unique environment within which the system was deployed were addressed and how the framework was used to track respondents longitudinally using a binary system that identifies distinct cohorts of respondents.

 Return to Evaluation 2011

Add to Custom Program