Measurement Matters : Estimating Students ’ Community Engagement Participation

Many community engagement and service-learning studies require the researcher to identify whether or not a particular student has participated in an engagement activity. In this article, the author explores the importance and challenge of measuring college students’ community engagement by detailing one institution’s rigorous effort to answer the question, “What percentage of students participate in community engagement during their time at college?” The article illustrates the results of a study in which an institution supplemented an existing database of student participation with several other nontraditional sources of student participation data to construct an expanded measure of engagement. Results indicated that, compared with the expanded measure, the existing database produced a biased estimate of engagement with regard to gender, race and ethnicity, financial aid, and athletic status. Implications for future research, assessment, and practice are discussed.

As service-learning and community engagement initiatives on college and university campuses have grown over the past two decades, interest in understanding the impact of this work has also increased (Bringle, Clayton, & Hatcher, 2013).Community engagement and service-learning are designed to benefit local communities and students (Eyler & Giles, 1999;Janke, Medlin, Holland, & MacInnes, 2014), and research examining how participation in these experiences impacts students has become particularly prevalent in the literature.Service-learning and community engagement are associated with a number of desirable learning outcomes (Astin & Sax, 1998;Eyler & Giles, 1999;Pascarella & Terenzini, 2005), as well as other student outcomes, including political and social involvement (Kilgo, Pasquesi, Sheets, & Pascarella, 2014), well-being (Nicotera, Brewer, & Veeh, 2015), and post-graduation employment (Matthews, Dorfman, & Wu, 2015).Whether college and university administrators wish to document or evaluate community engagement programs or to assess student learning in these activities, or whether higher education researchers seek to learn about how these efforts may affect students and communities, one piece of information is essential: the participation of individual students in servicelearning or community engagement activities.Unfortunately, because of the various modes of student engagement (e.g., through formal programs, community-based learning courses, student-led groups, and student initiatives), many community engagement activities are not included in formal institutional records, posing a challenge to accurately measuring participation.I argue that obtaining comprehensive, high-quality participation data is important for both institutions and community engagement scholars, and highlights the challenges of doing so by detailing one institution's rigorous effort to answer the question, "What percentage of students participate in community engagement during their time at college?"

Why Measure Community Engagement Participation?
There are numerous reasons to estimate the number or percentage of students who participate in community engagement.For over a decade, scholars have emphasized the need for rigorous assessment of and research around student involvement in service-learning and community engagement activities (e.g., Bringle et al., 2013;Gelmon, 2000;Steinberg, Hatcher, & Bringle, 2011).This priority is reflected in the application for the Carnegie Foundation's (n.d.) elective Community Engagement Classification, which requires applicants to report how such participation is tracked.Moreover, administrators at many colleges and universities seek to demonstrate the benefit of their institutions to local communities by highlighting community engagement participation data (Janke et al., 2014).At times, these data are used as proxies for community impact; for example, Campus Compact (2013) has assigned a dollar value to students' volunteer work as an indicator of community benefit.These data also provide important insights for departments that administer community engagement programs as they engage in program-level decision making.For instance, participation data can be used to conduct equity analyses to understand who is and is not engaging-an important focus given findings suggesting that women are more likely than men to participate in community engagement (Chesbrough, 2011;Cruce & Moore, 2007;Eyler & Giles, 1999).
It stands to reason, particularly in the larger context of accountability in higher education (Shavelson, 2010), that examining what students learn through community engagement would be part of an institution's assessment efforts.Notably, the higher education assessment literature articulates clear distinctions between assessment and research (Schuh & Upcraft, 2001;Suskie, 2009), arguing generally that the precision required of scholarly scientific research is neither possible nor particularly important for assessment work.Because of the limited resources available for many assessment efforts, and because institutions are required to generate assessment data for accreditors and other stakeholders, the assessment practice literature emphasizes the importance of continually collecting and analyzing data despite likely limitations in design and measurement.For example, Suskie (2009) suggested that "if you take the time and effort to design assessments reasonably carefully and collect corroborating evidence, your assessment results may be imperfect but will nonetheless give you information that you will be able to use with confidence" (p.14).Unfortunately, in the absence of preexisting data to which assessment data might be compared, it is nearly impossible to know whether the careful assessment work described by Suskie actually yields valid data.
In addition to their utility for institutional assessment, community engagement participation data comprise essential variables in numerous research studies, serving, for instance, as independent variables in research examining factors related to student learning and development (e.g., Astin & Sax, 1998), and as dependent variables in studies examining motivations for participation (e.g., Chesbrough, 2011).One of the most robust areas of research on service-learning and community engagement centers on investigating the influence of these activities on student outcomes and development.Part of the field's intellectual history has involved justifying service-learning and community engagement by exploring the extent to which improved student learning or beneficial student outcomes are associated with these efforts (Giles & Eyler, 2013).Indeed, measuring participation is critical, particularly as the field moves from investigating the potential benefits of service-learning at the individual course level (for which valid participation data is, typically, readily available) toward examining participation across a wider range of modalities and institutions.

Ways to Calculate Community Engagement Participation
Several methods are currently used to estimate the percentage of students who have participated in community engagement.In an administrative context at a single institution, some program directors might make informed guesses, inferring from a combination of tracked data (e.g., the number of students who participated in service-learning courses) and a best estimate of the number of students who engage in ways that are not tracked.Though such an approach does not tax resources, there are obvious problems regarding reliability and validity; moreover, participation in community engagement cannot be attributed to a specific student.
Another, more rigorous method for estimating engagement is surveying students.One strength of this approach is that students can indicate their participation in engagement activities that are not managed directly by an engagement office or do not fall under the direct purview of an institution.However, surveys also have drawbacks, including the potential for measurement error and non-response bias (Groves et al., 2009).For example, (Kolek 2013) found that students at one institution could not reliably report whether or not they had taken a course with a service-learning component.Moreover, response rates to surveys of college students have fallen considerably over the past 15 years (Kolek, 2012;Tschepikow, 2012), and surveys sent by community engagement centers or about community engagement specifically may suffer from sponsor effects or topic effects (Groves et al., 2009), with students who have participated in these activities being more likely to complete the survey.Furthermore, the Pew Research Center for the People and the Press (2012) has found that people who are more civically engaged are overrepresented among survey respondents, even when controlling for a host of demographic characteristics.
A third common approach to measuring community engagement participation is to draw from a database of engagement activities.Institutions can gather high-quality tracking data about activities for which there are official records (e.g., service-learning courses) or that are directed by a center (e.g., a leadership program).Though it requires more resources than other approaches, this method has the advantage of providing measures of engagement for an entire population (including both participants and nonparticipants).However, unlike a survey approach, this method is not well suited to documenting participation in activities outside the direct purview of the department or institution-for example, engagement through student-run groups.
Given the limitations of calculating accurate estimates of engagement via the methods described earlier, the institution upon which this study focused sought to supplement an existing community engagement database with numerous additional sources to obtain a more comprehensive view of student participation in an effort to better answer the question, "What percentage of students participate in community engagement during their time at college?"

Method
This study was conducted at a four-year, highly selective, residential liberal arts college located in the northeastern United States.The community engagement center at this institution maintained comprehensive student participation data for the programs it ran and activities it funded, and for the community-based learning courses at the institution.For the purposes of this study, community engagement was defined as political work, volunteering, public service, activism, or other social change work that included interaction with others who were not affiliated with the college.
In the first stage of the study, previously collected unit-record participation data (i.e., communitybased learning courses, public service internships, structured tutoring and mentoring volunteer programs, leadership programs, orientation trips, non-credit community-based learning courses, and other miscellaneous center-run activities) were compiled for all members of the senior Class of 2012.I focused on a single class year of seniors for two reasons.First, compared to students of other class years, seniors would offer the most comprehensive picture of engagement through the student "life cycle."Second, from a pragmatic perspective, focusing on a single class of 451 students allowed me to collect data on the entire population of seniors.These engagement data were matched with demographic information from the college's database, including class year, race and ethnicity, gender, financial aid, and participation in varsity athletics.At the beginning of the study, I determined the population by selecting in the college's database all students designated to graduate in 2012.Because the purpose of the study was to understand whether students had participated in community engagement at some point during their college enrollment, I considered members of the Class of 2012 at this institution as a population rather than as a convenience sample (see Sudman, 1976).
In the second stage of the study, I identified potential areas of community engagement that had not been tracked; for this, I used three sources: open-ended survey data from (a) previously conducted community engagement surveys, (b) student leaders, and (c) community engagement staff members.I began by reviewing open-ended data from an online, community engagement survey of the undergraduate population conducted in 2011 (response rate = 21.2%;n = 378).The survey had asked students to describe any community engagement in which they were involved at the time.Next, I queried 12 students affiliated with the institution's center for community engagement about the different kinds of engagement in which they themselves or their peers were involved.Finally, I asked six staff members to generate lists of any community engagement activities that had been occurring on campus but that were not directly affiliated with the center.I then attempted to collect names of students who were involved in these previously untracked activities.I asked the leaders of approximately 20 student groups with a community engagement focus (e.g., public health, public policy, working with adults with disabilities) to provide the names of students who had been members.In about one half of the instances, student groups did not keep membership records but were able to provide one or two names of members from the Class of 2012; five groups provided complete rosters; and several groups did not respond to the request.I also received lists of volunteers who had participated in several short-term projects (e.g., Habitat for Humanity builds).I asked staff at three community organizations that were not formally affiliated with the center to provide the names of students who had volunteered.I examined all of the then-current student activity websites for other engagement-related student groups (approximately 30) and coded officers who were listed as having participated in community engagement.I also interviewed staff and student informants: three residential-life staff members and six student leaders who worked with athletic teams and other student groups.I asked these informants to review a list of members of the Class of 2012 for whom no record of community engagement existed in the database, identify those whom they knew had participated in community engagement, and describe the nature of the activity or activities.I asked informants to name the type of activity in part to increase the accuracy of reporting by prompting them to recall particular activities or events, rather than simply identifying a given student.I created fields for each type of additional data, then computed two dichotomous variables indicating whether or not a student had participated in community engagement (a) using the traditional participation measures maintained in the center's database (standard measure), and (b) using both standard measures and the additional data sources described earlier (expanded measure).I used SPSS statistical software to compute frequencies of standard and expanded measures of engagement, and computed cross-tabulations to examine differences in engagement between demographic groups.I did not employ tests of statistical significance since I analyzed an entire population of students and had no sampling error (Cowger, 1984).
It is important to note that this study was conducted as part of an effort to improve student learning through community engagement and to evaluate the effectiveness of various community engagement experiences.Because the project was originally conceived and conducted for internal evaluation and assessment efforts, rather than as a scholarly research project, I was able to link the various data sources to unique student identifiers in the institution's database.

Results
The standard measure of engagement yielded an estimate that 61% of the senior class participated in at least one community engagement activity, whereas the expanded measure yielded an estimate that 75% of the senior class participated in at least one form of community engagement.This 14 percentage-point difference was not evenly distributed across demographic groups (see Table 1).With regard to gender, the estimate of men's engagement increased by 20 percentage points compared to seven percentage points for women.With regard to race and ethnicity, the estimate of White students' engagement and students of unknown race increased more substantially than the estimates for Asian, Black, and multi-racial students.Comparatively, the engagement estimate for Hispanic students and international students increased by a small percentage.The expanded measure produced a higher estimate of engaged students who never received a Pell Grant compared to the standard measure.The single greatest difference between the two measures was the increase in the estimate of varsity athletes who engaged from 51% in the standard estimate to 82% in the expanded estimate.Overall, including these supplemental data had the effect of minimizing demographic differences in engagement, resulting in greater parity in participation across demographic groups.
Most of the additional sources included in the expanded measure yielded very small changes in the engagement estimate (i.e., zero to three students).For example, all seniors who had been leaders of student groups with a community engagement focus had also participated in activities already included in the standard measure.However, there was an 11 percentage-point increase in the engagement estimate after adding the varsity athletes who had participated in community engagement through projects with their athletic teams.Moreover, the addition of these student-athletes substantially and directly changed the demographic profile of engaged students, increasing the estimated number of men who participated by 17 percentage points and thereby accounting for most of the increase in participation among men.

Limitations
Several limitations of this study should be noted.First, the research was conducted at one liberal arts institution in the northeastern United States.It is quite possible that biases in estimates of engagement would be different at other institutions, in terms of both degree and "direction."Second, though I attempted to be exhaustive in my supplemental data collection, I know that I likely missed some other instances of engagement; however, the breadth of the data-collection efforts suggests that any uncounted engagement would have had limited effect on the new estimates.Third, though informants had no apparent motivation to lie about students' engagement, it is possible that they made some errors in reporting.Finally, it is important to emphasize that this study attempted to measure individual students' participation in any type of community engagement; this does not suggest that these students or the community benefitted from all short-term, low-intensity engagement activities.

Discussion
The study findings suggest substantial undercounting of student participation in engagement activities by institutions using traditional tracking measures.This is not particularly surprising, given that additional data sources are likely to yield at least some increase in participation estimates.More importantly, this analysis revealed notable underestimates of the engagement of men, White students, students who did not report race and/or ethnicity, and athletes.In this case, understanding students' community engagement based on the standard measure would suggest problematically low levels of participation among men and White students, whereas the expanded measure reveals a different picture, with a much smaller gender gap and relative parity in engagement by race and ethnicity.Perhaps some of the differences in men's and women's community engagement found in other studies (e.g., Eyler & Giles, 1999) are attributable, in part, to imperfect measurement.
Other campuses might discover biases in their community engagement estimates if they were to conduct similar audits.Larger institutions, however, need not conduct an analysis of an entire population for this technique to be effective.For example, a class of 4,000 students could be audited using a sample of approximately 350 students with a margin of error of plus or minus five percentage points (at a 95% confidence interval).Such future research could employ multivariate analyses to explain differences in community engagement efforts in addition to the bivariate approach used in this study.
For most institutions, an empirical investigation of potential biases in community engagement measures is likely to be too resource-intensive to be conducted each year.However, these analyses suggest an appropriate statistical weight to apply to an institution's typical measure.Of course, some institutions may be unable to conduct an analysis that necessitates linking various data sources to student records.At such campuses, researchers could begin by identifying student groups or initiatives related to community engagement for which participation data is not collected, then interview students or ask them to complete short questionnaires in order to uncover potential significant pockets of students who only participate in community engagement that is not tracked at an institution.In circumstances in which institutions rely on survey data to estimate community engagement, work related to improving the questionnaire measurements might be appropriate.For example, informal pre-testing or cognitive interviews have the potential to illuminate areas of community engagement that are underreported or excluded entirely from a given survey.A third approach might be to employ open-ended survey questions related to community engagement that ask respondents to describe actions, activities, projects, and initiatives in which they are involved, in addition to closed-ended questions about specific types of service-learning and community engagement.These open-ended questions should be crafted in ways that avoid jargon, such as "community engagement" and "service-learning," and that invite respondents to report on a wide range of activities.Analyses of these data can exclude responses that do not fit the researcher's conception of community engagement.Though these other approaches will not yield a quantitative measure of bias in a community engagement estimate, they have the potential to improve measurement of students' community engagement participation.
At the program level, these findings may help to identify groups of students who would benefit from additional support or advising.For instance, at this particular college, the study results spurred the center to strengthen its work with athletic teams, changing from a model in which two or three student leaders supported athletic team engagement to one in which members from every team applied to be an "athletic team engagement leader."Those who were selected participated in leadership trainings and workshops, worked closely with center staff, and made efforts to move teams from a community service to a community engagement perspective.
These results suggest that community engagement researchers should critically examine how engagement measures are constructed and consider the extent to which particular student subpopulations may be underrepresented.Different biases may exist at other institutions, depending on the design of engagement programs and characteristics of the student body.Though bias in this study was due to undercounting students' engagement through athletics, other studies may undercount engagement through other avenues-for example, progressive or conservative political groups, or religious organizations.
The study also raises questions about what researchers, assessment specialists, and administrators intend to measure when examining students' participation in community engagement.With regard to assessment, in some instances it may not be important for institutions to enumerate students' community engagement in activities in which minimal or no institutional resources are used to support that work.For analyses in which institutionally sponsored or supported community engagement is the key variable of interest, using a database to measure student community engagement participation is not likely to be problematic.For studies seeking to understand students' community engagement in general, using a database has the potential to substantially bias results by undercounting low levels of engagement for particular segments of the student body.This may be particularly problematic for institutions that adopt decentralized approaches to engagement, including "train the trainer" models whereby significant community engagement may be undertaken by students who are organized by other students, or for those who facilitate students' community engagement by posting opportunities via bulletin boards, databases, or other electronic means.
From the standpoint of measurement, these results suggest potential problems associated with comparing students who participate in community engagement with those who do not-particularly when analyzing or including low-level engagement.In this study, 35% of students who were classified, according to the standard measure, as "not engaged" (n = 61) had actually participated in community engagement, as coded within the expanded measure.Accurate measurement of participation in noninstitutionally supported work may be so difficult that comparisons between low-level engagers and nonengagers do not adequately differentiate between the two groups.In other words, the results of this study raise the question of whether sufficient numbers of "low-level engagers" are coded as non-engagers in datasets, thus confounding the interpretations of analyses.With improved measurement, might low levels of community engagement be more strongly associated with desirable student outcomes?Giles and Eyler (2013) provided important context for the concerns about increased rigor in servicelearning research, noting that this call has been ongoing since the inception of the field nearly four decades ago.Scholars have argued that service-learning research could be strengthened significantly by grounding studies more firmly in appropriate theory (e.g., Bringle et al., 2013;Whitly, 2014) and by using more rigorous research designs (e.g., Bringle et al., 2013).The results of this study suggest that in addition to theory and design, more careful attention to basic measurement is important for advancing scholarship in service-learning and community engagement.Moreover, the findings suggest that individuals engaged in the work of assessment at colleges and universities might benefit from ignoring the reassurances of many assessment texts and by seeking to interrogate the worth of important measures related to student learning on their campuses.

Table 1 .
Community Engagement Participation