Senior Director, Strategy & Assessment Brown University, Swearer Center for Public Service Brown University Providence, Rhode Island, United States
Abstract: If you recently asked: don’t we have this data somewhere? This session might be for you! In this lightning talk, we provide an overview and methodology used at the Swearer Center for Public Service at Brown University to inventory and query institutional survey data. A collaboration with the Teaching and Learning Center on campus, and our graduate student researchers, this project activated institutional data for SLCE research and assessment purposes. We coded specific questions (such as how financial considerations restricted our students’ ability to participate in community engagement or how often they participated in SLCE activities) to gain a broader understanding of student experience across our programs and their time at Brown with specific attention to the barriers to full inclusion. If you’ve ever suspected that key data points that could improve your work are hiding from you in plain sight, come hear about one approach to finding that data. You’ll walk away with a clear action plan and solid examples of how you can replicate this in your own institutional context.
Narrative: If you recently asked: don’t we have this data somewhere? This session might be for you! In this lightning talk, we provide an overview and methodology used at the Swearer Center for Public Service at Brown University to inventory and query institutional survey data with specific attention to the barriers to students’ full inclusion. A collaboration with the Teaching and Learning Center on campus, and our graduate student researchers, this project activated institutional data for SLCE research and assessment purposes.
Institutional and national surveys within higher education are key repositories that contain a wealth of data on current and former students including their preferences, demand for services, learning experiences, backgrounds, barriers to engagement and learning, and in many survey questions, specific information about community engagement, community-based learning, and community-engaged research. These student surveys are one of the largest and most frequently used data sources for quality assessment in higher education1. Institutions of higher education have several home-grown survey questions that they administer independently, or national/international surveys that they participate in as part of networks/consortia to collect data from their students and other constituencies on a regular basis to inform senior administrations’ policy and decision-making. For example, Brown’s Office of Institutional Research’s website note that the office “supports the evaluation and planning efforts of Brown University's senior administration by initiating and conducting studies on the University's policies, academic programs, and environment”2.
While some of these data points are used and cited regularly, there is a wealth of information that requires a deep knowledge of the survey questions. The lack of this knowledge, too often, leads us to re-create surveys or devise other tools to collect data that already exists. Spending resources re-creating survey questions we already have answers to is wasteful; leads to lower survey response rates (a major concern facing survey research methods more broadly and in higher education in particular3); and fatigues participants, especially minoritized students we want to hear from about barriers to access and inclusivity. This is also a missed opportunity to fully tell our stories, understand our impact, and utilize data that helps us meet our research, programmatic, and institutional assessment goals. This project sheds light on the value our colleagues who administer surveys or institutional research offices create for our work. These survey data are often collected with a regular cadence and offer robust longitudinal datasets for analysis that could inform the effectiveness of our research, assessment, and programming efforts. Some of the datasets are from national/international surveys that also allow cross-institutional/country comparisons.
The project we will highlight in this lightning talk consolidated and coded institutional survey questions at Brown University across several areas of interest to teaching, learning, and community engagement. In our institutional context, we requested copies of all survey questions from the Office of Institutional Research (some institutions have these survey instrument documents published on their websites, see Cornell for example)4. We created a codebook with data categories of mutual interest to our Center and to the Teaching & Learning Center. We coded specific questions (such as how financial considerations restricted our students’ ability to participate in community engagement or how often they participated in SLCE activities) to gain a broader understanding of student experience across our programs and their time at Brown. This research project does not require Institutional Review Board or access to FERPA-protected data. As you identify questions of interest, you can request aggregated responses for these questions and, when available, specific populations of students/respondents you are interested in tracking and/or understanding data trends for over time (ex: in our case, that is students who participate in Swearer Center programming). This project methodology is accessible and is an excellent way to engage graduate student researchers in your research and assessment efforts. The students learned about survey design, became familiar with hundreds of survey question formats and best practices, created and implemented a codebook, and became fluent in inter-rater reliability methods.
We are now able to utilize the inventory to filter specific questions that are useful for our assessment goals (ex: what should we know about the incoming class that we could use to design / re-design programming to increase accessibility for our first-year students), including institutional accreditations and classifications (ex: assessment questions in the Carnegie Community-Engagement Classification). This project is also influencing our institutional survey instruments design to ensure that questions include community engagement or are re-worded to better align with our emerging institutional work, approach, and language, or increase the frequency of the data available for questions that are asked every two-three years and that we can incorporate into our center’s survey plans for more frequent data points.
If you’ve ever suspected that such key data points that could improve your student programs and outcomes, especially in relation to access and inclusivity are hiding from you in plain sight, come hear about one approach to finding that data. You’ll walk away with a clear action plan and solid examples of how you can replicate this in your own institutional context.
1. Williams, J. (2014). Student feedback on the experience of higher education. A significant component of institutional research data. In M. E. Menon & et al. (Eds.), Using data to improve higher education (pp. 67–80). Dordrecht: Sense Publishers. 2. https://oir.brown.edu/ 3. Fosnacht, Sarraf, S., Howe, E., & Peck, L. K. (2017). How Important are High Response Rates for College Surveys? Review of Higher Education, 40(2), 245–265. https://doi.org/10.1353/rhe.2017.0003 4. https://irp.dpb.cornell.edu/surveys/senior-survey