Missed Exams and Lost Opportunities: Who is Absent from College Admission Testing in Virginia?

Author: Tod.Massa
December 22, 2018

Missed Exams and Lost Opportunities: Who is Absent from College Admission Testing in Virginia?

Authors: Emily E. Cook and Sarah Turner
October 22, 2018

University of Virginia, Department of Economics and EdPolicyWorks

Summary

In Virginia, there are substantial differences across school districts in enrollment at four-year colleges and universities, with students from districts in low-income and relatively rural areas demonstrating the lowest enrollment rates (Cook et al., 2017).  Similarly, there are differences in completion of college admissions tests, which are an important application requirement at most four-year universities.  Participation in the exams is near-universal in some Virginia districts while in other districts only a minority of students participate.

While about half of U.S. states now require all high school students to take a college admissions test, in Virginia, decisions about student participation in college testing are left to district policy and parental discretion.  Using data from the Virginia Longitudinal Data System (VLDS), we estimate the likely performance of students who did not take the SAT and PSAT to understand whether these students were likely to score at levels that would permit enrollment at four-year colleges and universities.  In turn, we examine how these missing test-takers are distributed by race, geography and expected socio-economic status.

The central result of this analysis is that missed college admission tests substantially reduce the pool of students positioned to apply to four-year post-secondary institutions in states like Virginia.  The potential pool of applicants could be enlarged by over 40 percent at broad-access four-year institutions and nearly 20 percent at the most selective public and private colleges and universities. College-ready students who miss college entrance exams are disproportionately economically disadvantaged and often attend high school in relatively small districts in more rural parts of the state.  Targeted outreach based on prior academic performance may be an effective policy tool, as it would encourage students who are likely college-ready to take an admissions exam, without incurring the financial and administrative burden of statewide testing mandates.

Background

Over the course of the last 15 years, a number of states have entered agreements with the test providers to offer either the ACT or the SAT to all students in the state.  By the spring of 2017, 25 states required students to take either the SAT or the ACT with 12 of these states using the college admissions exams to satisfy federal accountability guidelines. Many of these mandatory testing policies have been implemented within the last five years. 

Several researchers have assessed the testing and college enrollment effects of these policies.  In one of the earliest studies, Klasik (2013) examined the introduction of policies in Illinois, Colorado and Maine and found evidence that the policies often shifted students from two-year institutions to four-year institutions.  Goodman (2016) assessed adoption in five states relative to adjacent states, examining test-taking, test performance and college-going.  In Colorado and Illinois, the mandates induced between 1/3 and 1/2 of students to sit for the exams (“compliers”) while 40-45% of new test-takers earned scores sufficient to attend competitive colleges. A particularly striking result is the finding that new test-takers came disproportionately from disadvantaged backgrounds.  Focusing on the state of Michigan, analysis by Hyman (2016) shows that there would be a 22.7% increase in the Michigan student population above a college-ready standard of an ACT 20, with these students accounting for 21.3% of the non-taker pool.

Taking the SAT remains optional in nearly all Virginia districts, while there is some district-level provision of the PSAT test; Johnson (2018) provides a preliminary inventory of districts with PSAT-mandates.  PSAT-taking varies across districts from near-universal in those districts providing access in multiple grades to levels often below 50% in districts where registration is student-initiated.  Overall, Virginia ranked 10th in SAT-taking among the 35 states that had no mandatory testing policy (either SAT or ACT) for the 2014 graduating cohort.

Methods

Data on prior academic achievement in the form of SOL scores allows for the prediction of the testing outcome for those students who do not take the PSAT or the SAT.  The problem of predicting SAT and PSAT scores for non-takers is a missing data issue.  To preserve the variation in the data, we employ a method that uses randomly selected records matched on observed characteristics to fill in missing data (in the spirit of various “hot deck” imputation methods).  This technique is frequently applied when variation within the data is a key outcome, or when the tails of a distribution are of interest—such as income measures in the Current Population Survey.  

Our strategy uses records from “similar” students in strata defined using SOL scores, race, and a proxy for income.  In the baseline we define strata based on a black/non-black indicator, an indicator for disadvantaged status (which includes eligibility for free and reduced-price lunch, TANF or Medicaid, or homeless or migrant status), and average SOL scores computed across scores in all four subjects (Reading, Writing, Algebra and Science) and grouped in 25 quantiles. 

A key assumption is that the data are Missing at Random (MAR) which implies that determinants of test-taking conditional on the factors used in imputation (SOL scores, race, and disadvantaged status) are not correlated with test performance.   While we pursue several additional imputation schemes, we emphasize that the availability of measures of prior academic achievement (the SOL scores) dramatically reduce the likely impact of selection into test-taking on the estimates (Garlick and Hyman, 2017).  One interpretation of our approach is that the estimated test scores for non-takers represent potential performance which would follow from having school and family supports similar to those received by takers.

Findings

Starting with the case of the PSAT, there is a substantial number of students who did not take the test (“non-takers”) who would have been expected to score at a level indicative of college readiness, even as the mean is lower for non-takers than takers (83 and 93, respectively).  Using counts from 2014 as a reference, a universal testing mandate in Virginia would induce 21,656 students to take the PSAT exam with an estimated 3,463 additional students scoring at or above 100 on the PSAT (the 65th percentile in Virginia).  While this represents an 18% increase in the number of students with scores above 100, more than 18,000 induced test-takers would score below this level. 

Student-level policies that encourage testing for those likely to achieve college-level academic proficiency would be more efficient in maximizing the number of college-ready students taking the test while minimizing the costs of testing.  Had all students in the state who scored above the 12th quantile (out of 25) on the SOLs taken the PSAT, we estimate that there would be an additional 3,185 students with scores above 100, very close to the 3,463 additional scores above 100 that might be achieved under a statewide universal testing policy.

Turning to the SAT, we consider how the inclusion of the missing SAT test-takers would change the potential pool of college-ready students by computing the ratio of non-takers to takers at a rough approximation of the admissions standards (the sum of the 25th percentile Math and 25th percentile Verbal scores among enrolled students) at different colleges in the state.  Figure 1 shows these results.  Overall, institutions like Liberty University and Old Dominion University could expect their pool of potential applicants to increase by over 40% while the University of Virginia and the College of William & Mary might gain just under 20% in the number of potential in-state applicants.  And, among demographic subgroups, it is non-black, economically disadvantaged students who are most likely to achieve college-ready scores but do not presently take the test, as shown in the bottom panel of Figure 1. 

Policy Implications

The pool of potential four-year college applicants in Virginia is markedly lower than it would be if a broader group of students were to take the college admissions tests. Those students missing college entrance exams who are likely to be college-ready are disproportionately economically disadvantaged and often attend high school in relatively small districts in more rural parts of the state. 

 

With large predicted gains in the college eligible pool to increased test-taking, what are the barriers to greater testing participation?  For individuals, particularly first-generation and low-income students, it is possible that students who would benefit from the tests do not take them because they are unaware of the potential upside benefits of four-year college attendance.  While some districts resolve this problem with universal or “opt-out” testing, other districts lack the financial and time resources to implement such policies.  District-wide testing requires not only financial resources, but also considerable professional time and coordination to administer tests  Universal testing at either the district or state levels may place the highest burden on students who are not college-ready and districts with the most limited resources for test administration and student guidance.

Our findings suggest that it is possible to achieve significant gains in college admission test-taking among college-ready students without incurring the full costs of statewide mandatory testing. Individual guidance based on prior statewide assessments (the SOL exams in Virginia) may be an effective tool for encouraging test-taking among potential high-scorers.  Students below the 20th percentile on the SOLs are highly unlikely to score in a college-admission eligible range, while students at the very top of the distribution are likely to achieve an admission-eligible score.  Targeted outreach, combined with interventions designed to reduce financial and administrative barriers for students and schools, may generate a significant increase in test-taking among the college-ready without incurring the financial and administrative burden of statewide testing mandates. 

The VLDS provides an important tool for the identification of students most likely to benefit from greater access to resources throughout the college admissions and college choice process.  The results obtained here provide motivation for data-driven policy experimentation aimed at expanding access to higher education among college-ready students.

 

 

References

Cook, E. E., Turner, S., & Romero, J. (2017). Transitioning from High School to College: Differences across Virginia. FRB Richmond Economic Brief, 17(12).

Garlick, R., & Hyman, J. (2017). Quasi-Experimental Evaluation of Alternative Sample Selection Corrections.

Goodman, S. (2016). Learning from the Test: Raising Selective College Enrollment by Providing Information. The Review of Economics and Statistics, 98(October), 671–684.

Hyman, J. (2017). ACT for All: The Effect of Mandatory College Entrance Exams on Postsecondary Attainment and Choice. Education Finance and Policy, 12(3), 281–311.

Johnson, K. (2018). In or Out? The Effect of PSAT Testing Practices on College-Going Behavior. University of Virginia Senior Thesis.

Klasik, D. (2013).  The ACT of Enrollment: The College Enrollment Effects of State-Required College Entrance Exam Testing.  Educational Researcher, 42(April), 151–160.

 

 

 

Figure 1: Share Missing by College Eligibility

Panel A.  All students

 

Panel B.  By demographic group

 

Note: Authors’ analysis of VLDS data on students in the 2014 graduating cohort with average SOL scores greater than or equal to 400.  Using realized (for takers) and predicted (for non-takers) SAT scores, we compute the ratio of non-takers to takers for the population with SAT scores above the sum of the 25th percentile Math and Verbal (Critical Reading) SAT scores among enrolled students at several universities in Virginia.  College codes identify Liberty University (LU), Old Dominion University (ODU), Virginia Commonwealth University (VCU),  James Madison University (JMU), Virginia Tech (VT), University  of Richmond (UR), the University of Virginia  (UVA), and the  College of William and Mary (W&M).

Tags: Entrance exams, college testing, college participation, postsecondary pipeline

Categories:

Back to Blog Page