Skip to main content
ABC News
What We’re Missing In Measuring Who’s Ready For College

Who’s ready for college? At first blush, not a lot of us. By most measures, far less than half of those taking college entrance exams could tread water at our nation’s colleges and universities.

That’s not too surprising. College enrollment rates are higher than completion rates, suggesting that while the draw of a post-secondary degree is strong and growing, something goes awry once a student kicks off her college career.

At the same time, the pressure is on to send even more students to college, with President Obama challenging the nation to reclaim the title of being the country with the highest percentage of adults who have a post-secondary degree by 2020. The Lumina Foundation, an influential nonprofit that supports efforts to increase the number of college graduates in the U.S., is committed to raising the portion of the U.S. workforce that has a post-secondary degree to 60 percent by 2025. Today that figure hovers at 42 percent for those age 25 and over, according U.S. Census data released this year.

A more educated workforce would be a great boon to the economy: Scholars say more employees would earn higher wages, which leads to more taxes being collected and fewer Americans grappling with the challenges of poverty, among other benefits.

Before we can implement policies designed to shepherd more of this country’s residents toward a college degree, we must actually know what makes a student college-ready. But what if our definitions of college readiness are incomplete, or worse, painting an unreasonably dour picture of how prepared U.S. students are for the rigors of college?

“Everyone has their own definition of college readiness, which makes it a little tricky,” said Jack Buckley, the head of research at the College Board, who previously led the Department of Education’s research arm.

So tricky, in fact, that there’s sharp disagreement over whether test scores or high school grades are better predictors of college readiness.

Resolving that disagreement may help underprepared students avoid financial distress: Enticing high schoolers and adults to enroll in college without gauging their ability to complete a degree can saddle them with debt they’ll struggle to repay if they don’t graduate. Identifying the factors that make someone ready for college can also steer nontraditional students — those who otherwise wouldn’t attend college or would enroll in a weak institution — toward a degree.

That U.S. colleges could do a better job of finding talent is hardly a secret. Shaun Harper, a researcher at the University of Pennsylvania, found that in numerous instances, college recruiters tend to concentrate on high schools with an established track record of talented students, overlooking schools with predominantly black and Latino students, regardless of whether those students are ready for the rigors of college.

A 2013 report issued by the Center on Education and the Workforce at Georgetown University found that white students tend to enroll at the nation’s leading colleges while black and Latino students largely attend schools with lower admissions standards. Though the number of black and Latino freshmen enrolling at colleges greatly increased (by 73 and 107 percent, respectively) between 1995 and 2009, the majority of those students have entered open-enrollment institutions — schools that are considered academically weaker than flagship public universities and elite private colleges. A small slice — about a tenth — enrolled at one of the 468 most-selective colleges in the U.S.; of the freshmen representing the increased enrollment by white students in the same period, 82 percent wound up at one of those elite schools.

In many cases, black and Latino students who have the potential to enroll at better schools aren’t doing so. Black and Latino students with an A average in high school matriculated at a community college 30 percent of time, while the same was true for 22 percent of white students with A averages, the report’s authors found. They note that “each year, there are 111,000 high-scoring African-American and Hispanic students who either do not attend college or don’t graduate.”

Think about that: Since 1995, the U.S. has missed out on about 2 million black and Latino students completing a post-secondary degree.1

The College Board argues that college readiness can be measured by how well a student scores on the SAT, one of the many standardized tests it produces. A student who earns a 1550 on the SAT out of a possible 2400, the College Board says, has a 65 percent chance of achieving a B- average in her first year of college. Students who clear this threshold graduate from college after six years 69 percent of time, while those who score below 1550 graduate in six years just 45 percent of the time, according to the College Board. In 2014, more than half of SAT test-takers earned scores lower than 1550, a sign to the College Board that they’re unlikely to be college-ready. By this metric, only 16 percent of black students and 23 percent Hispanic students were considered ready for college.

The other major player in college entrance exams is ACT, which is the name of both the organization and its flagship test. ACT maintains that students who meet its college-readiness benchmarks have a 50 percent chance of earning a B average in select courses during their first year of college. So a student scoring a 22 out of a possible 36 on the math section has a 50 percent chance of earning a B in college-level algebra. Close to 40 percent of ACT test-takers in 2014 hit the college-ready benchmarks in three or all four of the tested subjects. One-third of test-takers didn’t meet the benchmarks in any subject.

The makers of the nation’s gold standard for measuring how much K-12 students know, the National Assessment of Educational Progress (NAEP), seem to find value in the college entrance exams. In a bid to transform its 12th-grade assessments into a national benchmark for college readiness, the governing board overseeing the NAEP linked its scores to the SAT to create its own college-readiness indicator. The latest version shows that nearly 4 in 10 seniors have the math and reading skills needed to perform reasonably well in college — quite similar to the college readiness findings from the ACT and the College Board.2

Not so fast, says William Hiss, a former admissions dean at Bates College in Maine, who argues that college entrance exams do not accurately foretell student performance in post-secondary settings. He and a co-author made the case in a 2014 paper that high school GPAs are much better at predicting a student’s college grades and likelihood of graduating.

“What we have found is that in a significant number of cases, the students who have perfectly sound high school records, but much less impressive SAT scores, do fine in college,” Hiss said in a phone interview.

In the study, Hiss and his co-author broke down the high school transcripts and college performances of 123,000 students in 33 colleges and universities of various sizes and statures that did not require test scores as part of the admissions process. The authors compared students who did submit ACT or SAT scores to those who did not, granting the institutions anonymity in exchange for access to student admissions data.

Overall, students who didn’t submit their ACT or SAT scores posted high school GPAs that were similar to students who did. The report also found that among the accepted students, those with strong GPAs in high school performed reasonably well in college, while students with relatively strong ACT or SAT scores but lower high school GPAs finished with slightly lower college GPAs and graduated less frequently.3

To Buckley, tests like the SAT complement GPAs and provide a standard by which to judge the millions of students who submit report cards from tens of thousands of high schools, each with its own definition of what makes an A or a B.

For Hiss, “the SAT and ACT are acting as what any statistician would call a false negative,” identifying students as lacking the qualities to perform well in college when in fact they can graduate with solid GPAs.

The debate over how much to value test scores versus GPAs gives way to a larger question of what high school transcripts and SAT results are telling us about students. In many ways, these numbers serve as proxies for other qualities that contribute to college-readiness but are much harder to measure.

An increasing number of scholars have begun looking at the nonacademic factors that enable students to work their way through the challenges of college and careers. David Conley at the University of Oregon created the Four Keys to College and Career Readiness, which include not only testing acumen but also thinking and reasoning skills, familiarity with the college-attendance process, the ability to self-motivate and assess one’s progress, and other attributes.

Other researchers have set their sights on “noncognitive” attributes like grit, motivation and perseverance. Decades of research in psychology suggests that there’s power in helping students develop growth mind-sets — in other words, if educators change the way disadvantaged students think about their own abilities, the students might perform better academically.

There’s also something called the “hidden curriculum”: those non-academic skills, like scheduling meetings with academic advisers and forging relationships with tutors, that are less familiar to low-income and first-generation college students. Missing out on the hidden curriculum can be the difference between completing one’s studies and dropping out entirely.

Even Buckley acknowledged, “There are other things that you need to know to be prepared for college. You’ve got to have actual knowledge of the college process itself; you need to have the study skills and other noncognitive skills that are necessary.”

“It’s not enough to look at anyone’s test scores,” he added.

Maybe a longitudinal data system that tracked every student’s K-12 and college performance — something the Obama administration has encouraged states to develop — would lend more clarity to how we measure college readiness. With that data in hand, would we need college entrance exams like the ACT or SAT?

“Even that data system is not going to have all the key elements of readiness,” Buckley said. “I need assessments of their noncognitive skills; I need to know about their maturity. To really have a very strong estimate, broadly defined, you just need data that we don’t have, or don’t have on everyone.”


  1. This is according to a calculation provided by the Center on Education and the Workforce.

  2. Unlike the other standardized tests, however, NAEP is a statistically representative assessment of all U.S. students.

  3. At many private colleges with test-optional admissions policies participating in this study, students who didn’t submit their SAT scores during the admissions process shared their scores once they were granted acceptance (or the college knew of the scores, but didn’t factor them into the admissions process). Using those records, Hiss was able to calculate that while the difference between the SAT scores of submitters and non-submitters was 149 points in favor of submitters, the high school and college GPAs were nearly identical, and so were the college completion rates.

Mikhail Zinshteyn is a program manager at the Education Writers Association.