College Admissions, Selectivity, and Grit

By Watson Scott Swail, President & Senior Research Scholar, Educational Policy Institute

In 2013, Angela Duckworth became a bit of a phenomenon for her book “Grit: The Power of Passion and Perseverance.” She toured the talk shows and became the flag-bearer for showcasing that individual passion had much more to do with future ability than purely academics. Five years later, there are many critics of the “grit” analogy. Scientific observations have shown that grit has a limited and “weak” impact on future success.[1]

For the past 30 years, there has been a push for non-cognitive factors for student success. I have written and spoken prominently on the balance between academic data and non-cognitive and social attributes for students in the Geometric Framework for Student Retention. William Sedlacek of the University of Maryland was one of the first researchers who focused on non-cognitive variables in predicting future success for non-traditional students. His basic point remains that grades and academic testing data are not, in isolation, an appropriate way to measure the future success of students. Thus, the game is about finding variables beyond traditional test scores to provide other indicators of success. Duckworth’s research on grit and other variables was largely based on this believe in not over relying on traditional measures of academic achievement and potential.

In the end, we always must warn that these issues are mostly indicators of interest for selective institutions of higher education, which is attributed to less than half of the students who begin their college experience each fall. However, if we focus exclusively on the four-year level, then selectivity rises to 81 percent of students who attend institutions that are at least marginally selective. It can be argued that only the very selective institutions focus discretely on these numbers, but certainly the moderately selective schools do, as well.

As illustrated in the graphic below, 26 percent of public four-year students attend very selective students as compared to 41 percent of students at private, not-for-profit four-year institutions. In fact, only five percent of students at the private level attend open admissions institutions compared to 18 percent at publics. But the selectivity of four-year institutions has increased over the years. Once thought of primarily a private-institution issue, sheer demand has forced public institutions to be much more selective in admitting students, especially at land grant institutions.

Exhibit 1. Selectivity of US Institutions of Higher Education, by Control and Level, 2011-12

180606 BPS Selectivity

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2011-12 Beginning Postsecondary Students Longitudinal Study, First Follow-up (BPS:12/14). Data downloaded and analyzed by the Educational Policy Institute, June 6, 2018). 

But measuring non-cognitive indicators is difficult. Perhaps tricky is a better term. Just as with academic measures, such as the ACT and SAT tests, there are issues of bias and validity with any type of test. The fact that the two measures illustrated—the ACT and SAT—with over a century of research and development by arguably the best psychometricians in the world, still has bias and validity issues should set this in perspective: if these tests still bear weight of internal and external validity, then what about these other non-cognitive tests and measures that have not nearly had the same research and development? Would they be more biased and less conclusive? The answer is yes, most certainly.

The ability for admissions personal at institutions who admit most of their applicants is not terribly difficult. But as the institution becomes more selective, the process gets much more complex and difficult. Focusing on the most selective institutions, many of which accept less than 15 percent of their applicants (e.g., Stanford (5.1 percent), Harvard (6 percent), Princeton (7.4 percent), etc.), these institutions need a process for sifting out students who are largely very talented. For instance, only 1 percent of students who take the SAT earn a perfect 1600, and only 5 percent score over 1400.[2] Yet many of these top schools have hundreds of students who apply for admission with perfect or near-perfect test scores.

To put this in perspective, Stanford University had 44,073 applicants in 2017, of which 2,085 were admitted and 1,708 enrolled. The average SAT for reading and writing was between 690 and 760 and average math was 700-780 and the average ACT score fell between 32 and 35. Thus, almost everyone who applied to Stanford had an exceptionally high test score. Harvard has over 200 students applying each fall with perfect SAT scores.[3] UC Berkeley received over 89,000 applicants plus another 19,000 transfer applicants in fall 2017, with an average SAT of 1337 and 29 ACT, with over 10,000 students above 1400. That translates to 108,000 highly-qualified applications for 14,000 freshman positions.

With so many applicants and so few spaces, relatively speaking, how can these institutions possibly decide who gets in or not? Non-cognitive testing? Not likely. It has been noted that at some point, once you get into the highest 10 percent of test scores, the cease to matter at these institutions.

Most institutions say that the GPA and the test score are useful but not important indicators of selection. Other factors, such as academic coursework (read: rigor), extracurriculars, essays, recommendations, and the “interview” are also considered. But let’s face it: the extra curriculars, essays, and recommendations are all fodder: they will all be good. The interview is important, but how many of the 108,000 applying to UC Berkeley get the chance to interview? In the end, the prime factor that still must be addressed is academic ability. The standardized test score and course rigor are the two items that should matter most in admissions, I am afraid to say. GPA is not useful in any statistical matter because of the rampant grade inflation over the decades. The average GPA at Berkeley is 4.10. So big deal. What matters is how students score on a standardized test and what courses they completed at these highly-selective institutions.

This isn’t the message a lot of people want to hear, but in selective institutions, this is the way it is and the way it should be. For the other institutions—the moderately and minimally selective—admissions processes are even more challenging because the rigor of students is still high even if downgraded from the top tier. And there are more of them. Hundreds of thousands of students who fit a decent test score and academics trying to get into these hundreds of institutions. Now we get into a number issue. Simply put, addressing a fair admissions strategy with so many applicants. In the end, the number has to rule because everything else would seemingly be unfair. We have arguments when legacies get into schools in front of others; when sports and other factors are determined; and, of course, when race/ethnicity is considered, attested by the numerous court cases over the decades by various plaintiffs who have mostly argued that affirmative action should not be the law of admissions.

Thus, in the end, grit is a wonderful four-letter word that doesn’t necessarily matter that much unless you believe that grit is what gets you good test scores and results in the completion of high-level course work in high school. Otherwise, what is grit really measuring? Perseverance? I’d say getting top test scores and taking the right courses also indicates perseverance. And perhaps preference, opportunity, and social status, with no argument.

The argument against, of course, is that not all students are in a position where they can take and complete high-level courses due in large part to the dismal teaching and learning environment where they are raised. This is true. But can we really dismiss the actual academic performance of the top tier students for top-tier institutions? This isn’t an anti-affirmative action rant. Not at all. But if we really want to level the playing fields at the college level, we need to level the education at the K12 level first.

To do this, we have to convince policymakers to invest heavily into public schools—not charter schools—and change how we educate students, especially those who do not typically go to college or who go and do not succeed. I’m talking low-income, first generation, and students of color. The best affirmative action we can do is to change the process of teaching and learning. Sure, we can increase the expectations and the requirements for high school graduation, but we’ve done that before with dreadful outcomes. You can only change those expectations once you retool the system. This is not a simple effort, of course. We’ve been retooling for decades. We have lived through open space schools, gender-only schools, and even playing with the curriculum, as we did with Algebra and geometry in the EQUITY 2000 project when I worked at the College Board. That didn’t really work, either, because the school districts failed to provide the necessary supports for students to succeed after years and years of bad mathematics instruction. In the age of educational technologies, we still have failed to largely harness their potential for altering the teaching and learning environment, arguably because we’ve only placed those systems on top of traditional curriculum and standard maps.

The challenge, of course, is that we do not create an equitable system over night, and perhaps, never. And this will remain a long, frustrating issue for higher education: how do we equitably admit students to our selective institutions?

It will still come down to test scores. That is the truest form of grit we have.

[1] Credé, M., Tynan, M. C., & Harms, P. D. (2017). Much ado about grit: A meta-analytic synthesis of the grit literature. Journal of Personality and Social Psychology, 113(3).



2 thoughts on “College Admissions, Selectivity, and Grit

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.