Rankings, Rankings, and More Rankings

By Watson Scott SwailPresident & Senior Research Scholar, Educational Policy Institute

In this week’s news, Kevin Carey and Thomas Toch (see “College Rankings: Higher Education’s Battle Royal”) discuss the upcoming US News and World Report Rankings Guide, which will hit the newsstands in a few weeks. Unfortunately, Carey and Toch take a purely populist view of the rankings and related issues. And this has set me off.

It’s certainly true that institutions are walking away from rankings reports. In Canada, MacLean’s magazine, which produces Canadian rankings, has seen an unprecedented downturn in institutional participation in the last year. Last year, 19 major Canadian institutions (almost half of four-year institutions across the country) denounced the rankings and are not participating. But in the US, with the exception of a very few institutions, higher education presidents and CEOs are relatively mercenary about the rankings: they hate ‘em when they don’t rank; love ‘em when they do. If you attended a half-way decent institution, you probably know I’m right. I’ve read alumni letters from William & Mary (my wife’s alma mater) and other reputable institutions that in the first paragraph downgrade the importance of the rankings, followed by a quick acknowledgement of their “25” ranking on the liberal arts category. Then, of course, they ask for money.

I don’t blame the universities. It’s part of the game. But only because we have let it become so. And I’m not so inclined to buy the authors’ comment that “presidents have a right to be angry.” They get what they want and they use it to fundraise if they can.

Carey and Toch say that the tides are turning in this ranking business. Maybe they are. But their argument is weak, at best:

“New research and advances in technology in the last several years have led to a host of new ways to measure the performance of colleges and universities—and the new metrics are yielding some surprising results.”

As a researcher, I find it insulting when writers throw around terms like “surprising results,” without telling us what the surprising results are. It’s so gimmicky. In talking about the CLA (College Learning Assessment), which was produced by a consortium led by the very talented Rand Corp (and I mean that), the authors state that the “results have been eye-opening.”But Carey and Toch don’t bother to open our eyes with any of these results. It’s kind of like believing that things in Iraq are going “really, really well” and that we’ve made “significant progress” because our President says so (EDITOR’s NOTE: 8 out of 18 isn’t very “significant”). The power of the pulpit and of the press.

In reality, the outcomes of the CLA have been mixed at best. How does one truly measure the diversity of college learning at the most diverse education system in the world? The CLA is an interesting experiment, but to think it is the Holy Grail of campus quality is near sighted at best. I’m sure Rand wouldn’t disagree. But it’s a start, for sure.

Carey and Toch then suggest that the National Survey of Student Engagement, better known as the NSSE, is an important tool to document how “good” schools are. The NSSE is a decent tool for institutions to gauge what students think of their institution. Built off the backs of decades of IR surveys, they’ve develop a trademarked tool (yes, as we’ve found, you apparently can trademark “how many hours do you study a week?”) to hopefully help institutions plan for better student services. The author’s state that institutions aren’t willing to provide their NSSE information for public use. This statement isn’t altogether true from our experience, but still, institutions are never overly keen on providing insights into their sausage factory. As well, NSSE doesn’t publish that information. So I’ll give them that point. But they do comment that high-scoring NSSE schools rate low in the US News and World Report rankings. I’m not sure if this is true, but if it is, it brings into question two things: either the US News Ranking system is flawed (which surely it is to some degree), or the NSSE is flawed, which I am unable to comment on. Important to remember is that the NSSE wasn’t designed to be used in rankings, which makes it very interesting that the NSSE people are now working with USA Today to provide college-based NSSE information for the public. We’ll see how well that goes.

Carey and Toch are absolutely correct in suggesting that there needs to be more information made public on how well our schools are doing. We’ve built a system called University Navigator in Canada to do just that. We hope to have it brought down to the US soon if we can get the funding and legal clearances, both of which we are working on at the present. Regardless of what we are doing, hopefully the NSSE and the CLA and the other discussions are opening the door so that students, parents, and policymakers can see behind the Wizard’s (or Wizards’) curtain. But we have a long way to go in terms of assessing institutional “effectiveness” and “learning.” In the meantime, here’s to you and your alma mater’s 2007 ranking!

One thought on “Rankings, Rankings, and More Rankings

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.