Admit This

Are College Rankings Good?

My posts of late have been centered on college rankings. It seems as though almost anyone who has even the most rudimentary knowledge of the college world is issuing rankings these days. I'm waiting for a new TV reality show where the participants visit colleges and then rank them. What might that show be called? Rank This! (?) You're Ranked! (?) America's Rankest Colleges (?)

Anyway, I stumbled across an interesting article on Consumerist.com the other day: 5 Reasons Why Every Single College Ranking Is a Pile of Crap. Wow. With a title like that, the author, Zac Bissonnette, could be a producer for America's Rankest Colleges.


As you no doubt know, if you're a regular reader of Admit This!, I am not a fan of college rankings. But I do realize that in the face of such a huge number of choices out there, some sort of sorting system can be helpful. But, who do you believe? Who do you trust?

42-16761179

Let's take a look at what Zac has to say:

Almost weekly, it seems, consumers are introduced to yet another annual ranking of the "best," "worst" or "biggest party" colleges in America.

US News & World Report produces the most widely-read college rankings, but Forbes, The Princeton Review, and several other publications have produced their own rankings. The problem is that every single one of these rankings is just absolutely, completely, and totally full of crap.

This probably isn't the smartest thing to say for my career as someone who writes about colleges, but whatever. Let's look at just a few examples of the problems with these rankings:

1. College rankings are often based on opinion and not actual data

US News & World Report is now producing a list of schools that have the "best undergraduate teaching." How did they do that? According to their description of their methodology, the magazine "asked top academics as part of the regular U.S. News peer assessment survey to name the schools that they think have faculty with an unusually strong 'commitment to undergraduate teaching.'" Got that? They measured the quality of teaching at one school by asking people who work at other schools how good the teaching is. It would be like basing the Fortune 500 on just the opinions of other CEOs instead of things like revenue and profit.

2. Rankings can be based on factors that have no demonstrated impact on academic results

Many college rankings - including US News & World Report - rank colleges based on the proportion of faculty who are full-time. The problem? According to a report from The Delta Cost Project [PDF], "in higher education, in contrast to K-12, there is no consistent research showing that access to full-time faculty pays off in greater student learning, student retention, or degree attainment." Oops.

3. The benefits of attending a more selective college might very well be canceled out by the benefits of attending a less selective college

Most of the major college rankings are based in part on selectivity: either by looking at the acceptance rate or by looking at the high school GPAs and SAT scores of students. But a savvy student might be better off attending a school with a bunch of students who are dumber than he is. Why? A recent study of law school grads found that the correlation between class rank and salary is stronger than the correlation between school prestige and salary. "Under-matching" - that is, attending a law school where you're smarter than many of your classmates - is likely to result in better grades and a better class rank and a higher salary. Princeton economist Alan Krueger has theorized that this phenomenon may explain why students who get into elite colleges but attend less elite colleges earn as much money as students who attend elite colleges. Krueger found that students who graduate seven percentile ranks higher in their class tend to earn about 3.5 percent more money.

4. Rankings that look at career earnings fail to consider the aptitudes of students

Payscale Inc. gained a lot of press when it published a ranking of colleges based on a return on investment calculation: Taking the earnings of graduates and comparing them to the sticker price of the colleges. The problem with this ranking is that it assumes that the only difference between an MIT grad (the #1 ranked school by return on investment) and a Black Hills State grad (which Payscale ranked dead last at #852) is that one went to MIT and the other went to Black Hills State, and that the return on investment has nothing to do with the talents or intellect of the students.

5. Real experts -- as opposed to people trying to sell magazines full of car ads -- who look into this stuff have realized you can't compare colleges in any meaningful way

In September 2005, Secretary of Education Margaret Spellings convened a meeting of 19 top education experts to study college accountability and outcomes. The study reported that colleges provided "no solid evidence" of their value that consumers could use in comparing one college to another.

But the largest problem with all these college rankings and guides is this: A student's success or failure in college and in life will ultimately be determined by who they are, not which college they attend. Successful people attended all kinds of colleges - only three CEOs of the top 20 Fortune 500 companies attended "elite" colleges, and 12 of the top 20 attended public colleges.

***

As always, there is a spirited discussion on the College Confidential discussion forum about Zac's contentions. Check it out and join in. Post your thoughts here, too.

**********

Be sure to check out all my admissions-related articles and book reviews at College Confidential.