Preparing for College

Jockeying for Position

One of the hottest areas for discussion on the College Confidential forum is rankings.  As you may already know, I am not a fan of rankings. In my opinion, there are just far too many fine colleges that get routinely overlooked every year simply because they haven't played the rankings game or, for whatever reason, end up in the murky backwaters of the big rankings' listings, most prominently the U.S. News America's Best Colleges annual publication.

Not that my opinion matters one iota about this, but people often ask me why colleges sweat the rankings so much every year and why they play their games. There's an easy answer: money. Think I'm being too cynical? Well, maybe I can quantify my contention. Take a look at this:


U.S. News blogger, Bob Morse, in his Inside the College Rankings blog, asks what seems to me to be a highly rhetorical question: "Does Being at the Top of the Rankings Help Colleges?" Check out the rationale that forms at least one answer:

Is there an impact on a college's admissions indicators as a result of its position in the U.S. NewsAmerica's Best Colleges rankings? Is the influence of the rankings different depending on whether the school is a large research university or a smaller liberal arts college? How big could these effects be, and are they statistically significant?

In "Getting on the Front Page: Organizational Reputation, Status Signals, and the Impact of U.S. News and World Report on Student Decisions," recently published in Research in Higher Education , Nicholas A. Bowman of the University of Notre Dame and Michael N. Bastedo of the University of Michigan-Ann Arbor analyzed the U.S. News college rankings to try to answer those questions for top college administrators. Their article joins a rapidly expanding body of literature on college rankings and the impact the rankings might have on colleges and universities.

Their article's three key points are:

"First, moving onto the front page (the Top 50) of the U.S. News and World Report rankings results in a substantial improvement in admissions indicators in the following year, and these effects are apparent for both national universities and liberal arts colleges.

Second, once institutions have reached the top tier, moving up in the rankings provides noteworthy benefits for institutions in the top 25 and among national universities, but this impact is weaker or non-existent among liberal arts colleges and the bottom half of the top tier. Consumers of liberal arts colleges may not share the general perceptions of the overall population. One hypothesis is that these families are far more knowledgeable about higher education than are general consumers of higher education and therefore less sensitive to magazine rankings.

Third, tuition costs and instructional expenditures also serve as markers of institutional quality and prestige that yield improvements in subsequent admissions outcomes. These markers are influential primarily among liberal arts colleges and the lower half of the top tier. Consistent with the notion that potential consumers of liberal arts colleges are savvier in their decision-making, liberal arts colleges are the only type of institution in which admissions indicators are responsive to a proxy for institutional quality: expenditures on student instruction."

What is the nature of these benefits to colleges, and are they significant? According to the paper, appearing on the "front page" decreases a school's acceptance rate by 3.6 percentage points and results in a 2.3 percentage point increase in the proportion of students in the top 10 percent of their high school class. The "front page" effect was not significant for the average SAT scores, amounting to a 1.2 percentage point increase. From our end, all these changes are very small and would not have any impact on a school's standing in the Best Colleges rankings. I do wonder about the reliability of a statistical analysis that says it can accurately take into account all the factors that affect year-to-year admissions and can isolate the effect of the Best Colleges rankings.

The paper's conclusion that liberal arts colleges are not benefiting from their top-tier rankings and that prospective students and their parents are more influenced by factors other than the rankings is 100 percent counter to the statements of the presidents and admission deans from some liberal arts colleges. They have criticized the U.S. News college rankings as too influential in admissions decisions. I hope they read this paper and reconsider their criticism.

This final paragraph, along with its assertion that the U.S. News rankings exert more than a little influence on high school students' application and admissions choices, is telling. One superb national-caliber liberal arts college has stood up against this participative pressure: Reed College in Portland, Oregon. Reed's independence from the rankings hysteria has always heartened me. Check out their official statement regarding freedom from rankings and my highlighted sentence about the "consequences":

Since 1995 Reed College has refused to participate in the U.S. News and World Report "best colleges" rankings. Several times Reed's stance on the rankings has put the college in the national spotlight, most prominently in a Rolling Stone magazine article that raised serious concerns about the U.S. News best colleges issue.

Reed does participate in several other well-established college guides that do not assign numerical rankings to institutions, including Barron's, the Fiske Guide to Colleges, Peterson's, Colleges that Change Lives, Newsweek's College Guide, and the College Board's College Handbook. Each of these guides attempts to describe more fully the experience, student culture, and academic environment at different schools. Consistent with Reed's non-participation in U.S. News rankings, the college also does not participate in Money magazine's college-ranking issue.

Reed College has actively questioned the methodology and usefulness of college rankings ever since the magazine's best-colleges list first appeared in 1983, despite the fact that the issue ranked Reed among the top ten national liberal arts colleges. Reed's concern intensified with disclosures in 1994 by the Wall Street Journal about institutions flagrantly manipulating data in order to move up in the rankings in U.S. NewsU.S. News that he didn't find their project credible, and that the college would not be returning any of their surveys. In 1996 an op-ed in the Los Angeles Times by a leader of the student government at Stanford University praised Reed for refusing to provide information to U.S. News. The editorial advised prospective students to choose Reed if they "want to go to a school that isn't interested in selling out its education." and other popular college guides. This led Reed's then-president Steven Koblik to inform the editors of

The college has repeatedly asked U.S. News simply to drop it from the best-colleges issue, yet the magazine continues to include Reed and to harvest data from non-Reed sources. Reed's subsequent yo-yo relationship with U.S. News has turned into quite a spectator sport. The year the college refused to submit data, the magazine arbitrarily assigned Reed the lowest possible in several categories and relegated the college to the lowest tier in its category, the most precipitous decline in the history of its ratings. The following year, responding to widespread criticism of its retribution, the magazine trumpeted Reed in its "best colleges" press release as being new to the "top tier" of national liberal arts schools. After that Reed was relegated to the "second tier" until this year when it was returned to the top tier in a tie for 47th place, even though the magazine's sources rate the college's academic reputation as high or higher than half of the top-ranked schools.

The college's decision was not without risk especially related to admission. To date, however, the action has received widespread enthusiastic support from parents, students, faculty members, high school college counselors, and other college and university presidents--several of whom have even confided that they wish they could refuse to participate. In the years since Reed has stopped participating, two measures of institutional vigor—admission and fundraising—have been robust. This past year Reed received a record number of applications for admission and exceeded goals for its annual fund.

Reed's president, Colin Diver, cautions prospective students and parents against relying on rankings. Rankings, he says, are grounded in a "one-size-fits-all" mentality. "They are primarily measures of institutional wealth, reputation, influence, and pedigree. They do not attempt, nor claim, to measure the extent to which knowledge is valued and cultivated" on each campus. Reed doesn’t rank its students. "Why should we participate in a survey that ranks colleges?” he asks.

Reed continues to stand apart from ephemeral trends, resisting pressures to abandon its core principles and its clear focus on academics. Studies continue to show Reed graduates earning doctorates or winning postgraduate fellowships and scholarships (such as Rhodes, Fulbright, Watson, and Mellon) at rates higher than all but a handful of other colleges. Says President Diver, "Reed is a paradigmatic example of a college committed--and committed solely--to the cultivation of a thirst for knowledge. Reed illustrates a relatively small, but robust, segment of higher education whose virtues may not always be celebrated by the popular press, but can still be found by those who truly seek them."

If I were wearing a hat, it would be off in a second to Reed. BRAVO!

Bottom line for all you parents and future college applicants: Don't get caught up in rankings! There are many other avenues of research that can give you the information you need to make the best enrollment decision. (More about that later.) I would love for someone to do a survey of enrollment motivations versus satisfactory outcomes. In other words, I would like to see the statistics on how many college students who are unhappy with their college choice made that choice based on some form of rankings, especially the U.S. News rankings.

I have counseled high schoolers who have gone on to Harvard, for example, and who also have come back to me for transfer advice after they have found themselves to be perfectly miserable with their collegiate situation. I'm just using Harvard as an example. However, these unhappy young people have, to a person, all made some kind of "Well, [college name here] is #[ranking]!" when asked why they thought their choice was the right one. That's unfortunate. Lots of time and money wasted.

The very nature of rankings always reminds me of Macbeth's pronouncement: ". . . full of sound and fury, signifying nothing." People, please . . . Think for yourself!

Don’t forget to check out all my admissions-related articles and book reviews at College Confidential.