On the morning of September 8, all was well. Students shuffled off to classes, yawning, rubbing their eyes and sipping coffee—oat milk infused, naturally. Professors assumed their lecterns, delivering learned wisdom to disinterested youths. Most were on Facebook, as is customary. But for the usual disruptions—the noisy tabling on the BC plaza, the crazed gaggle around dear Nugget, the occasional spasm of student activism—Duke was at peace. Then, everything changed when the rankings dropped.
“I should’ve ED’ed to Northwestern,” a first-year spat. “Has my time here been squandered?” a senior mused. “That transfer application deserves a once-over,” a sophomore muttered. Duke had dropped two spots in the annual U.S. News and World Report rankings, from an acceptable eighth to a precarious tie for tenth. Across campus, students vented their frustration with nefarious forces outside their control. One friend insisted that “heads are gonna roll in the administration, trust me.”
Reactions to the rankings vary. Predictably, that variance depends on school placement. If a school ascends, then students are content with their fat, happy and thoroughly confirmed priors. If a school plummets, then students become ranking reactionaries prone to fatalism and conspiracy. Year after year, the pattern stubbornly persists. It’s our own fault. We assume the rankings are meaningful, but that assumption should be challenged.
We rarely question the value of more information. Most people assume that more data about anything, from national economic output to Kombucha consumption per capita, is an unalloyed Good Thing. No doubt, data, statistics and figures often enable us to better understand our world and more effectively solve the problems we face. But where data analysis was once a means, among many, of understanding our world, it is now regarded as a royal road to understanding. Consequently, our society has developed a quantification bias, a sense that numbers must hold the answers. We acknowledge there are other considerations, but if a solution is produced by something other than “data driven strategy” then it is lacking in the “legitimacy department.”
This faith in data is unreasonable. The information we manipulate on excel sheets, run through regressions and report on at length is a human creation and thus imperfect. Our data is defined, framed, collected and interpreted according to our biases, prejudices and failures. That fact is obvious, but it bears emphasis regardless. Too often, we treat data abstractly, as if it were a fruit of the tree of observed reality plucked by some means other than our own fallible fingers.
We should all take a page from John J. Cowperthwaite’s book. As British financial secretary of Hong Kong during the 1960s, Cowperthwaite “refused to compile and distribute official data for economic output” responding “absolutely not” to all requests for data during his tenure. According to economic historian Neil Monnery, Cowperthwiate did this because he believed that publishing data would create a demand “for government intervention in the economy,” which he, as a free marketeer, strongly opposed. He feared that people would claim license for action based on poorly measured, highly inaccurate and easily distortable data. Cowperthwaite realized that the mere existence of data has “the second order effect of people therefore wanting to manage it and influence it and do stuff with it.”
Cowperthwaite’s realization has utility beyond the domain of economics. Among other things, it applies to our relationship with the U.S. News and World Report rankings.
In technical terms, the U.S. News and World Report rankings suck. Their rankings are based on six metrics, all of which are, according to researcher George Leef, “either input measures or subjective evaluations” and the “weights assigned to these factors are also entirely subjective.” A particularly nebulous measurement, “academic reputation,” accounts for 20% of a school’s ranking and is formulated by polling each college’s provosts, presidents and admission officers. These administrators are expected to rank all 4,000 other schools in the report according to “academic reputation.”
What? How can anyone reasonably expect the President of any University to know the relative reputations of more than 4,000 schools? Does US News expect the President of Boston College to have a firm belief as to why San Diego State should be ranked 147 as opposed to 151? These ridiculous expectations strain credulity.
Why do the rankings measure input factors instead of output factors such as student achievement: would that not tell us more about the quality of schooling? Why is school selectivity weighted at 15% specifically? Should academic reputation be a consideration at all? These questions, among many others, deserve consideration and should inspire harsh scrutiny.
“Ok the rankings are flawed, so what? No statistic is perfect!” Fair enough. The rankings are flawed and we live in a fallen world. However, the problem with the rankings goes further than just the fact that they are themselves flawed—they are also easily manipulated. Even US News acknowledges this, which is why they dropped the University of Oklahoma and Cal Berkeley for submitting false information. Those are only the most blatant offenders. Plenty of quantitative shenanigans are fair game, such as gaming class sizes to create the illusion of selectivity or considering financial donations during admissions to incentivize alumni giving.
Despite the fact that the rankings are utter drek, they still influence our behavior. As Cowperthwaite predicted, their mere existence influences where high schoolers apply and whether administrators are fired or given a raise. Amazingly, significant decisions are justified by these nonsense numbers.
Get The Chronicle straight to your inbox
Signup for our editorially curated, weekly newsletter. Cancel at any time.
The quantification of academic standing grants us an illusion of comprehension and control. We understand how to objectively assess a school’s quality. Therefore, we can establish a ranking of every school. But we cannot. And, we probably should not. If you want to learn about a school then go visit it or contact alumni. For God’s sakes, don’t consult the rankings.
Reiss Becker is a Trinity junior. His column, “roused rabble,” runs on alternate Wednesdays.