In his “Common Sense” column in the Times’ Oct. 2 Business Day section, writer James B. Stewart sought to unravel the tangled web woven by the dozens of college rankings that have been produced.
His goal? To come up with a simple, common-sense method that would circumvent all “the same flaws that afflict nearly every other college ranking system: There is no way to know what, if any, impact a particular college has on its graduates’ earnings, or life for that matter.”
Working with Brookings Institution fellow Jonathan Rockwell, who had developed an innovative scale measuring the “value added” by a college to what its students bring with them to campus, they put together a ranking that zeroed out the impact of the students’ majors.
They called it “the Brookings-Common Sense ranking.”
All of the top 10 were liberal arts schools.
And right in the middle of that top 10 was Wagner College at #6 in the nation.
Watch the CNBC interview with James Stewart:
And read the complete column by James B. Stewart below.
College rankings fail to measure the influence of the institution
Students, parents and educators increasingly obsessed with college rankings have a new tool: the Obama administration’s College Scorecard. The new database focuses on a college’s graduation rate, graduates’ median earnings 10 years after graduation and the percentage of students paying back their college loans.
While Scorecard adds potentially valuable information to the dizzying array that is already available, it suffers from many of the same flaws that afflict nearly every other college ranking system: There is no way to know what, if any, impact a particular college has on its graduates’ earnings, or life for that matter.
“It’s a classic example of confusing causation and correlation,” said Frank Bruni, the author of “Where You Go Is Not Who You’ll Be,” a book about the college admissions process, and an op-ed columnist for The New York Times. “Anyone who has taken statistics should know better, but when it comes to colleges, that’s what people do. They throw common sense out the window.”
Of course graduates of the Massachusetts Institute of Technology (average postgraduate earnings $91,600, according to the Scorecard) and Harvard ($87,200) do well. That’s because the students they admit have some of the highest test scores and high school grade point averages in the country, reflecting high intelligence and a strong work ethic — two factors that cause high future earnings. That is generally true regardless of where such students attend college, as long as they go to a reputable four-year institution, various studies have shown.
“It’s absurd,” said Jerry Z. Muller, a professor of history at Catholic University of America and the author of “The Costs of Accountability,” a study of misplaced and misunderstood metrics. “Their graduates have high earnings because they’re incredibly selective about who they let in. And many of them come from privileged backgrounds, which also correlates with high earnings.”
The College Scorecard does not rank colleges, but anyone can use the data to do so. M.I.T. (No. 6 on Scorecard earnings) and Harvard (No. 8) are the only universities in the Scorecard’s top 10 that are also highly ranked by the influential U.S. News and World Report. The other schools have a narrow focus on highly paid skills. The No. 1 school on Scorecard is MCPHS University, whose graduates earn, on average, $116,400. (MCPHS stands for Massachusetts College of Pharmacy and Health Sciences, which is not even ranked by U.S. News.)
But pay, of course, says nothing about the relative quality of different colleges. “If you go to M.I.T. and earn a degree in engineering, you’re going to make more than if you go to Oberlin and major in music performance,” Professor Muller said. “But you already know this. To rank the value of colleges based on the ultimate earnings of their graduates radically narrows the concept of what college is supposed to be for.”
Andrew Delbanco, a professor at Columbia University and author of the book “College: What It Was, Is, and Should Be,” agreed. “Holding colleges accountable for how well they prepare students for postcollege life is a good thing in principle,” he said. “But measuring that preparation in purely monetary terms raises many dangers. Should colleges be encouraged first and foremost to maximize the net worth of their graduates? I don’t think so.”
And that is assuming the earnings data is reliable. Scorecard draws from a substantial database of tax returns, but measures the postgraduate incomes only of students who received federal loans or grants, which excludes most students from high-income families. And high family income is a factor that correlates strongly with postgraduate earnings.
PayScale, which ranks colleges based on postgraduate earnings reported by users of its web services, produces numbers that in many cases are substantially different from Scorecard’s. PayScale’s “midcareer” earnings for graduates of Harvard (ranked third at $126,000) and M.I.T., (No. 6, at $124,000) are much higher than Scorecard’s figures.
As with Scorecard, PayScale’s top-ranked institutions, SUNY-Maritime College in the Bronx ($134,000) and Harvey Mudd College in Claremont, Calif. ($133,000), train students for specialized, high-paying fields.
U.S. News does not even include earnings data in its ranking formula, although it said it might do so. “The federal data is a large and new data set, and we’re studying it,” said Brian Kelly, editor and chief content officer for U.S. News. “It represents a subset of students, and we’re looking closely to determine if it in fact tells us what it claims to.”
Some schools highly ranked by U.S. News — Grinnell, Smith and Wellesley, for example — have low rankings on PayScale and low earnings results on Scorecard. Mr. Kelly said U.S. News was examining these “anomalies.”
This year, the Brookings Institution published its own ambitious college rankings that try to improve upon what it sees as flaws in the other lists. It calculates the “value added” of each college by comparing what graduates would be expected to earn given their entering characteristics to what they do earn after graduating.
Because of their high test scores and other factors, students entering Harvard would be expected to do well in postgraduate earnings (a projected $85,950, according to Brookings). That they actually earned $118,200 is a measure of what a Harvard education added to their potential earnings.
The Brookings rankings factor in the nature of a college’s curriculum, the career choices of its graduates and the percentage of graduates prepared for so-called STEM occupations (science, technology, engineering and math), so like Scorecard and PayScale results, its rankings are dominated by schools with narrow focuses on those high-paying areas.
Of the eight schools earning perfect scores of 100 in its rankings, five have technology-focused curriculums: California Institute of Technology; M.I.T.; Rose-Hulman Institute of Technology in Terre Haute, Ind.; SUNY-Maritime; and Clarkson University in Potsdam, N.Y. (Brookings draws its data from PayScale, LinkedIn and the Bureau of Labor Statistics.)
Jonathan Rothwell, a fellow at Brookings and an author of the study, said that many educators applauded this approach but it had drawn criticism from the liberal arts community, which says it unduly weights a narrow focus on high-paying STEM fields. Mr. Rothwell defended that approach, noting that a college’s curriculum and what field a student studies were “hugely relevant to graduate success.”
But he acknowledged that liberal arts programs and programs that train students for lower-paying fields were valuable to both individuals and society. “If your only goal is to make as much money as possible, you should study engineering, computer science, biology or business,” he said. “But most people are interested in more than just making money.”
So, for the benefit of those people, I asked Mr. Rothwell to do a ranking that deleted the curriculum component and identified the highest “value added” colleges regardless of major. I’m calling this the Brookings-Common Sense ranking. Here’s the top 10:
1. Colgate University
2. Carleton College
3. Washington and Lee University
4. Westmont College
5. Kenyon College
6. Wagner College
7. Marietta College
8. Manhattan College
9. St. Mary’s University
10. Pacific Lutheran University
Under this methodology, liberal arts schools like Colgate and Carleton shot up the rankings. No Ivy League schools made the top 20 on this list, suggesting that many of those students have an edge heading into college. The highest-ranked Ivy was Brown, at No. 45. And most of the engineering and technical schools, even M.I.T. and Caltech, stripped of their curricular weighting, plummeted. (I studied history and French at DePauw University, a liberal arts college, which ranked No. 19.)
The bottom line is that no ranking system or formula can really answer the question of what college a student should attend. Getting into a highly selective, top-ranked college may confer bragging rights, status and connections, but it doesn’t necessarily contribute to a good education or lifelong success, financial or otherwise.
The obsession with college rankings and graduates’ earnings “is just the most recent example of a larger phenomenon, which is that the gathering of numerical information acts as a kind of wish fulfillment,” Professor Muller said. “If you have enough metrics and benchmarks, somehow people believe that’s going to solve a major problem. It rarely does.”