All Hail the Social Mobility Index
The social mobility index is a ranking of U.S. colleges that you can find here. Scroll down to about the middle of that page to see the actual rankings. The idea is that colleges are ranked not according to prestige, or the average performance of graduates, or amount of money spent, but rather according to how many students have been lifted how far. Blurring over the methodological details, a college does better in the SMI rankings to take a struggling student and help them a lot than to accept a wealthy prodigy and help them network. This contrasts with other rankings, which might, for instance, penalize colleges for accepting too high a proportion of applicants.
The rankings themselves are fun to read, if only for the glorious feeling of seeing all your least favorite schools at the bottom of the list where they belong. The top fifteen are all California State — certainly not to be confused with the UC schools — and CUNY. High-ranking “prestigious” schools are, according to my rather poor understanding of which schools have prestige:
- Rutgers (Newark campus — this is R2, not the R1 campus)
- Florida International University
- Texas A&M
- Stony Brook
- Rutgers (Camden campus — also not the famous campus)
- UCSB
For some sense of scale, Stony Brook is ranked 37. My undergraduate and graduate school, UMD-College Park, is ranked 455. MIT is 989, but at least they’re doing better than Harvard: 1264.
Ranking colleges this way is glorious. Reading the copy on that page, I’m not even sure the people who implemented the social mobility index realize how wonderful their invention is — they seem to be concerned solely with attacking economic inequality (and perhaps scoring the occasional political point). But this way of ranking schools attacks more problems than just economic inequality.
Let’s start by looking at incentives. The traditional rankings often explicitly penalize colleges for accepting too many people, but even leaving that aside, the general emphasis on prestige (asking professors to rank universities off the top of their heads, for instance) is always going to create an incentive to be selective. Moreover, the idea that we should care about performance of students some years after graduation, without reference to where they started from, means that colleges are incentivized to select students who would be successful anyway. This ties in to Caplan’s signaling theory of education, but you can be skeptical of that while still acknowledging that colleges are being encouraged to find the best students, not the students they can help the most.
And, of course, when I say “best” students, I really mean “students who were doing very well before college”. Seeking such students is not the path to obtaining a racially, economically, or culturally diverse student body. Nor is it a good way to maximize utilitarian good. It’s pretty much the equivalent of a charity that only gives money to wealthy people, on the grounds that then the average recipient will be wealthier.
The SMI — made sufficiently influential — incentivizes colleges to find the students they can help the most. Colleges are rewarded for taking poorer students, charging them less, and raising their post-graduation salaries. That’s good. That’s what colleges should be doing. Colleges should be preferentially finding ways to find ways to help those who are disadvantaged. Well, the college that figures out how to do that will rocket to the top of SMI.
So, the SMI moves us toward greater social good. Maybe I don’t care. Maybe I just want to know which college is best. Well, the SMI is better at that! When I ask “which college is best”, what I’m interested in is the causal relationship between the college and how well I do afterward. I’m not interested in the fact that being accepted to Stanford is a bigger honor than being accepted to Oklahoma State. I don’t care that the average graduate of Princeton is wealthier, unless Princeton caused them to be wealthier.
Most rankings make really quite a poor effort to determine a causal relationship. The SMI, perhaps inadvertently, does much better. A college that has effective teachers, and an environment conducive to learning, and all the other things you might want — that college is going to have a very easy time rising to the top of the SMI rankings. A college that relies exclusively on determining who will already do well, and attracting them via prestige, will suffer horribly. So, if I want to look for the college that will teach me the most (or any similar goal), SMI should be much better than traditional rankings!
So, I’m in love with these rankings. They aid the selfish interests of the million or so people looking at colleges each year; they create incentives for colleges to pursue the utilitarian good. They work towards the goals of the left (social justice and racial equality) while pandering beautifully to the right (Harvard’s ranking is just what it should be, and as far as I can tell, none of the top-ranked schools are currently targets of discrimination lawsuits). Everybody’s happy.
It’s worth noting that debates over methodology are entirely reasonable. (I can’t find a detailed description of SMI’s methodology.) With one of the more prestige-oriented rankings (US News and World Report comes to mind), it doesn’t make sense for two rankings to disagree by much, because, well prestige is prestige. We’re not going to have a debate about whether a school is prestigious or not — the whole point is that everybody already agrees. But the social mobility index is trying to measure something real, and there can be legitimate disagreement over how to measure that thing. For instance, “Education Reform Now” pays much more attention to Pell grants, among other differences, and some of its rankings are wildly different. In particular, BYU ranks in the top ten with ERN, while not making it to the top 500 under the social mobility index!
Another similar ranking is from the “Equality of Opportunity Project”. I’m sure there are others I missed, and hopefully still more will be created.
Actually, I should expand on that last part a bit more. The SMI, while great, is not free of flaws. As it is, it uses a small number of aggregate variables (like tuition and graduation rate) as a proxy for aiding social mobility. Of course, collecting data is hard, but if I had my ‘druthers, the ranking would be calculated directly from the average difference between a student’s salary post-graduation and that student’s parent’s salary. I’m sure there are more-easily accomplished improvements available.