Caroline Hoxby, a Stanford professor whose work I’ve mentioned before, has released another report, this time entitled “The Changing Selectivity of American Colleges” on how the landscape of college admissions has changed over the past 55 years. A writer for the *New York Times *blog The Choice, Rebecca Ruiz, has written about the report and responded to some of its interesting findings and hypotheses, particularly about the more selective colleges. For one thing, Hoxby has found that:
Since 1955…the number of high school graduates has grown by 131 percent while the number of college spots has risen by 297 percent.
This fact, combined with falling transportation and communication costs, means that she finds that 90 percent of colleges are in fact less selective than in 1955. So far, so good: there are more college slots due to new schools and expansions of old schools and its easier for kids to go further away from home, so more kids from, say, NYC end up across the country rather than clogging every college in the city (not that they don’t try to go to schools in the city, but the fact that they have the option to go elsewhere, eliminates the potential bottleneck there), meaning that many schools end up being less selective than they might have been in 1955.
However, it is Hoxby’s findings about the 75 to 100 top institutions (the uppermost 10 percent) that are the most interesting. As we all know, acceptance rates at Ivy League institutions and other elite universities across the country have declined dramatically over the years, something which continues today. At the very top, acceptance rates have dipped below 10 percent, down from as high as 30-40 percent in the 1970s. Why is this?
The conventional explanation is basic supply-and-demand: for the most part, increases in demand for slots at these universities has outpaced increases in supply. There is a fixation on these elite schools, so they continue to receive more and more applications. Hoxby proposes an alternative:
I think it has to do with information. If you think of the typical college student in 1955, this college student had no idea where he or she was in the national distribution. He didn’t know if he was in the 90th percentile or the 10th percentile. And the colleges didn’t know, either…Once there was more information due to national testing and other things, students figured out who they were. Before that, they’d been very dependent on high school counselors and the local feeder pools…Localness has just become less important. It’s really all about re-sorting of students.
In effect, she is arguing that in the past, colleges relied on local high schools, historic feeders, and a healthy dose of legacy to get applicants. Without any strong objective standard, elite universities accepted a wider array of students rather than simply the highest achieving ones. As information (such as SAT scores and AP courses) appeared to allow universities to make cross-candidate comparisons, they became able to both select for more elite candidates; meanwhile, as college rankings and other cross-school measurements appeared, the best schools became apparent to students, allowing them to attract the best candidates. In 1955, a great student in Alabama might have been looking at the University of Alabama, or perhaps an even more local school, but now that student is also considering Georgetown or Stanford.
This all seems to make sense, but Hoxby pushes this theory to an almost bizarre extreme claiming to Ms. Ruiz that “[t]hese colleges have expanded enough for extra students to go. Change in selectivity doesn’t come from the fact that they have not opened enough seats.” I find this incredibly hard to believe. Sure, there may be better applicants applying to Stanford now than in 1955, when the school was even more overwhelmingly Californian (and West Coast in general), but that doesn’t change the fact that Stanford is receiving record numbers of applications every year. It’s not just that the caliber of the competition has increased: there are a lot more competitors too.