Legacy Admissions Policies Were Originally Created To Keep Jewish Students Out Of Elite Colleges
It's not a secret that legacy applicants receive a fairly significant admissions bump - about a 45% higher chance of getting into a school - and many of the top universities seem to openly endorse the process, although they say it only becomes a factor in "tie-break" situations.
The president of Princeton University recently described legacy preferences as "a recognition of a special bond that Princeton has with its alumni and it matters so much to the University."
The former president of George Washington University invoked similar logic when defending the practice in The New York Times, writing:
Of course, this is not the case for all schools. A Massachusetts Institute of Technology admissions officer wrote on his blog that "if anyone in our office ever advocated for a mediocre applicant on the basis of their 'excellent pedigree' they would be kicked out of the committee room."
We were curious as to why this policy started, and how it evolved into the standard for so many top universities. We turned to UC Berkeley Professor Jerome Karabel's "The Chosen: The Hidden History of Admission and Exclusion at Harvard, Yale, and Princeton," which offers a great view of the social and cultural changes that drove colleges to radically change their standards for new students.
He writes that the decision to boost acceptance rates for children of alumni was spurred by the quickly increasing immigrant population entering the U.S. at the beginning of the 20th century. As Karabel writes in his introduction (emphasis ours):
In response to the influx of outsiders, Karabel writes, the Big Three created an admissions system that still exists in some form today:
For more information on the legacy of legacy admissions, you can buy Karabel's book here.