Sign up for our free monthly newsletter Beyond High School to get the latest news about college and career paths for Colorado’s high school grads.
Every year, university admissions officers read and sort through tens of thousands of essays. It’s a long, arduous process.
Now, some researchers say an artificial intelligence tool may be able to help admissions officers sort through essays and recognize prospective students who might previously have gone unrecognized.
The application is a long way off from actually being used in the admissions process, but the group that includes researchers from the University of Colorado Boulder say it has the ability to pull out key traits of students, such as leadership qualities or the ability to persevere.
The possible use of AI in admissions, however, raises questions about how universities would responsibly use it, especially because college admissions officers have said essays might carry more weight in the wake of the Supreme Court decision eliminating the use of race-based admissions.
Sidney D’Mello, a CU Boulder professor in the Institute of Cognitive Science and Department of Computer Science who helped develop the system, said he and fellow researchers want to emphasize the responsible use of AI, including calling for transparency in how admissions decisions would be made.
“We’re certainly very, very firm on the fact that it’s really what we call human-centered AI,” he said, “where the human is really the one making the decisions” and the AI acts as a tool.
To develop the AI tool, D’Mello and researchers from the University of Pennsylvania used more than 300,000 anonymous, 150-word essays submitted to colleges in 2008 and 2009. Those essays focused on extracurricular activities and work experiences.
A group of admissions officers then read those essays and scored them based on seven characteristics. The researchers trained the AI system based on how admissions officers evaluated those characteristics within the essays.
The AI platform was able to identify those characteristics in new essays and assign qualities to applicants across different student backgrounds, including whether students demonstrated teamwork or intrinsic motivation.
D’Mello said the model also showed it has potential to avoid bias by being designed not to show a preference for any particular racial, gender, or socioeconomic background.
“This is really kind of blending what computers do best — they can find patterns in large volumes of data — with what humans do best and that’s finding the best in each other,” D’Mello said. “This is the core of how we’ve been trying to approach this.”
Many universities across the country are evaluating their admissions processes after the Supreme Court’s affirmative action decision banning race-based admissions. They want to ensure they build diverse classes while still complying with the law.
U.S. Department of Education guidelines encourage colleges to use materials such as essays to get a fuller picture of who students are, the communities they come from, and any adversity — including discrimination — they might have dealt with.
At the same time, Melissa Clinedinst, director of research initiatives and partnership with the National Association for College Admission Counseling, said schools still rank essays lower than a student’s grades for college admissions or test scores. Colorado has made test scores optional for students for students applying to public universities.
Clinedist said colleges are trying harder than ever to find ways to improve their admissions processes. She could see how AI systems might appeal to school officials who have to sort through thousands to tens of thousands of essays with only limited staff to do so.
AJ Alvero, a computational sociologist at the University of Florida who focuses on language, ethnicity, culture, and education, and who wasn’t involved in the study, but reviewed it at the request of Chalkbeat, said the researchers do a great job keeping the ethical issues of bias at the forefront of their study.
Getting to a point where universities could use AI systems might be a long way away, he said.
“A technical concern here could be, if and when universities adopt these tools, are they considering how student language is changing?” he said.
He also said universities would need to put accountability measures in place if there are errors and have staff on hand, such as a computer scientist, to handle any potential problems.
Alvero said schools would also benefit students by allowing more transparency in the application process. Transparency could also give researchers a better look at how to evaluate bias within school decisions and how to train the AI systems.
D’Mello and his fellow researchers hope to continue to develop the AI, including small testing in cooperation with universities.
“We really want to take a measure twice, cut once approach when it comes to high-stakes things such as this,” he said.
Jason Gonzales is a reporter covering higher education and the Colorado legislature. Chalkbeat Colorado partners with Open Campus on higher education coverage. Contact Jason at jgonzales@chalkbeat.org.