Can AI-powered college counsellors ‘democratise’ admissions?

Proponents of new technology say it can alleviate stress on advisers and make expertise more accessible but critics say this work needs a human touch

April 9, 2024
Source: iStock/TennesseePhotographer

The College Guidance Network (CGN), an informational resource hub for counsellors and students, held a virtual round table last week to introduce AVA, the newest artificial intelligence-powered college counselling assistant.

Thousands showed up to the demonstration, many identifying themselves in the sidebar chat as high school counsellors and independent admissions consultants. Before long they were flooding the chat with questions and concerns.

Brennan Barnard, director of college counselling at the Khan Lab School and the round table’s moderator, seemed both pleased and slightly uneasy about the turnout for the demo.

“This speaks to a moment of significant potential,” he said. “And also, I’m sure, some trepidation. But I for one am really hopeful.”

ADVERTISEMENT

Students have been using ChatGPT and other generative AI tools to write essays for more than a year now, a trend that has raised concern but seems largely unstoppable. Even some college admissions offices have begun using AI to ease their workloads, however begrudgingly.

AVA, which will launch in pilot this autumn, is the latest AI counselling tool meant to replicate the work of a high school counsellor or private admissions consultant. Proponents of the technology argue it could reduce the burden on overworked counsellors and give students access to expertise and information 24/7 during the stressful application cycle. Critics worry it could be seen as a cheap alternative to high-impact counselling for students who most need a human touch.

ADVERTISEMENT

Angel Pérez, president of the National Association of College Admissions Counselors (NACAC), lent his organisation’s considerable heft to the project by partnering with CGN on AVA’s launch. He spoke at the round table about NACAC’s role in engaging admissions professionals to help train the bots.

“I think a lot of our members are kind of putting their heads in the sand about this issue. The truth is, we have to engage with this technology; it’s already here,” he told Inside Higher Ed. “It’s true we’re stepping into the unknown, but I would rather our profession be involved in informing this technology as it evolves. If we don’t, someone else with profit-driven, less-than-ideal motives will be the one doing this work.”

Katie Cameron, a high school counsellor and assistant executive director of the Nebraska School Counselor Association, attended the round table out of curiosity. She has a 300-student caseload and was intrigued by the idea of using AVA to help her serve them better.

“As counsellors, we do way more than just college prep,” she said. “I like the idea of it, especially if it saves us time on the simple tasks.”

Equity hopes and ethical concerns

Jon Carson, CGN’s chief executive, first started building what would become an informational college-advising resource for students and families in 2019, when he went through what he called the “terrible experience” of helping his son apply to colleges.

“We were flying without instruments,” he told attendees at the virtual round table. “There were grotesque inequities: side doors, back doors, expensive consultants; it was hard to get the advice we needed…the expectation seemed to be that our 17-year-old was going to navigate this solo.”

The purpose of AVA, he said, is to “democratise advising”. Currently, students at well-resourced public high schools and private boarding schools get frequent help from counsellors with only a few dozen students on their caseloads, while students at cash-strapped public schools are lucky to schedule one meeting a year. AI chatbots can serve families with limited access to counselling services, Mr Carson argued, which will help close the massive equity gap in college counselling.

AVA is also trained in multiple languages – Mr Carson said it would launch with one or two dozen options – making it a potential game changer for immigrant families who may struggle with language barriers in the application process.

ADVERTISEMENT

Royel Johnson, associate professor at the University of Southern California’s Rossier School of Education, said he sees the potential upside of introducing a tool like AVA to under-resourced high schools, especially if it’s offered by districts as a free resource for families.

But he also cautioned against entrusting AI to be inclusive in its advice and sensitive to students’ lived identities.

“These tools, when you try to make them colourblind, often end up exhibiting some form of racial bias,” he said. “They need to be trained for racial sensitivity, which is a very difficult undertaking.”

More than anything, Dr Johnson worries that AI chatbots could exacerbate existing disparities in college counselling, especially if students who are less engaged in the admissions process are routinely redirected to the bot while higher-income, highly motivated students are granted access to more intensive human advising.

“The students who AI counsellors are aimed at serving are also the ones in most need of contextualised, high-contact advising,” he said. “There are so many perils and promises here.”

Ethical concerns also hang over the enterprise. A frequently asked question at CGN’s round table concerned student data privacy, an issue the hosts seemed to be actively working on. The basic notion of AI’s place in the admissions process is hotly debated as well.

NACAC is partnering with several universities to form an AI ethics committee, which Dr Pérez said would address ethical questions including, “Should counsellors use AI to write recommendation letters?” (his answer: “They already are!”) and “Should students use AI to help outline their admissions essays?”

Mr Carson said CGN is recruiting counsellors to help further develop AVA through practice. The nonprofit wants to build a “community of practice”, with volunteers who will test the tool and send feedback.

“We don’t have all the answers, and this is the best way to build something folks feel they can trust,” he said at the round table. “Because that’s who this is for.”

More than just a chatbot?

AVA, Mr Carson noted, is not a replacement for counsellors. It is not intended for curating individualised college lists or reassuring first-generation students that they belong in a lecture hall. That’s human work, he said, and can’t be replicated by any AI, no matter how well trained.

But AVA can answer foundational questions about financial aid and application requirements, or help a student find the right framing device for their essay. In that way, AVA is more like a streamlined, reliable resource for frequently asked questions, Mr Barnard said – a way to get students on the path toward college and free up counsellors’ time.

On CGN’s website, AVA is referred to as “the first and only AI counselling assistant for students and families”. But there’s also Ivy, a comprehensive, generative AI counsellor from the educational consulting and technology company CollegeVine.

CollegeVine co-founder Vinay Bhaskara drew a fine but important distinction between AVA and Ivy: the former, he said, is essentially a “chatbot with expertise”, a characterisation he said was not meant to be belittling, whereas Ivy is a “personalized counselling system”.

At the round table, Mr Carson said AVA was trained on the knowledge of hundreds of experts across 110 topics in college admissions. Ivy was developed with input from admissions experts as well, but takes its cues primarily from student members’ individual CollegeVine profiles, Mr Bhaskara said, which record their interests and aspirations while keeping track of deadlines and to-do lists during the application cycle. Ivy is also trained to be conversationally intelligent; it will remember previous discussions with students, and bring things up as necessary.

Inside Higher Ed was given a private demonstration of Ivy last fall, and this reporter can say that, based on demos of both AVA and Ivy, the distinction seems accurate.

“Because [Ivy] is integrated into the network, it knows you better,” Mr Bhaskara said. “It’s totally different than ChatGPT. It’s offering something unique.”

That something sounds an awful lot like what a human counsellor offers: personalised service, emotionally intelligent advising, a rapport that deepens with time. Mr Bhaskara, like Mr Carson, insists that his tool is meant to help counsellors, not replace them. But he said it’s not a bad thing that AI can replicate the most essential parts of the job.

“AI has to be part of the future of this field,” he said. “The system has been calling out for more capacity for 20 years. But that’s not going to be solved with chatbots; it will be with comprehensive tools like ours.”

Ms Cameron, the counsellor from Nebraska, tried Ivy out after she received an ad in her inbox last autumn, in the midst of a particularly hectic application season. She was faced with dozens of requests for recommendation letters, which she said often took an hour each to write; Ivy, she said, cut that down to mere minutes.

But Ms Cameron isn’t too worried about AI taking her job, and neither are the members of the Nebraska counselling association she stewards. Anything to reduce counsellors' workloads and help her students, she said, is worth trying. The rest is just static.

ADVERTISEMENT

This is an edited version of a story that first appeared on Inside Higher Ed.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Related articles

Sponsored

ADVERTISEMENT