Select Language

English

Down Icon

Select Country

America

Down Icon

OpenAI Wants to get College Kids Hooked on AI

OpenAI Wants to get College Kids Hooked on AI

AI chatbots like OpenAI’s ChatGPT have been shown repeatedly to provide false information, hallucinate completely made-up sources and facts, and lead people astray with their confidently wrong answers to questions. For that reason, AI tools are viewed with skepticism by many educators. So, of course, OpenAI and its competitors are targeting colleges and pushing its services on students—concerns be damned.

According to the New York Times, OpenAI is in the midst of a major push to make ChatGPT a fixture on college campuses, replacing many aspects of the college experience with AI alternatives. According to the report, the company wants college students to have a “personalized AI account” as soon as they step on campus, same as how they receive a school email address. It envisions ChatGPT serving as everything from a personal tutor to a teacher’s aide to a career assistant that helps students find work after graduation.

Some schools are already buying in, despite the educational world initially greeting AI with distrust and outright bans. Per the Times, schools like the University of Maryland, Duke University, and California State University have all signed up for OpenAI’s premium service, ChatGPT Edu, and have started to integrate the chatbot into different parts of the educational experience.

It’s not alone in setting its sights on higher education, either. Elon Musk’s xAI offered free access to its chatbot Grok to students during exam season, and Google is currently offering its Gemini AI suite to students for free through the end of the 2025-26 academic year. But that is outside of the actual infrastructure of higher education, which is where OpenAI is attempting to operate.

Universities opting to embrace AI, after initially taking hardline positions against it over fears of cheating, is unfortunate. There is already a fair amount of evidence piling up that AI is not all that beneficial if your goal is to learn and retain accurate information. A study published earlier this year found that reliance on AI can erode critical thinking skills. Others have similarly found that people will “offload” the more difficult cognitive work and rely on AI as a shortcut. If the idea of university is to help students learn how to think, AI undermines it.

And that’s before you get into the misinformation of it all. In an attempt to see how AI could serve in a focused education setting, researchers tried training different models on a patent law casebook to see how they performed when asked questions about the material. They all produced false information, hallucinated cases that did not exist, and made errors. The researchers reported that OpenAI’s GPT model offered answers that were “unacceptable” and “harmful for learning” about a quarter of the time. That’s not ideal.

Considering that OpenAI and other companies want to get their chatbots ingrained not just in the classroom, but in every aspect of student life, there are other harms to consider, too. Reliance on AI chatbots can have a negative impact on social skills. And the simple fact that universities are investing in AI means they aren’t investing in areas that would create more human interactions. A student going to see a tutor, for example, creates a social interaction that requires using emotional intelligence and establishing trust and connection, ultimately adding to a sense of community and belonging. A chatbot just spits out an answer, which may or may not be correct.

gizmodo

gizmodo

Similar News

All News
Animated ArrowAnimated ArrowAnimated Arrow