Educational Advisors

Industry News

What Higher Ed Gets Wrong About AI Chatbots — From the Student Perspective

What Higher Ed Gets Wrong About AI Chatbots — From the Student Perspective


Mary Jo Madda (Columnist)
May 15, 2023
As a doctoral student at the University of California at Los Angeles, I was among those who got a recent campus-wide email with an urgent directive: Don’t use AI chatbots like ChatGPT or Bard or Bing, as doing so “is equivalent to receiving assistance from another person.”
Upon reading it, I took a pause. I’m a former educator in the process of writing my dissertation for my Doctorate of Education, as part of a part-time program while working a full-time job at Google. And as someone who is also a former journalist and editor for EdSurge, I recognize that we should never plagiarize, and that artificially-intelligent chatbots are very, very capable of responding to prompts like “Write me a 500-word essay on Shakespeare’s Twelfth Night.”
But as a student, and frankly, as a former teacher, my university’s approach struck me as incredibly short-sighted.
Oftentimes, when it comes to technology and new innovations, folks move to a “good” or “bad” binary. This is “good,” while this is “bad.” But in this case, AI chatbots actually fulfill a really important role on college campuses. If I’m in need of a tutor, or an editor, or a professor’s help, is that not “receiving assistance from another person”? And furthermore, if those folks aren’t willing or available to help me, why not have a chatbot fulfill that role?
Perhaps we need to reframe the idea of what AI chatbots can do. As such, here are three examples of use cases I’ve heard from fellow students—and how higher education can do a better job of incorporating the student perspective into these policies.
Continue Reading

We have worked with schools across the nation who are accredited by national and regional agencies such as:

National Association of Schools of Art and Design