No. 22, July-August 2023. Before each class, the professor would ask students to submit brief answers to questions about the case of the day. He usually received perfunctory answers until this year, when the quality took a surprising leap. The reason: ChatGPT, OpenAI's product capable of passing law exams and is already making its way around higher education. To ban or not to ban generative AI in classes? asks the headline in El Financiero (Costa Rica).

In search of an answer, I asked Florian Federspiel, colleague and expert in quantitative methods and decision making who teaches with cases, the following question: What is your experience in teaching and grading exams when students have access to ChatGPT?

"That's a great question," he replied. "In short, I haven't noticed any major impact in any of my classes. I've passed my exams through ChatGPT and, to my surprise, didn't answer any questions correctly...after testing it a bit, I can understand how easy it would be to get it wrong in the subjects I teach, unless you ask the questions so accurately that you already know how to answer them."

"ChatGPT is one of the 'Large Language Models' (LLMs) among several, and while it is the most advanced, its impact on our case-based educational model should not be overstated. Its capability in certain areas-such as coding assistance-is impressive, but it has some limitations:

ChatGPT does not know the truth and, at best, estimates what is considered the truth according to the data to which it has access (the entire Internet), with all its biases;
still 'hallucinates' a lot, offering information that does not exist or is erroneous; and
is still useless for complex analysis and decision making because it lacks the necessary critical thinking."

Others point out the virtues of ChatGPT: in an HBP webinar, Professor Mitchell Weiss, a self-described "lover of the case method," explains how to use it to maximize learning in preparing cases and clarifying concepts, with the example of a student struggling to understand the difference between "network effects" and "virality."

But Professor Weiss does not address the dilemma of the teacher who wants to distinguish between what the student knows how to do independently and what ChatGPT knows how to do. In his case, students are submitting text as if it were their own, without attribution to the true source. This is plagiarism, and there are ways to control it, such as GPTzero.

There is consensus among the universities interviewed by El Financiero that generative IA such as ChatGPT is an everyday reality that facilitates a variety of tasks, and that it is important for the student to know what they can and cannot expect from it. A good practice is to allow the use of ChatGPT, but the student should explain why and how they are using it, and how its use enriches their educational experience.

ChatGPT can be wrong, but he has a certain humility. When I asked questions about a case, he confessed that he had not received training on cases written after a certain date. He offered a generic solution to address the situation I described, but cautioned that it may not have relevance since he didn't know the environment. And so it was: his recommendations were irrelevant.

In short, ChatGPT can be a valuable academic assistant, but like many useful tools, it can be misused.
- John C. Ickis

Image by pch.vector on Freepik (edited)