When Jocelyn Leitzinger had her university students write about times in their lives when they had witnessed discrimination, she noticed that a woman named Sally was the victim in many of the stories.

"It was very clear that ChatGPT had decided this is a common woman's name," said Leitzinger, who teaches an undergraduate class on business and society at the University of Illinois in Chicago. "They weren't even coming up with their own anecdotal stories about their own lives," she said.

Leitzinger estimated that around half of her 180 students used ChatGPT inappropriately at some point last semester — including when writing about the ethics of artificial intelligence (AI), which she called both "ironic" and "mind-boggling."