Estimated read time: 5-6 minutes
This archived news story is available only for your personal, non-commercial use. Information in the story may be outdated or superseded by additional information. Reading or replaying the story in its archived form does not constitute a republication of the story.
NEW YORK — When college administrator Lance Eaton created a working spreadsheet about the generative AI policies adopted by universities last spring, it was mostly filled with entries about how to ban tools like ChatGPT.
But now the list, which is updated by educators at both small and large U.S. and international universities, is considerably different: Schools are encouraging and even teaching students how to best use these tools.
"Earlier on, we saw a knee-jerk reaction to AI by banning it going into spring semester, but now the talk is about why it makes sense for students to use it," Eaton, an administrator at Rhode Island-based College Unbound, told CNN.
He said his growing list continues to be discussed and shared in popular AI-focused Facebook groups, such as Higher Ed Discussions of Writing and AI, and the Google group AI in Education.
"It's really helped educators see how others are adapting to and framing AI in the classroom," Eaton said. "AI is still going to feel uncomfortable, but now they can now go in and see how a university or a range of different courses, from coding to sociology, are approaching it."
With more experts expecting the continued application of artificial intelligence, professors now fear ignoring or discouraging the use of it will be a disservice to students and leave many behind when entering the workforce.
Since it was made available in late November, ChatGPT has been used to generate original essays, stories and song lyrics in response to user prompts. It has drafted research paper abstracts that fooled some scientists and passed exams at esteemed universities. The technology, and similar tools such as Google's Bard, is trained on vast amounts of online data in order to generate responses to user prompts. While they gained traction among users, the tools also raised some concerns about inaccuracies, cheating, the spreading of misinformation and the potential to perpetuate biases.
Students are already using AI
According to a study conducted by higher education research group Intelligent.com, about 30% of college students used ChatGPT for schoolwork this past academic year and it was used most in English classes.
Jules White, an associate professor of computer science at Vanderbilt University, believes professors should be explicit in the first few days of school about the course's stance on using AI and that it should be included in the syllabus.
"It cannot be ignored," he said. "I think it's incredibly important for students, faculty and alumni to become experts in AI because it will be so transformative across every industry in demand so we provide the right training."
Vanderbilt is among the early leaders taking a strong stance in support of generative AI by offering university-wide training and workshops to faculty and students. A three-week 18-hour online course taught by White this summer was taken by over 90,000 students, and his paper on "prompt engineering" best practices is routinely cited among academics.
It cannot be ignored.
–Jules White, Vanderbilt University associate professor of computer science
"The biggest challenge is with how you frame the instructions, or 'prompts,'" he said. "It has a profound impact on the quality of the response and asking the same thing in various ways can get dramatically different results. We want to make sure our community knows how to effectively leverage this."
Prompt engineering jobs, which typically require basic programming experience, can pay up to $300,000.
Although White said concerns around cheating still exist, he believes students who want to plagiarize can still seek out other methods such as Wikipedia or Google searches. Instead, students should be taught that "if they use it in other ways, they will be far more successful."
The shift toward AI in the classroom
Diane Gayeski, a professor of communications at Ithaca College, said she plans to incorporate ChatGPT and other tools in her fall curriculum, similar to her approach in the spring. She previously asked students to collaborate with the tool to come up with interview questions for assignments, write social media posts and critique the output based on the prompts given.
"My job is to prepare students for PR, communications and social media managers, and people in these fields are already using AI tools as part of their everyday work to be more efficient," she said. "I need to make sure they understand how they work, but I do want them to cite when ChatGPT is being used."
Gayeski added that as long as there is transparency, there should be no shame in adopting the technology.
Teachers need to learn how to use it because even if they never use it, their students will.
–Tyler Tarver, former high school principal
Some schools are hiring outside experts to teach both faculty and students about how to use AI tools. Tyler Tarver, a former high school principal who now teaches educators about tech tool strategies, said he's made over 50 speeches at schools and conferences across Texas, Arkansas and Illinois over the past few months. He also offers an online three-hour training for educators.
"Teachers need to learn how to use it because even if they never use it, their students will," Tarver said.
Tarver said that he teaches students, for example, how the tools can be used to catch grammar mistakes, and how teachers can use it to assist with grading. "It can cut down on teacher bias," Tarver said.
He argues teachers could grade students a certain way even if they've improved over time. By running an assignment through ChatGPT, and asking it to grade the sentence structure on a scale from 1 to 10, the response could "service as a second pair of eyes to make sure they're not missing anything," Tarver said.
"That shouldn't be the final grade — teachers shouldn't use it to cheat or cut corners either — but it can help inform grading," he said. "The bottom line is that this is like when the car was invented. You don't want to be the last person in the horse and buggy."