close
close

Professor asks students to create AI guidelines for teaching

Professor asks students to create AI guidelines for teaching

Florida International University students create their own AI guidelines.

PhotographyLink/iStock/Getty Images Plus

Rafael Moron and Lexy Modrono were used to their professors at Florida International University either glossing over guidelines on the emerging uses of generative artificial intelligence or avoiding discussions about AI altogether.

“Very few courses talked about it,” said Moron, who graduated from FIU in May. “Most of the time, the policy was to prohibit AI, and if it was used, it would be considered plagiarism, plain and simple.”

FIU has a general AI policy that is very similar to its plagiarism policy. According to a survey of rectors from Inside Higher Ed beginning of the year.

So Moron and Modrono were surprised when they and a dozen other FIU students were asked earlier this year to develop their own AI guidelines for a course on rhetoric theory and practice.

“I was definitely a little surprised because since AI has become more accessible, I think professors have very strict rules,” Modrono said. “So it was surprising to know that we had a say in shaping the policies.”

Christine Martorana, an associate professor of writing at FIU, spent two semesters giving students in several courses the opportunity to create their own policies for using AI.

“Trying to control the use of AI is counterproductive,” she said. “As a professor, I don’t want to take that stance and that’s not the relationship I want to have with my students. I tried to create a policy and there were so many ways to do it. It became, ‘Let’s share this with the students and see what they come up with.'”

In the spring semester, students were divided into small groups to come up with what they thought were best practices, which they then presented to the whole class to refine their ideas. In a shorter-timed summer course, Martorana had students look at the spring semester guidelines and make changes to develop their own.

“Personally, I definitely felt more valued as a student,” Modrono said. “I felt like she recognized that we are responsible students and we know what we are doing.”

Common views emerged—namely, that AI should not be used to plagiarize—but disagreements also emerged. For example, students in the spring semester course decided it was OK to use AI for brainstorming, while students in the summer course decided that brainstorming was only allowed when students were alone and not in a classroom with peers. Spring semester students said generative AI could be used to organize a paper, while summer course students said the technology should not be used for outlining.

The guidelines for both semesters addressed the use of AI in courses and the citation of AI use in papers and other course materials.

Martorana recognizes that AI will be “an inevitable part” of writing and communicating in the future, and says she believes policy-making is a useful way to prepare students for that future.

“I wanted them to participate,” she said. “I wanted them to first understand (AI) and second act on it because they themselves had worked on its development.”

Brianna Dusseault, provost and executive director of the Center on Reinventing Public Education, said that while she hasn’t heard of other professors asking their students to comment specifically on AI policies, it’s a tactic professors – even elementary school teachers – are using in their classes by asking them to come up with general policies.

“You set norms and create assumptions together over the course of the year,” she said. “This is a new area of ​​AI, but this type of task where you involve students in co-creating their learning environment would make sense.”

Dusseault, whose center is currently conducting studies on the use of AI, pointed to his (and others’) research showing that acceptance of AI is generally lower among professors than among students.

“This is an example of a professor playing a role that universities in general may not be ready for,” she said. “We’re still trying to make it understandable to adults, let alone students.”

Both Dusseault and Martorana said that student involvement in crafting AI policies can improve AI literacy, citing the amount of research students had to do on the ethical — and unethical — uses of the technology. Martorana added that discussion of AI ethics factored into discussions throughout the semester, with students asking whether their use of AI fits into the policies they created.

“I’ve been teaching since 2008, and I’ve never had students ask me about ethics and academic integrity,” she said. “To me, that suggested that students continued to think about it throughout the semester and that the topics of conversation were more open.”

Martorana will also give her students the opportunity to create their own AI policies this fall and will expand this to freshmen in addition to her advanced courses.

“If you try to control the use of AI, you’re ultimately fighting a losing battle,” she said. “As AI technology continues to advance, our policies need to take a more productive and positive approach. Instead of saying, ‘You shouldn’t do this,’ we need to instead show, ‘These are the ways we can use it in our course.'”

Leave a Reply

Your email address will not be published. Required fields are marked *