For the fall semester, Elon launched a new student guide for the use of artificial intelligence on campus, in partnership with the American Association of Colleges and Universities. Mustafa Akben, Elon’s director of artificial intelligence integration, said Elon already has a AI policy It will remain the same saying: the use of AI in classrooms will vary from teacher to teacher.
“We are not replacing human labor,” Akben said. “Each class, each teacher and each instructor has their autonomy to decide whether they are going to use, prohibit or partially allow students to use these tools.”
This is also stated in Elon’s Honor Code, as it is forbidden to use tools not authorized by faculty members.
The student guide includes sections showing students how to best use AI, both productively and ethically, concerns about AI, and how AI can be used in their careers.
In a study published by Resume templatesManagers are more likely to prefer a candidate for a position who has less work experience, but more experience with AI. This increase in AI in the workforce is one reason Elon is working to stay ahead of the curve with new technologies, Akben said.
This year, Elon launched other tools for students using AI, including a educational consulting tool and will launch ElonGPT — first for staff and faculty, then for students — software similar to ChatGPT but without using people’s data to train the AI, Akben said. It will be free and will help students use AI ethically and responsibly, Akben said.
The advising chatbot was tested this summer and can be used to ask questions related to academic advising and enrollment. It can also answer other Elon-related questions, such as when is the best time to bring a parent to campus and where they can stay.
One of the reasons Akben said Elon was looking to use his own AI program is because of a report published by Tyton Partners. The report found that half of students regularly use generative AI — and 75% of students who already use AI will continue to do so even if teachers or schools ban it. That means teachers will need to be aware of it and teach accordingly, Akben said.
According to Akben, a more productive use of AI is to use it for brainstorming and synthesis work. That’s why ElonGPT, once launched for students, will include safeguards that prevent the chatbot from doing homework for students. Akben also said that allowing AI to do students’ work for them defeats the purpose of learning and following a course.
“You have to be critical,” Akben said. “Make sure you’re learning something as part of your training and use AI for that purpose, not just to offload some tasks.”
The current version of ElonGPT, designed this semester for faculty and staff, is not geared toward learning and teaching and will not include the same guards, Akben said. One use of ElonGPT for staff and faculty could include creating end-of-year reports, facilitating conversations and summarizing documents, Akben said.
If there are any issues or unexpected responses regarding ElonGPT, Akben said faculty and staff are encouraged to report them, and he and his team will work to resolve any issues that arise. Elon will have no way to monitor the use of ElonGPT because it is people’s personal data, Akben said, so Elon will rely on survey responses from participants to see if it is effective. Akben also said he wants to hear feedback from students about what would be helpful in an AI director for them at Elon and beyond.
“I really want to learn,” Akben said. “How can I help them successfully complete their program and be career ready?”