Since the release of ChatGPT, universities worldwide have been discussing how to use the AI-based tool in teaching. While a large part of the discussion deals with control and restriction options, Macromedia University takes a different approach:
From now on, the use of AI-supported tools when creating projects or final theses is generally permitted. The condition is that the tools and interactions used are documented in an AI directory.
Prof. Dr. Dr Castulus Kolo, President of Macromedia University of Applied Sciences:
“Artificial intelligence is already an integral part of many digital work areas. If we want to make our students employable, we have to prepare them for these realities. We are breaking new ground to do this – after all, we see ourselves as pioneers in education, not as nostalgics.”
The meticulous documentation of the tools and input commands (“prompts”) used is central to the Macromedia regulation. “The AI directory contains all prompts used and names the specific AI applications,” says the decision of the examination board of 10 March 2023.
Prof. Dr. Joschka Mutterlein, Dean of Macromedia University and Professor of Digital Technologies & Coding, emphasises the technical dimension of this regulation.
“AI-supported tools are the future! We are convinced that they strengthen the creative and analytical skills of the students and promote their ability to reflect. With our regulation, we can convey the opportunities of these instruments – and at the same time meet our high academic standards. This is what innovative teaching looks like to me.”
Dr Cornelia Albert, Chairwoman of the Audit Committee, keeps an eye on the further development of the tasks.
“Examination tasks must be aimed even more at competence orientation. If AI-supported tools are used, this must be done fairly and transparently and taken into account accordingly in the evaluation. If students use an AI without creating an AI directory, this is considered an unauthorised source or unauthorised assistance. This is an attempt at deception and, like other attempts at deception, can lead to a warning.”
(IMH)