"We're increasingly less like teachers and more like police officers": How are universities fighting plagiarism?
Most faculties have assumed that they cannot detect work done with AI and choose to change the way they assess.


Barcelona"I don't know how to tell you, the feeling is quite discouraging," admits Antoni Espinosa, Vice-Rector for Digital Policies at the Autonomous University of Barcelona (UAB). "When you read the third report produced with artificial intelligence (AI), you see that they are just as poor and homogeneous, and that they are not characterized by high quality," he laments.
A little over two years ago, ChatGPT went, almost overnight, from being a completely unknown tool to becoming a headache for universities. At that time, several faculties They said that in the short term they did not foresee changing their protocols in response to the emergence of this new technology. and they began to consider whether it was necessary to rethink the way students were assessed.
Now, two academic years later, the outlook is somewhat different: there is less fear of AI-powered text generation tools, but also more resignation after realizing that, at least for the moment, the option of having filters that detect fraudulent work done with these systems does not seem viable.
"The first thing we did was regulate how students who plagiarized using artificial intelligence should be sanctioned because, at the end of the day, this is like any student who copies using another technique. It is a fraudulent practice and, therefore, should be treated the same as any fraudulent practice," says the Vice-Rector for Planning, Volver. In this way, Pompeu, but also the University of Barcelona (UB), the Polytechnic University of Catalonia (UPC), and the Open University of Catalonia (UOC), have adapted their protocols to incorporate the unauthorized use of tools such as ChatGPT among the fraudulent behaviors. In the case of the Autonomous University, they considered that its regulations already penalize plagiarism "regardless of the context in which it occurs," Espinosa explains.
But, as usual, once the law is made, the trap is set. The fact that it's prohibited doesn't mean that, when it comes down to it, universities are able to detect when a student is using AI fraudulently to penalize this behavior. Tools like ChatGPT rewrite the information they have, and plagiarism detectors don't detect that content as something fraught.
"We're increasingly making less of teachers and more of police officers. In the end, the university degree won't be worth anything," warns a professor who has been teaching at the UPC for more than thirty years. But this is also acknowledged by several professors consulted at Catalan universities in the ARA, both those who have been in the classroom for years and those who have recently joined. "I can suspect and even be certain that my student didn't do that work, but I don't have a way to prove it either," complains a professor at the Autonomous University of Catalonia.
"Obviously, sometimes, when you receive certain assignments, due to the way they are written or even the level of their content, these suspicions arise," admits Joan Gispets, Vice-Rector for University Policy at the UPC. He explains that in these cases, they have a conversation with the student and "from there we decide what to do."
In fact, broadly speaking, most public universities in Catalonia act similarly. "We always speak with the student, but if it is truly evident that this fraudulent activity has occurred, then they are given a zero," explains Conxita Amat, Vice-Rector for Teaching at the UB. Once it is determined that the content is considered plagiarism, the consequences at all universities range from failing the specific activity to failing the entire subject. In very serious cases, the student can even be temporarily expelled from the university.
Less weight on traditional assignments
Despite being vigilant—and offering training to professors—to detect possible cases in which the work is not done by the student, universities are increasingly focusing on making changes to the way they assess. "In many cases, we have opted to give less weight to work that students can complete independently at home and shift the emphasis toward oral presentations or in-person exams in the classroom," explains Gispets.
At the UOC, where distance learning makes these practices more difficult, they have also been reinforced in different ways. "First, we have implemented educational methods because we want students to understand what they can and cannot do in assessment activities. This is especially important in the case of AI, because, since it is recent, in some cases they are unaware of doing something inappropriate," explains Robert Clarisó, a teaching expert in AI at the university. From there, they have also introduced new assessment methods that "seek to ensure the identity of the student and their authorship in the activity statements." For example, by conducting synchronous interviews or video presentations.
"Most professors are already proposing activities that cannot be solved with artificial intelligence, for example, asking for analytical tasks, something that AI cannot perform," insists the vice-rector of UPF. However, he warns that what universities are not doing is "the most interesting thing this tool can offer": preparing students to perform different types of work than they currently do. "We are using tools that help us better prepare them, but the intervention of artificial intelligence is only didactic and pedagogical. The problem is that we are preparing them for a world that will change in five years and for a labor market that will be different from what it is now and that we don't know," he warns.
However, despite being in favor of leveraging them, all universities insist that these tools cannot replace student work. "What we aim for, and what most students also aim for, is to learn content and procedures, to acquire a series of skills they will need later and that a machine won't do for them," concludes the UB vice-rector.