Regarding your report (British universities have warned “stress test” assessments because 92% of students use AI, February 26), For centuries, universities have considered themselves as repository of knowledge and truth. It started to break down when the experts were no longer appreciated, the critical thinking undermined and the public discourse increasingly polarized.
In this world, traditional sources of knowledge have been rejected. Books, journal articles and old media are challenged by developments in the presentation and recovery of information, in particular via applications and social media. This led to the “tinderilier” of knowledge.
The reading lists organized, for example, that academics spend time looking for, highlighting thinkers and key writings, are often overlooked by students in favor of Google research. If a student does not like what he reads, he can simply slide on the left. Algorithms can then send students to unexpected directions, often diverting them from academic rigor to non -academic resources.
It is important that students have access to learning equipment 24/7. But does knowledge become another food for convenience? It is available by pressing an online button, is actually delivered to your door and there are so many points of sale of your choice. There may be a quantity, but not necessarily of quality: AI is the ultimate dishes of convenience.
This raises fundamental questions not only on what we mean by knowledge, but also on what the role of education and academics will be in the future. I can appreciate the advantages of AI in science, economy or mathematics, where the facts are often indisputable, but what about the human and social sciences, where many is questionable?
We quickly lose ground because of deep societal changes that could have unimaginable consequences for universities if we do not respond quickly.
Prof Andrew Moran
London Metropolitan University
As a university professor in the humanities, where trials remain a way of key evaluation, I am not surprised to learn that there has been an explosive increase in the use of AI. It is aggressively promoted as an economical property in time by technological companies, and a broader political discourse only reinforces this point of view without questioning the limits and ethics of AI.
Although AI can be useful in several academic contexts – by writing basic relationships and by conducting initial research, for example – its use by students to write tests is indicative of the devaluation of human sciences and a misunderstanding of critical thinking.
“How can I know what I think until I see what I say?” asked the great novelist Em Forster. He meant that writing is a sophisticated form of thought, and that learning to write well, felt through the development of an idea or an argument, is at the heart of writing. When we ask the AI to write an essay, we are not just outsourced work, we outsource our thought and its development, which over time will make us more confused and less intelligent.
In a neoliberal technological age in which we are often obsessed with a product rather than by the process by which it was manufactured, it is hardly surprising that the true value of writing is neglected. Students Simply take their bearings in a world by losing contact with the irreplaceable value of human creativity and critical thinking.
Dr Ben Wilkinson
Sheffield

Finn founded Learnopoly to provide unbiased, in-depth online course reviews, helping learners make informed choices. With a decade in financial services, he developed strategic partnerships and business development expertise. After a frustrating experience with a biased course review, Finn was inspired to create a trusted learning resource.