We spoke with several staff members about the ethics of AI in education. Read on to find out more about the benefits and challenges of using AI and how our staff are educating students on using AI effectively.
AI is becoming an increasing part of our daily lives. No sector is untouched by it. One of the biggest questions for universities right now is how both students and staff use AI ethically in order to protect the quality of their courses and qualifications.
We reached out to staff across the University to get a broader understanding of how AI is changing teaching and how it should, and just as importantly shouldn’t, be used when it comes to education.
Patick Grant, an Associate Professor and our Project Director for Legal Tech and Innovation, was keen to discuss how the University is incorporating AI into teaching, stating: “We’re not outsourcing education. We continue to provide students with a strong foundation on which to build a career. We teach students to do a job and provide them with the skills and knowledge they need to do the job well.”
For Patrick, its important students understand what AI is and what it isn’t. AI is a tool to be used, but its essential students don’t over trust it. Similarly, James Bissett, our Research Support Manager, said: “AI tools, as with other digital tools, can often behave like enthusiastic puppies: if you ask them to fetch the stick, they’ll bring you back a stick (hopefully), but it might not be the stick you wanted or need.”
Patrick referenced ELIZA, one of the first computer programs to process language developed by Joseph Weizenbaum in 1966, to highlight his point on overreliance. People who used ELIZA bonded with it. They felt understood and supported. The reality was users were simply attaching intelligence to the program. A mistake some make today with modern chatbots.
Patrick also referenced a recent study by the Massachusetts Institute of Technology (MIT) which examined “the neural and behavioural consequences of Large Language Model (LLM) assisted essay writing”. The four month study found “LLM users consistently underperformed at neural, linguistic, and behavioural levels… [which] raise concerns about the long-term educational implications of LLM reliance and underscore the need for deeper inquiry into AI's role in learning.”
James highlights “the challenge is how you present these tools alongside guidance on how to best use them.” Patrick believes this can be achieved by educating students about how AI works, so they better understand it’s benefits and limitations. He also noted the importance of teaching students about where AI is headed and where it could potentially lead in the future.
Thoughts from our staff on AI in education
Lauren Snowden, Programme and Student Lead
“Whilst AI can be helpful in some circumstances it concerns me that an over reliance on AI risks detracting from the knowledge and skills students need to develop as part of their journey to becoming a professional.”
“I agree we should be using the tools at our disposal and considering ways to improve our practices to promote work/life balance, but we need to do so in a way that doesn’t compromise reliability or integrity. As part of this, it’s important to not automatically assume accuracy of the material AI is providing. Having to check the accuracy of the material can, in itself, be more time consuming than just completing the work outright.”
Caitlin Worton-Scott, Senior Lecturer
“When used well, AI can support students by filling learning gaps, offering quick feedback and helping them approach topics from different angles. It’s a tool that can enhance – not shortcut – your learning.”
“Do use AI to test your understanding and revise challenging topics. Do use it to compare explanations until you find one that makes sense. However, don’t treat AI as an unquestionable source of truth, and don’t let it do the thinking or writing that should be your own; that undermines both your skills and academic integrity.”
James Bissett, Research Support Manager
“AI tools can often supplement or support you in completing a task but are often not a good substitute for a human agent where tasks require any level of critical thinking or subjective review.”
“AI tools can be useful in a range of activities, but you should always supplement that with your own critical evaluation of what it might suggest. Use AI as a starting point or springboard to identify reliable sources, then test the accuracy or comprehensiveness of information presented and identify gaps in what might be presented to you.”
Dr Morag Duffin, Director of Student Success, Equality, Diversion and Inclusion
“My main concern about the use of AI in education is how to navigate the inherent biases within AI. Within our Access and Participation Plan with the Office for Students we’ve identified one of the largest risks to equality of opportunity for students from underrepresented groups in higher education is the University replicating the inequalities of certain professions. For instance, we teach a legal curriculum that is (by necessity as it is determined by the Law) based on historical inequalities. The use of AI in education presents a similar risk. AI replicates the biases of society and history; therefore, when using AI we need to actively challenge those biases.”
Practical uses of AI in education
The University of Law is going to begin using Harvey, a legal AI platform, across its law courses. With the use of AI already commonplace in many of the biggest law firms it makes sense that potential future lawyers can utilise the technology. However, emphasis is put on the fact that Harvey is a tool. It’s there to assist legal professionals, not replace them.
In his position with the Library and Learning Skills team, James has assisted with the use of AI and makes the following suggestions for practical uses below:
- Identifying keywords you may not have thought about to help search for literature around a specific topic.
- AI generated summaries of texts. However, James said this should be used “specifically to manage your time and prioritise what to read or focus on, not to replace critical reading, evaluation and synthesis into your work.”
- Providing quick, simple questions on new terms you come across and are unsure: Some AI tools go beyond simple definitions to also signposting potentially useful further reading or concepts.
James also said: “There are numerous academic databases and library discovery tools which now embed AI. There obviously needs to be some level of subject knowledge to assess the real value, and potential gaps, in what these AI discovery tools offer. This highlights the risk for students new to a topic not having the broad subjective knowledge to fully assess if what is presented really is the most relevant or best sources on a topic.”
This feeds back into what Patrick was keen to highlight. First and foremost, students need a strong subject knowledge base, and this will come from quality teaching. Not from AI.
Explore our range of courses and take the first steps towards your professional career today.