Meet the Other Phone. Flexible and made to last.

Meet the Other Phone.
Flexible and made to last.

Buy now

Please or to access all these features

University staff common room

This board is for university-based professionals. Find discussions about A Levels and universities on our Further education forum.

AI use in humanities PhDs

49 replies

NutsAndMay · 10/12/2025 08:58

How are you handling this? We are seeing AI all the time in proposals, and consider it a reason to turn them down. But I’ve recently found a funded PhD student using large swatches of Ai (re-)written material in a thesis. The institution doesn’t really seem to have a process or policy on this. As a humanities scholar I think it’s ludicrous and totally unacceptable! What do you think? And what do you do?

OP posts:
titchy · 10/12/2025 09:11

Can’t comment specifically about PGR students, but doesn’t your institution have an AI use policy that applies to all students? Ours essentially says that AI can be used as a tool, but (obviously!) not as a source. If they’ve used AI to create a more coherent structure to their argument is that any different from using a grammar/spell check? As long as the basis of the argument is sound, referenced and they can defend it isn’t that OK?

If they’ve got the AI to summarise journal papers though it’s not likely to actually reflect the paper accurately - AI just isn’t good enough (yet) - at least in the sciences. So easy to draw a red line through and ask student to read for themselves.

bibliomania · 10/12/2025 09:14

I don't have the answer, @NutsAndMay , but it's a huge question. My institution has been wrestling with a research student's request to use generative AI as a reasonable adjustment. He has a disability which impacts on his ability to organise material and communicate it succinctly - but isn't a PhD partly about the ability to do this, not just about the knowledge itself?

surreygirly · 10/12/2025 09:22

With the proliferation of AI the more important question is what is the point of doing a humanities subject
Whilst interesting they will not offer people job opportunities within 5 years
AI is sucking up jobs every day now
Students need to do science or medical based or learn a trade

NutsAndMay · 10/12/2025 09:31

@bibliomania Exactly! The PhD is a process of learning - learning how to identify and select and process sources; learning how to refine your written expression; learning how to structure an argument; learning how to generate ideas from the source material…

OP posts:
titchy · 10/12/2025 09:32

surreygirly · 10/12/2025 09:22

With the proliferation of AI the more important question is what is the point of doing a humanities subject
Whilst interesting they will not offer people job opportunities within 5 years
AI is sucking up jobs every day now
Students need to do science or medical based or learn a trade

Seriously? You don’t see the point of humanities? You think education is solely about getting a job? Philistine. Angry

titchy · 10/12/2025 09:36

Does your institution offer academic writing modules? Even if aimed at Masters that might be useful. I get the PP’s point about needing to craft the argument, but presumably if the student couldn’t do that on paper they won’t be able to in a viva?

DallasMajor · 10/12/2025 09:38

titchy · 10/12/2025 09:32

Seriously? You don’t see the point of humanities? You think education is solely about getting a job? Philistine. Angry

Or a realist.

Education is swiftly becoming a preserve of the very wealthy.

NutsAndMay · 10/12/2025 09:44

@titchy I do think it’s different from using a spellchecker. A spellchecker will flag and correct spelling mistakes and grammatical errors but it doesn’t actively change what’s there - and it gives the user the opportunity to see the mistake and learn from it. AI corrects without showing its workings, but more than that it actively intervenes - it makes substantive changes, not just corrective ones. It changes the voice of the piece. It overwrites the human…

OP posts:
NutsAndMay · 10/12/2025 09:45

surreygirly · 10/12/2025 09:22

With the proliferation of AI the more important question is what is the point of doing a humanities subject
Whilst interesting they will not offer people job opportunities within 5 years
AI is sucking up jobs every day now
Students need to do science or medical based or learn a trade

If you think the sciences and medicine and trades are going to be left peacefully alone by AI and automation you’re going to have a shock…

OP posts:
rhabarbarmarmelade · 10/12/2025 09:52

Defending my own turf but, I think it is precisely the humanities that retain meaning compared to the sciences at this higher level. Sciences, coding, medicine etc all succumb to data crunching and analysis far more than the contextually, historically nuanced Humanities.
but yes, in the PhD student who asks for special measures, I think it is absurd…because what is one then asking to assess….AI’s ability to rationalise, synthesise etc. it will, I guarantee be poor quality.

titchy · 10/12/2025 09:59

bibliomania · 10/12/2025 09:14

I don't have the answer, @NutsAndMay , but it's a huge question. My institution has been wrestling with a research student's request to use generative AI as a reasonable adjustment. He has a disability which impacts on his ability to organise material and communicate it succinctly - but isn't a PhD partly about the ability to do this, not just about the knowledge itself?

The viva would be interesting…

PotolKimchi · 10/12/2025 10:04

I have seen a University document that allows students to organise their thoughts etc using AI but not write using it. But it has to be done through the University's tie up with Copilot and all logs and all prompts have to then be submitted WITH the thesis as well so examiners can see how much AI contributed.
The bigger problem in the humanities is that people are just churning out grants a dime a dozen using AI and grant bodies and their review boards (I sit on a couple of big ones) are truly overwhelmed.

KnickerlessParsons · 10/12/2025 10:05

We’re encouraged to use AI in work. It’s a skill they need to learn in the modern world.
using AI is not the same as plagiarising, which I would frown on.

Mydadsbirthday · 10/12/2025 10:59

bibliomania · 10/12/2025 09:14

I don't have the answer, @NutsAndMay , but it's a huge question. My institution has been wrestling with a research student's request to use generative AI as a reasonable adjustment. He has a disability which impacts on his ability to organise material and communicate it succinctly - but isn't a PhD partly about the ability to do this, not just about the knowledge itself?

I don't understand how you can become a research student if you don't have the ability to organise material and communicate it. Isn't this literally what a research student needs to be able to do? Why are they in this role

PotolKimchi · 10/12/2025 11:22

You can use AI at work, the work has to still be your own and you have to have enough expertise to know when AI is wrong, or has missed nuance or is over simplifying something.

NutsAndMay · 10/12/2025 12:04

PotolKimchi · 10/12/2025 10:04

I have seen a University document that allows students to organise their thoughts etc using AI but not write using it. But it has to be done through the University's tie up with Copilot and all logs and all prompts have to then be submitted WITH the thesis as well so examiners can see how much AI contributed.
The bigger problem in the humanities is that people are just churning out grants a dime a dozen using AI and grant bodies and their review boards (I sit on a couple of big ones) are truly overwhelmed.

That’s an interesting approach, to allow certain uses of AI but require submission of the logs. It doesn’t necessarily stop unauthorised uses of AI though. And how does the institution police (and penalise) over-use? It’s a fine line between AI organising your thoughts and AI introducing material, or AI writing for you… and once you’re in the viva the student would have a decent case for saying the university allows it and they’ve not done anything wrong, or only a little misdemeanour, versus no AI being allowed meaning a much stricter threshold. Though maybe that ship has sailed…

The point about grant applications is sobering. Have the finders taken any action yet, do you know? Any specifications on what’s acceptable in terms of AI use and what’s not?

OP posts:
NutsAndMay · 10/12/2025 17:19

@KnickerlessParsons How would you say using AI is different from plagiarising? Because it’s the intellectual property of a machine rather than a person? Or something else?

OP posts:
PigeonsandSquirrels · 10/12/2025 17:39

surreygirly · 10/12/2025 09:22

With the proliferation of AI the more important question is what is the point of doing a humanities subject
Whilst interesting they will not offer people job opportunities within 5 years
AI is sucking up jobs every day now
Students need to do science or medical based or learn a trade

Personally - as someone with two humanities degrees and an ongoing medical degree - I think AI is actually worse at humanities and creative arts than it is at medical/science topics. It has way more data on medical science to analyse and spew out than it does on say plays of Maundy in the 1400s and dual Cartesianism.

KnickerlessParsons · 10/12/2025 23:11

NutsAndMay · 10/12/2025 17:19

@KnickerlessParsons How would you say using AI is different from plagiarising? Because it’s the intellectual property of a machine rather than a person? Or something else?

I use copilot on my own work to make it read better.
I can ask copilot to take something I’ve written and make it more succinct, more professional, more casual in tone, whatever.
I’ve used it to eg summarise a huge ppt presentation someone has sent me, and to pick out the key points.
I’ve used it to help me write requirements for some new software. I’ve used it to give me the bones of a business plan to submit to the bank when asking for a loan, and loads more.
I wouldn’t consider any of that plagarising.

Bungle2168 · 10/12/2025 23:21

surreygirly · 10/12/2025 09:22

With the proliferation of AI the more important question is what is the point of doing a humanities subject
Whilst interesting they will not offer people job opportunities within 5 years
AI is sucking up jobs every day now
Students need to do science or medical based or learn a trade

STEM and certain “hands on” vocations may be more resilient, for the time being, but AI is gradually encroaching on them, too.

catontheironingboard · 10/12/2025 23:24

bibliomania · 10/12/2025 09:14

I don't have the answer, @NutsAndMay , but it's a huge question. My institution has been wrestling with a research student's request to use generative AI as a reasonable adjustment. He has a disability which impacts on his ability to organise material and communicate it succinctly - but isn't a PhD partly about the ability to do this, not just about the knowledge itself?

This seems to me to be ludicrous. There’s reasonable adjustment, and then there’s not doing a core part of the assessed task of a PhD.

The student would get short shrift for asking to use an essay farm as a reasonable adjustment. Generative AI is no different. Unreasonable adjustment more like!

catontheironingboard · 10/12/2025 23:27

surreygirly · 10/12/2025 09:22

With the proliferation of AI the more important question is what is the point of doing a humanities subject
Whilst interesting they will not offer people job opportunities within 5 years
AI is sucking up jobs every day now
Students need to do science or medical based or learn a trade

😆 hah! It’s the quantitative sciences and software/tech industries that are most at risk from AI - not the humanities. Are you under the impression that AI can actually think?

Bungle2168 · 10/12/2025 23:30

NutsAndMay · 10/12/2025 17:19

@KnickerlessParsons How would you say using AI is different from plagiarising? Because it’s the intellectual property of a machine rather than a person? Or something else?

I think that using AI as a brainstorming tool is an acceptable use of the technology. Critiquing and revising a thesis - may - be acceptable, but above and beyond that you are straying into plagiarism.

…which is why I have reverted to written, formal examination where possible!

Your institution will have, or will be in the process of developing an AI policy. Make sure the students are aware of it. Furthermore, insist that any AI usage is a) fully documented, and, b) linked to their University Google / Microsoft.

However, I get the impression that some universities are less concerned about academic rigor and more concerned about data leakage and copyright infringement.

catontheironingboard · 10/12/2025 23:35

PotolKimchi · 10/12/2025 11:22

You can use AI at work, the work has to still be your own and you have to have enough expertise to know when AI is wrong, or has missed nuance or is over simplifying something.

I don’t get this. AI is wrong the vast majority of the time for writing in humanities fields. I even sometimes look at how it summarises student essays just to see what it does, and the initially plausible summaries it produces turn out to bear no relation to the actual content of a student’s essay once you’ve read both.

People only think it’s good if they don’t actually know where all the mistakes are. It’s like reading newspaper articles on a topic you know a lot about: as an expert you can see they are full of errors; as a non-expert you take them as perfectly fine. AI runs the very real risk of just allowing people who take it as true or accurate to just recirculate errors or misrepresentations that the AI has made.

If you don’t do your own research and work how will you know when the AI is wrong? You have to do it yourself anyway, to be able to have the skills to tell. So you might as well just do it yourself in the first place, rather than do it twice, once in order to correct the mistakes of an AI!

If you know your field, and you want to produce really new, original work not rehashed mediocre stuff, AI is actually a timewaster rather than a useful tool. My job is to produce original thought and research that no-one has produced before. I’m not going to do that by using AI, which by design just recombines all the mediocrity that’s out there already, in order to produce yet more mediocrity.

QBTheRoundestOfBees · 10/12/2025 23:36

KnickerlessParsons · 10/12/2025 23:11

I use copilot on my own work to make it read better.
I can ask copilot to take something I’ve written and make it more succinct, more professional, more casual in tone, whatever.
I’ve used it to eg summarise a huge ppt presentation someone has sent me, and to pick out the key points.
I’ve used it to help me write requirements for some new software. I’ve used it to give me the bones of a business plan to submit to the bank when asking for a loan, and loads more.
I wouldn’t consider any of that plagarising.

Co-Pilot writes terribly, though, in my opinion.