Meet the Other Phone. Protection built in.

Meet the Other Phone.
Protection built in.

Buy now

Please or to access all these features

University staff common room

This board is for university-based professionals. Find discussions about A Levels and universities on our Further education forum.

Do you use ai?

19 replies

tunainatin · 23/05/2025 09:27

I'm a senior academic and about a year ago, a colleague recommended non-generative ai to write a protocol. Since this I have increasingly used it for things such as protocols, ethics applications, funding applications, job descriptions. I have been much more reluctant to use generative ai, both due to ethical concerns, and concerns about accuracy, but have recently used this to find and summarise references for something that isn't for publication.
My workload is enormous, and this makes me more efficient - many tasks are things that don't develop my current skills, just take a lot of time, and this gets those done quickly so I can concentrate on other things.
I'm fairly open about this with close colleagues, many of whom also use this. However, I feel hesitant to discuss this more widely, and a bit unsure of the ethics of this - I heard someone on radio 4 day 'why should I read something that someone couldn't be bothered to write?'.
Just wondering if anyone else uses it and if so what for? Or is anyone dead against it?

OP posts:
AnotherAngryAcademic · 23/05/2025 09:55

'why should I read something that someone couldn't be bothered to write?'

This summarises my views!

My main concerns are about ethics and sustainability, but I recognise that AI can save individuals time.

The problem is where someone is using AI to do something they can't do themselves. This is why I think it is SO problematic in HE - in addition to the ethics and sustainability issues.

It is one thing to ask AI to summarise sources if one is capable of doing that task independently, but many (most?) students frankly are not.

It is one thing to ask AI to generate a draft is one is capable of producing scholarly writing independently, but many (most?) students now do not seem able to do this.

It is one thing to ask AI to produce written material that one is competent to assess for cogency and accuracy, but most students frankly do not seem capable of this.

And the more students use AI to do these things, rather than doing them themselves, the less capable they are going to be.

I am currently quite literally on the brink of handing in my resignation from a teaching only side gig (that I started doing because I wanted to do some teaching in addition to my research role!) as I am so sick and tired of reading AI generated nonsense rather than work they have produced themselves.

SusanLittle76 · 23/05/2025 10:04

Well as an academic i am sure you would value a peer reviewed piece of original work more than say a personal opinion or worse an AI generated opinion. However for me as long as the AI generated piece is marked as such or at least not presented as original then I would have neither a personal ethical nor societal moral dilemma stopping me being more efficient. Even better may be a caveat to say this work is AI generated but has been personally reviewed however the amount of effort to proof read may make it equivalent to just researching trustworthy sources in the first place.

atriskacademic · 23/05/2025 10:57

@tunainatin What non-generative AI are you using? - just out of interest!

I have also used generative AI for certain tasks and am not ashamed to do so. For instance, I composed an internal funding application. All parts written by myself, but then I had to run to pick my son up from school and had one part left - lay summary in 100 words. I fully admit chasing this through generative AI, making small changes and putting it on the form. This doesn't take anything away from the fact that the intellectual input before was mine!

I also used it for my promotion application. I have a lot of activities that could have gone in either of the three categories (research, teaching, citizenship) as all of my activities are very interlinked. So I used ChatGPT's temporary function to read the promotion criteria and help me sort my bullet points of activities into categories.

I use generative AI for monkey work and to enhance my work, not replace it.

parietal · 23/05/2025 11:17

My lab uses AI as a tool for specific bits of work (transcriptions, some video coding). I don't use it as a writing tool because I don't trust it enough - I think there are too many errors and bland nonsense.

my students use it to help them code and probably to write - there are big debates at the university about what to do about AI contributions to MSc and PhD level work. not sure what the answers are.

tunainatin · 23/05/2025 11:30

Thanks for all these interesting replies @atriskacademic the way you have used it is similar to how I use it. It's those tasks which involve a lot of sorting which it really helps for. In this way maybe not so different to data analysis software?
It's obviously very different using it as someone who has already proven your academic skills to a student using it, but even here, I wonder if we are thinking about it in the wrong way? For example, if I have a machine to make bread and don't have to knead it myself, it means I get bread and also have time to make other things - could we see AI in a similar way?
The environmental concern is different, as others said, it is not the only contributor. My feeling is that technology got us into this mess and ultimately may get us out of it too.

OP posts:
AnotherAngryAcademic · 23/05/2025 11:41

I think the bread comparison falls down if the bread maker doesn't work, or if one can't afford a bread maker (or the space or power to run it... or indeed if society can't afford the space or water to run server banks). If an individual had first learned to make bread before getting the break maker they would be ok. If they had just learned to use the machine they would not...

These dangers are already clearly apparent eg in healthcare settings, which grind to a halt when there is a computer issue. South London hospitals and GPs could not order tests for months after a hacking incident, for example. Clinicians with the skills to eg support women to deliver their babies in breach presentation at term are in very short supply because most junior staff will not have experience of this. (Slightly different reason, but the bottom line is the same: because they have not seen/done it themselves they don't have the skills when those skills are needed in an emergency.)

I am as partial to a time saving hack as anyone else, but I do not see the risk of the skill loss as benign, and the environmental implications are staggering (see eg the volume of water needed to keep those server banks cool).

Sorry to be such a killjoy 🙁

tunainatin · 23/05/2025 12:01

@AnotherAngryAcademic don't be sorry, I was being a devils advocate and think your first paragraph sums it up perfectly - you need to learn to do it before getting someone or something else to do it for you. And as someone who was denied a breech birth as no midwife felt experienced enough to undertake it I get your other point too!

OP posts:
SusanLittle76 · 23/05/2025 12:19

I think the bread analogy could be similar in ways .
In this case
How would the bread be presented ? (homemade or otherwise)
How is homemade defined ? (What constitutes an original piece of academic work?)
Who/What are your intended consumers and
What is the quality of the final product (is it diminished or enhanced by machine (even partial) vs human ?
I mean can anything ever be considered as truly original? Aren't we all influenced in someway by what already exists, maybe we just reframe pre existing things into a different form?
I know this is philosophical but the AI thing lends itself to this.

Smoronic · 23/05/2025 18:51

I use it to cut out hours of dithering over wording emails. And I've used it for rephrasing my research for websites and things.

We've had training at work on it and have been encouraged to use it for these kind of purposes. Students are also allowed to use it within a fair use policy (my feeling is they use it far more than 'fair use' though).

I have noticed there is a gender divide in my colleagues though. The men are using it far more to do actual academic work (literature reviews, writing papers, data analysis) whereas the women say this is unprofessional. I think I err on the side of the latter but I think it's going to mean yet another reason why men will go ahead. If you can ai another 10 papers a year then you'll be promoted far quicker than slow but thoughtful Mary.

Amibeingunfeasible · 23/05/2025 20:33

How the fuck do you do this? I need to save me some time. Can it write our REF return??

FrostyMorn · 23/05/2025 20:41

I'm sure people on this thread will know this but chatgpt makes up fake references. They look real - they may use real academic authors' names and real journal names along with a plausible looking title (given that AI is scraping real data) - but these elements are assembled into something that doesn't exist.

I wonder how many students are submitting essays with made up references.

Smoronic · 23/05/2025 21:56

FrostyMorn · 23/05/2025 20:41

I'm sure people on this thread will know this but chatgpt makes up fake references. They look real - they may use real academic authors' names and real journal names along with a plausible looking title (given that AI is scraping real data) - but these elements are assembled into something that doesn't exist.

I wonder how many students are submitting essays with made up references.

The problem is it's getting better. Chatgpt3 did this but the later versions are more reliable. Now I suspect some students are uploading all the lecture recordings, the PowerPoint slides, the assessment overview, the marking criteria, the readings, their last 3 pieces of work and then asking it to work with that to create an essay in their own writing style.

You can definitely spot bad ai prompting. But as people get better at prompting then it'll be very difficult to see.

Marasme · 24/05/2025 14:13

i use it regularly as a soundboard to help me decide on a path to follow in the context of institutional conflict, mostly because i get really ensnared into politics at work and it brings me down - i ask it to help me phrase constructive and firm replies with less emotions

i also use it to build protocols, to write bits of website, to fill internal applications and review my own appraisal writing etc...

it saves me time, and, having read a lot on the actual environmental impact, i came to the conclusion thatvit is not worst than a lot of other "normal" behaviours in academia (excessive coffee drinking, flying to conferences, eating meat, and streaming content at night for revance night procrastination)

Acinonyx2 · 25/05/2025 11:33

Similar to previous post - very useful as a sounding board on various dilemmas (has given some surprisingly good advice of late), to check the logic and organisation of ideas. It's definitely GIGO still but the quality of the back and forth discussion has radically improved of late. I don't use it for writing text but often for outlining/organising. Sometimes for generating questions (exams etc). In the past, when we had take away exams, I used to check students references for fakes - and some were really riddled with these. We've gone back to in person exams for this reason. Overall though, I'm rather isolated in my dept and enjoy having 'someone' to go back and forth over ideas.

aridapricot · 26/05/2025 20:19

I started off using it to shorten texts, like when an abstract or description cannot be longer than 200 words but you're stuck at 219 and cannot think of anything to cut. I've found it works well but might still need to tinker with it - nothing beats actual expertise to spot when an important piece of information has been missed.
A few weeks ago I was attending a workshop where we had to come up with a pitch for industry - I am mildly embarrassed to confess that I did use ChatGPT to come up with an initial draft as I was really stuck (the business idea was mine, it was just about how it was pitched). ChatGPT came up with a really witty wordplay which I then built upon. I am not proud of myself for doing this, but I tell myself that this wasn't assessed work, I did put in work anyway (the original idea, plus reworking the wordplay), and this was an artificial exercise and we weren't really pitching to industry so it's not as if won anything out of it.
Evil me likes to use ChatGPT to draft certain e-mail replies - I am HoD in a department populated by arrogant men, all of whom are older than me and most of whom more senior - constant testing of my boundaries, constant mansplaining, constant minimizing of my expertise. When I am fed up and need a curt reply, I ask ChatGPT to draft it, mostly for two reasons: a) I am a non-native speaker, sometimes tone and subtlety is difficult for me (or takes very long); b) there's something powerful in indirectly saying "I am not going to dignify your childish mansplaining behaviour with a human-generated answer".

ICantPretend · 29/05/2025 19:32

Which gen AI are people using? A few people have mentioned chat gpt, but wondering if there's one that seems to do slightly better with academic stuff?

So far I've tried it out on wording a few things but it hasn't really felt like I'm saving time in the end. Maybe I'm not good at writing effective prompts.

foxglovetree · 07/06/2025 23:05

The only AI I use is Deep L, which is really good for translating foreign languages. It's expanded the range of texts I can deal with (since I can only read myself in 3 or 4 foreign languages).

As a matter of principle I don't use other generative AI, even for mundane things. Writing, synthesising, summarising, etc are all skills I've worked over a career to build and be effective at, and to keep any skill live, I feel you need to keep it active, and that includes boring stuff like writing student reports or conference abstracts - I feel it would be rather as though I had become a violinist after years of training and practising for hours a day, and then was trying to outsource my practice to a robot and then got surprised that my scales ended up being rusty. Also it would be hypocritical of me to encourage my students of the need to build these skills for themselves if I then can't be arsed in my own life.

Onceacnowcsheo · 08/06/2025 11:26

I use it for career therapy. It's non-judgemental and helped me get a clear perspective on what options are open to me, and how others are dealing similar situations. I do recommend it!

Loepa · 24/06/2025 18:02

I’ve used it to write a promotion application as well - I ended up not using too much of it but was helpful as a start point. I think it’s quite good for that sort of generic corporate bullshit type writing. I wouldn’t use it to write papers, especially as I really like writing papers. But I can see it could be useful for some lit reviewing type activity as well (the specialist tools that only scan academic databases I mean) and definitely helpful for transcription. I wouldn’t trust it on anything I don’t already know about, as it does make stuff up.

New posts on this thread. Refresh page
Swipe left for the next trending thread