Established 2005 Registered Charity No. 1110656

Scottish Charity Register No. SC043760

current issue

February – March 2026 : Progress READ ONLINE

RECENT TWEETS

Artificial care

October 01 2025

On the troubling advance of AI in care for young people, especially those facing homelessness. Written by Lucy, an independent advocate supporting young people in the care system

I work for what was a charitable enterprise supporting young people in care. Following buy-out after buy-out though, it is now classified as a “multi-solution provider”. And the pay is still shit!

My role is to support young people making complaints against their local authorities regarding their time in care. From endless changes of social worker, to changes of placement, constant changes of inexperienced and unsuitable staff and much more, most young people in the care system have a lot to complain about.  

I could write a huge amount about the issues affecting the care system, but right now I want to address how the complaints young people have are being outsourced to AI.

One of the touchstones of organisations outside the statutory sector is the importance of the voice of the child. Already young people in care feel voiceless. Very few children ever ask to go into care no matter how bad their situation. What they feel is that no one is listening to them, that no one hears what they want.  

Unfortunately, being taken into care really is the only option sometimes but having the opportunity to speak to someone independent of the care system, who will listen, record and support the young person, and who can share their views at meetings, can make a huge difference to all involved. The housing situation for young people is one example. This is something already very difficult for all young people given the current climate but it can be especially challenging for young people leaving care without the support of families.  

Once a young person in care turns 18, we no longer expect them to care totally for themselves, which is what used to happen. However, neither are they able to access the local authority housing that they are legally entitled to, mainly because it just doesn’t exist any longer. Young people negotiating leaving care need ethical, professional support to ensure they get the services they require and deserve.

Advocacy is also a service that provides real value to young people in cases where things have gone wrong. How can a young person share information and get redress? Hopefully via an advocate who will listen to them, put their words on paper and relay them to their local authority. Or perhaps, it will be something closer to what I was advised to do recently: “put the information into ChatGPT and off you go”.

Where is the voice of the young person in this? How do they feel when they see their words in bland AI speak? The suggestion was that AI could add in relevant legislation but this is not a court situation – this is a child trying to voice their distress about how the care system has treated them. It is about how they have been ignored and hurt.

Managers in the Local Authority will surely recognise AI and may likely reply with AI, making an already difficult situation even harder. Many young people find the responses hard enough as it is.

Then we have the concerns about accuracy. Is ChatGPT using the correct legislation? Does it fit with the young person’s complaint? Is it relevant? 

Finally, and perhaps most concerningly, is the question of where their information is going. We should never share a young person’s personal information – even if they consent. This is their personal life story and we have no right to feed it into any sort of algorithm.

When the use of AI was raised with higher management at my organisation, it was treated seriously, and we were told it was not acceptable. Still, you have to ask yourself how aware other overworked services are that this is happening and how many reports are most certainly being given a “helping hand” from AI? The end result is a “why not” response and an erosion of care.

Even if you could pass the hurdles of accuracy or reliability, how would these already failed young people feel about their words being outsourced to AI?

This is an issue that certainly goes beyond young people in the care system, who are already faced with disproportionately high rates of homelessness once out of the system. And you have to ask yourself where it will stop? Are case workers using AI to read reports? Are they using algorithms to make decisions about people’s lives?

For me and the work that I do, it really comes back to each young person’s voice. If we choose AI, what we’re doing is choosing to once again not hear those voices. Yet again young people lose control of their own voices. We cannot allow this to happen.

BACK ISSUES