In the media

The rise of AI in social care

David Rees

By David Rees, Ted Edmondson

The MJ

30 July 2025

In January 2025, the Prime Minister laid out an ambitious and comprehensive vision for Artificial Intelligence (AI) in the UK. With the Government set to integrate AI into its plans for public service reform including local government, and especially social care given the references to technology in the NHS 10-year plan, AI is likely to be a major focus.

There is clearly an appetite for these developments. In its most recent annual care technology survey, PA Consulting and the TSA noted eight out of 10 senior leaders in adult social care agreed that emerging technologies will be central to the long-term provision of care. Importantly they wanted to focus on supporting and accelerating their development now. The good news is that AI in social care is not starting from zero and a number of councils are already embracing the AI agenda.

During the pandemic, Hampshire CC and its Technology Enabled Care (TEC) Argenti partnership developed the Wellbeing Automated Call Service (WACS). This is an AI-powered solution using AWS Connect that supported vulnerable residents on the shielded list. Built in just seven days, WACS used automated phone calls and AI to identify urgent needs based on simple responses. It contacted over 53,000 people through 200,000+ calls, enabling 4,000 residents to access emergency food and medication. The system completed activities in a few days that would have taken the council's contact centre 280 days to achieve.

More recently, Kingston was the first council to pilot an AI-powered solution aiming to streamline the process of writing case notes and assessments. Co-developing Magic Notes with Beam, the results have been impressive. Social workers have reported an average reduction of 50 to 60% in the time taken to complete case notes and assessments, with a 96%+ accuracy in the automated transcription. Recognising the tool's scalability, Kingston is exploring its wider adoption, enabling resources to be reallocated toward prevention services.

These represent a strong start, and we anticipate the use of AI in social care will accelerate over the next five years. Areas for development include revolutionising proactive prevention with AI allowing social care commissioners to leverage their own data, to better manage care demand and improve outcomes for vulnerable people. An example of this is Taking Care's recently launched Hälo offer. At its core, Hälo uses AI to determine preventative interventions based on Alarm Receiving Centre (ARC) call data (e.g., the nature of calls, their frequency and timing) with an added opportunity to then build on wearable technology data (e.g., from a Fitbit) to support an individual's care needs.

Similarly, AI will be increasingly used to monitor changes in behaviour and identify situations where an early response can prevent escalation of the need for care. For example, Kent CC's Technology Enhanced Lives Service (TELS) is about to build on its current use of ‘Howz' to use machine learning to identify areas of potential concern and allow the Kent Team to rapidly provide support. Howz sensors monitor daily activities, informing families and carers of changes in behaviour which may require an intervention or change in the care needed. One example relates to a family that noticed an increase in a relative's visits to the bathroom via Howz reporting; in consultation with their GP, it transpired the individual had contracted a Urinary Tract Infection (UTI).

In addition to using AI to support individuals directly receiving care, Hampshire is looking to use it to support unpaid carers in the future. It has commissioned a Microsoft based proof of concept that uses generative AI within its Argenti TEC service. Based on the type of unpaid carer and the circumstances of the individual they support, a wide range of factors are assessed to help recommend solutions. This could be consumer technology or specialised TEC products to help the carer. Once the AI solution has been fully evaluated, the council aims to share it with its third sector partners and make it directly available to its community. Given, at a national level, there are 11.9 million unpaid carers, many providing over 50 hours of care each week, the benefits could be significant if it was more widely adopted.

But where do councils start with to put these developments in place safely? In order to minimise the risks, there are three things they need to focus on.

Firstly, make sure your data is in a good state. AI relies on access to significant amounts of data related to the particular focus of its activity. For AI to be able to work effectively, that data needs to be accurate, up to date and accessible. This means data owners must be aware of where it might need cleansing, and to make sure that this work happens. Making data accessible is a much harder task and requires a broader view of IT infrastructure and its integration. Data often exists in siloes, either within individual systems, or even Microsoft Excel spreadsheets, making AI much less effective.

Next, ensure that you redesign service delivery processes appropriately, and with humans at the centre of any key decisions. Simply using AI to run existing processes faster is like ‘buying a faster horse' and will not leverage the capabilities it offers. It also risks cutting humans, with their professional opinions and real-life experience, too far out of the loop and automating parts of the process that would benefit from human control. Without this you potentially remove humans from critical decision points within the process, and end up in a situation where ‘the computer says no' in a very dispassionate and non-caring way, with all of the consequences for public trust in the care system that it entails.

Finally, do not automatically trust the AI. There have been several examples of automated algorithms containing inherent biases against minority communities, which were exacerbated by the agencies running them, at least initially, trusting the systems over the service users. Continuous monitoring and audit of AI service delivery will help mitigate some of these risks.

By putting these safeguards in place councils will be in a strong position to realise the potential of AI to truly transform social care.

This article was first published in The MJ.

Explore more

Contact the team

We look forward to hearing from you.

Get actionable insight straight to your inbox via our monthly newsletter.