The Information Commissioner’s Office (ICO) has released a landmark Tech Futures report exploring the rise of agentic artificial intelligence — an autonomous AI system capable of automated decision making and acting on behalf of users. Agentic AI systems can process and generate personal data at scale, making it harder to apply fundamental data protection principles such as data minimisation, purpose limitation and accountability. Without careful design and governance, systems may expose individuals’ personal data.
Agentic AI’s ability to process personal data at scale and make or influence decisions autonomously heightens the relevance of the Data (Use and Access) Act 2025, which relaxed certain restrictions on the use of automated decision-making under UK data protection law, such as now having more ‘lawful bases’ for an organisation to use automation techniques to make decisions about individuals based on their personal data.
Privacy & Protection at the Forefront
At the core of the ICO’s messaging is a reaffirmation that existing UK data protection law applies to agentic AI. This means developers and adopters must ensure processes are lawful, transparent and fair, and that individuals’ rights — including access, rectification and contesting automated decisions — can be exercised effectively. The ICO stresses that privacy and compliance cannot be an afterthought but must be integrated from the earliest stages of system design.
The ICO also warns of specific risks unique to agentic AI, such as:
- Ambiguous accountability in complex AI supply chains, making it unclear who controls or processes personal data.
- Unnecessary or excessive processing if agents collect or infer information beyond what is required.
- Heightened cybersecurity threats, given the autonomous nature and broad access privileges of agents.
Industry and Public Sector Implications
The report’s timing aligns with increasing adoption of agentic technologies in commercial and public services. Government and healthcare providers are exploring agentic systems to automate routine tasks, while the private sector — notably in retail and finance — is testing autonomous assistants to enhance customer engagement. The ICO’s message is clear: the data protection implications must be understood and managed before widespread deployment.
Next Steps and Regulatory Trajectory
Although the Tech Futures report is not formal regulation, the ICO intends to consult on new draft guidance for automated decision-making in light of the Data (Use and Access) Act 2025 later in 2026, which will contribute to clearer expectations for AI governance.
For businesses developing or procuring agentic AI systems, now is the moment to review data governance practices, conduct robust data protection impact assessments and engage with emerging regulatory guidance as it takes shape.
At Howes Percival, we can carry out a data protection compliance assessment on your business, and advise on compliance and understanding your data protection and privacy needs. Our Data Protection & Privacy solicitors in Leicester, Milton Keynes, Oxford and Northampton are on hand to answer any questions you may have. For further information or to discuss how we can assist you, please contact Hannah Steggles ([javascript protected email address]), Paula Dumbill ([javascript protected email address]) or Miles Barnes ([javascript protected email address]).
The information on this site about legal matters is provided as a general guide only. Although we try to ensure that all of the information on this site is accurate and up to date, this cannot be guaranteed. The information on this site should not be relied upon or construed as constituting legal advice and Howes Percival LLP disclaims liability in relation to its use. You should seek appropriate legal advice before taking or refraining from taking any action.