Data protection and cybersecurity – what busy GCs need to know

It is often said that the speed of technological progress outstrips the law’s ability to keep pace. Nowhere is this more apparent than in the field of data protection and cybersecurity when almost daily reports of high-tech innovation contrast sharply with laborious and often ill-informed legislative debates over applicable and too frequently impractical regulatory frameworks. However tempting, putting forthcoming legislation at the bottom of a burgeoning in-tray can necessitate serious ‘catch-up’ when the rules come into force and risks missing opportunities to influence debate. Information on law developments affecting public and private sector organisations are myriad but in this article we’ve identified our top five – it could easily have been a top 15 – together with our thoughts on what in-house lawyers should consider.

1. The rise and regulation of artificial intelligence

Released in November 2022, OpenAI’s ChatGPT, swiftly followed by similar chatbots from Google, Microsoft and Meta, has since dominated tech news. The potential of artificial intelligence – an umbrella term for algorithmic technologies which often attempt to imitate human thought to accomplish tasks – promises to revolutionise the modern world, leading many organisations to consider ways of applying it in their fields of activity.

Machine learning is emerging as the dominant form of AI. It is not sentient; rather, it identifies patterns from vast training data sets often containing personal data and uses them to generate synthetic data based on statistically probable sequences. A particular subset of machine learning is the large language model (LLM) which pieces together statistically probable sequences of words in a conversational manner. ChatGPT and its even more powerful and ‘ethical’ successor, GPT-4, are examples of LLMs.

As with any technological innovation, AI attracts ‘early adopters’ and the UK Government is an enthusiastic supporter. It is keen to place Britain in the vanguard of the AI revolution, extolling its potential to grow and transform the country’s industrial and economic landscape. It has issued a national AI strategy backed with nearly £1bn of investment and aims to make Britain a global AI superpower1. In March 2023, it followed up with a white paper2 proposing a ‘pro-growth’ general AI regulatory framework to be implemented by existing sectoral regulators and based on five high-level principles: safety, security and robustness; appropriate transparency and ‘explainability’; fairness; accountability and governance; and contestability and redress. Keeping the regulation light-touch will, it is hoped, avoid stifling innovation and slowing AI adoption.

Businesses, keen to reap the benefit, are already deploying AI to streamline their operations, for example to automate customer support, produce documents and online content, and to detect fraud. Facing commercial pressure to match their competitors, others will inevitably do likewise. As the AI revolution gathers pace, however, concerns have arisen about its potential negative impacts, not only for the jobs of workers whose skills might more quickly and accurately be accomplished by AI, but also for compliance with data protection legislation.

The UK currently has no AI-specific data protection regime and insofar as AI processes personal data it is regulated by the UK GDPR and the Data Protection Act 2018 (DPA). The Information Commissioner’s Office (ICO) has produced best practice data protection guidance for senior management, GCs, Data Protection Officers (DPOs) risk managers and tech specialists of companies using AI3. This is the ‘go to’ guidance and the ICO’s detailed recommendations address foundational data protection issues in the context of AI-use, including lawfulness, fairness and transparency.

The ICO almost always regards the deployment of AI involving personal data as ‘high risk’, triggering the need for a Data Protection Impact Assessment (DPIA). Organisations considering using AI to mechanise decision-making should pay particular attention to article 22 of the UK GDPR which, except in specified circumstances, prohibits solely automated decision-making about an individual which would have a legal or similar effect on them. This would include, for example, decisions about whether to hire or dismiss an employee, or whether to grant someone a loan.

Key in the introduction of AI are how data subject rights over personal data – for example, the rights of access, restriction and erasure – might be affected. Such rights apply over both the personal data on which the AI is trained and once the AI becomes operational. Businesses procuring an AI service should therefore confirm it facilitates their obligations as a data controller to individuals whose data they process, and those contracting out AI functions to third party data processors should ensure the Data Processing Agreement with the outsourced provider includes an obligation to assist in compliance with data subject rights.

With media reports of individuals inputting highly sensitive information to ChatGPT including corporate strategy documents and patient health data4 to be subsumed and exploited by the algorithm, data security should be uppermost in the minds of those responsible for introducing AI. Some well-known corporates have restricted access to LLMs in the workplace or have issued warnings to staff when using such generative IT. The UK’s National Cyber Security Centre, whose functions include supporting businesses and SMEs with cybersecurity, has blogged about the risks of using LLMs5, warning users not to input sensitive data when interrogating them. Prudent employers may wish to introduce appropriate staff training and issue policies prohibiting employees from inputting confidential, proprietary and trade secrets to publicly accessible LLMs. Aside from staff risks, external ‘threat actors’ will undoubtedly use AI to improve malware and increase the sophistication of hacks on organisations using social engineering. Those responsible for online corporate security must stay up-to-date with methods of attack and appropriate protection/mitigation strategies.

2. The ICO’s fresh regulatory approach

The introduction of the GDPR in 2018, with its raft of obligations for organisations, made data protection a standing item on boardroom agendas, with compliance a constant concern. The Information Commissioner, John Edwards, in office since January 2022, has sketched out a fresh approach to enforcing UK data protection legislation of which organisations should take note6.

Facing down criticism for being ‘soft’ on data breaches (only five penalties were imposed for GDPR breaches in 20227), the Information Commissioner has rejected the number or quantum of financial penalties as the metric by which the ICO’s enforcement activity should be judged. Instead, he has emphasised a graduated approach to non-compliance, reserving financial penalties for breaches which risk the greatest harm or where businesses have profited from failure to observe data protection obligations.

Since June 2022, the ICO has been trialling a ‘partnership approach’ with the public sector to encourage data protection compliance; financial penalties will only be imposed in the most egregious of cases. The ICO will use its discretion to reduce financial penalties where they would merely drain public funds, instead issuing public reprimands and enforcement notices more frequently.

To facilitate transparency and accountability, the ICO will now publish all reprimands issued from January 2022 unless there is good reason not to. Moreover, the ICO is publicly releasing the high-level data sets of self-reported personal data breaches which did not result in regulatory action8. While this greater openness is intended to promote general data protection awareness and bring about behavioural change, it inevitably carries reputational risks for the individual businesses concerned, making compliance with data protection legislation ever more important.

3. Online Safety Bill

The Online Safety Bill (OSB), which broadly applies to providers of internet services and search engines, has had a tempestuous passage through the House of Commons and debate has now moved to the Lords. It is expected to be enacted later this year, though implementation by the proposed online safety regulator, OFCOM will take several more years9.

Originally mooted in 2019, the OSB continues to generate amendments such that comments on its protean measures are soon outdated; out-of-scope organisations would be forgiven for thinking it irrelevant to them. However, its potential impact on end-to-end encryption (E2EE) deserves wider understanding.

Computer-based encryption has become ubiquitous and assumed for everything from e-commerce and video-conferencing to secure messaging. Major messaging platforms such as WhatsApp, iMessage and Telegram implement E2EE by default. However, criminals also exploit E2EE to evade detection when disseminating terrorist and child sexual exploitation and abuse material (CSEA).

To tackle this problem the government wishes to give OFCOM the power to issue notices requiring platform providers to ‘use their best endeavours’ to develop/source and deploy technology to identify and prevent users encountering terrorist and CSEA material on their platforms10. OFCOM would judge whether a company had complied with a notice, and the obligation to do so would be reinforced with the threat of financial penalties of up to 10% of an offending company’s global turnover.

It is widely believed the amendment paves the way for a surveillance technique known as client side scanning (CSS), which involves downloading software onto individual devices such as smartphones tablets and computers to allow law enforcement authorities to conduct algorithmic scanning for prohibited material.

Proponents of CSS regard it is minimally invasive and avoiding traditional security concerns about ‘backdoors’ into encrypted services. Opponents argue it would be prone to manipulation by criminals and hostile states, and is a slippery slope to mass surveillance. The debate has aroused fierce controversy, with WhatsApp and Signal both threatening to withdraw from the UK rather than compromise E2EE on their services11. The debate’s outcome will have profound implications for the confidentiality of professional and personal communications.

4. Reform of UK data protection regime

The UK GDPR and DPA are the foundations of the UK’s data protection regime and proposed changes should be on the radar of in-house counsel. Change to existing data protection laws is inevitable in some form – without it they risk being swept away together with other EU-derived laws on 31 December 2023 by the Retained EU Law (Revocation and Reform) Bill 2022 currently before the UK Parliament.

The government’s proposals, in the Data Protection and Digital Information Bill (DPDIB)12, aim to create a pro-growth and pro-innovation data protection framework whilst alleviating compliance burdens on organisations and boosting the economy. Businesses may welcome some of the proposed practical amendments: allowing refusal of vexatious or excessive subject access requests; removing the obligation on controllers and processors to keep processing records except for high risk processing; the automatic recognition of certain legitimate interests as a lawful processing basis without the need for a legitimate interests assessment; and (given the potential of AI) the relaxation of the prohibition on solely automated decision-making. Other proposed changes seem largely cosmetic, the replacement of DPOs by ‘senior responsible individuals’ and substitution of DPIA with ‘assessments of high risk processing’.

Though the government acknowledges that the UK GDPR has improved cyber security standards13, it claims the legislation has cost British businesses an 8.1% decline in profit and laments the estimated $1.1bn which FTSE 350 businesses needed to invest in data protection compliance14. Having made that investment, however, some businesses may be wary of proposed changes requiring further financial outlay or which might be regarded as diminishing data security with potentially catastrophic effect in terms of vulnerability to ransom hacks and data loss, aside from the threat of regulatory action. Businesses would also wish to weigh the cost-benefits of changing their current data protection standards if it risked compliance with EU data protection standards, creating greater friction in their ability to trade in Europe.

5. EU data adequacy

On the horizon, but of particular interest to companies doing European business, is EU data adequacy. The European Commission (EC) adopted an adequacy decision in the UK’s favour on 28 June 2021, allowing the continued free-flow of personal data between Britain and its continental neighbours post Brexit. The decision, which contains a four-year ‘sunset clause’, may be renewed if the UK maintains data protection standards essentially equivalent to those of the EU. Some estimates have suggested adequacy is worth between £1bn-£1.6bn to the UK economy. Losing it would not prevent UK-EU data flows but the average cost to British business of introducing alternative data transfer mechanisms, for example standard contractual clauses, could be between £10,000 for a small firm and over £162,000 for a larger businesses15.

Aspects of the DPDIB now before Parliament, for example reduced ICO independence and the UK’s determination to grant the US and other third countries adequacy despite European misgivings, arguably jeopardise the essential equivalence of the UK’s data protection regime and the likelihood that the EC will renew its adequacy decision in the UK’s favour. Companies trading with EU countries or whose supply chains involve entities in EU member states should remain alert for threats to adequacy and take advice on mitigating risk as necessary.

How can we help?

BCL Solicitors’ expertise encompasses the law relating to investigatory powers, data protection, computer misuse, and data sharing. Many of our engagements arise from demands for information by state authorities, the use of new technologies for investigative purposes, and the enforcement of data rights and obligations. Our work involves both contentious matters and compliance guidance. Our clients include big tech companies, SMEs and start-ups, as well as corporates facing regulatory investigation, which are the victims of cyber-attacks, or which have received information requests from law enforcement agencies, data subjects and third parties.

Notes

  1. https://www.gov.uk/government/publications/national-ai-strategy/national-ai-strategy-html-version
  2. https://www.gov.uk/government/publications/ai-regulation-a-pro-innovation-approach
  3. ico.org.uk/media/for-organisations/guide-to-data-protection/key-dp-themes/guidance-on-ai-and-data-protection-2-0.pdf & ico.org.uk/media/for-organisations/guide-to-data-protection/key-dp-themes/explaining-decisions-made-with-artificial-intelligence-1-0.pdf
  4. darkreading.com/risk/employees-feeding-sensitive-business-data-chatgpt-raising-security-fears
  5. ncsc.gov.uk/blog-post/chatgpt-and-large-language-models-whats-the-risk
  6. https://ico.org.uk/about-the-ico/media-centre/news-and-blogs/2022/11/how-the-ico-enforces-a-new-strategic-approach-to-regulatory-action/
  7. https://www.urmconsulting.com/blog/analysis-of-fines-imposed-by-the-information-commissioners-office-in-2022
  8. https://ico.org.uk/about-the-ico/our-information/complaints-and-concerns-data-sets/self-reported-personal-data-breach-cases/
  9. ofcom.org.uk/_data/assets/pdf_file/0016/240442/online-safety-roadmap.pdf
  10. Online Safety Bill (as introduced to House of Lords), s110
  11. techcrunch.com/2023/03/10/uk-osb-e2ee-warning/
  12. publications.parliament.uk/pa/bills/cbill/58-03/0265/220265v2.pdf
  13. assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/1053023/national-cyber-security-strategy-amend.pdf at para 21
  14. https://www.gov.uk/government/news/british-businesses-to-save-billions-under-new-uk-version-of-gdpr#:~:text=Today’s%20data%20reforms%20are%20expected,global%20partners%2C%20including%20the%20EU
  15. https://www.ucl.ac.uk/european-institute/files/ucl_nef_data-inadequacy.pdf