Living the GDPR

Two years on from the implementation of GDPR and the Data Protection Act 2018, DAC Beachcroft’s data protection specialists consider the challenges their clients face in this now-established data protection climate.

The GDPR has brought in a real culture change. No longer is it just the legal and compliance teams who talk about the GDPR, but the business and operations teams themselves have become familiar with terms such as ‘data protection impact assessment’ and ‘right to erasure’. DAC Beachcroft’s data protection team consists of experts with sector specific knowledge, including financial services and health. They understand that different sectors face different challenges posed by the GDPR.

Rhiannon Webster, DAC Beachcroft partner and data protection specialist said, ‘Although, in general each sector is subject to the same law, there are surprisingly diverse regimes operating in each one. By that I not only mean in terms of additional guidance and regulation, which exists at a sector level, but also in terms of culture and the maturity of operating in a regulated regime’.

‘We have taken the opportunity below to provide our insights from our data protection specialists in our key sectors of health and financial services within the UK. We have also taken a look at the data protection enforcement regime within Ireland. The GDPR brought with it a promise of harmonisation of data protection law across Europe and the potential for fewer barriers to cross border trade through its creation of the “one stop shop”. Through this mechanism, organisations with multiple offices across Europe can appoint a lead supervisory authority to minimise the number of regulators it needs to deal with. The Irish regulator the DPC has therefore become an unlikely “David” to some giants of the technology sector. Our Irish data protection team below look at activities of the DPC in Ireland over the last few years.’

Financial services

For the financial services industry, regulation and the prospect of fines and other regulatory action was something with which it was already very familiar, with the first real wholesale regulation of financial services firms being introduced in the late 1980s. The financial services sector was therefore better placed than other sectors to respond to black letter requirements of the GDPR, with many firms making use of existing governance frameworks and IT infrastructure to give effect to these requirements. Even before the GDPR came into force, the Financial Conduct Authority had the power to issue unlimited fines for regulatory breaches, and these included breaches of data protection laws.

The bigger challenge for these firms has been learning how to interpret the black letter requirements in the face of ever increasing technological development. Financial services firms, like those in many other sectors, are increasingly looking to make use of new technologies to get the most out of the data they hold. In many cases these uses – for example ‘big data’ activities – do not sit comfortably alongside the GDPR, and the firms’ advisers need to find ways to give effect to the commercial demands of the business while ensuring compliance with the GDPR.

Taking a more detailed look at big data, this is where insurance companies, for example, look to analyse the massive data sets that they hold, which will usually include personal data belonging to policyholders used at underwriting or claims stages, to understand better the behaviour of their customers and so how to better market to them, underwrite policies and manage claims.

One challenge firms have faced when it comes to making use of their ‘data lakes’ for big data purposes is that the GDPR requires that personal data is only held for as long as necessary for the specific purpose for which it was obtained. This challenge is in respect of both the historic data held by firms and data collected by firms going forward. Anonymisation of data is a way of continuing to use the data, free of regulation going forward, however the threshold for anonymisation is high and the process of anonymisation itself is subject to the requirements of the GDPR, eg ensuring that correct privacy notices were provided and that the processing is lawful.

With new technologies such as artificial intelligence, comes multiple advantages to financial services firms, eg its ability to trawl through different data sets and more quickly and accurately produce results, the potential to help firms better fight fraud and also produce more accurate and personalised pricing. Fintech, with its ability to automate and streamline processes and customer interaction, has the potential to open up new markets and free up business capacity.

However, the GDPR has meant that firms now need to take a step back before diving into new technologies and new uses of personal data and ensure that data protection is considered and built in from the start: ‘data protection by design’. This is a reasonably new concept for financial services firms, with only a handful having such processes in place before the GDPR. Firms have had to build additional governance and time into new projects to ensure the data protection implications of new technologies are properly considered and any risks mitigated.

Jade Kowalski is a partner and Charlotte Halford a senior associate in DAC Beachcroft’s financial services specialist data protection team. The team are looking forward to helping guide its financial services clients through the next wave of technological change. These compliance steps are even more pressing in light of the parallel issue of increased data subject awareness of personal data regulation and their rights under it. It is vital that firms get data protection right from the outset, not only to ensure compliance with the GDPR, but also to minimise the risk of data subject complaints and the reputational and cost implications that often go alongside.


The GDPR has not, perhaps unlike other sectors, precipitated a dramatic change in the approach to data protection compliance. Information comprised in health records is just one of a number of categories of special category data. However, given the inherent sensitivity associated with information relating to individuals’ health it has always been safeguarded as closely as possible by those who process it, and so in that sense little has changed. The health sector has not been completely unaffected though – in particular, as individuals have become more aware of their rights, the health sector has also seen an increase in data subject access requests (DSARs) and data breach claims, just like others.

The secretary of state for health and social care has consistently affirmed his commitment to embracing medical technology within the NHS, as exemplified by the announcement earlier this year of a £140m artificial intelligence fund which will provide funding to companies who apply and satisfy the relevant criteria. And, of course, recent challenges in the face of the Covid-19 pandemic have pressed the fast forward button on health technology.

Development and adoption of health technology inevitably entails the use of health data, often in significant quantities, and collaboration with the private sector. Although the GDPR establishes a number of clear parameters in respect of the use of health data, it is often more permissive than the common law duty of confidentiality which, in general terms, requires patient consent for any uses of health information other than direct care and treatment, or for the information to be anonymised. There are clearly legal risks for failing to discharge those obligations, but also reputational ones, as exemplified by the news coverage earlier this year that supposedly anonymised data shared with pharmaceutical companies may not have been sufficiently de-identified.

The Topol review gave an indication of the benefits of data-driven technology, such as the Electronic Frailty Index to locally identify elderly people at risk of adverse health outcomes, and the developments in the use of AI to interpret medical images. However, such benefits are realised in a fragmented and localised fashion, developing in piecemeal fashion within individual hospitals rather than across the health system as a whole. This is largely because of the difficulty in obtaining consensus on the parameters of confidentiality and the requirements for obtaining valid consents, which can hamper the adoption of new technology. This is also a prominent theme in the Institute of Global Health Innovation’s recent report on maximising the benefit of NHS data for patients, which recommended the Information Commissioner’s Office (ICO) issue NHS-specific guidance about data sharing in the NHS, and a transparent public debate about clear rules for data access by research organisations in the private sector, followed by the establishment of clear rules.

Darryn Hale is a senior associate and data protection specialist in DAC Beachcroft’s health team. The team has extensive experience of advising private and public bodies on health information law, including the launch of the NHS app and private sector AI, as well as single patient records bringing together health, including mental health, and social care across London. The Covid-19 pandemic demonstrates how the debate around the balance between individual and community interests is developing, and it will be interesting to see how the concept of ‘reasonable expectations’, which has always been championed by the National Data Guardian, evolves in relation the use of confidential patient data within the NHS. This will, in particular, need to balance the tension between the need for more rapid uptake of medical technology for potential public benefit, against the impact on protection of individual rights. We expect this to be a rapidly evolving issue.


Ireland’s data protection regulator, the Data Protection Commission (DPC), is tasked with being the lead supervisory authority for some of the largest tech companies in the world including Facebook, Google, Apple and Microsoft. Notwithstanding the DPC’s reach, the DPC has not, to date, imposed any fines under the GDPR. Nor has it flexed its powers in any significant way as lead supervisory authority in relation to these tech giants. In 2019 when CNIL, France’s data protection regulator, fined Google €50m for breaching the GDPR, the CNIL determined that it possessed the regulatory jurisdiction to consider a complaint and that the ‘one stop shop’ mechanism was not applicable in circumstances where Google’s EU headquarters in Ireland did not have any decision making power in respect of the data processing in question and as such, could not be said to be Google’s ‘main establishment’ for the purposes of the ‘one stop shop’ principle. Indeed, CNIL ultimately concluded that none of Google’s EU bases had decision making power in respect of the data processing in question and it therefore did not have a main establishment within the EU at all.

The approach of the DPC to be slow to fine companies does appear to be in contrast to a number of regulators across Europe, in particular the UK and France. Although we would note that there has not been the avalanche of fines that some commentators predicted prior to the GDPR and regulators across Europe are issuing large-scale fines in exceptional circumstances only. The DPC has, however, announced that it is conducting a number of statutory inquiries into companies including Facebook, Twitter and Apple, and the Commissioner, Helen Dixon, has publically stated that GDPR fines are ‘an inevitability’. As of 31 December 2019, the DPC had 70 ongoing statutory inquiries, including 49 domestic inquiries.

The DPC continues to face funding challenges. The DPC is currently operating on a total budget of €16.9m and is entirely reliant on government funding. Irish legislation dictates that any fines imposed under the GDPR are to be paid into the state exchequer. Dixon publically expressed her disappointment at the increase in government funding for the DPC in 2020, outlining that it was ‘less than one third of the funding’ that her office had requested in its annual budget submission.

Notwithstanding budget constraints, in 2019 the DPC significantly increased its output of guidance, responded to more complaints and queries and began conducting larger scale statutory inquiries. Going forward, Dixon indicated that the DPC hopes to move away from the ‘first principles’ of GDPR and move into the ‘meat of data protection by design’ to ensure that the next generation technology will not create the problems Ireland ‘sleep walked into’ over the last 20 years. Culturally, it would appear that there is a growing understanding and awareness of data protection legislation amongst the general populace in Ireland since the introduction of the GDPR. In its most recent Annual Report, the DPC outlined that in 2019 there had been a 75% increase in the total number of complaints received by the office.

It is clearly an interesting and challenging time in Ireland for both the data regulator and data protection and privacy lawyers. Rowena McCormack is a partner and Charlotte Burke an associate in DAC Beachcroft’s data protection team in our Dublin office.