This country-specific Q&A provides an overview to technology, media and telecom laws and regulations that may occur in United Kingdom.
This Q&A is part of the global guide to TMT. For a full list of jurisdictional Q&As visit http://www.inhouselawyer.co.uk/practice-areas/tmt-3rd-edition/
What is the regulatory regime for technology?
See answers to the questions below.
Are communications networks or services regulated?
Yes, communications networks and services are regulated. However, most providers of communications networks and services do not require a licence or specific authorisation to operate; rather, they have 'general authorisation', meaning that they can operate provided they comply with a set of general rules which are largely set out in the General Conditions of Entitlement (which are established under section 45 of the Communications Act 2003).
The exceptions to the principle of general authorisation include:
- networks or services using radio spectrum (except where exempted by the government);
- mobile operators wanting, for wireless telegraphy, to (i) establish and use base stations, or (ii) install or use apparatus;
- satellite operators;
- multiplex operators; and
- certain premium rate services regulated by the Phone-paid Services Authority.
The communications services and networks which are in scope for the purposes of being regulated are:
- 'electronic communication networks' - i.e. the system (and its associated apparatus, equipment, software and stored data) which is used to transmit signals; and
- 'electronic communication services' - i.e. the conveying of signals over the electronic communication network.
In the absence of an agreement between the EU and the UK, from the date of the UK's withdrawal from the EU, providers of electronic communications networks and/or services established in the UK will no longer benefit from the general authorisation regime within the European Union. As such, the remaining EU Member States may introduce additional authorisation requirements on providers established in the UK.
A UK withdrawal will mean that the regulatory regime will cease to be applicable in the UK. However, as the framework has already been transposed into UK law through national legislation, it is unlikely to have a significant immediate effect. However, the UK government will remove from the current UK legislation provisions referring to EU institutions and processes that will no longer be applicable. Ofcom will no longer be required to notify certain proposed measures to the European Commission prior to their implementation and will no longer need to comply with the Commission's Recommendations on non-discrimination obligations and costing methodologies for network access.
If so, what activities are covered and what licences or authorisations are required?
The use of radio spectrum requires a licence from Ofcom (section 8, Wireless Telegraphy Act 2006 (WTA)). Radio spectrum is auctioned by Ofcom from time to time, for vast sums running into billions of pounds.
Digital TV and radio broadcasting requires various multiplex and radio broadcast licences under the Broadcasting Act 1996. These are different from the licence of the relevant spectrum, which is licensed under the WTA. Licences are granted by Ofcom for a fixed fee payable. There can also be an annual percentage of revenue payable.
Operating a satellite in outer space. The licence is obtained from the UK Space Agency.
Is there any specific regulator for the provisions of communications-related services?
Yes - communications networks and services are primarily regulated by Ofcom.
Are they independent of the government control?
Ofcom, whilst independent of government control:
- must act in accordance with its powers and duties set out in law;
- is accountable to the UK Parliament; and
- is funded from regulatory fees and grant-in-aid from the UK government.
Following amendments made by the Digital Economy Act 2017, Ofcom has to have regard to the government's statements of strategic priorities relating to telecoms and radio spectrum management (section 98 2A-2C).
Are platform providers (social media, content sharing, information search engines) regulated?
Platform Providers in the UK are regulated to an extent by the Electronic Commerce (EC Directive) Regulations 2002 which implement the Articles 12-15 EU Directive 2000/31/EC. The regime applies to content which appears on platforms, but with respect to which the platform operator performs only certain technical functions (services whose primary function is hosting content contributed by others). The legislation imposes obligations upon a seller before a contract is formed and information that must be provided to the consumer.
The Statutory Code of Practice for providers of online social media platforms has been published in accordance with Section 103 of the Digital Economy Act 2017. The Code provides guidance for social media platforms, in advance of the new regulatory framework envisaged in the Online Harms White Paper. It sets out actions that the Government believes social media platforms should take to prevent bullying, insulting, intimidating and humiliating behaviours on their sites.
If so, does the reach of the regulator extend outside your jurisdiction?
In April 2018 the European Commission proposed new rules to ensure transparency and fairness when dealing with platforms. The draft regulation will apply to online intermediaries, including: e-commerce marketplaces, app stores, price comparison tools and search engines. The EU Commission has also launched a public consultation on the issue of 'fake news'.
In Germany, rules set out that platforms with over two million users have to remove potentially illegal material within 24 hours of being notified or face fines of up to 50 million EUR.
Does a telecoms operator need to be domiciled in the country?
There are no requirements for a communications provider to be domiciled in the UK prior to or during the provision of services, and there are no foreign ownership restrictions.
Are there any restrictions on foreign ownership of telecoms operators?
Potentially, Ofcom does have the right to, amongst other things, revoke licences for the installation and use of wireless telegraphy equipment where necessary in the interests of national security. This could, theoretically, be used to restrict foreign ownership of certain telecoms operators, although this right is unlikely to be invoked.
Are there any regulations covering interconnection between operators?
Yes - the rules on interconnection are governed by a mixture of communications- and competition-related laws.
The General Conditions require that a provider of public electronic communication networks negotiates with another public electronic communication networks provider based in the EC with a view to concluding an agreement (or an amendment to an existing agreement) for interconnection within a reasonable period. This is a general obligation which applies to providers of public electronic communication networks.
In addition, Ofcom has the power to impose access-related conditions on providers of electronic communication networks for the purposes of securing:
- efficiency (e.g. by making associated facilities available);
- sustainable competition;
- efficient investment and innovation; and
- the greatest possible benefit for the end-users of public electronic communications services.
If so are these different for operators with market power?
These access-related conditions may include obligations to share the use of electronic communications apparatus (and apportioning and contributing towards the costs of this sharing) where there are no viable alternative arrangements that may be made. Hence, these are more likely to be applied on electronic communication networks providers with significant market power.
Moreover, general European competition law applies in respect of anti-competitive agreements and the abuse of dominant positions.
What are the principal consumer protection regulations that apply specifically to telecoms services?
Various consumer-specific provisions are set out in the General Conditions of Entitlement. A 'consumer' is defined as someone who uses or requests a service for purposes which are outside his or her trade, business or profession.
Specific obligations relating to consumers include:
- the requirement to include certain minimum terms in consumer contracts;
- certain parameters regarding the term and termination rights under the consumer contract (e.g. so that the procedures for contract termination do not act as disincentives for consumers against changing their communications provider);
- a requirement to make certain information available to the consumer (e.g. any access charges, the payment terms, the existence of any termination rights and any termination procedures);
- number portability; and
various restrictions on sales and marketing activities.
What legal protections are offered in relation to the creators of computer software?
Creators of computer software are entitled to copyright protection through the Copyright, Designs and Patents Act 1988 ("CDPA"). This gives the owner of the software the exclusive right to use and distribute it for a period of 70 years from the end of the calendar year in which the author of the software died (section 12(2)).
However, if the work is computer-generated, where there is no human author of the work copyright expires 50 years from the end of the calendar year in which the work was made (section 12(7)).
Elements of a computer program, such as screen displays and graphics may give the creator of computer software design rights under the Community Design Regulation (6/2002/EC) and/or the Registered Designs Act 1949, although a computer program itself does not attract a design right.
Patents are not available for computer software "as such" under the Patents Act 1977, although the Court of Appeal in the Aerotel Ltd v Telco Holdings Ltd and Macrossan  EWCA Civ 1371 set out guidance to establish when computer software may be patentable, which is currently being followed by the Intellectual Property Office when deciding whether or not to grant a patent.
Do you recognise specific intellectual property rights in respect of data/databases?
As a general principle, there are no intellectual property rights ("IPRs") in data itself, although databases may be protected by IPRs. The CDPA gives copyright protection to the author of a database for the period of 70 years from the calendar year in which the author died (section 12(2)). Moral rights, which grant rights such as the right to be identified as the author of the database (sections 77-79) will be granted to the author, unless the database was created in the course of an employee's employment (sections 79(3) and 82(1)).
The Copyright and Rights in Database Regulations 1997 give the author a database right for 15 years from the end of the calendar year in which the making of the database was completed, or if a substantial change is made to the contents of a database so the database can be considered to be a "substantial new investment", 15 years from the end of the calendar year in which the substantial change was made (Regulation 17).
A patent may be available under the Patents Act 1977 if the database can be shown to achieve a technical effect that is novel and inventive (section 1). Databases used to implement new business methods are not, however, patentable (section 1(2)(c)).
What key protections exist for personal data?
Personal data (being any data which - alone or in combination with other information in the hands of the party in question - would enable a living person to be individually identified) is subject to detailed regulation and protection by way of the General Data Protection Regulation (GDPR).
The main rights afforded to individuals generally under the GDPR are:
- the right to be informed - individuals have the right to be informed about the collection and use of their personal data;
- the right of access - individuals have the right to access their personal data and supplementary information. This right allows individuals to be aware of and verify the lawfulness of the processing;
- the right to rectification - individuals have the right to have inaccurate personal data rectified, or completed if it is incomplete;
- the right to erasure - individuals have the right to have personal data erased;
- the right to restrict processing - individuals have the right to request the restriction or suppression of their personal data;
- the right to data portability - individuals have the right to obtain and reuse their personal data for their own purposes across different services; and
- the right to object - individuals have the right to object to: (i) processing based on legitimate interests or the performance of a task in the public interest / exercise of an official authority; (ii) direct marketing; and (iii) processing for the purposes of scientific or historical research and statistics.
Under the GDPR, data controllers may only collect and process personal data when certain specific conditions are met, including:
- where the data subject has consented;
- where it is necessary for a contract to which the data subject is a party; and
- where there is a "legitimate reason" for processing which does not itself damage the data subject's rights, freedoms or own legitimate interests.
More stringent rules apply to special categories of personal data (e.g. as to health or sexual orientation etc.).
All data controllers must take appropriate technical and organisational measures to safeguard against unauthorised or unlawful processing, and against accidental loss of or destruction of personal data. The ICO does not mandate any particular standard in this regard but recommends adherence to ISO 27001.
Are there restrictions on the transfer of personal data overseas?
Under the GDPR, personal data may only be transferred outside of the EU in compliance with the conditions for transfer set out in Chapter V of the GDPR. The main two situations in which it is permissible for personal data to be transferred outside of the EU are:
- transfers on the basis of an adequacy decision - i.e. where the Commission has decided that a third country, a territory or one or more specified sectors within that third country, or the international organisation in question ensures an adequate level of protection. Such a transfer shall not require any specific authorisation (on exit date the UK will be a third country outside the EEA); and
- transfers subject to appropriate safeguards - i.e. if the Commission has not made a relevant adequacy decision, a controller or processor may transfer personal data to a third country or an international organisation only if the controller or processor has provided appropriate safeguards, and on condition that enforceable data subject rights and effective legal remedies for data subjects are available.
The appropriate safeguards to include are set out in Article 46(2) of the GDPR.
What is the maximum fine that can be applied for breach of data protection laws?
The Information Commissioner's Office has (unless and until Brexit occurs) the power to levy fines pursuant to the GDPR. The maximum fines will increase to €20m / 4% of worldwide turnover re: (for example) breaches of the basic principles of processing (eg re: consent), or a lower threshold of €10m / 2% of annual turnover for breaches of some of the more ancillary obligations such as security arrangements or breach notifications.
What additional protections have been implemented, over and above the GDPR requirements?
The Data Protection Act 2018 is a complete data protection system. As well as governing general data covered by the GDPR, it covers all other general data, law enforcement data and national security data. The Act also exercises a number of agreed modifications to the GDPR to make it work for the benefit of the UK in areas such as academic research, financial services and child protection.
The Privacy and Electronic Communications Regulations (PECR) sit alongside the Data Protection Act and the GDPR. They give people specific privacy rights in relation to electronic communications. They implement European Directive 2002/58/EC and set out specific rules on marketing calls, emails, texts and faxes; cookies; keeping communications services secure; and customer privacy regarding traffic and location data. PECR was amended in 2019 to ban cold-calling of pensions schemes in certain circumstances.
Are there any regulatory guidelines or legal restrictions applicable to cloud-based services?
There are currently no specific 'Cloud laws' in the UK, but the direction of travel in the EU is towards a harmonised 'certification' scheme, although this stops short of full-blown regulation. A recent study found that ISO 27001 was still the most commonly-adopted certification scheme among the most prominent Cloud Service Providers (CSPs):
In addition, many sector-specific regulatory initiatives (either issued by administrative or supervisory authorities or by the industry itself) have been issued which may further fuel the drive towards national cloud regulations. Some of these initiatives are binding, such as the guidelines issued by several financial supervisory bodies, whereas the guidelines of data protection authorities may not as such be binding but nonetheless tend to lead to a best practice standard.
For example, in the financial services sector, the Financial Conduct Authority (FCA) has stated that financial services companies operating in the UK can make use of cloud-based services without falling foul of regulatory obligations. The published guidance
https://www.fca.org.uk/publication/finalised-guidance/fg16-5.pdf is not binding but the FCA said it expects firms to take note of them and use them to inform their systems and controls on outsourcing.
Aside from sector-specific guidance, the key restriction applicable to cloud-based services will depend upon the nature of the data being placed in the cloud. In the event that the data is personal data then the points made at 10 and 11 above will apply.
Are there specific requirements for the validity of an electronic signature?
EU Regulation 910/2014 ("Electronic Identification Regulation"), which has direct effect in the UK, sets out the validity requirements for electronic signatures. Under the Electronic Identification Regulation, a 'qualified electronic signature' has the same effect as a handwritten signature (Article 25(2)) as long as it was created by a qualified electronic signature device and based on a qualified certificate for electronic signatures (Article 3(12)).
The validity requirements for a qualified electronic signature are set out in Article 26 and Annexes I and II of the Electronic Identification Regulation and include the following: the signature must be uniquely linked to the signatory (Article 26(a)), the signatory must be identifiable (Article 26(b)), the qualified electronic signature creation device must have appropriate technical and procedural measures to ensure that the confidentiality of the signature is assured (Paragraph 1(a), Annex II) and the qualified certificate for electronic signatures must clearly indicate the name or pseudonym of the signatory (Paragraph (d), Annex I).
In respect of electronic signatures at a national level, the UK takes a relatively progressive approach in respect of electronic signatures as confirmed by the guidance note published by The Law Society Company Law Committee and The City of London Law Society Law and Financial Law Committees (found here: https://www.lawsociety.org.uk/support-services/advice/practice-notes/execution-of-a-document-using-an-electronic-signature/). However, there is still some uncertainty as to whether electronic signatures will be sufficient for the valid execution and/or filing of certain documents subject to statutory requirements (e.g. guarantees, or certain documents that require registration at HM Land Registry).
In the event of an outsourcing of IT services, would any employees, assets or third party contracts transfer automatically to the outsourcing supplier?
No transfers of assets or third party contracts would occur automatically. However, there will frequently be detailed contract provisions negotiated between the parties to the outsourcing arrangement to facilitate this. In the case of the other signatories to the third party contracts, their consent to the proposed transfer of their contracts to the new outsource service provider will ordinary be required.
If there are individuals who are wholly or substantially engaged in the services/functions which are being outsourced, however (and whether they be employed by the customer entity or its other service providers), then their contracts of employment may transfer automatically to the outsource service provider by virtue of the Transfer of Undertakings (Protection of Employment) Regulations 2006 (TUPE). In such event, all of their rights and obligations (including claims arising from employment related mistreatment by their previous employer) will transfer to the outsource service provider.
If a software program which purports to be a form of A.I. malfunctions, who is liable?
Ordinarily, there will be strict liability for the producer of defective products for consumers (Consumer Protection Act 1987) and this would include products which are themselves software or which include software components. Where such defects have resulted from computer assisted design or other software assisted processes, it will ordinarily be the person who programmed the CAD tool who will then face liability. However, this is all predicated on a principle of casual connection, i.e. "Because of A + B, C necessarily came next". When the software starts to make decisions for itself based upon its "learning" from what it observes/receives from external sources and so ceases to be predictable, this liability concept becomes more strained.
There are discussions as to whether AI could be given a separate legal personality, and so be held accountable in a similar way to a company. In addition, the licensors/programmers responsible for the AI could be open to claims through vicarious liability. Alternatively, a similar framework to that which is applied to animals who cause harm or damage, where there is strict liability on their owner(s), could be applied. However, both of these discussions have not been fully developed and therefore, for the time being at least, it would seem most likely that the licensor/programmer of the A.I product would be liable pursuant to the strict liability regime in the Consumer Protection Act as referred to above. In the business (i.e. non-consumer) context, contractual provisions will usually specify where liability will sit in any event.
What key laws exist in terms of: (a) obligations as to the maintenance of cybersecurity; (b) and the criminality of hacking/DDOS attacks?
a) Obligations as to the maintenance of cybersecurity;
The key laws imposing obligations on companies to maintain cyber-security include the General Data Protection Regulation ("GDPR"), the Data Protection Act 2018 ("DPA 2018"), the Network and Information Systems Regulations 2018 ("NIS Regulations"), the Communications Act 2003 ("2003 Act") and the Privacy and Electronic Communications (EC Directive) Regulations 2003 ("2003 Regulations").
Under the DPA 2018, which implements the GDPR, controllers are subject to various obligations including to select a processor that sufficiently guarantees appropriate technical and organisational measures. Specifically, Article 32 of the GDPR requires controllers and processors to implement measures that ensure a level of data security appropriate for the level of risk presented by processing personal data – this incudes encryption. In the event of a data breach, there is a mandatory legal duty to notify the ICO of the breach having occurred (within 72 hours of a controller having become aware of such incident (Article 33)). This must be notified directly to data subjects concerned where the breach is likely to result in a high risk to the rights and freedoms of natural persons.
The NIS Regulations focus on the availability of crucial network and information systems in order to protect critical infrastructure and apply to Operators of Essential Services ("OES") and Digital Service Providers ("DSP"), requiring OESs and DSPs to: (i) take appropriate technical and organisational measures to secure their network and information systems; (ii) take into account the latest developments and consider the potential risks facing the systems; (iii) take appropriate measures to prevent and minimise the impact of security incidents to ensure service continuity; and (iv) notify the relevant supervisory authority of any security incident having a significant impact on service continuity without undue delay.
Under the 2003 Act, public electronic communications network ("PECN") providers and public electronic communications service ("PECS") providers have an obligation to take technical and organisation measures to manage risks in respect of electronic communications (section 105A). This includes notifying Ofcom of any breaches (section 105B). PECS providers are also subject to obligations under the 2003 Regulations, which require them to take appropriate technical and organisational measures to safeguard the security of their services (Regulation 5(1)). PECS providers must inform the Information Commissioner's Office ("ICO") if there is a personal data breach (Regulation 5A(2)) and the individuals concerned if the breach is likely to adversely affect the personal data or privacy of the subscriber or user (Regulation 5A(3)).
Businesses operating in the financial services sector are also subject to the Senior Management Arrangements Systems and Controls ("SYSC") set out in the FCA Handbook and the STAR and CBEST standards developed by the Council for Registered Ethical Security Testers and the Bank of England. The SYSC provides obligations relating to governance, systems and controls that can directly or indirectly impose cyber security obligations on financial service providers (e.g. securing systems, managing risks, reducing the risk of financial crime and protecting client confidentiality). The STAR and CBEST standards allow financial services providers to demonstrate their cyber-security assurance by passing stipulated penetration and vulnerability tests.
Company directors also have an obligation to maintain cyber-security through the fiduciary duties they owe to their company, which are set out in the Companies Act 2006. These include the duty to promote the success of the company and to exercise reasonable care, skill and diligence while conducting their role (sections 172 and 174). Failure to understand and mitigate cyber risk (e.g. by failing to implement appropriate cyber-security measures) could equate to a breach of these duties, which could lead to a claim being brought against the directors by the company or its shareholders.
On 19 March 2019, the Cybersecurity Act ("2019 Act") was approved by the EU Parliament and will shortly be submitted to the EU Council for approval. The 2019 Act aims to, strengthen the European Union Agency for Network and Information Security ("ENISA") as the EU competent authority on cybersecurity matters; and to also introduce a common cybersecurity certification framework in a broad range of digital products and services. This 2019 Act will give ENISA a permanent mandate throughout the EU and give the EU the power to adopt cybersecurity certification schemes which will apply across the EU and may create further obligations on businesses as to the maintenance of cybersecurity. The extent to which Brexit will affect the implementation of the 2019 Act within the UK remains to be seen,
b) The criminality of hacking/DDOS attacks?
The Computer Misuse Act 1990 ("CMA 1990") covers the criminality of hacking and DDOS attacks. The Regulation of Investigatory Powers Act 2000 ("RIPA") also creates offences in respect of the unlawful interception of communications.
The CMA 1990 creates various offences relating to cybercrime including: unauthorised access to computer material (section 1(1)), unauthorised acts with intent to impair, or with recklessness as to impairing, operation of a computer (section 3) and impairing a computer such as to cause serious damage or a significant risk of causing serious damage of a material kind (section 3ZA(1)). Persons found guilty of an offence under sections 1(1) or 3 of the CMA 1990 are liable for a prison term of up to 12 months, or a fine, or both (sections 1(3) and 3(6)). Those found guilty of an offence under section 3ZA(1) are liable for a prison term of up to 14 years, or life if the offence creates a significant risk of serious damage to human welfare or national security, or a fine, or both (section 3ZA(6) and (7)).
Under RIPA, it is an offence to intentionally and without lawful authority intercept a communication in the course of its transmission via a public or private telecommunications system (section 1). Persons found guilty are liable to a prison term of up to two years, or a fine, or both (section 7).
What technology development will create the most legal change in your jurisdiction?
We maintain our view that blockchain has yet to truly come into its own: there is going to be disruption across multiple industries once the security and trust issues are addressed, and everything from real estate transactions to IP rights management will be affected.
Which current legal provision/regime creates the greatest impediment to economic development/ commerce?
It is impossible to overlook the impact of the uncertainty over Brexit, which at the time of publication, remains unresolved. The severe commercial, legal and regulatory consequences for businesses in the UK can't be underestimated. Encouragingly we have seen clients act in a nimble, proactive way, and have adapted even without knowing the full future outcomes. Although one would think businesses which are primarily digital would be able to act independently, this doesn't account for the people, in buildings, with skills and experience who are physically located in the UK.
Do you believe your legal system specifically encourages or hinders digital services?
The UK legal system (when viewed in combination with the economic environment) is currently seen as conducive to digital startups and is seen as the epicentre for developments in AI and machine learning, globally. DLA Piper has run three iterations of its Tech Index, https://www.dlapiper.com/en/uk/focus/2018-european-tech-index/overview/ assessing the trends in digital innovation and growth prospects (and potential blocks). The 2018 edition highlighted that the key areas of focus are cyber security, IoT, AI and robotics, FinTech and Digital Transformation.
Ongoing concerns over cyber-attacks are still high among almost half of companies interviewed; yet only one quarter have response plans in place, which leaves those unprotected open to a major attack.
The research highlights concerns over compliance regulations, staff skills and investment which could hold development of FinTech back.
To what extent is your legal system ready to deal with the legal issues associated with artificial intelligence?
AI will place significant strains upon the English legal regime; many criminal acts and civil offences depend on questions of state of mind, foreseeability and/or intent… all of which are difficult enough to apply to fellow human beings, but which become near impossible to apply to a computer programme which has no emotions or "intentions" per se, but which will rapidly cease to act in a manner which is foreseeable on the part of the original programmers. It is likely therefore that further legislation would be required so as to remove elements of doubt that would otherwise persist.
The House of Lords Report entitled "AI in the UK: ready, willing and able?" (found here: https://publications.parliament.uk/pa/ld201719/ldselect/ldai/100/100.pdf), published in April 2018, concluded that the UK is in a strong position to be among the world leaders in the development of artificial intelligence. The Report recommended various proposals and initiatives such as government-backed targeted procurement and a national policy framework to help the growth of AI and AI technologies. However, the Report does note that there are distinct areas of uncertainty regarding the adequacy of existing legislation should AI systems malfunction, underperform or otherwise make erroneous decisions which cause harm (as described above) and the Report called upon the Law Commission to provide clarity on these legal issues. In its response, the UK Government agreed with the House of Lords Report recommendations regarding the need for greater regulation of AI and welcomed the Law Commission's input into the legal issues surrounding AI.
Moreover in April 2018, the UK Government launched an industry, government and academic-led 'AI Sector Deal' to boost the UK’s global position as a leader in developing AI technologies. The latest Sector Deal publication from May 2019 (found here: https://www.gov.uk/government/publications/artificial-intelligence-sector-deal/ai-sector-deal), outlined recommendations to (among others): help attract AI talent from around the world; help raise and deliver funding for major upgrades to the UK digital and data infrastructure; and to help create a flexible approach to AI regulation that promotes innovation and growth of AI whilst protecting citizens and the environment. The launch and level of support directed towards this Sector Deal suggests that the UK is certainly trying to be proactive in not only promoting AI, but also addressing the legal challenges that it will pose in the UK.