This country-specific Q&A provides an overview to technology laws and regulations that may occur in the France.
It will cover communications networks and their operators, databases and software, data protection, AI, cybersecurity as well as the author’s view on planned future reforms of the merger control regime.
This Q&A is part of the global guide to Technology. For a full list of jurisdictional Q&As visit http://www.inhouselawyer.co.uk/index.php/practice-areas/technology
Are communications networks or services regulated? If so what activities are covered and what licences or authorisations are required?
The installation and operation of networks open to the public and the provision of electronic communication services to the public are free. Network operators and service providers (all of which are designated as ‘operators’) must, however, file a declaration with the national regulatory authority, the Autorité de Régulation des Communications Electroniques et des Postes (ARCEP). Through this declaration, the operators commit themselves to comply with the regulatory regime defined by the EU ‘Telecom Package’ (dating from 2002 and 2009) and the French Posts and Electronic Communications Code (CPCE, art.L.32 et seq.).
However, an individual authorization by the ARCEP is required for the use of certain frequency bands such as those allocated to mobile telephony services (GSM, UMTS), radio local loops, radio-relay systems or satellite networks. In these cases, the regulatory regime is reinforced though additional obligations that are set out in the operators’ licenses.
Conversely, a declaration to the ARCEP is not required in respect of ‘independent networks’ (this corresponds to telecom services exchanged within closed user groups (aka. ‘VPN’ or Virtual Private Networks)) or to radio installations using short-range frequencies that are not dedicated to their users.
Is there any specific regulator for the provisions of communications-related services? Are they independent of the government control?
Communications networks and services are regulated by a national regulatory authority, the ARCEP (Autorité de Régulation des Communications Electroniques et des Postes).
This agency has been recognized as an ‘independent government authority’ by the French constitutional court since its inception in 1996. This implies that its members may not be revoked during their assignment and that the agency may not receive orders or instructions from the government.
Certain rules issued by the ARCEP such as, for instance, concerning the use and operation of certain radio frequencies, must, nevertheless, be homologated by the ministry in charge of electronic communications. Further, the ministry keeps certain significant powers such as, for instance, to mandate a security check on the installations, networks and services of any operator. More broadly, the ARCEP is bound to collaborate with other regulatory bodies such as the one in charge of radio and television broadcasting activities (‘CSA’) and, at the EU level, the EU Commission and the BEREC Office (Office of the Body of European Regulators for Electronic Communications).
Does an operator need to be domiciled in the country? Are there any restrictions on foreign ownership of telecoms operators?
An operator is not required to be domiciled in France (i.e. to create a subsidiary, register a branch, or else) in order to operate a network or provide communications services in the country. Only the declaration or authorization requirement described in Question 1 above will apply.
Pursuant to France’s international commitments there are no legal restrictions on foreign ownership of operators. More generally, under EU directives each EU Member State must make sure that access to its telecom market is not unduly restricted.
In this area, the ministry in charge of electronic communications and the ARCEP must yet ensure that equivalence of treatment is respected regarding outbound as well as inbound traffic, including as concerns conditions of access to foreign networks.
Are there any regulations covering interconnection between operators? If so are these different for operators with market power? What are the principal consumer protection regulations that apply specifically to telecoms services?
Interconnection between operators is regulated pursuant to the EU ‘Telecom Package’ and the French CPCE (see Question 1). For instance, an operator’s decision to refuse interconnection with another one (including a foreign one) must be substantiated. The ARCEP may control any interconnection agreement and may impose specific requirements on the parties in an ‘objective, transparent, non-discriminatory and proportionate manner’ (CPCE, art.L34-8).
As such, interconnection obligations of operators which have a significant market power are systematically controlled. Furthermore, such operators may be required to issue a public interconnection tariff and proposal and to offer such services to other operators under non-discriminatory terms.
Significantly, the legislator recently broadened the scope of entities that may be required to provide interconnection or access to infrastructures. This concerns for instance operators which exercise control on access to end users, and entities which run certain infrastructures such as the power grid or railways.
What are the principal consumer protection regulations that apply specifically to telecoms services?
Beyond the general provisions of the Consumer Code, an electronic communications operator is subject to specific requirements, in particular in terms of:
- information in real time on its offering and tariff, on the consequences of unlawful use of its services by the customer (e.g. in respect of copyright infringement), on the ways to protect individual security and personal data, on number portability - to list a few;
- insertion in consumer agreements of certain provisions, such as on indemnification in case of failure to maintain the proposed quality of service; and limitations to the possibility to require a minimum term of service;
- performance of its service agreements, for instance with the prohibition of additional access charges for communications to an after-sale service or a helpdesk.
What legal protections are offered in relation to the creators of computer software?
Software programs are legally protected by copyright, provided they are original. According to case law, ‘original’ means that the way a program is written reflects the personality or the personal efforts of its author. Copyright grants the software publisher the exclusive right to authorize the use, copying and initial distribution of its program for a period of 70 years from the year of publication.
This legal protection applies to source code and object code regardless of the kind, form of expression, merit or purpose of the program. Copyright may also apply to preparatory design materials (e.g. specifications), graphical user interfaces or embedded multimedia elements, or even to the title of the program. However, the software medium (e.g. CD Rom), the ideas and concepts embodied into the software and, more generally, its functionality, are not protected.
Patent protection cannot apply to computer software programs “as such,” but in as far as they are used within patentable inventions (i.e. may cause a "technical effect").
Filing a piece of software with a registrar or notary public remains useful, however, in a view to obtain evidence of the date of its creation. Such a filing will comprise the source code, but its content will not be made public. From a practical viewpoint, confidentiality indeed best protects the source code of a program.
Are specific intellectual property rights in respect of data/databases recognised?
Pursuant to EU directive 96/9/EC of 11 March 1996, a database may be subject to both copyright, which may benefit its author in respect of his original selection or arrangement of the contents of the database, and to a specific, sui generis right that will inure to its ‘producer’ for a period of 15 years, irrespective of whether the database is in itself original.
Under the French Intellectual Property Code (IPC), the ‘producer’ of a database is defined as the person who initiated the investments in the database and assumed the associated risks, when the investment in the obtaining, verification or presentation of the contents is substantial from a financial, material or human standpoint. Thanks to this sui generis database right, the effort made in developing a database that is a compilation of information or commonplace data, such as a telephone directory or football match listing, may still be protected despite its lack of originality. This protection allows the producer to prohibit extraction or re-utilization of the whole contents or of a substantial part thereof.
Nevertheless, under the above rules only the database is protected, not data per se: data is considered as information and, out of principle, information is due to circulate freely (unless made confidential by an exclusive information holder).
What key protections exist for personal data?
The processing of ‘personal data’ (defined as data which, alone or combined together, allow to identify a natural person directly or indirectly) is ruled in France by the Act No. 78-17 of 6 January 1978 (the 1978 Act). The implementation of this act led to the adoption at the EU level of EU directive 95/46/EC of 24 October 1995. These texts are about to be replaced on 25 May 2018, when EU regulation 2016/679 of 27 April 2016 (the ‘General Data Protection Regulation’ or GDPR) takes effect.
Under the 1978 Act and, afterwards, the GDPR, personal data may be collected and further processed only under certain conditions, such as:
- when the concerned person (‘data subject’) has consented;
- when it is necessary for the performance of a contract to which the data subject is a party, or to comply with a legal obligation imposed on the data controller; or
- where there is a ‘legitimate reason’ for the processing, provided this does not harm the data subject's fundamental rights and freedoms.
Furthermore, the ‘data controller’ (considered to be the person who determines the purposes and means of the data processing) must comply with several principles:
- the personal data must be processed lawfully, fairly and in a transparent manner;
- personal data must be collected for specified, explicit and legitimate purposes and must be subsequently processed in accordance with these purposes;
- the personal data that is collected must be adequate, relevant, and non-excessive in view of the purposes for which it is collected (this is called ‘data minimisation’);
- all personal data must be accurate and, when necessary, kept up to date;
- personal data must not be retained for longer than necessary in light of the purposes for which it is processed; and
- the data controller must implement appropriate organizational and technical measures to ensure the security and confidentiality of the personal data, both against unauthorized or unlawful processing and against accidental loss, destruction or damage to the data.
On their side, data subjects are granted certain specific rights that include the right to access data concerning them and to request correction or deletion of such data.
More stringent rules apply to ‘sensitive’ personal data (e.g. data relating to health or the sexual orientation of a person).
Generally speaking, the above principles and rules are detailed in regulations and recommendations issued by the national regulatory authority in charge of personal data regulation, the Commission Nationale de l’Informatique et des Libertés (CNIL). This authority‘s power to control data processings and to impose sanctions also belong to the key protections granted to individuals in this area.
Are there restrictions on the transfer of personal data overseas?
The transfer of personal data out of the territory of the European Union is prohibited unless the destination country or the recipient provides a level of protection considered as sufficient, that is, equivalent to the protection afforded within the EU. Please note that providing remote access to data from abroad, irrespective of where data is stored, is considered as a transfer.
Transfers may take place only under certain conditions defined by EU legislation, such as:
- according to a decision of the EU Commission, the non-EEA jurisdiction provides "adequate protection" (e.g. has laws commensurate with those of the EU State members);
- the transfer is governed by the "standard contractual clauses" approved by the European Commission; or
- the data is transferred to the US and in compliance with the "Privacy Shield" program.
Current and future legislation provides for exceptions to the prohibition to transfer data, if the data subject expressly consents to the transfer, or where the transfer is necessary for instance:
- to safeguard the life of the data subject;
- to safeguard public interest;
- to ensure compliance with obligations that allow the acknowledgement, exercise or defense of a right before courts;
- for the performance of a contract between the data controller and the data subject;
- for the execution or the performance of an agreement in the interest of the data subject but between the data controller and a third party.
What is the maximum fine that can be applied for breach of data protection laws?
Currently in France, the maximum fine that may be imposed by the CNIL amounts to 3 million euros. As from 25 May 2018, under the GDPR, the maximum amount will increase to 20 million euros or 4% of the worldwide turnover of the data controller, whichever is higher. This will concern, however, only certain types of breaches, such as non-compliance with the rights conferred on data subjects. The GDPR provides for graduated sanctions regarding other types of breaches.
Are there any restrictions applicable to cloud-based services?
Most if not all cloud-based services involve the processing of personal data in the sense of the 1978 Act and the GDPR (see Question 7). Until the GDPR comes into effect (May 2018), the clients of such services will remain ‘data controllers’ and must assume full responsibility to comply with the associated obligations (including those described in Questions 7 and 8), even where they delegate IT activities to cloud service providers (e.g. SaaS, PaaS, etc.).
Since 2012, however, the CNIL has recognized the possibility that cloud service providers be considered jointly liable, as “co-data controllers,” when they propose standard services under adhesion contracts and will not take instructions or allow audits from their customers. This situation is now fully addressed by the GDPR. Consequently, clients are required to define specifically how their obligations and responsibilities regarding personal data processing will be shared with their cloud service providers. These agreements between co-controllers must be made available ‘in essence’ to the concerned data subjects.
Furthermore, where the cloud service provider remains considered as acting on behalf of its client, as a sub-contractor or data ‘processor,’ it must comply with a series of specific obligations, including, for instance, to ensure that its employees and other persons authorized to process the personal data will be legally committed to confidentiality. A number of such requirements must be inserted into the service agreement with its client.
Beyond legal obligations, efforts are being made to nurture the development of codes of conduct, standards (for instance, SecNumCloud initiated by a government agency, ANSSI) as well as certifications. The French CPCE even defines the contents of an ‘electronic vault service,’ a cloud variety that offers the storage of electronic data or documents with full traceability and integrity, for which an administrative certification is proposed in a view to raise confidence from would-be customers.
Aside from general texts, sector-specific guidance is also provided by various regulatory authorities such as in the bank and insurance sector, with the ACPR (Autorité de Contrôle Prudentiel et de Résolution), or the health sector, with the ASIP Santé (Agence Française de la Santé Numérique). For example, the ACPR encourages the firms it controls to take appropriate risk management measures along best practices that the agency described in an analysis issued in July 2013.
Are there specific requirements for the validity of an electronic signature?
According to the Civil Code provisions that implement EU legislation governing this matter (lastly, EU regulation 910/2014 of 23 July 2014), an electronic signature is considered as a ‘signature,’ that is, as effectively identifying the author of an act and showing his consent, only when it results from a reliable identification process that guarantees its link with the act. The process is deemed to be reliable when it fulfills certain requirements that are set out in regulations. Such requirements include the use of a certificate that must be delivered to the signature holder in person, as well as other requirements that are seldom fully satisfied.
As a consequence, so called ‘electronic signatures’ in use on the market may most often not be considered as ‘electronic signatures’ under the law. This means that they may not be legally deemed to be reliable and identify their users: when challenged before the courts, their users will have to demonstrate their probatory value.
By way of consequence an electronic document, which should in principle have the same evidentiary value as a document on paper provided its author can be properly identified, will hardly meet this identification criterion since this requires an electronic signature. Therefore, where a writing is required as evidence and is issued in an electronic format, the lack of a lawful electronic signature will place the document at a lower rank than a document on paper.
In the event of an outsourcing of IT services, would any employees, assets or third party contracts transfer automatically to the outsourcing supplier?
According to the Labour Code (art. L.1224-1), which implements EU directive 2001/23/EC of 12 March 2001 on safeguarding employees’ rights in the event of transfers of undertakings, businesses or parts thereof, an automatic transfer of all employment contracts may occur in the event of a change in the employer’s legal situation, in particular as a result of a sale or merger of an undertaking, provided the outsourced activities constitute an “autonomous economic entity” as defined by case law, that is, an organised group of persons and assets that will be able to continue business in pursuit of a specific goal.
As regards assets, an automatic transfer may take place in the frame of a company merger, a corporate split, or the contribution of a whole business branch that involves a transfer of all associated assets and liabilities. Agreements inherent in the person of the co-contracting party may, however, not follow the transfer if such other party does not give its consent thereto.
If a software program which purports to be an early form of A.I. malfunctions, who is liable?
As of today, a general principle is that a person shall be considered as liable not only for damages he/she causes out of his/her own act, but also for those caused by things under his/her custody (Civil Code, art.1242). The scope of application of this principle may decrease as things that are prompted by Artificial Intelligence move away from the custody of their owner, as is the purpose of AI. Disputes may then arise as to who is liable: the owner of the thing, the designer, the software programmer, or the user from whom the thing learned how to behave…
In furtherance of this liability principle and following EU legislation, a strict liability regime was enacted in 1998, which applies on the producer of a product in regard to damages caused by a defect in his product. This liability applies irrespective of whether the producer is bound to the victim by contract (Civil Code, art.1245 et seq.). Strict liability makes things easier for the victim, who may sue the manufacturer, a supplier of individual parts or, ultimately, the reseller of the product. The victim must only prove the lack of security of the thing.
This liability regime might address situations where AI is embedded or associated to pieces of equipment or hardware, but less so in respect of a software program itself in as far as this is not (yet) considered as a ‘product.’ In addition, risks that will arise after fabrication and sale, due to the AI autonomous lurning curve, will ultimately cease to be predictable and might hardly be charged on the initial designer.
In this context, the liability regime in respect of artificial intelligence seems likely to evolve.
What key laws exist in terms of obligations as to the maintenance of cybersecurity?
Key legal provisions in respect of cybersecurity include in particular:
- article 34 of the 1978 Act (see Question 7), which requires the data controller to "take all necessary precautions, in light of the nature of the data and of the risks presented by the processing, in order to preserve the security of the data and, in particular, to prevent the data being distorted, damaged or subject to unauthorized access by third parties.’ The data controller which does not conform to its security obligations under this article is liable to a criminal fine of up to 300,000 euros (1,5 million euros for legal persons);
- the Military Programming Act of 18 December 2013, pursuant to which the State must rule on certain obligations such as the prohibition of certain systems connected to the internet; encourage the implementation of detection systems by certified providers; audit the security level of critical information systems; and, in the event of a major crisis, impose the necessary measures on ‘operators of vital importance.’ Further, legal persons designated as ‘operators of vital importance’ must strengthen at their own expense the security of the information systems they operate and that are deemed vital, and they are required to report incidents to the relevant authorities to give advance warning to entities potentially concerned;
- EU directive 2016/1148 of 6 July 2016 provides for, among other measures, a high common level of security of networks and information systems between member States, including through standardization; for security and notification requirements on operators of ‘essential services’ as well as on digital service providers; for the creation of a computer security incident response team network.
What key laws exist in terms of the criminality of hacking/DDOS attacks?
The Act n°88-19 of 5 January 1988 on software fraud creates various infractions such as fraudulent access or continued presence within all or part of an automatic data processing system and covers the criminality of hacking and DDOS attacks. This act was amended in 2004 and 2013 and, more recently, by the Act n°2015-912 of 24 July 2015 on intelligence. Amendments increased the quantum of the penalties applicable to incrimination against offenses of infringement of automated data processing systems, doubling the amount of fines for some of them.
What technology development will create the most legal change in the jurisdiction?
The blockchain might generate the most substantial legal change, in France like elsewhere. As this technology relies on a chronological transactions database that is both distributed and encrypted, it ensures the integrity of the identification of the author of a legal act as well as the apparently flawless traceability of the origin and subsequent stages of a transaction. Consequently, the system may be used for transactions such as money transfers, land registration, royalty collection societies, the issuance of bonds, of diplomas, etc. above all without territorial boundaries nor government regulation. To some extent the blockchain might also offer an effective alternative to the ‘electronic signature’ as devised under current legislation (see Question 11).
In parallel, more and more objects are connected to the Internet thanks to the Internet of Things, which the International Telecommunication Union defines as a ‘global infrastructure for the information society that provides advanced services by interconnecting objects (physical or virtual) with information and communication (…)." IoT gives another major example of a communications network that develops itself regardless of territorial boundaries and with only limited government control.
Which current legal provision/regime creates the greatest impediment to economic development/ commerce?
The French tax regime provides for a few hundred different types of taxes (some estimates say about 360) and stretches through more than three thousand pages of tax regulations (for those that are codified; see www.legifrance.gouv.fr). The flavor of the day, however, consists of the labor code, that has about the same size and whose complexity is also decried as an impediment to economic development. Regardless, to keep focused on the area of e-commerce and the digital economy, cybercrime appears to pose a challenge of an ever increasing magnitude and remains largely out of reach for legislators and regulators alike, as its scale has become global.
According to Interpol’s president Mireille Ballestrazzi, “it is clear that the national scale is not enough, we must act at European and global level. We hope that the Budapest Convention, drafted by the Council of Europe in 2005, will be transposed at the global level (…). It is a long-term struggle, because countries do not all have the same vision of what cybercrime is and how it should be treated (…). The more new technologies enter our daily lives, the greater the potential for infringements, and the more complex the fight against attacks.”
Do you believe the legal system specifically encourages or hinders digital services?
Digital services are ruled in essence by EU legislation, which has largely shaped French consumer law through the last twenty years or so. Therefore, national specificities will less be found in the legal system than in the economy.
However, France has lately become a determined proponent of digital administration, in particular through simplifying and transferring procedures to the internet (for instance with public procurement platforms and digital invoicing mechanisms) and through disseminating an increasing volume of open data (www.data.gouv.fr).
To what extent is the legal system ready to deal with the legal issues associated with artificial intelligence?
French authorities have for many years understood the importance of setting up mechanisms to foster pilot projects as well as large-scale experiments in the area of Artificial Intelligence (for instance, ‘France is AI,’ ScanR, etc.), but also the necessity to develop a regulatory framework that will protect consumers and citizens at large.
In this latter respect, the 1978 Act on personal data protection forbids any legal person to let a decision legally binding on an individual be made on the sole basis of a data processing that will profile such individual or assess certain aspects of his personality. Furthermore, the Act n°2016-1321 of 7 October 2016 for a digital Republic enacts the right for a natural person to be informed when an individual decision concerning her is taken on the basis of an algorithm and to request communication of the rules and major features of this data processing.
In light of the risks raised by Artificial Intelligence, essentially, the risk that “It would take off on its own, and re-design itself at an ever increasing rate" (Stephen Hawking), legislators may have to reinforce such principles (along with the liability regime, as explained in Question 13) as AI develops and becomes less and less understandable to those who become subject to its prescriptions.