France: Technology

The In-House Lawyer Logo

This country-specific Q&A provides an overview to technology laws and regulations that may occur in the France.

It will cover communications networks and their operators, databases and software, data protection, AI, cybersecurity as well as the author’s view on planned future reforms of the merger control regime.

This Q&A is part of the global guide to Technology. For a full list of jurisdictional Q&As visit http://www.inhouselawyer.co.uk/index.php/practice-areas/technology

  1. Are communications networks or services regulated? If so what activities are covered and what licences or authorisations are required?

    The installation and operation of networks open to the public and the provision of electronic communication services to the public are free. Network operators and service providers (all of which are designated as ‘operators’) must, however, file a declaration with the national regulatory authority, the Autorité de Régulation des Communications Electroniques et des Postes (ARCEP). Through this declaration, the operators undertake to comply with the regulatory regime defined by the EU ‘Telecom Package’ (dating from 2002 and 2009, currently under revision) and the French Posts and Electronic Communications Code (CPCE, Art.L.32 et seq.).

    An individual authorization by the ARCEP is required for the use of certain frequency bands such as those allocated to mobile telephony services (GSM, UMTS), radio local loops, radio-relay systems or satellite networks. In these cases, the regulatory regime is reinforced though additional obligations that are set out in the operators’ licenses.

    On the other hand, a declaration to the ARCEP is not required for ‘independent networks’ (this corresponds to telecom services exchanged within closed user groups (ie. ‘VPN’)) or to radio installations using short-range frequencies that are not dedicated to their users (e.g. WiFi, Bluetooth).

  2. Is there any specific regulator for the provisions of communications-related services? Are they independent of the government control?

    Communications networks and services are regulated by a national regulatory authority, the ARCEP.

    This agency has been recognized as an ‘independent government authority’ by the French constitutional court since its inception in 1996 and is subject to the Act of 20 January 2017 which settles the rules applicable to such bodies, such as incompatibilities, conflicts of interests, professional secrecy. Its members may not be revoked during their assignment and may not receive orders or instructions from the government.

    The ARCEP’s decisions on the definition of market segments and the remedies to potential lacks of competition are guided by the recommendations of the EU Commission, which also holds a veto power. The Authority’s regulations on the terms of use of the different categories of telecom networks and services are homologated by the ministry in charge of electronic communications. The ARCEP reports to a commission comprised of members of Parliament (Commission Supérieure du Numérique et des Postes) and is frequently auditioned by the Parliament. Its regulatory decisions are subject to the jurisdiction of France’s highest administrative court, the Conseil d’Etat, and its decisions on individual disputes between operators may be brought before the Paris Court of Appeals.

    Being under such scrutiny, the ARCEP strives to strike a balance between the main market players and the government. Its position and role in the telecommunications landscape is well illustrated by its contribution to an agreement reached in January 2018 with the government and the mobile operators to help generalize 4G cover throughout the territory due, in particular, to increased mutualization of networks.

  3. Does an operator need to be domiciled in the country? Are there any restrictions on foreign ownership of telecoms operators?

    An operator is not required to be domiciled in France (i.e. to create a subsidiary, register a branch, or other) in order to operate a network or provide communications services in the country.

    Pursuant to EU directives, each EU Member State must ensure that access to its telecom market is not unduly restricted. The ministry in charge of electronic communications and the ARCEP must nonetheless ensure that equivalence of treatment is respected regarding outbound and inbound traffic with foreign countries, including as concerns conditions of access to networks abroad.

    Foreign investments in France may be subject to prior approval by the Ministry of Economy when pertaining to sectors that involve the country’s interests in terms of public order, public security or national defense, such as with regard to the integrity, security and operating continuity of electronic communications services and networks.

  4. Are there any regulations covering interconnection between operators? If so are these different for operators with market power?

    Interconnection between operators is regulated pursuant to the EU ‘Telecom Package’ and the French CPCE (see Question 1). For instance, operators of networks open to the public must accept interconnection with their peers, unless their refusal is duly motivated. The ARCEP may control any interconnection agreement as well as any agreement for the sharing of a radio network and may, in certain cases, impose specific requirements on the parties in an ‘objective, transparent, non-discriminatory and proportionate manner’. Similar duties apply to infrastructure managers such as railways or highway operators and to those who set up or manage optical fiber broadband lines to end users.

    Furthermore, the ARCEP identifies and lists the operators which have a significant market power on a market segment and may impose more stringent obligations on them, such as to publish an interconnection tariff and to offer services under non-discriminatory terms to other operators.

  5. What are the principal consumer protection regulations that apply specifically to telecoms services?

    In addition to the general provisions of the Consumer Code, an electronic communications operator is subject to specific requirements, in particular in terms of:

    • real time information on its offering and tariffs, on the consequences of unlawful use of its services by the customer (e.g. in respect of copyright infringement), on the ways to protect individual security and personal data, on number portability – without this list being exhaustive;
    • insertion in consumer agreements of certain provisions, such as on indemnification in case of failure to maintain the proposed quality of service; and limitations to the possibility to require a minimum term of service;
    • performance of its service agreements, for instance with the prohibition of additional access charges for communications to an after-sale service or a helpdesk.
  6. What legal protections are offered in relation to the creators of computer software?

    Software programs are legally protected by copyright under the Intellectual Property Code (IPC), provided they are original. According to case law, ‘original’ means that the way a program is written reflects the author’s personality or personal efforts. Copyright grants the software publisher the exclusive right to authorize the use, copying and initial distribution of its program for a period of 70 years from the year of publication.

    This legal protection applies to source code and object code regardless of the kind, form of expression, merit or purpose of the program. Copyright may also apply to preparatory design materials (e.g. specifications), graphical user interfaces or embedded multimedia elements, or even to the title of the program. However, the software medium (e.g. CD Rom), the ideas and concepts embodied into the software and, more generally, its functionality, are not protected by copyright.

    Patent protection cannot apply to computer software programs “per se,” but only insofar as they are used within patentable inventions (i.e. may cause a "technical effect"). Filing a piece of software with a software registrar is still useful, because this will provide evidence of the date of creation. Other than that, confidentiality remains the best protection for the program’s source code.

  7. Do you recognise specific intellectual property rights in respect of data/databases?

    Pursuant to EU directive 96/9/EC of 11 March 1996, a database may be subject to both copyright, which may benefit its author in respect of his original selection or arrangement of the contents of the database, and to a specific, sui generis right that will inure to its ‘producer’ for a period of 15 years, irrespective of the originality of the database.

    Under the French Intellectual Property Code, the ‘producer’ of a database is defined as the person who initiated the investments in the database and assumed the associated risks, when the investment in the obtaining, verification or presentation of the contents is substantial from a financial, material or human standpoint. Due to this sui generis database right, the effort made in developing a database that is a compilation of information or commonplace data, such as a telephone directory or football match listing, may still be protected despite its lack of originality. This protection enables the producer to prohibit extraction or re-utilization of the whole contents or of a substantial part thereof.

    Nevertheless, under the above rules only the database is protected, not data per se: data is considered as information and, out of principle, information should circulate freely (unless made confidential by an exclusive information holder or recognized as confidential under the law).

  8. What key protections exist for personal data?

    The General Data Protection Regulation 2016/679 issued by the EU Parliament and Council on 27 April 2016 (‘RGPD’/‘GDPR’) replaced the existing 1978 Act on 25 May 2018. Implementation texts will be adopted at both EU and national levels.

    Under the GDPR, personal data may be collected and further processed only under certain conditions, such as:

    • when the concerned person (‘data subject’) has consented;
    • when it is necessary for the performance of a contract to which the data subject is a party, or to comply with a legal obligation imposed on the data controller; or
    • where it is necessary to safeguard an individual’s vital interests or for the performance by the data controller of its public interest mission or official authority;
    • where there is a ‘legitimate reason’ for the processing, provided this does not harm the data subject's fundamental rights and freedoms.

    Other key protections must be followed by the ‘data controller’ (i.e. the person who determines the purposes and means of the data processing) such that:

    • the personal data must be processed lawfully, fairly and in a transparent manner;
    • personal data must be collected for specified, explicit and legitimate purposes and must be subsequently processed in accordance with these purposes;
    • the personal data that is collected must be adequate, relevant, and non-excessive in view of the purposes for which it is collected (this is called ‘data minimisation’);
    • all personal data must be accurate and, when necessary, kept up to date;
    • personal data must not be retained for longer than necessary in light of the purposes for which it is processed; and
    • the data controller must implement appropriate organizational and technical measures to ensure the security and confidentiality of the personal data, both against unauthorized or unlawful processing and against accidental loss, destruction or damage to the data.

    Data subjects are granted certain specific rights that include the right to access their personal data and to request correction, deletion and/or portability of such data.

    More stringent rules may apply, depending on the sensitivity of data at stake. Individuals may file claims with their national authority (in France, Commission Nationale de l’Informatique et des Libertés - CNIL).

  9. Are there restrictions on the transfer of personal data overseas?

    The transfer of personal data out of the territory of the European Union is permitted only if the destination country provides a level of protection that is considered as “adequate” by the EU Commission, that is, equivalent to the protection afforded within the EU, or if the controller or processor has provided appropriate safeguards so that data subjects may still be able to enforce their rights when their data is transferred.

    It should be emphasized that acceding remotely to data from abroad, irrespective of where data is stored, is considered as a transfer and, therefore, requires compliance with the GDPR.

    In the absence of an adequacy decision, the appropriate safeguards which the data controller or data processor in the third country must provide may include:

    • a legally binding and enforceable instrument between public authorities or bodies;
    • the data importer’s commitment to binding corporate rules applied within the group of enterprises to which both the data exporter and itself belong;
    • joint subscription, with the data controller, to standard data protection clauses adopted or approved by the Commission for this purpose;
    • adherence to an approved code of conduct with binding and enforceable commitments; or
    • certification through an EU approved mechanism.

    Once such safeguards are in place, the concerned data may be transferred outside the EU without the need for an individual authorization, provided that enforceable data subject rights and effective legal remedies for data subjects remain available. As an alternative safeguard, the data exporter and data importer may agree on their own terms and conditions (rather than the EU standard clauses), but then the data transfer will be subject to prior authorization by the competent supervisory authority.

    The requirement for an adequacy decision by the EU Commission or for individual commitments to safeguards by the data importer may be set aside where:

    • the data subject has been informed of the possible risks of the transfer and expressly consents to such transfer,
    • or where the transfer is necessary for the performance of a contract between the data controller and the data subject; or necessary for important reasons of public interest; or for the establishment, exercise or defence of legal claims; or for the protection of an individual’s vital interests, where the data subject cannot provide consent.

    In addition, it should be noted that a data transfer required from the data importer by a judgment or an administrative decision issued in the third country will only be recognised or enforceable under EU law if it is based on an international agreement, such as a mutual legal assistance treaty.

  10. What is the maximum fine that can be applied for breach of data protection laws?

    Under the GDPR, the maximum amount that may be imposed by the CNIL amounts to 20 million euros or 4% of the data controller’s global turnover, whichever is greater. However, this only concerns certain types of breaches, such as non-compliance with the rights conferred on data subjects. The GDPR provides for graduated sanctions regarding other types of breaches.

  11. Are there any restrictions applicable to cloud-based services?

    Most, if not all, cloud-based services involve the processing and/or transfer of personal data within the meaning of the GDPR. Consequently, the clients of such services should be considered as ‘data controllers’ and must assume full responsibility to comply with the associated obligations from the beginning to the end of the processing (including those described in Questions 7 and 8), even although they might still believe that delegating their IT activities to cloud service providers should exempt them therefrom.

    When qualifying as ‘data processors,’ cloud service providers must comply with a series of specific obligations also set forth in the GDPR, including, in particular:

    • to process data only in accordance with documented instructions from their clients;
    • to take all measures required to ensure a level of security appropriate to the risks incurred by the data and data subjects;
    • and to provide information as necessary (including through audits) to demonstrate that they comply with their obligations.

    Where cloud-based services include standard services governed by adhesion contracts, i.e., the service provider plays a role in defining the ways and means and, possibly, the purposes of the data processing, the service provider could appear to be jointly liable with its client towards the persons concerned (‘data subjects’). This is why the GDPR requires ‘data controllers’ and ‘data processors’ to specifically define, in their agreements, the allocation of their obligations and responsibilities regarding personal data processing..

    In order to help clarify such situations, efforts are being made to nurture the development of codes of conduct (such as by software end user groups), standards (for instance, SecNumCloud initiated by a government agency, ANSSI), as well as certifications.

    Aside from general texts, sector-specific regulations applicable to the outsourcing of IT activities may apply to cloud based services insofar as they form a subset of outsourcing services. Recommendations or guidelines specific to cloud based services have also been issued by regulatory authorities such as the Autorité de Contrôle Prudentiel et de Résolution (ACPR), in the bank and insurance sector, and the CNIL.

  12. Are there specific requirements for the validity of an electronic signature?

    According to the Civil Code provisions that implement EU legislation governing this matter (latterly, EU regulation 910/2014 of 23 July 2014), an electronic signature is considered as a ‘signature,’ that is, as effectively identifying the author of an act and showing his consent, only when it results from a reliable identification process that guarantees its connection with the act. Qualified electronic signatures are deemed by statute to offer such reliability and, consequently, to have the same legal effects as a handwritten signature, because they fulfil certain requirements that are set out in regulations.

    These requirements include the use of a qualified certificate which must be delivered to the signatory in person, as well as other requirements that, in practice, are seldom fully satisfied. Accordingly, so called ‘electronic signatures’ in current use on the market may most often not be considered as ‘qualified electronic signatures’ under the law. This means that, when challenged before the courts, their users will have to demonstrate their probative value.

  13. In the event of an outsourcing of IT services, would any employees, assets or third party contracts transfer automatically to the outsourcing supplier?

    According to the Labor Code (Art. L.1224-1), which implements EU directive 2001/23/EC of 12 March 2001 on safeguarding employees’ rights in the event of transfers of undertakings, businesses or parts thereof, an automatic transfer of all employment contracts may occur in the event of a change in the employer’s legal situation, in particular as a result of a sale or merger of an undertaking, provided the outsourced activities constitute an “autonomous economic entity” as defined by case law, i.e., an organised group of persons and assets that will be able to continue business to reach a specific goal.

    As regards assets, an automatic transfer may take place in the context of a company merger, a corporate split, or the contribution of a whole business branch that involves a transfer of all associated assets and liabilities. Agreements personally inherent to the co-contracting party (“intuitu personae”) may not follow the transfer, however, if such other party does not grant its consent thereto.

  14. If a software program which purports to be an early form of A.I. malfunctions, who is liable?

    Currently, the general principle is that a person shall be considered as liable not only for damages he/she causes through his/her own act, but also for those caused by items under his/her custody (Civil Code, Art.1242). The scope of application of this principle may decrease as items that are prompted by Artificial Intelligence deviate from their owner’s custody.

    To further develop this liability principle and following EU legislation, a strict liability regime was enacted in 1998, which applies to the producer of a product in regard to damages caused by a defect to his product. This liability applies irrespective of whether or not the producer is bound to the victim by contract (French Civil Code, Art.1245 et seq.). Strict liability makes things easier for the victim, who may sue the manufacturer, a supplier of individual parts or, ultimately, the reseller of the product. The victim must only prove the lack of security of the item.

    This liability regime might address situations where AI is embedded or associated to pieces of equipment or hardware, but less so in respect to a software program per se insofar as this is not (yet) considered as a ‘product.’ In addition, risks that will arise after fabrication and sale, due to the AI autonomous learning curve, will ultimately cease to be predictable and might barely be assumed by the initial designer. Disputes may then arise as to who is liable: the owner of the item, the designer, the software programmer, or the user instructing the item…

    In this context, the liability regime in respect of artificial intelligence seems likely to evolve.

  15. What key laws exist in terms of obligations as to the maintenance of cyber security?

    Key legal provisions in respect of cybersecurity include in particular:

    • Article 32 et seq. of the GDPR which require the data controller and the data processor to implement appropriate technical and organizational measures to ensure a level of security appropriate to the risk incurred by the personal data they process, and to notify breaches to the supervisory authority (i.e. the CNIL), except those unlikely to result in a risk to the rights and freedoms of natural persons;
    • the Military Programming Act of 18 December 2013, pursuant to which the State must rule on certain obligations such as the prohibition of certain systems connected to the internet; encourage the implementation of detection systems by certified providers; audit the security level of critical information systems; and, in the event of a major crisis, impose the necessary measures on ‘operators of vital importance’;
    • EU directive 2016/1148 of 6 July 2016 and the Act no.2018-133 of 26 February 2018 provide for, amongst other measures, a high common level of security of networks and information systems between member States, including through standardization; for security and notification requirements on operators of ‘essential services’ as well as on digital service providers; for the creation of a computer security incident response team network.
  16. What key laws exist in terms of the criminality of hacking/DDOS attacks?

    The Act no.88-19 of 5 January 1988 on software fraud creates various offenses such as fraudulent access or continued presence within all or part of an automatic data processing system and covers the criminality of hacking and DDOS attacks. This act was amended recently in order, in particular, to increase the quantum of applicable penalties.

  17. What technology development will create the most legal change in your jurisdiction?

    The block chain is generating the most substantial legal change, in France like elsewhere. As this technology relies on a chronological transactions database that is both distributed and encrypted, it ensures the integrity of the identification of the author of a legal act as well as the apparently flawless traceability of the origin and subsequent stages of a transaction.

    While countries are competing to attract Initial Coin Offerings (ICOs) with crypto-currencies based on this technology, this type of transaction challenges lawmakers because it by-passes the rules applicable to public tenders on stock markets, enables start-ups and businesses to raise funds beyond government agencies’ control, and offers anonymity despite anti-money laundering regulations. Currently, the tendency of the French regulator is to refrain from regulating this market directly in order not to restrain its development and, rather, to foster the development of reliable tools such as an optional visa which could be granted by the ACPR on currency issuances.

    Similar challenges will be posed by the block chain in other areas of the law such as with regard to financial transactions, land registration, royalty collection societies, the issuance of bonds, etc..

  18. Which current legal provision/regime creates the greatest impediment to economic development/commerce?

    The French tax regime appears to be cumbersome for many projects, due to its complexity and tax impact. As an example, while efforts are being made to foster the development of Initial Coin Offerings in France, the tax regime applicable to such type of transactions appears not to be fully settled. Should a token be considered as a share or an interest in future revenues, it might then be subject to income taxes, which would act as a deterrent compared with other countries.

  19. Do you believe your legal system specifically encourages or hinders digital services?

    Digital services are ruled in essence by EU legislation, which has largely shaped French consumer law over the last twenty years or so. Therefore, national specificities will less be found in the legal system than in the economy.

    However, France has recently become an ardent proponent of digital administration, in particular through simplifying and transferring procedures for the internet (for instance, with public procurement platforms and digital invoicing mechanisms) and through disseminating an increasing volume of open data (www.data.gouv.fr).

  20. To what extent is your legal system ready to deal with the legal issues associated with artificial intelligence?

    French authorities have for many years understood the importance of setting up mechanisms to foster pilot projects as well as large-scale experiments in the area of Artificial Intelligence (for instance, ‘France is AI,’ ScanR, etc.), but also the necessity to develop a regulatory framework that will protect consumers and citizens at large.

    In respect of the foregoing, the GDPR prohibits any legal person to enable a decision legally binding on an individual to be made on the sole basis of a data processing that will profile such individual or assess certain aspects of his personality. Furthermore, the Act no.2016-1321 of 7 October 2016 for a digital Republic establishes the right for a natural person to be informed when an individual decision concerning him/her is taken on the basis of an algorithm and to request communication of the rules and major features of this data processing.

    In light of the risks raised by Artificial Intelligence, essentially, the risk that “It would take off on its own, and re-design itself at an ever increasing rate" (Stephen Hawking), legislators may be required to reinforce such principles (along with the liability regime, as explained in Question 13) as AI develops and becomes less and less understandable to those subject to its stipulations.