Are platform providers (social media, content sharing, information search engines) regulated?
Technology (3rd edition)
Platform providers are not regulated in Armenia except for the following cases:
- If they apply for tax exception (if it is the case under the Law of Armenia on state support to the information technologies sector); the regulation is in the limits of tax law provisions;
- If they process personal data; the regulation applies in the scope of given for personal data protection.
No, platform providers are not regulated in the country, consequently all carriers and participants have the freedom to elect the vendor that best suits their needs, as well as to select the service platform of their preference.
As mentioned above, the Press and Media Law has provided for a very broad definition of websites. Accordingly, such definition covers social media and may extend to cover search engines as well (if they provide advertisement content).
In addition to the above, the Cyber Crimes Law regulates the operations of content sharing and storage service providers although it does not require obtaining a license.
There are no special legal acts regulating social media, content sharing or information search engines. However, different legal acts regulate certain areas of these activities.
The Estonian Information Society Services Act stipulates requirements for information society service providers. Information society services are services which are provided without the parties being simultaneously present at the same location, and such services involve the processing, storage or transmission of information by electronic means intended for the digital processing and storage of data.
With regard to search engine providers, it is important to note that the Information Society Services Act provides for the restricted liability upon provision of information storage service. Where a service is provided that consists of the storage of information provided by a recipient of the service, the service provider is not liable for the information stored at the request of a recipient of the service when two conditions are met. First, the provider does not have actual knowledge of the contents of the information and, as regards claims for the compensation of damage, is not aware of facts or circumstances from which the illegal activity or information is apparent. Secondly, the provider, upon obtaining knowledge or awareness of the facts, acts expeditiously to remove or to disable access to the information.
The Estonian Media Services Act stipulates, among others, the requirements for on-demand audiovisual media services. Most importantly, this act regulates the content of the media service which is provided. For example, the act stipulates that when providing media services, it is prohibited to incite hatred on the basis of sex, racial or ethnic origin, beliefs or religion or the degrading of the lawful behaviour or violation of law in any of the programmes.
In the Estonian Law of Obligations Act there is a clause which stipulates that it is unlawful to disclose incorrect information or incomplete or misleading factual information which interferes with the economic or professional activities of a person.
The providers of online public communication services based on the classification or referencing of content, goods or services offered or posted online by third parties, by means of computer algorithms, or of services offering to connect various parties with a view to selling a good, provide a service, exchange or share a content, good or service, are considered “online platform operators” and governed by provisions introduced into the Consumer Code in 2016.
Pursuant to these provisions, these operators must provide consumers with fair, clear and transparent information on the way their platform work, on any existing contractual or financial relationships that may influence the classification or referencing, and on the rights and obligations of the parties with whom connection is proposed. Besides, platform operators are encouraged to develop self-regulation through the elaboration of codes of conduct. The government officials in charge of competition, consumer affairs and fraud enforcement can investigate and record violations of the Code.
Furthermore, when platform providers delivering services in France determine the features of the services provided or the goods sold and fix their price, they are subject to certain obligations towards self-employed workers whom they connect with their customers. For example, pursuant to the Labour Code and subject to exceptions, they must cover the insurance contribution covering the risks of accidents at work or the contribution to the vocational training of these workers. They must refrain from sanctioning union membership or strike movements defending professional demands.
These provisions do not prevent the possibility that such workers be reclassified as employees if their relationship with a platform shows the features of an employment contract. Government bodies such as those in charge of the family allowances fund (URSSAF) ensure that the status of employees is respected and can initiate proceedings to collect the associated contributions, as has been seen in disputes brought against Über.
Yes. The regulation of such services will depend primarily on where in the Catalog each specific service might fall. For example, information search engines fall under the banner of Internet information services (Category B25 under the Catalog) and are regulated primarily by the MIIT. Social media and content sharing services will likely fall under the same Catalog category but may also involve other regulators, e.g., the National Radio and Television Administration if such services include any audio/video functions.
The platform providers are not regulated.
No specific regulation exists regarding “platform providers”.
However, several provisions of law apply to platform providers, such as:
- Legislative Decree no. 70 of 9 April 2003, implementing EC Directive no. 31/2000 on information society services, including electronic commerce;
- Legislative Decree 6 September 2005 n. 206 (Italian Consumer Code);
- Regulation (EU) no. 2016/679 (GDPR) and Italian Legislative Decree no. 101/2018.
There are no laws and regulations specifically targeting platform providers (e.g., social media, content sharing, and information search engines) in general, but depending on the nature of their services and roles, they might be subject to certain industry-specific laws and other regulations. For instance, social media platform providers might be regulated by the Act on the Protection of Personal Information (APPI) with respect to their handling of personal data and/or the Telecom Act as for the privacy of communications between users on their platform. One notable recent movement in Japan is the enactment of the Private Lodging Business Act in 2017, under which platform providers are regulated as private lodging agents serving as brokers for private lodging services between guests and private lodging business operators (typically, landlords and lessees).
In general, platform providers are not regulated in Malaysia. Only telecoms operators which carry out the functions of Network Facilities Providers, Network Services Providers, Applications Service Providers and Content Applications Service Providers are regulated as provided in the CMA.
Currently, there is no specific legislation or regulation in Malta relating specifically to digital platforms. However, the e-Commerce Directive (Directive 2000/31/EC), which was transposed into Maltese law in 2002 through the Electronic Commerce Act (Chapter 426 of the Laws of Malta), creates a basic legal framework for online services including platform providers such as social media, content sharing and information search engines.
There is no specific regulation of platform providers. General consumer protection and privacy laws apply (e.g. Fair Trading Act 1986 ("Fair Trading Act"), Consumer Guarantees Act 1993 ("Consumer Guarantees Act"), and the Privacy Act 1993 ("Privacy Act")).
New Zealand consumer law applies to goods or services provided to people in, or business carried on in, New Zealand. The Commission can regulate such activities, and in doing so can initiate enforcement action against residents of other countries. The Privacy Act is discussed below.
The Harmful Digital Communications Act 2013 applies to online content hosts (including any organisation that hosts websites or social media platforms in New Zealand). Online content hosts may be civilly or criminally liable for the content that is on their website unless they follow a prescribed process, which requires complaints to be received and dealt with in a prescribed way.
Platform providers might be subject to varieties of laws and regulations depending on in which type activities it engages. Laws such as consumer protection law, copyright law, competition law, Indonesian Criminal Code might apply to platform providers. However, generally, the law which has the most correlation with platform providers is Law 11/2008 and its implementing regulations.
Law 11/2008 and its implementing regulations specifically apply to all Electronic System Provider (“ESP”) (in which all platform providers shall fit under ESP definition) and any activities conducted in electronic system. Law 11/2008 sets out general rules on electronic system certification, privacy, domain name, as well as prohibition on certain acts committed electronically (cybercrime).
In addition to the above, MCI has also issued Circular Letter of Minister of Communication and Informatics No. 3 of 2016 on the Provision of App-based Service and/or Internet-based content (OTT) (“Circular Letter 3/2016”). The Circular Letter 3/2016 does not have legally binding power to the public. However, it must be perceived as formal position of MCI in matters relating OTT in Indonesia. To provide legal certainty to OTT players in Indonesia, the MCI is currently preparing regulation on OTT which further regulates the OTT activities in Indonesia.
Generally, the above-mentioned platform providers (i.e. social media, content sharing, and information search engines) are not regulated. However, PTA being the telecom regulator in Pakistan, will implement policies to block websites with blasphemous, un-Islamic, offensive, objectionable, unethical, and immoral material. In this regard, PTA as and when directed by the Federal Government can direct/require its licensees to implement IP/URL blocking/filtering protocols.
Having stated the foregoing, whoever with dishonest intention (i) gains unauthorized access to any information system or data, or (ii) and unauthorised access, copies or otherwise transmits or causes to be transmitted any data, or (iii) interferes with or damages or causes to be interfered with or damages any part or whole of an information system or data, or (iv) interferes with or damages, or causes to be interfered with or damaged, any part or whole of a critical information system, or data, shall be punishable with imprisonment, under the Prevention of Electronic Crimes Act, 2016 (the “PECA”).
It is pertinent to note that the provisions of PECA are not only specific to the licensees (including MNOs) of PTA but the scope of PECA extends to every citizen of Pakistan, wherever he may be, and also to every other person for the time being in Pakistan. The same also applies to any act committed outside Pakistan by any person; whereby the act constitutes an offence under PECA and affects any (i) person, (ii) property, (iii) information system, or (iv) data, in Pakistan.
For the purposes of the foregoing:
(i) The term ‘information system’ includes, electronic system for creating, generating, sending, receiving, storing, reproducing, displaying, recording or processing any information;
(ii) The term ‘data’ includes, any representation of fact, information or concept for processing in an information system including source code or a program suitable to cause an electronic system for creating, generating, sending, receiving, storing, reproducing, displaying, recording or processing any text, message, data, voice, sound, database, video, signals, software, computer programs, any forms of speech, sound, data, signal, writing, image or video, to perform a function or data relating to a communication indicating its origin, destination, route, time, size, duration or type of service.
Further, PECA provides that a service provider shall, within its existing or required technical capability, retain its specified traffic data (data relating to a communication indicating its origin, destination, route, time, size, duration or type of service) for a minimum period of one year or such period as PTA may notify from time to time and, subject to the production of a warrant issued by the court, provide that data to the investigation agency or the authorised officer whenever so required.
For the purpose hereof, a ‘service provider’ means to include a person who:
a) acts as a service provider in relation to sending, receiving, storing, processing or distributing any electronic communication, or the provision of other services in relation to electronic communication through an information system;
b) owns, possesses, operates, manages or controls a public switched network or provides telecommunication services; or
c) processes or stores data on behalf of such electronic communication service or users of such service.
Service providers are required to retain traffic data by fulfilling all requirements of data retention and its originality, as per the provisions of the PECA.
At present, there is no Romanian framework specifically targeting platform providers.
However, EU regulations such as the GDPR also apply to platform providers inasmuch the collection, storage and processing of personal data is concerned.
Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC (the “Copyright Directive”) imposes specific obligations on online content-sharing service providers performing an act of communication to the public (or an act of making available to the public) for the purposes of the Copyright Directive when same give the public access to copyright-protected works or other protected subject matter uploaded by its users. Member States are under the obligation to transpose the Copyright Directive by 7 June 2021.
Furthermore, as part of the digital market, there is currently a Proposal for a Regulation of the European Parliament and of the Council on promoting fairness and transparency for business users of online intermediation services undergoing the legislative process at EU level. The proposal essentially lays down rules to ensure that business users of online intermediation services and corporate website users in relation to online search engines are granted appropriate transparency and effective redress possibilities.
In addition, the European Commission is carrying out an in-depth analysis of algorithmic transparency and accountability. This project is aimed at providing for an in-depth policy-relevant study of the role of algorithms in the digital economy and society, in particular how algorithms shape, filter or personalize the information flows that they intermediate.
The European Commission has also issued the Recommendations on measures to effectively tackle illegal content online of March 1, 2018. The recommendation provides that Member States and hosting service providers (in respect of content provided by content providers, which they store at the request of those content providers), are encouraged to take effective, appropriate and proportionate measures to tackle illegal content online (illegal content means any information which is not in compliance with EU law or the law of a Member State concerned).
In any case, platform providers carrying business in Romania are to comply with other relevant areas of law imposing specific obligations and standards such as the rules that govern electronic commerce, consumer protection, misleading advertising, audio-visual services (where applicable), e-commerce, competition, etc.
Currently, platform providers are subject to limited regulation only. Under the TBA, platform providers are classified as value-added telecommunications business operators, and thus must either report to or register with the MSICT depending on the nature of the services they provide (see 1.2(2) above for more details). However, these reporting and registration requirements applicable to value-added telecommunications service providers are not often enforced strictly by the MSICT since the online service market is evolving rapidly almost on a daily basis.
Also, platform providers qualify as telecommunications business operators or “information and communications service providers” (i.e., persons who provide information/data services over a telecommunications network) for the purposes of the Network Act. So, for example, platform providers’ protection of their customer/user information would be subject to the relevant requirements of the Network Act.
In Spain there is no regulation that specifically targets platform providers such as social media or content sharing yet. However, video sharing platforms are soon to be covered by regulation given a recent change to the Audio-visual Services Directive (Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010), by means of the Directive (EU) 2018/1808, which establishes a specific audio-visual regulatory framework for those platforms. Member States have a transposition period until 19 September 2020 to adopt the laws, regulations and administrative provisions necessary to comply with such directive.
On the other hand, Law No. 34/2002 of 11 July 2002 on Information Society Services and Electronic Commerce (hereinafter "the E-commerce Act") establishes the liability regime applicable to the intermediation activities carry out by Internet Service Providers ("ISP") that applies to platform providers. The E-commerce Act states that linking and hosting ISP will not be liable for the information to which they direct or host if: (i) they do not have actual knowledge that the information is unlawful and (ii) they do have knowledge that the information is unlawful and act diligently to remove or disable the content.
This said, the recently adopted Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market (hereinafter "Copyright Directive") puts more direct responsibility on platforms to make sure that copyright infringing content is not hosted on their sites. Specifically, Article 17 of the Copyright Directive states that online content sharing service providers (category in which YouTube or Facebook would fall) shall obtain from license holders a license to publicly communicate the content protected by intellectual property rights that their users upload. In the event that it does not have the corresponding license, the platform must adopt measures to prevent the infringing contents from being available in it, which in practice will involve resorting to content recognition technologies. This has been one of the most controversial issues of the Copyright Directive, therefore we will have to wait and see how it is develops in practice and its implementation by Member States, which will have until 21, June of 2021 to do so.
In addition, there are other measures that will probably be implemented at European level in the following months regarding the monitoring of content online that will affect platform providers. For example, there is a UE regulation proposal called "preventing the dissemination of terrorist content online" that would force platforms to remove terrorist content in one (1) hour; otherwise they could face huge economic penalties.
Finally note that there are other regulations that indirectly affect platform providers. For example, a platform provider whose services fall within the scope of Regulation (EU) 2017/1128 on cross-border portability of online content services in the internal market shall comply with the obligations underlined in such regulation, which mainly consist in enabling a subscriber who is temporarily present in a Member State to access and use the online content service in the same manner as in the Member State of residence, including by providing access to the same content, on the same range and number of devices, for the same number of users and with the same range of functionalities.
Yes, the Bulletin Board System Act (Sw. Lag (1998:112) om ansvar för elektroniska anslagstavlor) and the GDPR are applicable. As a starting point, the person who publishes something in social media that is regarded as personal information is responsible for the personal information that the publication entails. The company that provides the platform may also be liable if the company has the ability to influence posts or determine which posts shall be published.
In some cases, a publication may be covered by the so-called private exemption in the GDPR. According to the private exemption, the GDPR shall not apply to the processing of personal data carried out by a natural person in the course of a purely personal or household activity. If a person publishes personal data for a wider audience, for example by publishing pictures or other things in social media, then it is not to be considered a matter of purely private nature. This means that the private exemption does not apply and the person who publishes becomes the data controller for the publication.
According to the Bulletin Board System Act the provider/administrator of an electronic bulletin board is required to provide information to anyone who uses the service about the provider’s/administrator’s identity and the extent to which incoming messages become available to other users. The provider/administrator should also have such oversight of the service that is "reasonably required with regard to the scope and direction of the business". According to the law, the provider of an electronic bulletin board is also responsible for removing or otherwise preventing messages from spreading if:
- the content obviously means unlawful threat, unlawful violation of personal integrity, incitement, agitation against an ethnic group, child pornography offense or unlawful depiction of violence, or
- it is evident that the user has infringed the copyright or rights protected by the copyright law by submitting the message (eg attached copyrighted material).
If the provider/administrator is responsible for electronic message boards and a message is posted that contains, for example, material infringing copyright or racist statements, the provider/administrator is obliged to remove it as soon as possible. If the provider/administrator does not do so within "reasonable time", the provider/administrator may be held liable for violations of the Bulletin Board System Act, which may result in a fine or even imprisonment.
Platform providers providing services in Taiwan shall comply with all general legal requirements under Taiwan law, including without limitation to laws and regulations with regard to consumer protection, personal data protection, competition, etc. Currently, there are no laws or statutes specifically drafted with the purpose of regulating the platforms. There have been one or two draft bills that may be regulating the digital platforms but they are still pending at the Legislative Yuan.
Meanwhile, if a platform is engaging in certain business subject to sectorial regulations, they will need to comply with such regulations. For example, “Uber” has been regulated as a “transportation” business in Taiwan and the Taiwan government have been imposing unfriendly restrictions to Uber’s business model. For the sharing economy platforms delivering meals, such as Food Panda, Deliveroo, and Uber Eats, the regulator in charge of the “restaurant” business has been trying to regulate them as part of the food industry.
Yes. Law No. 5651 regulates access ban procedures of Internet contents, and determines the obligations and responsibilities of content providers, hosting providers and access providers. Search engines are not defined under the law.
Platform Providers in the UK are regulated to an extent by the Electronic Commerce (EC Directive) Regulations 2002 which implement the Articles 12-15 EU Directive 2000/31/EC. The regime applies to content which appears on platforms, but with respect to which the platform operator performs only certain technical functions (services whose primary function is hosting content contributed by others). The legislation imposes obligations upon a seller before a contract is formed and information that must be provided to the consumer.
The Statutory Code of Practice for providers of online social media platforms has been published in accordance with Section 103 of the Digital Economy Act 2017. The Code provides guidance for social media platforms, in advance of the new regulatory framework envisaged in the Online Harms White Paper. It sets out actions that the Government believes social media platforms should take to prevent bullying, insulting, intimidating and humiliating behaviours on their sites.
Platform providers are subject to the regulations of the FCC, but are not regulated entities after the repeal of the net neutrality rules.