If a software program which purports to be an early form of A.I. malfunctions, who is liable?
Currently, liability for malfunctioning artificial intelligence programs may be based on tort, in which liability is found only where the developer was negligent or could foresee harm, or on contractual agreements such as End User License Agreements, in which the parties allocate the risk of liability among themselves.
The CCA provides that any producer shall be liable for the damage caused by a defect in his product. The term ‘producer’ has several meanings assigned to it by the CCA, including: ‘the manufacturer of a finished or processed product’; ‘manufacturer of a component part’; or any person who imports into an EEA State a product for sale, hire, leasing or any other form of distribution, where the product is manufactured or produced outside a Member State. The injured party shall solely be required to prove the damage, defect and causal link between the defect and damage and shall not be required to prove fault on the producer’s part. A supplier may be deemed to be a producer in the event that the producer or the importer of an imported product cannot be identified and the supplier fails to answer the injured party’s request to provide the identity and full address of the producer in question, within the stipulated time period.
When the liability is a question between the software provider and a purchasing party, the allocation of liability would primarily depend on the contract between these parties and/or mandatory consumer protection law, where applicable. If no such regulation exist, is lacking or if the question of liability concerns a third party, the allocation of liability would be decided by either Act no. 27 of 13 May 1988 on the Sale of Goods, non-statutory contract or tort law or otherwise applicable laws and regulations.
The choice of applicable law depends on how the software was provided, how and when the malfunction manifested itself, the consequences of said malfunction and an assessment of the respective parties’ degree of culpability.
For instance, if a company or consumer purchased the software on a physical disk, the Sale of Goods Act or Act no. 34 of 21 June 2002 on Consumer Protection would usually apply. Under these Acts, software malfunctions would normally constitute a defect that could give grounds for the purchasing party’s claims of repair, replacement or price reduction etc. While compensation may also be claimed in such instances, the aforementioned acts disclaims indirect losses as unallowable.
While there is no specific legislation as to A.I. in Turkish Law, the owner and the producer of the software program would be held liable by applying “strict liability” rules under the Turkish Code of Obligations through analogy.
Under the current law, AI developers and operators may be liable for AI malfunctions. When AI malfunctions, the AI operator may be liable in the first place and the AI user may seek liquidated damages and other remedies based on the user contract; while the AI developer may be liable eventually if the AI developer breaches the AI development contract, resulting in the AI malfunction. According to the Consumer Protection Law, when the AI user’s legitimate interests are infringed due to AI’s malfunction, the AI user may claim compensation from the AI operator and the AI operator may then claim compensation from the AI developer if the developer should be blamed for the malfunction. Where the AI user suffers from personal injuries or property damages due to the defect of AI, in accordance with the Tort Liability Law, the user may seek damages from either the AI operator or the developer in the first place, and the AI operator or the developer may be liable entirely or proportionally according to their faults in the AI malfunction.
Although liabilities regarding the use artificial intelligence have not been addressed in a specific manner in Mexico, it is important to mention that Mexico, under the Federal Civil Code, follows the general rule of “strict liability”. Meaning that if artificial intelligence is understood to be a mechanism or instrument that may be understood itself to be dangerous, the person using the artificial intelligence software program, is obliged to repair the damage caused by such software program, even if the person does not act in an unlawful manner, unless that person proves that the damage was a consequence of the inexcusable fault or gross negligence of any injured party.
Ordinarily, there will be strict liability for the producer of defective products for consumers (Consumer Protection Act 1987) and this would include products which are themselves software or which include software components. Where such defects have resulted from computer assisted design or other software assisted processes, it will ordinarily be the person who programmed the CAD tool who will then face liability. However, this is all predicated on a principle of casual connection, ie "Because of A + B, C necessarily came next". When the software starts to make decisions for itself based upon its "learning" from what it observes/receives from external sources and so ceases to be predictable, this liability concept becomes more strained. However, for the time being at least, it would seem most likely that the licensor/programmer of the A.I product would be liable pursuant to the strict liability regime in the Consumer Protection Act as referred to above. In the business (ie non-consumer) context, contractual provisions will usually specify where liability will sit in any event.
Currently the Romanian national legal framework does not contain any explicit provisions with regard to any form of A.I. Therefore, the general rules on civil contractual liability and tort law, as well as administrative and criminal liability would apply on a case-by-case basis, depending on the specific circumstances of the case.
In Italy, there is not yet a specific piece of legislation dealing with issues relating to A.I. liability. Based on the general principles of Italian tort and contractual law, depending on the specific circumstances of the case, if the malfunction causes damages to a third party, there could be liability of both the provider of the A.I. solution and the person using the A.I. solution when it caused the damage.
Under certain circumstances, the supply of a software program can qualify as a sale of goods. The buyer can claim breach of contract if the software program is not in conformity with the agreement. The software program is not in conformity with the agreement if it does not have the qualities that the buyer, given the nature of the object and the statements of the seller about it, could have expected on the basis of the agreement. In case of a non-conform delivery, the seller can be held liable vis-à-vis the buyer for malfunctions in the A.I. functionality.
Liability can also be based on general rules of unlawful conduct or certain strict liabilities. Art. 6:162 of the Dutch Civil Code qualifies an unlawful act as – amongst other – an act or omission in violation of a duty of care. If a person/entity fails to observe a duty of care and as a result brings certain risks to existence, said person/entity can be held liable for damages is said risks actually materialize. In relation to A.I. software programs, a duty of care rest on a number parties, such as the creator, reseller or the party actually using the A.I. software. The Dutch Civil Code also provides for certain types of strict liabilities which can also apply in case of A.I. software malfunctions. In this respect reference is made to strict liability on the part of a possessor of a movable thing if the A.I. software forms part of such a movable thing (i.e. as part of a robot, self-driving car etc.) (art. 6:173 of the Dutch Civil Code). Strict liability can also arise from the EU Product Liability Directive 85/374/EEC as implemented in the articles 6:185 et seq of the Dutch Civil Code.
Under the Consumer Code, product liability is based on a strict liability regime and any entity that participates in the chain of development, distribution and/or offer of the product is jointly and severally liable for any product defect or malfunction. Therefore, all the entities of the production chain may be subject to liability. The main causes of action that may trigger liability relates to defects in the products and services, failure to provide clear information to consumers on the risks and limitations of the products and services and misleading advertising. Therefore, liability may be triggered if the customer does not receive all information on how the A.I. works and any possible malfunction/risks associated to the product or, in any case, if the product is deemed, by its nature, defective.
As for the Civil Code regime, which is usually applicable to contracts between corporations, liability is imposed to the entities that caused or contributed to the damage caused; in this scenario, joint liability may only be imposed based on express statutory or contractual provisions. In any event, the Civil Code contemplates the “theory of risk”, imposing strict liability on any service provider that offers services that are deemed to expose people to an unreasonable and unexpected risk.
As Indonesian law has yet to regulate artificial intelligence (“A.I.”), reference must be made to Law No. 8 of 1999 regarding Consumer Protection (April 20, 1999) (the “Consumer Protection Law”). Under the Consumer Protection Law, an entrepreneur shall be liable for any damages sustained by the consumer due to the goods and/or services produced or traded by such entrepreneur. As such, in the event of an A.I. malfunction, the entrepreneur shall be liable for all damages suffered by the consumer of such A.I.
It may be understood that any liability arising out of software programmes, whether such programmes appear to be in the form of artificial intelligence (“AI”) or not, may be addressed under tort law, criminal law, intellectual property law and the IT Act. Under the IT Act, if a software programme is designed in such a manner or is intended to be used for “hacking” or “identity theft” or for other computer related offences as prescribed in the IT Act, then the creators of the software programmes shall be held liable and accordingly, be punished with imprisonment and/or fine. Further, if a service provider using computer programmes, in the course of performing a contract, as a result of the programme malfunction, discloses an individual’s personal information including sensitive personal data or information without such individual’s consent, then such service provider may be liable for penalties under the IT Act as specified in the response to Question 9 above. Again, the creators of the software programmes shall be held responsible under tort law for any nuisance caused or for any negligence by engineers of such software programmes. Further, any software programme which is premeditated to be used for commission of criminal offences, the creators of the software programme would be held liable under criminal laws. As far as intellectual property laws are concerned, the Copyright Act, 1957 as specified in the response to Question 5 above includes software programmes under the definition of “literary work”. Further as specified in the response to Question 5, under the Patents Act, 1970, if the processes associated with the creation of computer programmes or their functionality make them novel, inventive and capable of industrial application, such processes are patentable. If a software programme infringes or misappropriates a third party’s intellectual property, the owners of the software programmes will be held liable.
In case of bodily injury, several laws stipulate liability in different circumstances notwithstanding fault. For example, under the Liability for Defective Products Law, 1980, the manufacturer of a product will be liable for any bodily injury caused by a product malfunction. Likewise, the Road Accident Victims Compensation Law – 1975 stipulates full liability on the driver of a vehicle for any bodily injury. However, it may be unclear who would be considered the driver.
With respect to damages other than bodily injury, different regimes will apply in accordance with contractual undertakings and applicable tort law theories (e.g., negligence and breach of duty care).
If the A.I. software is sold to a consumer and subsequently malfunctions, the issue of liability may be governed by consumer protection laws such as the Sale of Goods Act (Chapter 393) and the Unfair Contract Terms Act (Chapter 396). The Sale of Goods Act imposes several implied terms which cannot be excluded by contract when dealing with consumers. These include implied conditions or warranties regarding title and lack of encumbrances, correspondence with description, satisfactory quality and fitness for purpose. Therefore, the A.I. software provider will be liable for any malfunction that results in a breach of these mandatory implied terms.
Otherwise, the issue of liability will generally depend on the contractual agreement between the A.I. software provider and the software user
As of today, a general principle is that a person shall be considered as liable not only for damages he/she causes out of his/her own act, but also for those caused by things under his/her custody (Civil Code, art.1242). The scope of application of this principle may decrease as things that are prompted by Artificial Intelligence move away from the custody of their owner, as is the purpose of AI. Disputes may then arise as to who is liable: the owner of the thing, the designer, the software programmer, or the user from whom the thing learned how to behave…
In furtherance of this liability principle and following EU legislation, a strict liability regime was enacted in 1998, which applies on the producer of a product in regard to damages caused by a defect in his product. This liability applies irrespective of whether the producer is bound to the victim by contract (Civil Code, art.1245 et seq.). Strict liability makes things easier for the victim, who may sue the manufacturer, a supplier of individual parts or, ultimately, the reseller of the product. The victim must only prove the lack of security of the thing.
This liability regime might address situations where AI is embedded or associated to pieces of equipment or hardware, but less so in respect of a software program itself in as far as this is not (yet) considered as a ‘product.’ In addition, risks that will arise after fabrication and sale, due to the AI autonomous lurning curve, will ultimately cease to be predictable and might hardly be charged on the initial designer.
In this context, the liability regime in respect of artificial intelligence seems likely to evolve.
The liability for malfunctions of a software program which purports to be an early form of A.I. is in German law still unsolved. Three different approaches are discussed amongst legal scholars. One opinion attributes the liability to the operator according to sections 280, 823 BGB. Another opinion wants to solve this problem with a new regulation about strict liability which is independent of negligence and intent similar to product liability. A third idea is to invent an own legal entity for A.I. – the so-called “e-person” - as counterpart to natural and legal persons.
There are no specific rules as regards A.I. functionality under Swiss law. From a civil law perspective, technology-neutral general Swiss liability law is applied, with the main sources of law being the Code of Obligations as regards fault-based and contractual liability as well as the Federal Road Traffic Act of 19 December 1958, as amended (RTA), and the Federal Law on Product Liability of 18 June1993, as amended (PLA), which address strict liability. The decisive factor for any liability is to whom the unlawful conduct is attributable. Only individuals or legal entities may be liable while a liability for a machine or A.I. functionality is excluded, meaning that the liability for the operation of autonomous systems (including A.I. functionalities) must always be based on the act or omission of a person, irrespective of an integrated software’s capability to amend the underlying software code. A similar logic applies from a criminal law perspective: Only individuals (and not machines) can be primarily criminally liable pursuant to the Swiss Criminal Code of 21 December 1937, as amended (SCC), meaning that an individual must have caused (by act or omission) an unlawful offense (e.g. a bodily injury or damage to an object) wilfully or negligently. A subsidiary liability for legal entities may apply if it is not possible to attribute an act or omission to an individual due to the inadequate organisation of such legal entity. Sanctions of up to CHF 5 mio. may be imposed for a felony or misdemeanour attributed to such legal entity.
Artificial Intelligence is not regulated in Ecuador; therefore, in case of malfunction of a software program, liability will depend on the obligations of each of the Parties agreed to in the contract, based on the principle of party autonomy. As such, if the contract determines that the software developer will be responsible for its functioning, it is the developer who will be liable for compensating the damages caused by the malfunctioning of the system.
Accordingly, each contract will determine the degree of responsibility agreed to or even a limitation to it.