Robo-cop: EU wants AI-harmed corporations to be accountable

Robo-cop: EU wants AI-harmed corporations to be accountable

The European Commission proposed new rules on Wednesday that would force companies that make software and products with artificial intelligence to pay people who were hurt by their work.A new AI Liability Directive would make it easier to sue for compensation when a person or organization gets hurt or loses money because of drones, robots, or software like automated hiring algorithms that use artificial intelligence.

“The new rules will give people who have been hurt by AI systems an equal chance and access to a fair trial and compensation,” Didier Reynders, the Justice Commissioner, told reporters before the proposals were presented.

The draft law is the latest attempt by European officials to control artificial intelligence (AI) and set a global standard for it. It comes at a time when the EU is in the middle of negotiating the AI Act, which is the first law in the world to limit high-risk uses of AI, such as facial recognition, “social scoring” systems, and software that uses AI to help with immigration and social benefits.
“If we want consumers and users to really trust AI applications, we need to make sure that it’s possible for them to get real compensation and real justice if they need it, without too many barriers like the opaqueness of the systems,” Reynders said.

Robo-cop:

Under the new law, people who are hurt by AI could sue the company that made it, the person who made it, or the person who used it if it hurt their health or property or violated their basic rights, such as their right to privacy. Because AI technology is complicated and hard to understand, it has been hard and very expensive for people who think they have been hurt by an AI to build cases.

Courts would have more power to pry open AI companies’ “black boxes” and ask for detailed information about the data used to make the algorithms, the technical details, and the ways they control risks.

With this new access to information, victims could prove that damage came from a tech company that sold an AI system or that the user of the AI, like a university, workplace, or government agency, did not follow the rules of other European laws like the AI Act or a directive to protect platform workers. The people who were hurt would also have to show that the damage was caused by the AI applications in question.

A new version of the Product Liability Directive was also shown by the European Commission. The law from 1985 doesn’t cover new product categories like connected devices, and new rules are meant to make it easier for customers to get paid when a bad software update, upgrade, or service hurts them. Online marketplaces are also in the crosshairs of the proposed product liability rules. According to the rules, they can be held liable if they don’t give the name of a trader to a person who was hurt and asked for it.

EU wants AI-harmed corporations to be accountable

The EU Council, which is made up of national governments, and the European Parliament will still have to agree with the Commission’s plan. Parliament could have a problem with the European Commission’s choice to propose a liability system that is weaker than the one it had suggested before.

In 2020, the chamber asked the Commission to make rules to make sure that people hurt by harmful AI can get compensation. Specifically, it asked that developers, providers, and users of high-risk autonomous AI could be held legally responsible even if they did not mean to hurt anyone. But the EU executive chose a “pragmatic” approach that is weaker than this strict liability regime, saying that the evidence was “not enough to justify” such a regime.

“We chose the least amount of help,” Reynders said. “We need to see if changes in the world call for stricter rules in the future.”It said that after five years, the Commission will look at whether a stricter system is needed.

READ MORE ARTICLES;

Leave a Reply

Your email address will not be published. Required fields are marked *