EU to unveil AI rules to fight Big Brother fears
The EU is set to unveil a proposal to regulate the sprawling field of artificial intelligence, with the aim of reassuring the public against “Big Brother”-like abuses.
The European Commission, the EU’s executive arm, has been preparing the proposal for over a year, with big tech companies worrying that the bloc’s definition of AI is too broad.
The draft regulation would make “generalized surveillance” of the population off limits as well as any tech that was “used to manipulate the behavior, opinions or decisions” of citizens.
Anything resembling a social rating of individuals based on their behavior or personality would also be prohibited, the draft said.
Military application of artificial intelligence will not be covered by the rules, which will require ratification by EU member states as well as the European Parliament.
Infringements, depending on their seriousness, may bring companies fines of up to four percent of global turnover.
Google and other tech giants are taking the EU’s AI strategy very seriously as Europe often sets a standard on how tech is regulated around the world.
Last year, Google warned that the EU’s definition of artificial intelligence was too broad and that Brussels must refrain from over-regulating a crucial technology.
The EU’s increasing ESG regulation and implications for business
The European Union (EU) is currently at the vanguard of environmental, social and governance (ESG) measures.
Two areas of development in particular are likely to have widespread repercussions for businesses: newly implemented obligations for ESG disclosures and likely forthcoming mandatory human rights, environmental and governance due diligence.
Their implementation is likely to have significant effects for both companies domiciled in the EU as well as companies operating within the EU. Importantly, as well as compliance concerns, businesses will need to consider the attendant legal risks of publicly sharing human rights and environmental risks in their business operations and supply chain more widely.
This should be looked at as the “first mover” among regulations of this kind. Similar types of regulations are being considered in the U.S. and the UK. The UK has indicated that it will adopt the recommendations made by the Task Force for Climate-related Financial Disclosures (TCFD) to make climate-related financial disclosures mandatory for certain firms by 2025.
The second area of prospective regulation concerns mandatory “due diligence” measures for human rights, environmental and governance concerns – essentially equivalent to ESG. The idea of due diligence legislation is linked to the UN Guiding Principles on Business and Human Rights, in which “human rights due diligence” is used to refer to a process of assessing the actual and potential human rights impacts of a companies’ operations, integrating and acting upon the findings, tracking responses, and communicating how those impacts are addressed.
Mandatory due diligence measures coming out of Europe are likely to significantly affect how businesses approach ESG issues. Regulations are part of a particular wave of consumer-focused regulation around ESG issues. Rather than directly requiring businesses to change the way they work, the objective is for the transparency obligations to promote changes in business practices and to promote accountability (for sustainability claims in particular).
Those disclosures are likely to lead to increased scrutiny of businesses decision-making around ESG issues, with implications for both legal risk and reputational risk. The best way for businesses to address these concerns – as well as to future proof against upcoming due diligence legislation – is to take action to address ESG concerns and to comply with guidance such as the UN Guiding Principles on Business and Human Rights.