What’s going on here?
The European Parliament approved a legal framework limiting AI's role in society, presumably after spending an evening working through a dystopian film library.
What does this mean?
Decades of literature have told us that AI agents aren’t exactly driven by a desire to protect humanity. And according to today’s scientists and tech wizards, that concern isn’t reserved for the world of fiction. So now, members of the European Parliament are pushing for specialized laws that would prevent certain actions, like detecting emotions in workplaces and schools, making decisions in high-stakes situations like job applications, and scraping CCTV to build facial recognition databases. Any companies that go against the grain could be landed with fines up to €35 million ($37.7 million) or – oddly specifically – 7% of their worldwide sales.
Why should I care?
Zooming out: Pandora’s box.
The European Parliament is mainly concerned about AI’s ability to replace humans in important decision-making roles, which could lead to misinformation, bias, and privacy breaches. But lawmakers want to find the sweet spot, where AI could enhance human life and companies’ productivity by automating certain tasks without, you know, violating basic rights. Plus, come down too hard and Europe could fall behind economically – not least because foreign companies, bound by less stringent rules, might avoid the region if they’d be more restricted there.
The bigger picture: High standards cost.
The US is drafting its own set of regulations, too – trailing behind China, where AI product approvals are already in place. But Europe has a history of sticking to the stricter side, no matter whether the matter is privacy, the environment, food safety, or consumer rights. And because regulations come with a cost that businesses need to foot, that partly explains why Europe’s stock markets tend to be valued lower than stateside equivalents. So unless European AI rules turn uncharacteristically lax, don’t count on that gap closing up anytime soon.