AI’s biggest risk will be on full display during the 2024 election

Published 3:00 am Friday, January 26, 2024

As artificial intelligence continues to grow, so do the calls to regulate the technology. Despite its potential for positive, numerous experts and legislators are raising concerns about the possible risks. Ian Bremmer, president and founder of The Eurasia Group joined TheStreet to talk about the inherent risks of ungoverned AI as we head into the 2024 election.

Full Video Transcript Below:

SARA SILVERSTEIN: What’s the biggest short term risk for ungoverned AI?

IAN BREMMER: You know, I’m a huge enthusiast when it comes for AI, I believe that the level of productivity is extraordinary and the technology is moving very fast. You’re going to see it used in every sector, in every company, and therefore you’re not going to have really powerful companies and powerful individuals trying to stop it, which is what usually happens with the technological revolution. You get, you know, post-carbon energy and then all the coal and the oil people try to stop it, try to lobby against it. That’s not happening in AI. So it’s going to actually lead to a much bigger upside than people expect much faster. But technology is moving much faster than the ability to govern it. And that means that the negative externalities, which one would expect from such a transformative technology, are going to happen very quickly. And they aren’t going to be contained or constrained. 

What kind of negative externalities? Well, one obvious one is deepfakes and artificial intelligence used for disinformation. So, you know, in an election like the United states, where so much is at stake, where people are so angry with us, so much chaos that could come, we’re moving from disinformation to AI driven disinformation. That’s a very significant disruptive risk. Then there’s also the question of what bad actors can do to just blow things up. So you use AI to code. It’s very impressive. Use AI to create malware that’s very dangerous and costly. Use AI to create vaccines. We love that. That actually got us out of COVID a lot faster than we otherwise would have been. Use AI to create new viruses and new diseases. We don’t like that so much. And as you have these new incredible AI tools that are rolling out that everyone has access to, some of which are open source, and they will be used not just for productive purposes, but also by tinkerers and by bad actors. This is the first year we’re going to start to see the negative impact of that more broadly. 

Marketplace