Ten years ago, I realized media literacy meant something different
As media theorist John Culkin first observed, we shape our technologies at the moment of their conception, but from that point forward they shape us. We humans designed the telephone, but from then on the telephone influenced how we communicated, conducted business, and conceived of the world. We also invented the automobile, but then rebuilt our cities around automotive travel and our geopolitics around fossil fuels.
This axiom holds true for technologies from the pencil to the birth control pill. But computers, algorithms, and artificial intelligences add another twist: after we launch them, they not only shape us but they also begin to shape themselves. We give them an initial goal, then give them all the data they need to figure out how to accomplish it. From that point forward, we humans no longer fully understand how an AI may be processing information or modifying its tactics. The machine isn’t conscious enough to tell us. It’s just trying everything, and hanging onto what works.
Researchers have found, for example, that the algorithms running social media platforms tend to show people pictures of their ex-lovers having fun. No, users don’t want to see such images. But, through trial and error, the algorithms have discovered that showing us pictures of our exes having fun increases our engagement. We are drawn to click on those pictures and see what our exes are up to, and we’re more likely to do it if we’re jealous that they’ve found a new partner. The algorithms don’t know why this works, and they don’t care. They’re only trying to maximize whichever metric we’ve instructed them to pursue.
That’s why the original commands we give our computers are so important. Whatever values we embed will be the values they achieve.
That’s why the original commands we give our computers are so important. Whatever values we embed — such as efficiency, growth, security or compliance, for example — will be the values they achieve. And they’ll do so by whatever means happen to work. Machine intelligences will be using techniques that no one — not even they — understand. And they will be honing them to generate better results, and then…