[This is part three of a three-part exploration of Artificial Intelligence (AI), where I explore the pop-culture effect on the human perception of “intelligence,” – artificial and otherwise. The first two parts can be found here, and here.]

“By far the greatest danger of Artificial Intelligence is that people conclude too early that they understand it.”

–Eliezer Yudkowsky

Artificial Intelligence is already here, but we don’t normally call it AI. It is in my phone, and it wakes up, hands free, whenever I bark “COMPUTER,” Just like Montgomery Scott in Star Trek. It’s not perfect. It doesn’t listen unless I use a very precise command. Sometimes it forgets to listen altogether and I have to use my thumbs again.


AI is ubiquitous: it lives in our toys. It lives in the back-ends of web pages that help us contextualize our searches, often hilariously. It lives in our cars, and can notify first responders of an accident, even if the driver can’t.

Rudimentary AI, like the ones that act as a back-end for consumer facing APP’s — are just the beginning. If the information age is about anything at all, it is about collecting data. When we leave a review on yelp, “like,” share or comment on a Facebook picture or story, we generate big data. Big data knows when you click-thru to another page, order up a ride from Lyft or Uber, or book a last minute flight to New Orleans. It knows which TV shows you binge on, and where you prefer to buy ice cream. Right now, this awareness lives in a conceptual space where it possesses a potential utility. It could be used to offer you digital coupons, for example, or even recommend a hotel, based on your itinerary. It won’t do so unless you specifically engage a particular APP and ask it nicely first*, but the potential is there, and – it’s growing.

Big data has two fundamental problems right now, and neither is related to mid or long term storage costs. The first problem is: each of us leaves a digital bread-crumb trail wherever we go, both in meatspace, and in cyberspace, creating massive piles of data.  The second is that it’s DATA, and most humans find complicated spreadsheets boring**.

“You want to protect the world, but you don’t want it to change.”

–Ultron (Avengers Age of Ultron)


I opened an advertisement I received in the mail this morning, from King Soopers, a local grocery chain here in Denver (and part of the Kroger family of stores). Inside were coupons, based on some recent purchases I made using our store discount card. One of my side gigs is with Instacart, and I am encouraged to use my card whenever I shop for customers at King Soopers. In theory, this means I always get about a dollar off, per gallon, whenever I fill my tank at a King Soopers gas station. In reality, it means I now get coupons for adult diapers and organic raspberries.

Big data hasn’t quite figured me out, though I suspect it already knows more about me – in the aggregate – than I would be prepared to share publically. Most of that information is only useful, after the fact, and then – only if it’s not based on a faulty premise. I’m getting older, no doubt, but don’t yet have use for adult diapers, or organic raspberries.

Doctors use another system, called Modernize Medicine, which is basically the for information related to the care and treatment of human diseases and conditions. As far as I know, Modernize Medicine hasn’t recommended me for any work-camps yet, but I’m sure its analyzed the data….

Jokes aside, applications that can interpret data and create moments of insight are the holy grail of big data collection. It is only a matter of time before they – like the websites before them – begin to “talk” to one another in a meaningful way. That won’t necessarily make them “smart,” of course, but they will probably be sold that way. The underlying intelligence required to allow for networked databases is low: something on the order of a single pig. But the groundwork has been laid for a smarter, “general AI” which uses networked data to provide a kind of general intelligence about anyone, anytime. Such an application would be incredibly useful to me, obviously, but the truth is, it will be even more useful to those who want to sell me experiences, throughout my life.

“The Revolution will be personalized. All Judgement days are personal days.”

–Muah Haha (The Complete Book of Keys)

The problem with pop-culture and artificial intelligence is that pop-culture is about stories, and stories need conflict like meatbags need 02.  We have been conditioned to watch for signs of a robot rebellion – through our stories – but we are trained to overlook issues and problems that tend to develop over a longer story-arc, because they are above our pay-grade.


We are wired to use tools to augment our intelligence. We were wired to do so when our first common ancestor used a stick to pry tree bark open to get at the juicy source of protein hiding behind it. Our entire history as a species is about how we utilized our tools to augment our strength, intelligence and senses, to better adapt to the world around us.

When our personal digital assistants evolve to the point where each has access to most, if not all of the databases where our breadcrumb trails are stored – in real time – we will have effectively augmented our intelligence again, using a variation of our first tool.

By the time a general AI achieves human-like levels of awareness, that awareness will be so intertwined with our own that autonomous drones, and Matrix-like agents will become superfluous.

Keeping the AI outside of ourselves is just a stop-gap measure. The real goal is in integration with the data.

In reality, there is very little difference between a mechanical robot and a human body. The intangibles – whatever you are conditioned to believe in – are in the payload. The payload is where most of the potential utility actually lives.

Think about the modern phone. Most of us carry ours wherever we go, and are fairly lost without it. Our dependency has fostered countless memes, and some of them are even true. There’s nothing shocking about this condition, however. In the span of 10 years, we have replaced hundreds of everyday human tools with a single device. Some of those tools – like a relatively innate sense of direction – have been part of our kit for 100 thousand years or more. It — like our ability to remember seven or 10-digit numbers, is, if I may paraphrase Roy Batty, “lost in time, like tears in the rain.”

The mega-tool which supplanted so many other tools isn’t going away soon. It won’t be replaced by nostalgia, or nicer public libraries. It may change shape a few times, but it is here to go, wherever we go.

In another 10 years it will live inside us; –a couple of sensors, and a handful of chips. We will have effectively integrated our collective intelligence, and our frontier will shift again. That’s the way we grow.

Our AI will be as smart as we are before we know it, but we won’t call it AI, because it won’t be something separate from “us.” We will move the goalposts. It’s one of our best traits, actually.

We will use our tools to augment our intelligence, our strength, our reach, and our grasp, and we will always struggle to catch up to it. If it all seems to move too fast, take heart – every human generation before you felt exactly the same way about their own tools. The only difference now is the rate of change.

And there’s an APP for that.


*Or just agree to the EULA, I suppose.

** This is why the humans who don’t are in charge of everything.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s