Targeted Algorithms and Complete Openness

August 20th, 2010

Eric Schmidt comments on where Google is going:

“I actually think most people don’t want Google to answer their questions,” he elaborates. “They want Google to tell them what they should be doing next.”

Let’s say you’re walking down the street. Because of the info Google has collected about you, “we know roughly who you are, roughly what you care about, roughly who your friends are.” Google also knows, to within a foot, where you are. Mr. Schmidt leaves it to a listener to imagine the possibilities: If you need milk and there’s a place nearby to get milk, Google will remind you to get milk. It will tell you a store ahead has a collection of horse-racing posters, that a 19th-century murder you’ve been reading about took place on the next block.

Says Mr. Schmidt, a generation of powerful handheld devices is just around the corner that will be adept at surprising you with information that you didn’t know you wanted to know. “The thing that makes newspapers so fundamentally fascinating—that serendipity—can be calculated now. We can actually produce it electronically,” Mr. Schmidt says.

Mr. Schmidt is a believer in targeted advertising because, simply, he’s a believer in targeted everything: “The power of individual targeting—the technology will be so good it will be very hard for people to watch or consume something that has not in some sense been tailored for them.”

Targeted, location-based technology could prove incredibly useful to us, or it could end up a terrible change in how we live our lives. I have a few fears about this kind of technology.

None of my fears happen to be a concern for privacy as it is typically considered. I am less afraid of personal information leaking out, or being abused by a company, than I am of a society where complete openness is not only accepted, but expected. There’s a balance between the public and the private that, I think, keeps us level as individuals. A private sphere, where the individual is free to act, say and think without regard or even thought for what others might think, allows us to consider things free from outside influence. By thinking and doing free from public interference, we can look at the outside world from an outside perspective—that is, from our own individual perspective—and judge it based on those standards. A private sphere is necessary for individuals to exist at all.

Once we are expected to share everything with the world—things as meaningful as our goals, hopes, fears and our health, the banal, such as what clothes we’re wearing or what we’re doing at any given moment—who we are as individuals would become a part of the public. They would no longer be factors of our own personal selves, our own to consider, but rather characteristics—data—to be judged by the public. Once those factors become a part of the public, the individual will look at themselves from the public’s perspective, to be judged not by their own standards, but by the public’s, since they now consider themselves the public. Their love for science fiction won’t be a unique characteristic, but an oddity, an abnormality, wrong, because no one else has it.

This already exists in a related form, and to some extent, always has. Women that think they are overweight even if they aren’t have fallen into this trap—they have internalized the media’s (e.g., the “public”) definition of beauty, and have now judged themselves by someone else’s standards. Women in this case are internalizing other standards—that is, taking someone else’s standards and replacing their own. They are allowing them in to their private sphere. But complete openness is something different. When an individual shares everything about themselves, they are not just internalizing. They are accepting the majority’s standard as the standard. They have become the public.

That’s one of my fears—that complete openness will erode not just our privacy, but our conception of the individual, and so also erode our ability to think critically about the world. Once we identify with the public, and do not see it from an outside perspective, we will be less able to find its faults.

My second fear is that Schmidt is right about targeted recommendations, and they will become the main means of “discovering” new things. Perhaps these algorithms will be exceptionally good at identifying things that we like; that’s fine. But what worries me is a society where people do not seek new things out on their own, through their own effort.

When you are trying to find something—a new book, movie, band—you are positively engaged. You have a defined idea of what you are looking for, and you are actively thinking of how you can find it. This requires that you, on your own, discover what it is that you like and want more of. You are defining your ideals and tastes for yourself. You are thinking.

If we rely on algorithms to do this for us, we are also ceding the right to think and do for ourselves, to make personal judgments. If we would rather an algorithm decide who we are, we are letting others define us and what we believe. Why not have an algorithm that decides what our political beliefs are, and votes for us accordingly?

That sounds hyperbolic, but I don’t find it much different than allowing an algorithm to find what music we should listen to, movies we should watch, food we should eat, or people we should date. Rather than just accept all new technology, we must be skeptical. We should calmly and rationally think through how it will impact us and whether it is positive.