It’s an emerging issue of our time and place. They know too much about us, and we know too little about what they know. We do know that what they know about us is increasingly determining what they choose to give us to read. We wonder: What are we missing? And just who is making those decisions?
Today, in 2012, those questions are more pressing in our age of news deluge. We’re confronted at every turn, at every finger gesture, with more to read or view or listen to. It’s not just the web: It’s also the smartphone and especially the tablet, birthing new aggregator products — Google Currents and Yahoo Livestand have joined Flipboard, Pulse, Zite, and AOL Editions — every month. Compare for a moment the “top stories” you get on each side-by-side, and you’ll be amazed. How did they get there? Why are they so different?
Was it some checkbox I checked (or didn’t?!) at sign-in? Using Facebook to sign in seemed so easy, but how is that affecting what I get? Are all those Twitterees I followed determining my story selection? (Or maybe that’s why I’m getting so many Chinese and German stories?) Did I tell the Times to give the sports section such low priority? The questions are endless, a ball of twine we’ve spun in declaring some preferences in our profiles over the years, wound ever wider by the intended or (or un-) social curation of Facebook and Twitter, and mutliplied by the unseen but all-knowing algorithms that think they know what we really want to read, more than we do. (What if they are right? Hold that thought.)
The “theys” here aren’t just the digital behemoths. Everyone in the media business — think Netflix and The New York Times as much as Pandora and People — wants to do this simple thing better: serve their customers more of what they are likely to consume so that they’ll consume more — perhaps buying digital subscriptions, services, or goods and providing very targetable eyes for advertisers. It’s not a bad goal in and of itself, but sometimes it feels like it is being done to us, rather than for us.
Our concern, and even paranoia, is growing. Take Eli Pariser’s well-viewed (500,000 times, just on YouTube) May 2011 TED presentation on “filter bubbles,” which preceded his June-published book of the same name. In the talk, Pariser talks about the fickle faces of Facebook and Google, making “invisible algorithmic editing of the web” an issue. He tells the story of how a good progressive like himself, a founder of MoveOn.org, likes to keep in touch with conservative voices and included a number in his early Facebook pages.
He then describes how Facebook, as it watched his actual reading patterns — he tended to read his progressive friends more than his conservative ones — began surfacing the conservative posts less and less over time, leaving his main choices (others, of course, are buried deeper down in his datastream, but not easily surfaced on that all-important first screen of his consciousness) those of like-minded people. Over time, he lost the diversity he’d sought.
Citing the 57 unseen filters Google uses to personalize its results for us, Pariser notes that it’s a personalization that doesn’t even seem personalized, or easily comparable: “You can’t see how different your search results are than your friends…We’re seeing a passing of the torch from human gatekeepers to algorithmic ones.”
Pariser’s worries have been echoed by a motley crew we can call algorithmic and social skeptics. Slowly, Fear of Facebook has joined vague grumbles about Google and ruminations about Amazon’s all-knowing recommendations. Ping, we’ve got a new digital problem on our bands. Big Data — now well-advertised in every airport and every business magazine as the new business problem of the digital age to pay someone to solve — has gotten very personal. We are more than the sum of our data, we shout. And why does everyone else know more more about me that I do?
The That’s My Datamine Era has arrived.
So we see Personal.com, a capitalist solution to the uber-capitalist usage of our data. I’ve been waiting for a Personal.com (and the similar Singly.com) to come along. What’s more American than having the marketplace harness the havoc that the marketplace hath wrought? So Personal comes along with the bold-but-simple notion that we should individually decide who should see our own data, own preferences, and our own clickstreams — and be paid for the privilege of granting access (with Personal taking 10 percent of whatever bounty we take in from licensing our stuff).
It’s a big, and sensible, idea in and of itself. Skeptics believe the horse has left the barn, saying that so much data about us is already freely available out there to ad marketers as to make such personal databanks obsolete before they are born. They may be forgetting the power of politics. While the FCC, FTC, and others have flailed at the supposed excesses of digital behemoths, they’ve never figured out how to rein in those excesses. Granting consumers some rights over their own data — a Consumer Data Bill of Rights — would be a populist political issue, for either Republicans or Democrats or both. But, I digress.
I think there’s a way for us to reclaim our reading choices, and I’ll call it the News Dial-o-Matic, achievable with today’s technology.
While Personal.com gives us 121 “gem” lockers — from “Address” to “Women’s Shoes”, with data lockers for golf scores, beer lists, books, house sitters, and lock combinations along the way, we want to focus on news. News, after all, is the currency of democracy. What we read, what she reads, what they read, what I read all matter. We know we have more choice than any generation in history. In this age of plenty, how do we harness it for our own good?
Let’s make it easy, and let’s use technology to solve the problem technology has created. Let’s think of three simple news reading controls that could right the balance of choice, the social whirl and technology. We can even imagine them as three dials, nicely circular ones, that we can adjust with a flick of the finger or of the mouse, changing them at our whim, or time of day.
The three dials control the three converging factors that we’d like to to determine our news diet.
This is the traditional title-by-title source list, deciding which titles from global news media to local blogs I want in my news flow.
Social curation is one of the coolest ideas to come along. Why should I have to rely only on myself to find what I like (within or in addition to My Sources) when lots of people like me are seeking similar content? My Facebook friends, though, will give me a very different take than those I follow on Twitter. My Gmail contact list would provide another view entirely. In fact, as Google Circles has philosophized, “You share different things with different people. But sharing the right stuff with the right people shouldn’t be a hassle.” The My Networks dial lets me tune my reading of different topics by different social groups. In addition, today’s announced NewsRight — the AP News Registry spin-off intended to market actionable intelligence about news reading in the U.S. — could even play a role here. (More on NewsRight here and here.)
The all-knowing, ever-smarter algorithm isn’t going away — and we don’t want it to. We just want to control it — dial it down sometimes. I like thinking of it in sci-fi terms, and The Borg from “Star Trek” well illustrates its potential maniacal drive. (I love the Wikipedia Borg definition: “The Borg manifest as cybernetically-enhanced humanoid drones of multiple species, organized as an interconnected collective, the decisions of which are made by a hive mind, linked by subspace radio frequencies. The Borg inhabit a vast region of space in the Delta Quadrant of the galaxy, possessing millions of vessels and having conquered thousands of systems. They operate solely toward the fulfilling of one purpose: to “add the biological and technological distinctiveness of other species to [their] own” in pursuit of their view of perfection“.) The Borg knows more about our habits than we’d like and we can use it well, but let’s have us be the ones doing the dialing up and down.
Three simple round dials. They could harness the power of our minds, our relationships, and our technologies. They could utilize the smarts of human gatekeepers and of algorithmic ones. And they would return power to where it belongs, to us.
Where are the dials? Who powers them? Facebook, the new home page of our time, would love to, but so would Google, Amazon, and Apple, among a legion of others. Personal.com would love to be that center, as it would any major news site (The New York Times, Zite-powered CNN, Yahoo News). We’ll leave that question to the marketplace.
Lastly, what are the newsonomics of the News Dial-o-Matic? As we perfect what we want to read, the data capturing it becomes even more valuable to anyone wanting to sell us stuff. Whether that gets monetized by us directly (through the emerging Personals of the world), or a mix of publishers, aggregators, or ad networks would be a next battleground. And then: What about the fourth wheel, as we dial up and down what we’re in the marketplace to buy right now? Wouldn’t that be worth a tidy sum?