It is no secret that Google and Facebook – the two websites that are quickly becoming what I think the collective psyche is beginning to treat simply as ‘the internet’ – use algorithms based on past behavior to determine the relevancy of content to individual users. But at this year’s TED conference, MoveOn.org executive director Eli Pariser pointed out some unfortunate side effects to personalized content streams.
These days, when one searches Google, the search engine adjusts and filters its results using 57 signals gleaned from the user’s past online behavior. It’s a similar story with Facebook – it modifies a user’s news feeds according to, among other behavior, the types of links she most often clicks and the friends with whom she interacts the most.
Unfortunately, Pariser pointed out, such algorithms that decide what we see (and subsequently what we think) based on relevancy are, well, perhaps making us dumber. Because what such relevancy-based algorithms do is narrow the field of information to which the user is exposed, and over time, basically deny the user of anything other than content that runs the same lines as her own opinions, sources and points of view.
“A squirrel dying in front of your house may be more relevant to your interests right now than people dying in Africa,” said Mark Zuckerberg to his staff as he introduced the then-new concept of Facebook’s News Feed. This train of thought is basically the problem Pariser’s referencing: users may indeed favor icanhazcheezburger content or autotuned celebrity freakouts, but a content stream modified to suit such click behavior could theoretically make a meme fanatic into a veritable zombie.
I’m sure this isn’t part of some nefarious master plan of Google and Facebook to make us all complacent drones more likely to lay down and accept the imminent Google/ Facebook world takeover. And the argument that technology is making us dumber isn’t, in any capacity, a new one.
“We really need you to make sure that these algorithms have encoded in them a sense of the public life, a sense of civic responsibility,” said Pariser to the Facebook and Google execs present at TED. “The thing is that the algorithms don’t yet have the kind of embedded ethics that the editors did… We need to make sure that they also show us things that are uncomfortable or challenging or important.” And in the meantime – given Pariser’s insights – you can opt out of Google’s personalized search feature. As for Facebook, I’ve yet to find a way to de-personalize its News Feed.