21 October 2011
Is Facebook and Google’s efforts to personalize our online experiences really a good thing? Or will they ultimately impose upon us a narrow perception of the world?
As online companies have evolved over the past decade there has been a trend of tailoring their services towards our individual tastes. Eli Pariser illustrated this concept at a recent TED presentation as a phenomenon he called online “filter bubbles”. This is where all the algorithms driving online services (An algorithm is the maths that helps a program decide something), work to generate an online world that is tailored personally for you. The graphic he used to illustrate this is shown in this article. If the circles represent all the types and topics of information out in the internet-iverse, then your “filter bubble” only lets the content through it has defined as relevant to your interests.
Eric Shmidt, the Chief Executive of Google, said “it will be very hard for people to watch or consume something that has not in some sense been tailored for them.”
This big world is getting smaller. So I guess this is another mighty step towards bringing the world’s bigness even closer to our daily lives. Hurray! This makes it even easier now to wade through the quagmire of information available and dig up the little gems that we are really concerned about.
However, let’s step back and look at this personalisation through a different set of glasses.
Mark Zuckerberg, the founder and CEO of Facebook, responded to a question put to him regarding the Facebook News Feed and how it determines the stories it displays. Mark responded with…..
“A squirrel dying in front of your house may be more relevant to your interests right now than people dying in Africa.”
Woah, dude that’s cold!
Unfortunately, Mark does have a point. When you’re online, what are the things we are generally interested in finding? More often than not it’s what your friends are up to, how your football team performed on the weekend, or about the unfortunate squirrel that found a decayed bag of nuts you threw out from cards-night seven weeks ago. Even with deeper and more compelling information readily available to us, we rarely seek it out.
So are the online experience-tailoring algorithms created by the likes of Zuckerberg, Brinn and Page intrinsically evil? Not really. Like one of our articles pointed out a couple of weeks ago, rarely does an object have an intrinsic evilness about it, rather it’s the use of the object, or the user itself, that generates its position on the “goodness-to-evilness scale”.
So the fault cannot be laid solely at the feet of platforms like Google and Facebook. Instead it is our own human nature that prioritises poisoned squirrels over the plights of a broken world.
So now we throw up our hands and recognise that, as crude as Mark’s comment was, it’s true and that we (mankind collectively) are selfish and don’t really care.
Hmmm, maybe not! Human nature or not, the fact remains Marks comment is …..well…. cold, and I for one, want more.
You can either “Be in Control or Be Controlled”. Off the cuff it’s easy to think the “Being Controlled” refers only to the algorithms that are becoming the gatekeepers of the internet. A responsibility certainly still remains on the shoulders of the online giants to make sure these algorithms don’t shelter us from a greater worldview. But really they are just spitting out the information based on our past online activity. What I’m really driving at is that it’s ourselves, our own human nature that needs the controlling.
Without even realising it, most people would find themselves accessing information that is self-indulging basically because the present online framework encourages it. “Because I am interested in my interests, the algorithms show me more of these.” But to reach for something that is beyond ourselves requires us to stop seeking only our own gratification. In a way, it requires a level of self-control. Jesus talks of the fruits of the spirit and points out self-control as one of them. I believe that what we encourage these “filter bubbles” to decide as relevant to our interests is a modern day application of this Biblical principle.
This happens through the decisions we make on the topics we allow to hold our attention. It is important that we don’t allow ourselves to be insulated from new knowledge; otherwise we won’t be able to discover truth.
In a nutshell, my points are;
- that we need to be aware of the self-perpetuating nature of major online platforms,
- we need to encourage the developers of these platforms to make sure the algorithms don’t shelter us from a greater worldview,
- and more than anything we need to “Be in Control” of ourselves so that we avoid insulation and leave the doors open to new knowledge and truth.
There needs to be a balance. I still believe it’s important to pursue content that we are personally interested in. I love checking the news and finding that Manchester United has flogged Chelsea 5-0. But for me to grow as an individual who can contribute to this world, my views, opinions, mindsets and knowledge must continually be challenged and exposed to things beyond my current understanding.
Left unchecked, the alternative will be a world of people who are only exposed to elements of life already found within their existing psyche. And this can only result in an un-educated, self-indulgent and discriminate society that will quickly find itself in a place darker than any it has previously escaped from.
If you would like to watch the TED presentation by Eli Pariser, here it is.