Recently, a colleague sent me a link that has stuck in my brain. It is a 9 minute talk on TED by Eli Pariser on the concept of internet filter bubbles. What Eli argues is that while the internet was created to help drive globalness and openness between individuals, the optimization going on behind the scenes is simply destroying it. Essentially, companies have worked so hard to develop algorithms to make sure you see what you like that we are now living in a world where we see what they want us to see. It is scary to me. In his talk he gives an example of two people doing the same search on google (search is Egypt) and that they see completely different front pages. And that as your usage goes so does what you see until you only really see a small portion of the internet. This filtering, in essence, leads to a world where people believe they are in a free e-world able to roam around where they please but in reality they cannot anymore. The sites we use are starting to hold us hostage using these behind the scenes methods of "observing" you behavior. I do agree that this type of optimization would be useful to give folks a positive experience about their product experience (because it drives efficiency, control and some measure of confidence). But can I really find what I want? And do unemotional algorithms monitoring my clicks UNDERSTAND what I want to really do. If I am a person who worries about their health and likes to read on the issues I may think I have, are we doing consumers a favor by continuous taking them to sites giving that type of information with medicine ads that help them treat the problem. Wouldn't we do an even greater service UNDERSTANDING over time that this consumer is afraid and that they also need to find sites that could help them deal with their fears of getting sick?
The question is this; doesn't something like natural language processing, where you can understand the link between sentences give us the ability to break these filter bubbles by taking an out of the box approach to checking behavior on web content? Instead of simply seeing clicks it could see links within the language you write in your queries (or even let you write sentences as queries rather than key words) to get to a wider variety of information. And over time this approach would broaden the filter bubbles by bringing information in that connects not only with WHAT you want know but HOW YOU FEEL when trying to find out. This would at least give the consumer the option to go places that currently computers currently decide for you. Then you could get more control of your own online world and at least get you out of your own infophobic bubble you let the companies create for you behind the scenes. But I guess the real question is...shouldn't they just give us the option to click the unfiltered button before we click search?