Facebook, Twitter, and just about every other site on the Web these days boast of their ability to serve you up exactly what you want to see, thanks to increasingly fine-tuned personalization technology. Of course, there is a downside to this so-called advance: what happens when you want to see a diversity of thought?
This issue came into focus earlier in May when Facebook users were up in arms over allegations that the social media site may have intentionally excluded news from conservative sites in its roundup of "Trending Topics." But according to some experts, the tech algorithms aren't necessarily the problem: Even when given a choice, people tend to only click on news sources and points of view that they usually read.
Even though people have more ways today than ever before to connect with people from across the world they tend to use technology to self-segregate, say Danah Boyd, the founder of research institute Data & Society, and Gillian Tett, the U.S. managing editor at the Financial Times and the author of The Silo Effect.
Speaking at the Techonomy conference in New York City Thursday, the two women--both anthropologists by training--discussed some of the pitfalls of what they call "data-centric technological development."
Both tech startups and big corporations are relying more heavily on plug-ins, cookies, and other technology to collect data on whatkind of ads customers click on, what products they buy, and what articles they read, and then create more tailored content using that data. It's why you're more likely to see an ad about things to do in Miami after you've booked a vacation there.
But, this means that you're more likely to engage in what Tett refers to as "tribalism"--that is, engaging with people and content that aligns with your current views and beliefs.
"Twitter and other social media sites were created to help people flock together, but the question in my mind is are they causing people to flock together or fly apart?" says Tett.
Boyd has found previous evidence that online self-segregation existed even before social media sites started using algorithms to show us content only from our most clicked-on followers. A study conducted in 2009 found that at the time, teenagers who used Facebook were more likely to come from wealthier families than those who used MySpace. She suggested that this could be reflective of the fact that Facebook was created at Harvard, while MySpace gained popularity in Los Angeles's underground music scene.
If we're already more likely to interact with only people who share our beliefs, and tech companies subconsciously encourage us to do this, how then do we avoid thinking and acting only in accordance with our "tribes"? Boyd and Tett say that people need to make conscious efforts to engage with people outside of their political, social, economic, and geographical circles.
Tett suggests that Twitter users take a cue from former CEO Dick Costolo, who would choose about 20 new people to follow on the social media site every few weeks. Boyd says that as she comes across news articles on the Web, before she scrolling past them, she stops to consider how reading about that particular news might benefit her world view.
"I don't want to read another article about Syria, for example, yet I know as a citizen of this world it is extremely important for me to do that," Boyd says.