By Doug Pinkham
Public Affairs Council President
March 29, 2012

Personalized news delivery is The Next Big Thing in the media business. Tired of foreign wars or coverage of politicians you don’t support? No worries. Readers of The Washington Post can now create their own “Personal Post.” The artificial intelligence-driven service analyzes your browsing patterns, screens out the bad stuff and produces a stream of customized news.

Similar services exist at The New York TimesThe Wall Street Journal, the Financial Times and other outlets. But many fret that personalized news — along with social networking sites — encourage people to wall themselves off from opposing views. Eli Pariser, author of The Filter Bubble, is concerned that media and social networking companies are trying too hard to give us what we want to see, not “what we need to see.”

When it comes to politics, personalized news delivery may be the journalistic equivalent of ordering two desserts and skipping the fresh vegetables. Critics believe that what you don’t consume — or don’t know — will hurt you. And personalized news makes it easy to choose your preferred diet.

I felt the same way until I read two recent studies — one from academia and one from the Pew Research Center’s Internet & American Life Project.

In the first study, Neil Thurman and Steve Schifferes at the City University London tracked the personalization of national news sites in the UK and the U.S. to determine how news organizations predict readers’ preferences and what those preferences mean for society.

Thurman and Schifferes found that only a small number of readers are willing to spend the time it takes to personalize their news content. As a result, media companies have shifted toward a passive form of personalization that relies on software algorithms to figure out what we want to read based on our past behavior. (It’s kind of like the way your TiVo recorder decides you like Tom Cruise movies after you’ve watched Mission: Impossible II.) These algorithms have become increasingly sophisticated.

The authors also record significant growth in “social collaborative filtering,” in which a social network (usually Facebook’s Activity Feed plugin) shows media articles that a user’s friends have read or recommended. The obvious problem with this approach is it relies heavily on the assumption that one person’s specific interests are shared with hundreds of others. Plus, relatively few articles are posted through the Activity Feed. This means we are not likely to get all our personalized news via Facebook anytime soon.

So what about the impact on society? The authors note that news providers (especially newspapers) are still responsible for most original reporting and that aggressive deployment of personalized news delivery seems to help media sites charge subscription fees. “If personalization helps build audiences and shift revenues from search providers, content aggregators and other intermediaries to the ‘content creators,’” they argue, “deliberative democracy may actually be better sustained.” What’s more, automated links to related articles may be increasing the diversity of sources connected to mainstream news sites.

In the second study, a new poll from Pew shows that “birds of a feather don’t always flock together on social networking sites when it comes to politics.” It turns out that “friends” do in fact disagree with “friends” on social networking sites such as Facebook — and most of the time, nobody cares.

Of the users whose friends post political content, only 25 percent mostly or always agree with these postings and the rest only sometimes agree or never agree. Most important, when there is disagreement, 66 percent of such users ignore the posts and 28 percent respond with comments or posts of their own. “Interestingly enough,” says Pew, “there were no differences in these responses among party partisans or different ideological groups.” Kind of sounds like a well-functioning democracy, doesn’t it?

Eighteen percent of social media users have grown tired of at least one friend’s political rants, however, and have blocked, unfriended or hidden that person from view. Usually this results from somebody posting too frequently about politics or being offensive or argumentative.

For critics concerned that personalized news and social media sites reduce exposure to new ways of thinking, here’s a surprising statistic: 38 percent of social network users have discovered through friends’ postings that their political beliefs were not what they expected.

As Pew observes, social network users “are like other Americans in that many are not particularly passionate about politics.” (Note to Washington insiders: The report says “many friendships are not centered on political discussion.”)

Social networking sites do provide tools to weigh in on public policy issues despite the fact that most users are not choosing friends based on political affiliation. For example, nearly half of users have hit the “like” button in response to political comments or material, 38 percent have posted positive comments in response to a political post or status update, and 16 percent have friended or followed someone because of shared political views.

These numbers represent opportunities for advocates to educate and influence, but the survey’s overall results are a reminder that Facebook and related sites are not political networks. They are social networks, which means politicos need to tread lightly.

Like personalized news delivery services, Facebook probably won’t deprive us of the chance to hear important information and contrary opinions. This is good news in a complex and increasingly partisan world.

Of course, we still have the right to ignore information and opinions — or perhaps unfriend the person who is broadcasting his political views. Active listening, unfortunately, can’t be enforced.

Comments? Email me.