What do we want Facebook to be?

The Washington Post

As you probably know, Facebook has been under fire for the way it edits its "trending topics." Gizmodo reported earlier this month that trending news editors at the social network had purposefully suppressed news from a conservative political viewpoint. Facebook confirmed that chief executive Mark Zuckerberg is going to meet with top conservatives in California — including Glenn Beck and Arthur Brooks — in response to the claims, which Facebook has denied.

Although many pixels have been spilled over what Facebook did, with what intent and to what effect, to me, the real question should be what do we want Facebook to be, anyway?

Do we want Facebook to act as a news site? It certainly never started out that way — it was supposed to give you headlines about your friends, not world events. To some end, that is still what Facebook is; impartiality, a watchword for American journalism, has never been one for Facebook. In fact, it actively didn't want to be impartial — it wanted to be personalized. Liberal users might be more interested in liberal news. Conservatives might want to see conservative news. Facebook doesn't care one way or the other, as long as people see what they want.

Yet we made Facebook into a news source. According to a recent study from the American Press Institute, 51 percent of Americans get their news from social media sites and, the study said, Facebook is the most-consulted network.

And it has adapted according to our habits. When Facebook added the trending topics feature in 2014, it didn't really fit with what most people thought of as Facebook's mission as a social network. Right off the bat, people criticized Facebook for showing fluffy news — the gossipy, scandalous, "you won't believe" stories we all profess to hate but click on anyway. So, over time, the company has crafted guidelines to promote "real news" — or at least more timely news — on the site.

Facebook turned to curation, fueled in part by humans and in part by machines, to make things seem fair. The problem is that while we like to pretend data is neutral, algorithms — because they are written by humans — have bias.

Another thing to note is that Facebook didn't aim to become a gatekeeper for the news; that's a side effect. We could also possibly recognize it's dealing with something it was never really designed to do.

None of this is to say Facebook doesn't deserve some of the criticism it's getting now. It saw how people were using the site, capitalized on those behaviors and pursued a larger role in media. And, it's not stopping — Facebook wants to be our personal assistant and our customer service representative. In that ambitious expansion, Facebook risks more challenges like this in the future, and more responsibilities — every new Facebook feature seems to bring the tech giant potentially in competition with a new industry.

So, yes, Facebook should learn some things. This controversy is a good thing overall and can be used to improve this specific product. Maybe it can also give Facebook pause before it wades into its next area. But it should also reflect something back to us about how we use the network, what we want out of it and what it can reasonably deliver.

Washington Post

Hayley Tsukayama covers consumer technology for The Washington Post.

Copyright © 2017, The Virginia Gazette
30°