But I discovered over the past week that Facebook has yet even to acknowledge that this responsibility exists when it deals with news.
Andy Mitchell, Facebook’s director of news and global media partnerships, came to speak at the international journalism festival in Perugia. Each month, 1.4bn people use Facebook. In America, 30% of adults get their news via Facebook (27% in the UK); 88% of millennials in the US do so (71% in Italy). That makes Mitchell one of the most – if not the most – powerful news distributor on the planet.
And what Mitchell had to say was straightforward in most ways and extremely odd in one important omission.
Facebook wants to improve the “experience” (this word cropped up a lot) of people getting their news on mobile to improve. Links to clunky news sites load slowly and Facebook is talking to major sites (such as the New York Times and Buzzfeed) about embedding their journalism directly in Facebook. Every statistic underlines how much people like getting their news on Facebook.
This was all fascinating, but there wasn’t any mention of how Facebook sees and handles its role as a news gatekeeper, influencing both the detail and flow of what people see. The issue didn’t come up right till the end of Mitchell’s session when a Scandinavian questioner asked Mitchell about instances of Facebook cutting out material from the news linked from his organisation and an Italian student followed up.
Mitchell batted both questions away without addressing either directly.
I then asked Mitchell whether he thought Facebook was in any way accountable to its community for the integrity of its news feed. Mitchell, by now looking pretty pissed off, repeated that Facebook wanted people to have a “great experience”, that the feed gives them “what they’re interested in” and that Facebook’s feed should be “complementary” to other news sources. In short, he didn’t begin to answer the question.
For the senior news guy with such gatekeeper and distribution power to evade these questions is condescending to say the least. Facebook is not, and knows quite well it is not, a neutral machine passing on news. Its algorithm chooses what people see, it has “community standards” that material must meet and it has to operate within the laws of many countries.
Shaping news
To imply, as Mitchell did, that Facebook doesn’t have responsibilities in journalism has to be false. And, at least in the long run, it won’t work; in the end these issues have to faced. Facebook is a private company which has grown and made billions by very successfully keeping more people on its site for longer and longer. I can imagine that any suggestion that there are responsibilities which distract from that mission must seem like a nuisance.
Google once claimed something similar. Its executives would sit in newspaper offices and claim, with perfectly straight faces, that Google was not a media company. As this stance gradually looked more and more absurd, Google grew up and began to discuss its own power in the media.
It was difficult to pass a day in Perugia without being reminded of how Facebook is making (usually via its algorithms) news decisions every hour. Someone reminded me of the survey in the US which showed large percentages of respondents quite unaware that Facebook has a adjustable formula which determines what their news feed shows. Rasmus Kleis Nielsen mentioned in a presentation the disagreements which temporarily took news from the Danish media company Berlinske off Facebook (at issue was a picture of some hippies in the 1960s frolicking nude in the sea). There was another row in Denmark when Facebook objected to a picture of Michelangelo’s (also nude) statue of David. An editor for the Turkish daily Milliyet reminded me that Facebook has strict rules about how Kurdish flags are seen on its feed in Turkey.
My blog post about that short and unrevealing exchange with Andy Mitchell, to judge by the large response, touched some kind of nerve. Jay Rosen of New York University, who first wrote powerfully on this subject in 2014, returned to it with an appeal which credited Facebook with caring about news but asked them to stop pretending that how they handle news is of no interest to anyone but themselves.
Facebook has to start recognising that our questions are real. We are not suggesting that it “edits” NewsFeed in the same way that a newspaper editor once edited the front page. It’s a very different way. That’s why we’re asking about it! We are not suggesting that algorithms work in the same way that elites deciding what’s news once operated. It’s a different way. That’s why we’re asking about it!
Sooner or later, Facebook’s state of denial will have to end.
This article was originally published on The Conversation.