![]() Update II - May 10, 2016: Facebook issued a lengthy statement Monday night further addressing the Gizmodo report. This could mean a number of things: Facebook’s guidelines have changed since this story was published Gizmodo’s story is inaccurate the human editors were acting outside management's guideline or Facebook gave us inaccurate information. The Trending algorithm is clearly not ready for prime time, or maybe Facebook is just trying to redefine what it calls "a breadth of ideas and commentary about a variety of topics.Gizmodo also reported that editors were "instructed to artificially ‘inject' selected stories into the trending news module.’ We were told that humans do not choose trending topics, and Facebook has not responded to our questions asking for clarification. There were so many problems with this story, ranging from plagiarism to falsity, that even a fairly simple-minded robot editor should have caught them. It cites exactly one news source as a basis for its speculation: the Vanity Fair piece The Conservative 101 post is three paragraphs long and basically reads as anti-Kelly fan fiction accusing her of being a “closet liberal” who is about to be removed from the network by a Trump-supporting O’Reilly. All three sites use the same “BREAKING” headline. The trending “news” article about Kelly is an Ending the Fed article that is basically just a block quote of a blog post from National Insider Politics, which itself was actually aggregated from a third conservative site, Conservative 101. As Abby Ohlheiser puts it in the Washington Post: Supposedly, humans are still involved with Trending in a few ways, such as "confirming that a topic is tied to a current news event in the real world." But that process appears to have a few bugs. Facebook explained that the new, non-human Trending module is personalized "based on a number of factors, including Pages you’ve liked, your location (e.g., home state sports news), the previous trending topics with which you’ve interacted, and what is trending across Facebook overall." Instead of paying humans to "write topic descriptions and short story summaries," the company said "we’re relying on an algorithm to pull excerpts directly from news stories." Which is why millions of Facebook readers this morning saw the "news" that Megyn Kelly is a traitor who has been fired. In a post about the changes, Facebook said the early move to eliminate human editors was a direct response to "the feedback we got from the Facebook community earlier this year," an oblique reference to the raging controversy unleashed by the Gizmodo revelations. And after Trending latched on to the fake Kelly scoop, it appears that human intervention might still be required to make Facebook's algorithms a legitimate source of news after all. Though the human editors were always expendable-they were mostly there to train the Trending algorithm-they were still engaging in quality control to weed out blatant falsehoods and non-news like #lunch. The training package offered tips on, among other things, how to curate news from an RSS feed of reputable sources when the stories provided by Facebook users were false or repetitive. The original accusations of bias came from a disgruntled ex-editor at Facebook, who leaked internal Trending training materials to Gizmodo. Within 72 hours, according to the Washington Post, the top story on Trending was about how Fox News icon Megyn Kelly was a pro-Clinton "traitor" who had been fired (she wasn't). But in an abrupt reversal, the company fired all the human editors for Trending on Friday afternoon, replacing them with an algorithm that promotes stories based entirely on what Facebook users are talking about. Earlier this year, Facebook denied criticisms that its Trending feature was surfacing news stories that were biased against conservatives. ![]()
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |