Finally we have John Herrman's post from the Times that gets to the root of the "Fake News" story. He gets it right and summarizes it better than I did in this previous post. There are really three problems, the first of which is simply the nature of the World Wide Web and the Internet, which, like any truly global market, is practically unregulated in important ways. Not much we can do about fake news here. The second problem is user-generated content published on Facebook which are and will remain un-curated, the responsibility of Facebook users, much of it populated with ridiculous and unsubstantiated opinion and outright lies. It'll be impossible for Facebook to be the arbiter of truth in this domain either.
The THIRD problem, on the other hand, a much simpler and more narrow problem, is something that Facebook can do something about. It's about links on Facebook that tempt more than 757 Million daily average users away from a Facebook controlled (and monetized) experience out into the World WILD Web. And naturally, when Facebook addresses this third problem, they are going to do it in a way that strengthens their core advertising business, wether or not it has any impact at all on the quality of news.
The THIRD problem, on the other hand, a much simpler and more narrow problem, is something that Facebook can do something about. It's about links on Facebook that tempt more than 757 Million daily average users away from a Facebook controlled (and monetized) experience out into the World WILD Web. And naturally, when Facebook addresses this third problem, they are going to do it in a way that strengthens their core advertising business, wether or not it has any impact at all on the quality of news.
Consider this characterization of what makes a “fake news” site a bad platform citizen: It uses Facebook to capture receptive audiences by spreading lies and then converts those audiences into money by borrowing them from Facebook, luring them to an outside site larded with obnoxious ads. The site’s sin of fabrication is made worse by its profit motive, which is cast here as a sort of arbitrage scheme. But an acceptable news site does more or less the same thing: It uses Facebook to capture receptive audiences by spreading not-lies and then converts those audiences into money by luring them to an outside site not-quite larded with not-as-obnoxious ads. In either case, Facebook users are being taken out of the safe confines of the platform into areas that Facebook does not and cannot control.Watch this space. Facebook will fix this by making it harder to link to ANY outside content... including this blog post.
Facebook’s plan for “fake news” is no doubt intended to curb certain types of misinformation. But it’s also a continuation of the company’s bigger and more consequential project — to capture the experiences of the web it wants and from which it can profit, but to insulate itself from the parts that it doesn’t and can’t. This may help solve a problem within the ecosystem of outside publishers — an ecosystem that, in the distribution machinery of Facebook, is becoming redundant, and perhaps even obsolete.And it's not just about news. It's about EVERYTHING in the world.
In the run-up to Facebook’s initial public offering, Mark Zuckerberg told investors that the company makes decisions “not optimizing for what’s going to happen in the next year, but to set us up to really be in this world where every product experience you have is social, and that’s all powered by Facebook.”Like Google, Facebook aims to be your platform for EVERYTHING. While this will have a huge impact on Facebook's bottom line and probably their stock price as well, it's unlikely to curb most of the fake news content that is shared between users.
A verified user telling a lie, be it a friend from high school or the president elect, isn’t breaking the rules; he is, as his checkmark suggests, who he represents himself to be. A post making false claims about a product is Facebook’s problem only if that post is labeled an ad. A user video promoting a conspiracy theory becomes a problem only when it leads to the violation of community guidelines against, for example, user harassment. Facebook contains a lot more than just news, including a great deal of content that is newslike, partisan, widely shared and often misleading. Content that has been, and will be, immune from current “fake news” critiques and crackdowns, because it never had the opportunity to declare itself news in the first place. To publish lies as “news” is to break a promise; to publish lies as “content” is not.And so the bubble effect of social media and the powerful voice it gives to community leaders (for good or for ill) are unlikely to fade.
No comments:
Post a Comment