Facebook joins Google in striking out at fake news sites

facebook joins google in striking out at fake news sites 2016 images

Who would have thought that it would take an election in which Donald Trump becomes the next president of the United States for Google and Facebook to realize that many people actually do believe everything on the web no matter how ridiculous or outlandish?

How many times have you heard someone say, “I read about that,” but they don’t say wear. Nine times out of ten you can bet that they read it on the internet. Even President-elect Donald Trump said he got a lot of information from it.

So now after both Facebook and Google got some egg on their face with this, they’ve decided to go after fake news sites that have been around for quite some time. As is usually the case, it took something embarrassing to happen to get these two behemoths to make a real change.

Over the last week, two of the world’s biggest internet companies have faced mounting criticism over how fake news on their sites may have influenced the presidential election’s outcome.

On Monday, those companies responded by making it clear that they would not tolerate such misinformation by taking pointed aim at fake news sites’ revenue sources.

Google kicked off the action on Monday afternoon when the Silicon Valley search giant said it would ban websites that peddle fake news from using its online advertising service. Hours later, Facebook, the social network, updated the language in its ad policy, which already says it will not display ads in sites that show misleading or illegal content, to include fake news sites.

“We have updated the policy to explicitly clarify that this applies to fake news,” a Facebook spokesman said in a statement. “Our team will continue to closely vet all prospective publishers and monitor existing ones to ensure compliance.”

Taken together, the decisions were a clear signal that the tech behemoths could no longer ignore the growing outcry over their power in distributing information to the American electorate.

Facebook has been at the epicenter of that debate, accused by some commentators of swinging some voters in favor of President-elect Donald J. Trump through misleading and outright wrong stories that spread quickly via the social network. One such false story claimed that Pope Francis had endorsed Mr. Trump.

Google did not escape the glare, with critics saying the company gave too much prominence to false news stories. On Sunday, the site Mediaitereported that the top result on a Google search for “final election vote count 2016” was a link to a story on a website called 70News that wrongly stated that Mr. Trump, who won the Electoral College, was ahead of his Democratic challenger, Hillary Clinton, in the popular vote.

By Monday evening, the fake story had fallen to No. 2 in a search for those terms. Google says software algorithms that use hundreds of factors determine the ranking of news stories.

“The goal of search is to provide the most relevant and useful results for our users,” Andrea Faville, a Google spokeswoman, said in a statement. “In this case, we clearly didn’t get it right, but we are continually working to improve our algorithms.”

Facebook’s decision to clarify its ad policy language is notable because Mark Zuckerberg, the social network’s chief executive, has repeatedly fobbed off criticism that the company had an effect on how people voted. In a post on his Facebook page over the weekend, he said that 99 percent of what people see on the site is authentic, and only a tiny amount is fake news and hoaxes.

“Over all, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other,” Mr. Zuckerberg wrote.

Yet within Facebook, employees and executives have been increasingly questioning their responsibilities and role in influencing the electorate, media outlets reported on Saturday.

Facebook’s ad policy update will not stem the flow of fake news stories that spread through the news feeds that people see when they visit the social network.

Facebook has long spoken of how it helped influence and stoke democratic movements in places like the Middle East, and it tells its advertisers that it can help sway its users with ads. Facebook reaches 1.8 billion people around the globe, and the company is one of the largest distributors of news online. A Pew Research Center study said that nearly half of American adults rely on Facebook as a new source.

Google’s decision on Monday relates to the Google AdSense system that independent web publishers use to display advertising on their sites, generating revenue when ads are seen or clicked on. The advertisers pay Google, and Google pays a portion of those proceeds to the publishers. More than two million publishers use Google’s advertising network.

For some time, Google has had policies in place prohibiting misleading advertisements from its system, including promotions for counterfeit goods and weight-loss scams. Google’s new policy, which it said would go into effect “imminently,” will extend its ban on misrepresentative content to the websites its advertisements run on.

“Moving forward, we will restrict ad serving on pages that misrepresent, misstate or conceal information about the publisher, the publisher’s content or the primary purpose of the web property,” Ms. Faville said.

Ms. Faville said that the policy change had been in the works for a while and was not in reaction to the election.

It remains to be seen how effective Google’s new policy on fake news will be in practice. The policy will rely on a combination of automated and human reviews to help determine what is fake. Although satire sites like The Onion are not the target of the policy, it is not clear whether some of them, which often run fake news stories written for humorous effect, will be inadvertently affected by Google’s change.

facebook learns real journalists trump algorithms any day of the week

Where It All Started and How

Facebook CEO Mark Zuckerberg says the idea that fake news spread on Facebook influenced the outcome of the U.S. election is “crazy.”

Still, the majority of Americans (six in 10) say they get at least some news from social media , mostly Facebook, according to the Pew Research Center. While a lot of this news comes from established outlets – whether CNN or BuzzFeed News, misinformation spreads on Facebook just as information does, shared by users, recommended by software and amplified by both.

Sources of spurious information have ranged from news articles produced by “content farms” for the sole purpose of getting clicks, to “hyperpartisan” sites from both sides of the political spectrum, churning out stories that are misleading at best.

Case in point: “FBI AGENT SUSPECTED IN HILLARY EMAIL LEAKS FOUND DEAD IN APPARENT MURDER-SUICIDE” – a fabricated headline from a fake news site called the Denver Guardian, was shared thousands of times in the days leading up to the election.

Is it possible that voters were swayed for or against a candidate, much like those same people might buy a product after seeing an ad on Facebook?

Zuckerberg says voters deserve more credit.

During an interview Thursday with “The Facebook Effect” author David Kirkpatrick, Zuckerberg the said idea that people voted the way they did because of bogus information on Facebook shows a “profound lack of empathy” for supporters of Donald Trump.

“Voters make decisions based on their lived experience,” he said.

Given the acerbic political contest from which the country just emerged, when countless longtime friends, even family, were unfriended, many are left to wonder if there would be an alternative American history being written today if it were not for Facebook, Twitter and the like.

This, after all, was the first truly social media election, playing out on Twitter and Facebook as much or more than it did on major networks, in living rooms and around watercoolers.

But isn’t social media just a reflection of our world as it exists? Has Facebook become an easy scapegoat when the answer is far more complex?

While Pew found that many believe political discussions on social media to be “uniquely angry and disrespectful,” a comparable number have the same impression of face-to-face conversations when it comes to Democrats, the GOP, or another party.

FILTER BUBBLE?

When it comes to Facebook users, Zuckerberg said almost everyone has friends on the “other side.” Even if 90 percent of your friends are Democrats, for example, 10 percent will be Republican. Still, that’s not a very big number, and the idea of a “filter bubble” – that social media allows people to surround themselves only with the people and ideas with whom they agree, has been a hot topic this election cycle.

“By far the biggest filter in the system is not that the content isn’t there, that you don’t have friends who support the other candidate or that are of another religion,” Zuckerberg said. “But it’s that you just don’t click on it. You actually tune it out when you see it. I don’t know what to do about that.”

A DIFFICULT LINE

Facebook has long denied that it’s a publisher or a media company, or that it acts remotely like either. Its cheery slogan – to make the world more “open and connected” – seemingly invites a broad range of viewpoints, diverse, lively discussion and the free flow of information, rather than censorship.

But it could also make clamping down on fake news difficult. At a time when everyone seems entitled, not just to their own opinions, but to their own facts, one person’s misleading headline might be another person’s heartfelt truth.

“We take misinformation on Facebook very seriously,” Adam Mosseri, the executive in charge of Facebook’s news feed, said in a statement to the tech blog TechCrunch this week. “We value authentic communication, and hear consistently from those who use Facebook that they prefer not to see misinformation.”

Facebook acknowledges that it has more work to do, and it seems to be putting a lot of faith in the power of data, artificial intelligence and algorithms as the solution.

Over the summer, Facebook fired the small group of journalists in charge of its “trending” items and replaced them with an algorithm. The catalyst appeared to be a report in a tech blog, based on an anonymous source, that the editors routinely suppressed conservative viewpoints.

Subsequently, fake stories ahead of the election began to trend.