Facebook

Facebook, false news and the debate about the wisdom of the masses

In 1906 the scientist Francis Galton made a singular discovery responses by joining a group of people who had answered independently, the conclusion was surprisingly valid. The experiment was one of the triggers of the famous book “The Wisdom of Crowds” James Surowiecki, and in him the theory that often validated the collective opinion on certain issues was Be correct.

Now that wisdom or collective intelligence will be part of Facebook. The company, widely criticized for the controversial false news during the last elections in the US, has already launched a plan to combat the problem. One of the keys to that plan is precisely using you as part of that intelligent mass that helps detect false news. That, the recent story tells us, may not be such a good idea.

Facebook

Image Source: Google Image

Mass Wisdom Works … Sometimes

The experiment Galton tried to achieve estimate the weight of a slaughtered ox, so the mechanics was very appropriate: each of the 800 people interviewed gave their estimation, and finally the median of all these figures turned out to be surprisingly close to the actual weight Of that animal. It missed only 0.8%.

The problem with that experiment is that although it works well for technical results, it does not do so well for those issues where there is no such objective conclusion. When it comes to assessing the quality or accuracy of a matter, it gets difficult. It is something that the makers of news aggregators such as Reddit or Digg in their first edition know a lot about, but it also offers surprising results in services where users rate an object.

The perfect example is Amazon: a researcher at Dartmouth College, Mikhail Gronas, revealed a strange and striking pattern in certain grades of books from Amazon. There were subjects that had better ratings (the ‘Harry Potter’ books) and others that had worse, but there were others with a curious curve: there were many scores of a star, and many of five stars, something that generated one curve horseshoe shaped : they were the controversial books that generated much criticism as praise but they managed something unique: a very high sales.

Voting and prejudices

The problem with this system is that when a reader will buy it and see scores and scores no longer acts independently, but does so with a bias, either positive or negative. That is enough reason not to fulfill the premises of the experiment that Francis Galton had done a century before.

The operation of Digg in its first period, when all the users voted the news, of Menéame in the present and especially of Reddit make good use of that same principle in which the users are those who value the quality of a news under his point of view. Each of these services has (or in the case of Digg, had) an algorithm to manage positive feedback and (if any) negative votes, something that makes the community itself moderator of that content.

You may also like to read another article on Web-Build: How to find customers with Facebook without spending a fortune

That creates several problems. The first, the user groups that try to “tweaking” the system and direct it to their own advantage, whatever it is. Many of these services have often had problems with such multi-user users – a problem we also have in the comment systems of this and other media – that try to act as lobbyists to favor or prejudice certain content. Called ‘mafias’ and ‘clustering’ of users are created in those groups that are created automatically both for and against all these subgroups as shown in accordance with certain specific aspects of the same issue.

But not only that problem exists, but also that which makes when we see the votes happens that in effect our perception of them is already conditioned. We do not appreciate the news equally if a lot of people have rated it positively or negatively. And that problem that adds people who devote more time to this system is that manages to outline the kind of content that just appearing on those sites. These social news sites also must strive to avoid being another example of the so-called ‘Peter Principle’, which reveals the following:

People who do their job well are promoted to positions of greater responsibility, to the point that they reach a position where they cannot formulate even the objectives of a job, and reach their maximum level of incompetence

Avoid all these problems and take advantage of the wisdom of the masses consistently is very complex, and indeed even in the anarchic web site par excellence, Reddit, have been safe from these problems. A few days ago it was discovered as its founder, Steve Huffman, he had used their “super powers” administrator to modify comments supporting Trump, something that has now put in a very bad place both Huffman and also to the reputation of independence Reddit.

Facebook and the community as a false news detector

Facebook is therefore faced with a titanic task. Users can dial any news that appears in their accounts as false, and from there will be a number of researchers employed full – time on Facebook, which valued if, after all those votes actually a story can be described as false or suspected of being .

From there other mechanisms will be triggered: there will be certain organizations that will collaborate in the task of validating and verifying the facts, and if false news is detected these will be marked as suspicious. They will not disappear from the system, but if a user wants to share and “spreading lies” will have to do so after being warned clearly and patent what you are doing.

That may work, but the size of Facebook is so daunting tasks of checking false news could be totally unsustainable. Aaron Sharockman, executive director of Politifact (one of the organizations that will collaborate in that effort) said, “There have always been more things to check the facts than people do. I do not expect that to change.” Still, he is optimistic about this type of function.

But the problem not discussed by Sharockman – and Facebook – is precisely how to give that power to users, although interesting, may end up creating in Facebook the same problems that communities like Reddit, Digg or Menéame have had in the past. Deal with it is particularly complex, so it will be interesting to see how events unfold. The false and the true.