Is it possible to use statistics to mislead an uninformed person




















Any given finding might be wrong, but if the same result is found in subsequent studies, the confidence in its validity rapidly grows. But if you get eight or nine heads in the next trial, and the next after that, you can be pretty sure that the coin is biased. In particle physics, researchers must clear a much higher bar before declaring a discovery, such as the Higgs boson last year, and replication is still essential. Even with only a 1 in 6 million chance of a fluke, few experts would believe the result if not for the fact that two independent experiments both found similar strong evidence.

But often in science, studies are too difficult or expensive to repeat, or nobody wants to bother—sometimes because a lot of money or prestige is on the line. When a study finds nothing interesting, researchers might not even attempt to publish it—or they might not be able to get it published if they tried.

All these factors conspire to give positive findings, often likely to be flukes, more attention than they deserve—especially in the media. For one thing, journalists are eager to report the first instance of a finding, just as scientists are.

Even without curses, first reports are likely to be wrong in many cases. Suppose one lab tests an arsenal of candidate drugs to find one that reduces symptoms by a statistically significant amount.

Say only one of the candidates actually works. For a P value threshold of 0. So, in this simplified example, the odds that the first report of an effective drug is right are just 1 in 6. The first report will most likely be one of the flukes. Reporters sometimes do write about subsequent papers in popular research fields, like cancer research or cloning.

Hot topics make news, but also magnify errors. In these competitive fields, numerous labs around the world pursue the same goal. Say, in the course of a year, 50 published papers reported statistically significant results.

Those 50 published papers represent the 5 percent of the time that fluke data would appear significant. Presumably although of course, not always , previous scientific belief is based on previous scientific data.

There is usually no reason to believe that one new study is right and all previous studies are wrong unless the new data come from improved methods or a technologically advanced instrument. Instead it is usually a green light to go with the story. So the general criteria of newsworthiness—a first report, in a hot field, producing findings contrary to previous belief—seem designed specifically to select the scientific papers most likely to be bogus.

One such study involved a fish—an Atlantic salmon—placed in a brain scanner and shown various pictures of human activity. They were just doing the test to reveal the quirks of statistical significance. The fish in the scanner was dead. Tom Siegfried is a freelance writer in northern Virginia and former editor in chief of Science News. Nautilus uses cookies to manage your digital subscription and show you your reading progress. Now I feel that people will tend to look for confirmation of their biases and the radical transparency will not shine a cleansing light.

Therefore there is ever more information that competes for attention, for credibility and for influence. The competition will complicate and intensify the search for veracity. Of course, many are less interested in veracity than in winning the competition. Producers have an easy publishing platform to reach wide audiences and those audiences are flocking to the sources.

The audiences typically are looking for information that fits their belief systems, so it is a really tough problem. Starr Roxanne Hiltz. They are happy hearing what confirms their views. And people can gain more creating fake information both monetary and in notoriety than they can keeping it from occurring. When the president of the U. And the political environment is bad. There are multiple information streams, public and private, that spread this information online.

We can also not trust the businesses and industries that develop and facilitate these digital texts and tools to make changes that will significantly improve the situation. It seems unlikely that government can play a meaningful role as this referee. We are too polarized. Too many Americans will live in political and social subcultures that will champion false information and encourage use of sites that present such false information.

There were also those among these expert respondents who said inequities, perceived and real, are at the root of much of the misinformation being produced. It is impossible to make the information environment a rational, disinterested space; it will always be susceptible to pressure.

People will continue to cosset their own cognitive biases. When there is value in misinformation, it will rule. Big political players have just learned how to play this game. The current [information] models are driven by clickbait, and that is not the foundation of a sustainable economic model.

There is too much incentive to spread disinformation, fake news, malware and the rest. Governments and organizations are major actors in this space.

As long as these incentives exist, actors will find a way to exploit them. These benefits are not amenable to technological resolution as they are social, political and cultural in nature.

Solving this problem will require larger changes in society. A number of respondents mentioned market capitalism as a primary obstacle to improving the information environment. The information that will be disseminated will be biased, based on monetary interests. They eschew accountability for the impact of their inventions on society and have not developed any of the principles or practices that can deal with the complex issues.

They are like biomedical or nuclear technology firms absent any ethics rules or ethics training or philosophy. It would be wonderful to believe otherwise, and I hope that other commentators will be able to convince me otherwise. Conflict sells, especially to the opposition party, therefore the opposition news agency will be incentivized to push a narrative and agenda. Any safeguards will appear as a way to further control narrative and propagandize the population.

They cited several reasons:. A share of respondents said a lack of commonly shared knowledge leads many in society to doubt the reliability of everything, causing them to simply drop out of civic participation, depleting the number of active and informed citizens. The success of Donald Trump will be a flaming signal that this strategy works, alongside the variety of technologies now in development and early deployment that can exacerbate this problem.

Philip J. These are the main causes of the deterioration of a public domain of shared facts as the basis for discourse and political debate. These people will drop out of the normal flow of information.

Jamais Cascio. What is truth? What is a fact? Who gets to decide? Each can have real facts, but it is the facts that are gathered that matter in coming to a conclusion; who will determine what facts will be considered or what is even considered a fact. Some respondents predicted that a larger digital divide will form.

Those who pursue more-accurate information and rely on better-informed sources will separate from those who are not selective enough or who do not invest either the time or the money in doing so. Anonymous respondent. This will use a combination of organizational and technological tools but above all, will require a sharpened sense of good judgment and access to diverse, including rivalrous, sources.

Outside this, chaos will reign. However, when consumers are not directly paying for such accuracy, it will certainly mean a greater degree of misinformation in the public sphere. That means the continuing bifurcation of haves and have-nots, when it comes to trusted news and information.

Many who see little hope for improvement of the information environment said technology will not save society from distortions, half-truths, lies and weaponized narratives. In the arms race between those who want to falsify information and those who want to produce accurate information, the former will always have an advantage.

David Conrad. Paul N. Many of those who expect no improvement of the information environment said those who wish to spread misinformation are highly motivated to use innovative tricks to stay ahead of the methods meant to stop them. They said certain actors in government, business and other individuals with propaganda agendas are highly driven to make technology work in their favor in the spread of misinformation, and there will continue to be more of them.

There are a lot of rich and unethical people, politicians, non-state actors and state actors who are strongly incentivized to get fake information out there to serve their selfish purposes. Jason Hong. Scott Spangler , principal data scientist at IBM Watson Health, said technologies now exist that make fake information almost impossible to discern and flag, filter or block.

Lastly, the incentives are all wrong. Those wanting to spread misinformation will always be able to find ways to circumvent whatever controls are put in place. Some respondents expect a dramatic rise in the manipulation of the information environment by nation-states, by individual political actors and by groups wishing to spread propaganda.

Their purpose is to raise fears that serve their agendas, create or deepen silos and echo chambers, divide people and set them upon each other, and paralyze or confuse public understanding of the political, social and economic landscape. Anonymous project leader for a science institute. This has been referred to as the weaponization of public narratives. Social media platforms such as Facebook, Reddit and Twitter appear to be prime battlegrounds. Bots are often employed, and AI is expected to be implemented heavily in the information wars to magnify the speed and impact of messaging.

Messages can now be tailored with devastating accuracy. Furthermore, information is a source of power and thus a source of contemporary warfare. An emeritus professor of communication for a U. It is being replaced by social media, where there are few if any moral or ethical guidelines or constraints on the performance of informational roles. The existence of clickbait sites make it easy for conspiracy theories to be rapidly spread by people who do not bother to read entire articles, nor look for trusted sources.

Given that there is freedom of speech, I wonder how the situation can ever improve. Most users just read the headline, comment and share without digesting the entire article or thinking critically about its content if they read it at all. The rise of new and highly varied voices with differing agendas and motivations might generally be considered to be a good thing.

But some of these experts said the recent major successes by misinformation manipulators have created a threatening environment in which many in the public are encouraging platform providers and governments to expand surveillance. Some of these experts expect that such systems will act to identify perceived misbehaviors and label, block, filter or remove some online content and even ban some posters from further posting.

Retired professor. This will end up being a censored information reality. A distinguished professor emeritus of political science at a U. Misinformation can be dangerous. Months before the election of then-presidential candidate Donald Trump, misleading stories were used to manufacture rage among unsuspecting users on social media.

In one instance , a false conspiracy theory about then- Democratic presidential candidate Hillary Clinton inspired a man to travel from North Carolina to a pizzeria in Washington, D. Research has shown that false information travels faster than the truth on social media.

In , a study from the Massachusetts Institute for Technology said misinformation moved six times faster than the truth on Twitter. Researchers analyzed , cascades of tweets, shared by 3 million people more than 4.

Their results suggested human Twitter-users trafficked inaccurate stories more frequently than bots, or computer-operated social media accounts that share content. Thirty-nine percent of Americans say the news media is responsible for vetting misleading information.

Another 18 percent say companies like Facebook, Twitter or Google are responsible. Seventy-five percent of U. The public lacks confidence in major social media companies despite tech giants such as Facebook and Twitter having promised to take steps to prevent election interference on their platforms.

In April , Facebook founder and chairman Mark Zuckerberg testified before Congress about election security. I think we have to work on that. Google is already doing this to some degree. It operates a little known grant scheme that allows certain NGOs to place high-ranking adverts in response to certain searches. It is used by groups like the Samaritans so their pages rank highly in a search by someone looking for information about suicide, for example.

But Google says anti-radicalisation charities could also seek to promote their message on searches about so-called Islamic State, for example. But there are understandable fears about powerful internet companies filtering what people see - even within these organisations themselves. For those leading the push to fact-check information, better tagging of accurate information online would be a better approach by allowing people to make up their own minds about the information.

We need to tag and structure quality content in effective ways. Mantzarlis believes part of the solution will be providing people with the resources to fact-check information for themselves. He is planning to develop a database of sources that professional fact-checkers use and intends to make it freely available. This is a problem that governments around the world are facing as the public views what they tell them with increasing scepticism.

Nesta, a UK-based charity that supports innovation, has been looking at some of the challenges that face democracy in the digital era and how the internet can be harnessed to get people more engaged. Eddie Copeland, director of government innovation at Nesta, points to an example in Taiwan where members of the public can propose ideas and help formulate them into legislation.

But that means facing up to our own bad habits. She and her team have been working to identify fake news on the internet since Will Moy agrees. He argues that by slipping into lazy cynicism about what we are being told, we allow those who lie to us to get away with it.

Instead, he thinks we should be interrogating what they say and holding them to account. If you liked this story, sign up for the weekly bbc. Grand Challenges Grand Challenges. Lies, propaganda and fake news: A challenge for our age. Share using Email. By Richard Gray. With news sources splintering and falsehoods spreading widely online, can anything be done? Richard Gray takes an in-depth look at how we got here — and hears from the researchers and innovators seeking to save the truth.

Having a large number of people in a society who are misinformed is absolutely devastating and extremely difficult to cope with — Stephan Lewandowsky, University of Bristol. Alternative histories Working out who to trust and who not to believe has been a facet of human life since our ancestors began living in complex societies.

There is great concern about how we control the dissemination of things that seem to be untrue — Paul Resnick, University of Michigan. For every fact there is a counterfact and all those counterfacts and facts look identical online — Kevin Kelly, co-founder Wired magazine.

People are quicker to assume they are being lied to but less quick to assume people they agree with are lying, which is a dangerous tendency — Will Moy, director of Full Fact.



0コメント

  • 1000 / 1000