The Five Laws of Disinformation

Discussion in 'Politics' started by gwb-trading, Jul 14, 2022.

  1. gwb-trading

    gwb-trading

    The Five Laws of Disinformation
    https://www.smerconish.com/exclusive-content/the-five-laws-of-disinformation

    The world has just witnessed what a motivated country is willing to do to ensure that their goals are met. However, while the current situation in Ukraine has been heartbreaking for most of the world, it's important to understand that the first shots fired in the invasion were not on February 22, 2022. The attack started years before with a vast disinformation campaign that subtlety drove a wedge into Ukrainian society which, in turn, helped to destabilize the legitimate government in Kyiv.

    This article is not about Russia or Ukraine, but rather about exemplifying the framework that disinformation falls into so that we may understand how incredibly harmful it can become to its target, by virtue of what it inherently is.

    So, without further ado, here are the Five Laws of Disinformation:

    1: Disinformation Easily Turns Into Misinformation

    While disinformation and misinformation tend to go hand-in-hand, there is one core difference between them; intent.

    Disinformation is blatantly false information created with a willing intent to deceive its intended targets. Misinformation is information that is spread by unsuspecting individuals who believe that the content they’re absorbing is accurate. This misinformation is typically alarming to the observer in some way, and therefore, they feel an ethical obligation to help by sharing it as much as possible, intending to inform others.

    This is how disinformation morphs into misinformation. Those with malicious intent are banking on the poorly informed to spread their crafted lies with earnest intent. Combine the motivation of an unsuspecting population with free and open social media platforms that connect the world, and a serious problem arises: the primary drivers of disinformation, usually nation-states, are essentially given access to a supercharger to spread their fake news.

    2: Disinformation Twists Truth to Prey on Confirmation Bias

    The most successful disinformation campaigns throughout the Digital Age are those that modify truth, or partial truth, to suit the purposes of the perpetrators. Furthermore, their lies are crafted to seem believable to a targeted population who already agrees with the original truth that is being, unknowingly, corrupted.

    For the disinformation to spread quickly online as misinformation, this perversion requires the confirmation bias of the targeted population.

    One of the most successful examples of this problem is Russia's Internet Research Agency which, during the 2016 U.S. Presidential election, crafted thousands of paid advertisements on Facebook that were designed to drive a wedge into American society. On the same day, simultaneous ads both supporting and deriding U.S. law enforcement were released to specific political populations, with the intent of driving the two sides of the debate further away from each other.

    Using the truth that the United States has a history of police brutality towards minorities, the disinformation ads intensified the rhetoric for the political right by saying that the police are being unfairly targeted for something that isn't a problem so it's important to support "Blue Lives Matter.” Conversely, an ad targeting the political left was released that posed the question: "How many more black men have to be killed…" with ruthless impunity.

    Note that the truth is there, yet blown out of proportion to rile up, essentially, everyone involved. Confirmation bias is what made both of these disinformation campaigns go viral on both sides of the political aisle.

    3: Disinformation is Antithetical to the Society it Targets, Which Benefits an Adversary
    The core goal of disinformation is to degrade and destabilize society, thus benefiting one of the society's adversaries.

    China has launched disinformation campaigns against the populations of Hong Kong and Taiwan with the intent of destabilizing each region. If China can sow doubt and discord into both populations, then the net benefit is an easier time gaining control over each region. In Hong Kong, disinformation campaigns helped the Communist Party of China easily take over the city and established rule under their handpicked local leaders. The disinformation campaign blitz included demonstrably false information about the democratically-elected leaders that they sought to replace.

    If disinformation has to benefit its perpetrators, then it would be pointless to utilize it.

    4. Successful Disinformation Cannot Be Obvious
    Hiding in plain sight is the name of the game in disinformation. While disinformation often utilizes hyperbole, the most successful campaigns will tone down the hyperbole in an attempt to make their claims seem rational, therefore, believable and plausible.

    A textbook example of this is using the real problem of child trafficking as a gateway into conspiracy theories that just feel right to the reader. No one can deny that child trafficking is a serious issue. So, when it is exacerbated by demonstrably false information, such as the claim that children are being sold through the retail website Wayfair, it could seem logical to the targeted audience.

    5. Disinformation Can and Must Be Combated Whenever Possible

    Left unchecked, disinformation damages a society, deeply. The United States seems to have such a vast political divide, when in actuality, studies show that most of the population is more in the ideological middle, and simply tired of all the "screaming" by a rather vocal minority on both sides of the political aisle. Those minorities are the ones being pumped full of disinformation that’s urging them to act out both online, and sadly, in person as well.

    However, not all is lost as disinformation can be combatted by proper education. Training a population on critical thinking and how to spot fake news goes a long way in lowering the amount of disinformation and online rhetoric related to it. Finland identified a large disinformation campaign created by Russia during their 2015 election. So, the Finnish embarked on a nationwide educational campaign that successfully thwarted Russia’s attempts.

    Ultimately, the goal of any society should be to live harmoniously with each other. Conflict will always be present, but so can rational discourse to amicably debate and resolve the issue.

    If a population begins to lose its democratic roots, then the world is lost thanks to the machinations of the autocrats. Here's hoping we all learn just how much damage we have experienced, and take the actions to stem the tide.
     
  2. Mercor

    Mercor

    Perfect summation of the media since day one of the Trump Presidency.
     
  3. gwb-trading

    gwb-trading

    Musk's X/Twitter is misinformation central.

    Study shows relatively low number of superspreaders responsible for large portion of misinformation on Twitter
    https://phys.org/news/2024-05-superspreaders-responsible-large-portion-misinformation.html


    [​IMG]
    Classification of Superspreader Accounts

    A small team of social media analysts at Indiana University has found that a major portion of tweets spreading disinformation are sent by a surprisingly small percentage of a given userbase.

    In their study, published in PLOS ONE, the group conducted a review of 2,397,388 tweets posted on Twitter (now X) that were flagged as having low credibility and who was sending them.

    Over the past several years, media researchers have found that social media sites such as Facebook, Twitter and Instagram can have a major impact on personal beliefs and social issues, including those of a political nature. Prior research has also shown that because of such influence, foreign entities have been posting entries on social media sites with the intention of swaying public opinion on a variety of issues.

    In this new study, the research team found that it does not take a lot of influencers to sway the beliefs and/or opinions of large numbers of people. This, they suggest, is due to the impact of what they describe as superspreaders.

    Like the superspreaders that were labeled as such during the pandemic, superspreaders on the internet have the ability to "infect" large numbers of people due to their reputation.

    To learn more about influence on social media, the research team focused their efforts on Twitter. They collected 10 months of data, which added up to 2,397,388 tweets sent by 448,103 users, and then parsed it, looking for tweets that were flagged as containing low-credibility information.

    They found that approximately a third of the low-credibility tweets had been posted by people using just 10 accounts, and that just 1,000 accounts were responsible for posting approximately 70% of such tweets.

    They note that the majority of the superspreader accounts could not be traced to an individual, though there were a number of high-profile posters, such as politicians or generalist influencers, such as Donald Trump Jr.

    The researchers note that many of the superspreader accounts they identified were disabled during a push by Twitter in 2020 to reduce the amount of disinformation on the site. But that trend is now in reverse as the site, now rebranded as X, has taken a new direction following its takeover by Elon Musk.
     
  4. gwb-trading

    gwb-trading

    The state of the media, 2024 version.

    Phony 'news' portals surpass US newspaper sites, researchers say
    https://phys.org/news/2024-06-phony-news-portals-surpass-newspaper.html

    Partisan websites masquerading as media outlets now outnumber American newspaper sites, a research group that tracks misinformation said Tuesday, highlighting a local news crisis in a year of high-stakes elections.

    Hundreds of sites mimicking news outlets—many of them powered by artificial intelligence—have cropped up in recent months, fueling an explosion of polarizing or false narratives that are stoking alarm as the race for the White House intensifies.

    At least 1,265 "pink slime" outlets—politically motivated websites that present themselves as independent local news outlets—have been identified, the US-based research group NewsGuard said in a report.

    By comparison, 1,213 websites of local newspapers were operating in the United States last year, according to Northwestern University's "local news initiative" project.

    "The odds are now better than 50-50 that if you see a news website purporting to cover local news, it's fake," the NewsGuard report said.

    Nearly half of the partisan sites were targeted at swing states, according to an analysis by the news site Axios, in what appears to be an effort to sway political beliefs ahead of the November election expected to be between President Joe Biden and Donald Trump.

    Those sites include a network of 167 Russian disinformation sites that NewsGuard said were linked to John Mark Dougan, a US former law enforcement officer who fled to Moscow.

    The other sites are backed by conservative as well as influential left-leaning groups such as Metric Media, Courier Newsroom and States Newsroom, the report said.

    The rise of pink slime comes amid a rapid decline of local newspapers, many of which have either shut down or suffered extensive layoffs due to economic headwinds.

    A study by Northwestern University last year identified 204 counties out of some 3,000 in the United States as "news deserts," having "no newspapers, local digital sites, public radio newsrooms or ethnic publications."

    Newspapers are continuing to vanish at an average rate of more than two per week, the study said.

    It added that the United States has lost almost two-thirds of its newspaper journalists since 2005.

    "With traditional newspapers disappearing... pink slime sites are rushing in to fill the void," NewsGuard's report said.

    "Consequently, millions of Americans are left without legitimate local coverage."

    Propaganda-spewing partisan websites have typically relied on armies of writers, but generative artificial intelligence tools now offer a significantly cheaper and faster way to fabricate content that is often hard to decipher from authentic information.

    These websites underscore the potential of AI-powered tools—chatbots, photo generators and voice cloners—to turbocharge misinformation while further eroding trust in traditional media, researchers say.