advertisement
Disinformation, the practice of blending real and fake information with the goal of duping a government or influencing public opinion, has its origins in the Soviet Union. But disinformation is no longer the exclusive domain of government intelligence agencies.
Today’s disinformation scene has evolved into a marketplace in which services are contracted, laborers are paid and shameless opinions and fake readers are bought and sold. This industry is emerging around the world. Some of the private-sector players are driven by political motives, some by profit and others by a mix of the two.
Public relations firms have recruited social media influencers in France and Germany to spread falsehoods. Politicians have hired staff to create fake Facebook accounts in Honduras. And Kenyan Twitter influencers are paid 15 times more than many people make in a day for promoting political hashtags.
South Korea has been at the forefront of online disinformation. Western societies began to raise concerns about disinformation in 2016, triggered by disinformation related to the 2016 U.S. presidential election and Brexit. But in South Korea, media reported the first formal disinformation operation in 2008.
As a researcher who studies digital audiences, I’ve found that South Korea’s 13-year-long disinformation history demonstrates how technology, economics and culture interact to enable the disinformation industry.
Most importantly, South Korea’s experience offers a lesson for the U.S. and other countries. The ultimate power of disinformation is found more in the ideas and memories that a given society is vulnerable to and how prone it is to fueling the rumor mill than it is in the people perpetrating the disinformation or the techniques they use.
The origin of South Korean disinformation can be traced back to the nation’s National Intelligence Service (NIS), which is equivalent to the U.S. Central Intelligence Agency. The NIS formed teams in 2010 to interfere in domestic elections by attacking a political candidate it opposed.
The NIS hired more than 70 full-time workers who managed fake, or so-called sock puppet, accounts. The agency recruited a group called Team Alpha, which was composed of civilian part-timers who had ideological and financial interests in working for the NIS. By 2012, the scale of the operation had grown to 3,500 part-time workers.
The company’s client was a close political aide of the current president, Moon Jae-in.
In contrast to NIS-driven disinformation campaigns, which use disinformation as a propaganda tool for the government, some of the private-sector players are chameleonlike, changing ideological and topical positions in pursuit of their business interests.
These private-sector operations have achieved greater cost effectiveness than government operations by skillfully using bots to amplify fake engagements, involving social media entrepreneurs like YouTubers and outsourcing trolling to cheap laborers.
In South Korea, Cold War rhetoric has been particularly visible across all types of disinformation operations. The campaigns typically portray the conflict with North Korea and the battle against Communism as being at the center of public discourse in South Korea.
In reality, nationwide polls have painted a very different picture. For example, even when North Korea’s nuclear threat was at a peak in 2017, fewer than 10 percent of respondentspicked North Korea’s saber-rattling as their priority concern, compared with more than 45 percent who selected economic policy.
My research on South Korean social media rumors in 2013 showed that the disinformation rhetoric continued on social media even after the formal disinformation campaign ended, which indicates how powerful these themes are. Today I and my research team continue to see references to the same themes.
The disinformation industry is enabled by the three prongs of today’s digital media industry: an attention economy, algorithm and computational technologies and a participatory culture. In online media, the most important currency is audience attention.
Metrics such as the number of page views, likes, shares and comments quantify attention, which is then converted into economic and social capital.
Ideally, these metrics should be a product of networked users’ spontaneous and voluntary participation. Disinformation operations more often than not manufacture these metrics by using bots, hiring influencers, paying for crowdsourcing and developing computational tricks to game a platform’s algorithms.
Historically, democracies have relied on polls to understand public opinion. Despite their limitations, nationwide polls conducted by credible organizations, such as Gallup and Pew Research, follow rigorous methodological standards to represent the distribution of opinions in society in as representative a manner as possible.
Public discourse on social media has emerged as an alternative means of assessing public opinion. Digital audience and web traffic analytic tools are widely available to measure the trends of online discourse. However, people can be misled when purveyors of disinformation manufacturer opinions expressed online and falsely amplify the metrics about the opinions.
To counter the disinformation industry wherever it emerges, governments, media and the public need to understand not just the who and the how, but also the what – a society’s controversial ideologies and collective memories. These are the most valuable currency in the disinformation marketplace.
(K. Hazel Kwon is a U.S.-Korea NextGen Scholar under the sponsorship of the Korea Foundation and an Associate Professor of Journalism and Digital Audiences, Arizona State University)
(This is an opinion piece and the views expressed above are the author’s own. The Quint neither endorses nor is responsible for the same. This article was originally published on The Conversation. Read the original article here.)
(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)