Securing Democracy Dispatch
News and Commentary
Tracking China’s efforts to expand its influence in Europe: A new report by the Global Public Policy Institute finds that “China’s rapidly increasing political influencing efforts in Europe and the self-confident promotion of its authoritarian ideals pose a significant challenge to liberal democracy as well as Europe’s values and interests.” Report co-authors Thorsten Benner, Jan Gaspers, Mareike Ohlberg, Lucrezia Poggetti, and Kristin Shi-Kupfer describe China’s “flexibly influencing toolkit” which includes both overt and covert tactics, as primarily deployed in “three arenas: political and economic elites, media and public opinion, and civil society and academia,” which has affected how “European states increasingly tend to adjust their policies in fits of ‘preemptive obedience’ to curry favor with the Chinese side.” Rick Noack issues a similar warning in The Washington Post that “Europe’s embrace of China, even as it warns against Russian meddling, might benefit from a certain degree of wariness,” citing “China’s infrastructure investments in eastern and southern Europe in cash-strapped countries such as Greece … [and] Improved Chinese-Norwegian trade ties [that] have coincided with a Norwegian effort to drop some of its human rights criticism of Beijing.” Noack also describes differences in Russian and Chinese influence operations, finding that while the Russians “have mainly shaped the discourse of the wider public while offending Europe's ruling elites, China's influence mainly appears to target leading politicians, academics, and journalists in an active outreach effort at conferences, receptions or less public meetings.” (GPPI, The Washington Post)
Russian hackers target the U.S. defense industry as former DHS head warns about Russian cyber-threats: Former DHS Secretary Jeh Johnson sounded the alarm about the threat Russian hackers pose to U.S. voting infrastructure, pointing to the states that have “done little to nothing ‘to actually harden their cybersecurity,’” after Jeanette Manfra, the head of cybersecurity at the Department of Homeland Security, reconfirmed that Russian hackers targeted 21 states in 2016 and were able to penetrate a small number of them. The Associated Press also reported that the Russian hacking group Fancy Bear targeted “at least 87 people working on militarized drones, missiles, rockets, stealth fighter jets, cloud-computing platforms or other sensitive activities,” in order to attain “sensitive U.S. defense technology.” According to Charles Sowell, a former senior advisor to the U.S. Office of the Director of National Intelligence, “The programs that they appear to target and the people who work on those programs are some of the most forward-leaning, advanced technologies … And if those programs are compromised in any way, then our competitive advantage and our defense is compromised.” While it is unclear what may have been stolen, this discovery exposes a “national vulnerability in cybersecurity: poorly protected email and barely any direct notification to victims.” (NBC News, Associated Press)
Russia targets its influence operations: According to The Guardian, Russia is attempting to hack the Oscars by targeting a disinformation smear campaign against Feras Fayyad, director of the Oscar nominated movie “Last Men in Aleppo.” Since his film was nominated, Russian state-run media outlets have portrayed Fayyad’s film as a “propaganda piece funded by Western governments” and an “Al-Qaida promotional film.” And Jason Schwartz in Politico reports that Russian-influenced Twitter accounts exposed by the Hamilton 68 dashboard, which are building off of the successful #releasethememo campaign, have accelerated their push of “issues related to the ‘deep state.’” In the Financial Times, Simon Kuper discusses Russia’s use of disinformation to meddle in European politics, finding that they change their narrative to apply to the domestic situation. Kuper believes that “Putin has found disinformation a cheaper and arguably more effective route to influence than tanks or foreign investments.” As a reminder that Russia seeks to sow chaos by appealing to both left and right sentiments, Anton Shekhovtsov tweeted: “The German far-left Die Linke party is now competing with the far-right AfD: Russian media report that a Die Linke delegation headed by Andreas Maurer will soon illegally visit Russia-annexed Crimea. Maurer already did that in March 2017.” (The Guardian, Politico, Financial Times, Twitter)
Unilever adjusts its ad buys as more evidence of social media-influenced bias builds: Unilever, one of the world’s largest advertisers, announced that it “will not invest in platforms or environments that do not protect our children or which create division in society, and promote anger or hate.” Unilever CMO Keith Weed stated that the decision is directed at “having a positive impact on society and whether we as a company want to engage with companies that are not committed to making a positive impact.” Unilever’s decision comes as new studies show how bias is cultivated among social media users. According to Jack Nicas in The Wall Street Journal, biases are encouraged by YouTube. Nicas writes that “YouTube is the new television, with more than 1.5 billion users,” whose “recommendations often lead users to channels that feature conspiracy theories, partisan viewpoints and misleading videos, even when those users haven’t shown interest in such content. When users show a political bias in what they choose to view, YouTube typically recommends videos that echo those biases, often with more-extreme viewpoints.” Drawing a distinction between Facebook and Twitter where users see content from those they follow, Nicas states that “YouTube takes an active role in pushing information to users they likely wouldn’t have otherwise seen.” The Financial Times published research by Phil Howard with the Oxford Internet Institute which shows that “Trump supporters are more polarized and share a wider range of misinformation than any other U.S. audience group on Twitter.” The study finds that “low-quality, extremist, sensationalist and conspiratorial news published in the U.S. was overwhelmingly consumed and shared by rightwing social network users.” (The Wall Street Journal, Twitter, The Guardian)
Twitter changes the way it shows retweets and CNN uncovers Russian propaganda on the platform: Twitter confirmed to Slate that it will change in the way it shows retweets, which according to April Glaser, will “instead of showing how many times a tweet has been retweeted … display a new metric that includes a combination of replies and retweets that shows how many ‘people are talking about this,’” in an effort to do away with the platform’s “massive bot infestation.” But Dipayan Ghosh, writing in Slate, questions how much the change will accomplish, describing how “the purveyors of disinformation have clearly determined that large-scale social media platforms offer a tremendous opportunity to move people to believe their messaging.” The “key to their tremendous ongoing success,” according to Ghosh “is their use of the audience segmentation tools developed by the leading Internet advertising platforms,” which allow disinformation purveyors to target “demographic groups that are homogenous across certain set of characteristics.” For Ghosh, “A more thorough solution must begin with the segregation of the interests of the disinformation agent and the Internet platform.” And CNN reported this week that it uncovered the existence of “hundreds of Russian propaganda videos” on the platform that Twitter “removed only after CNN brought them to Twitter's attention,” which “raises new questions about the nature of the company's effort to find and remove content produced by Russians trying to meddle in American politics, and how comprehensive it has been.” According to Senator Mark Warner, "Twitter shouldn't wait for Congress or anybody else to send a 'to-do' list with specific accounts to delete … The company needs to take responsibility and be proactive about stopping Russians and other bad actors who are abusing its platform." (Slate, CNN)
Experts weigh in on solutions as majority of Americans believe Russia will influence U.S. midterms: Clifford May writes in the Washington Times that Russian “disinformation is not a synonym for misinformation. The later implies information that happens to be wrong. The former implies an attempt to deceive public opinion for strategic purposes.” Discussing Putin’s disinformation objective, May believes “his mission is to restore the power Russia lost when the Soviet Union collapsed. And, in his calculus, strengthening Russia and weakening the West amount to the same thing.” Citing the Alliance for Securing Democracy’s Laura Rosenberger and Jamie Fly’s push for “a whole-of-government response … that cuts across national security and domestic policy spaces,” May also calls for the U.S. government to improve its cyber defensive and offensive capabilities, and to “get way ahead in the race for artificial intelligence.” As a new NBC News poll finds that 57 percent of Americans believe “Russia will try to influence this year's midterms,” there is no time to wait to shore up our defenses. (The Washington Times, NBC News)
Russian trolls on Tumblr and the influence of fake personas on Twitter: BuzzFeed and Jonathan Albright, director of the Tow Center for Digital Journalism at Columbia University, uncovered the use of another social media platform by Russian trolls, Tumblr. They find that “the blogging platform was in fact home to a powerful, largely unrevealed network of Russian trolls focused on black issues and activism … targeting mostly teenage and twenty-something African Americans.” While the research shows that many of the Russian-run Tumblr accounts have the same names as accounts on other social media platforms that have been linked to the Internet Research Agency, Tumblr has not been subject to the same Congressional scrutiny as the other platforms. And Thomas Rid, professor of strategic studies at the Johns Hopkins School for Advanced International Studies, describes how fake accounts influence what humans see: “A lot of users on Twitter think they don’t follow any bots … and therefore think that doesn’t concern me … but that’s not how this works. For example, imagine you see a tweet, a post on Twitter, and that has been retweeted or liked thousands, tens of thousands of times. You never check whether these retweets, which make something appear very important and viral, whether they actually are real or not. So it’s possible to give actual messages, like a hashtag, to give it more lift and more weight through automation and through automated abuse.” (BuzzFeed, PBS Newshour)
Governments respond to the threat of disinformation: In advance of elections in July, Mexico's National Electoral Institute (INE) is partnering with Facebook to combat disinformation, and also plans to work with Google and Twitter. Lorenzo Córdova, the president of INE said in a statement: "On the one hand, the identification of fake news, and on the other hand, the shared conviction between Facebook and INE that the best way to combat the so-called fake news, is to generate accurate, valid and objective information," calling the agreement the first of its kind worldwide. And Ukraine is partnering with the United States and the U.K. on a new media literacy campaign in Ukraine entitled “Learn to Discern.” According to U.S. Ambassador to Ukraine Marie Yovanovitch, “education is absolutely the key to cultivating smart media consumers and to countering disinformation.” As Twitter admitted that 49 Russian accounts actively tried to influence the Brexit referendum, 11 members of the U.K. parliament questioned representatives from Google, Twitter, and Facebook in the United States on Thursday, with Julian Knight, one of the British lawmakers, asking “Why has your self regulation so demonstrably failed and how many chances do you need?” As reported by The Washington Post, “The British lawmakers appeared to have no qualms with questioning the moral integrity and fundamental business practices of the American tech darlings.” While the line of questioning focused in on Russian interference, British lawmakers were also concerned with the effects of “sophisticated ad operations,” with many lawmakers suggesting “new regulations may be required to boost transparency in online advertising and to better police deliberate lies and misinformation.” (BuzzFeed, KyivPost, engadget, The Washington Post)
Our Take
Laura Rosenberger, director of the Alliance for Securing Democracy, and Jamie Fly, senior fellow and director of the Asia program at the German Marshall Fund discuss Russia’s efforts to meddle in the United States before the 2018 midterm elections with NPR’s Tim Mak, stating "These are not networks that are necessarily always traditional propaganda ... a lot of it is just trying to rip apart Americans, to sow chaos within our political system, to pit Americans of both parties against each other." (NPR)
Speaking with Clifford May, founder and president of the Foundation for Defense of Democracies on February 6 regarding “Russia’s Disinformation Offensive, ” Laura Rosenberger, director of the Alliance for Securing Democracy, and Jamie Fly, senior fellow and director of the Asia program at the German Marshall Fund call for a bipartisan response in order to “combat Putin’s disinformation offensive.” (FDD)
Clint Watts, senior fellow at the Alliance for Securing Democracy, appeared on NBC’s Meet the Press to discuss Russia’s use of information warfare to undermine our democracies, including its sophisticated targeting of voters in swing states during the 2016 presidential election. (NBC)
Hamilton 68 dashboard
Hamilton 68 dashboard: A week after the FISA memo was released, the pro-Kremlin influence network has done its part to keep the conspiratorial “deep state” fires stoked on social media. This past week saw the promotion and amplification of no less than a half-dozen hashtags related to the memo, from #obamagate and #obamaknew to #fisagate. Beyond the hashtag shuffle, Russian-linked accounts continued their assault on the U.S. justice system by seeding Twitter with a steady diet of content meant to undermine faith in the rule of law. Close to half of the top URLs promoted by the network pushed deep state narratives, including Uranium One, the Fisa memo, attacks against Pete Strzok and the Mueller investigation, and various conspiracy theories. Since the launch of the dashboard, content focused on undermining law enforcement and the justice department has increased steadily, suggesting an attempt to not only divide Americans but to erode faith in our systems of government.
Quote of the Week
“The Russians were involved in our elections for one simple reason: to erode trust in our democratic institutions. We're allowing that to happen by not being able to have this conversation dispassionately.”
- Representative Will Hurd (R-TX), February 5, 2018
Worst of the Week
As flagged in this space two weeks ago, Russian-linked accounts have been actively pushing content from a 4Chan message board where a user(s) known as QAnon — who claims to be an intelligence insider with a Q security clearance — alleges to have compromising information on a smorgasbord of deep state scandals. This week’s addition, which, surprise-surprise, has been picked up and promoted by InfoWars, is that former President Clinton, in the now infamous tarmac meeting with then Attorney General Loretta Lynch, offered Lynch Antonin Scalia’s open Supreme Court seat in exchange for her promise to intervene in the investigation into Hillary Clinton’s e-mail server. If the quid-pro-quo offer wasn’t tawdry enough, QAnon’s posts (and the InfoWars article) suggest that Scalia was actually murdered as part of this plot to remake the Justice Department. While this specific conspiracy clearly exists on the fringes of the Internet, it actually dovetails nicely within the broader, deep state narrative being pushed by Russian-linked accounts.