Russian Twitter trolls pounced on the University of Missouri’s woes in 2015 using the same techniques they applied to disrupt the 2016 presidential election, a U.S. Air Force officer wrote in an article published recently in Strategic Studies Quarterly.

In the aftermath of the Nov. 9, 2015, resignation of UM System President Tim Wolfe during protests over racial issues, some feared a violent white backlash.

It was fueled in part by a real post on the anonymous social app Yik-Yak from Hunter Park, then a student at Missouri University of Science and Technology in Rolla, that he would “shoot every black person I see.” The fear was enlarged and spread by a now-suspended Twitter account that warned “The cops are marching with the KKK! They beat up my little brother! Watch out!” that included a photo of a black child with a severely bruised face and the hashtag #PrayForMizzou.

The fear generated by the real and fake posts caused many students to stay home on Nov. 10, with several professors canceling classes and some stores along Ninth Street near campus closing their doors.

The tweet and photo were fake, Lt. Col. Jarred Prier, director of operations for the 20th Bomb Squadron, wrote in “Commanding the Trend: Social Media as Information Warfare” for the winter edition of Strategic Studies Quarterly. Prier’s article expands on the master’s degree thesis he wrote while studying at Air University for the School of Advanced Air and Space Studies.

The Twitter account, with the handle @FanFan1911 and user name Jermaine while tweeting about Mizzou, was used to spread panic about a fake chemical factory fire in St. Mary Parish, La., in 2014 and fear of Syrian refugees in Germany in 2016, Prier wrote. The account’s original MU tweets were retweeted by an army of 70 robot accounts and hundreds of legitimate users and became part of the huge volume of tweets about the university at that time, he wrote.

“The rapidly spreading image of a bruised little boy was generating legitimate outrage across the country and around the world,” Prier wrote. “However, a quick Google image search for ‘bruised black child’ revealed the picture that ‘Jermaine’ attached to the tweet was a picture of an African American child who was beaten by police in Ohio over one year earlier. The image and the narrative were part of a larger plot to spread fear and distrust. It worked.”

Payton Head, then-president of the Missouri students Association, took the bait, Prier notes in his article. In a Facebook post, Head warned students to stay away from windows in residence halls. “The KKK has been confirmed to be sighted on campus. I’m working with the MUPD, the state trooper and National Guard,” Head wrote.

Head quickly retracted and deleted the post, apologizing for sharing misinformation, which he said came from “multiple incorrect sources.” The Missouri National Guard confirmed it had been contacted about the rumored KKK presence on campus but an official said the Guard was “never working with anyone” to respond to the rumor.

“In a state of alarm, I was concerned for all students of the University of Missouri and wanted to ensure that everyone was safe,” Head wrote in his apology. “The last thing needed is to incite more fear in the hearts of our community.”

Prier is an MU ROTC graduate from 2003. In the article, he explains the techniques used by Russian online agents working to disrupt political life in democracies and why they are effective.

A human actor writes a fake post and it is automatically spread by robotic accounts created on the same platform. To spread the message, the post uses words or phrases that are getting a lot of attention, or trending, online. That inserts it into a conversation already underway, Prier wrote.

“One of the primary principles of propaganda is that the message must resonate with the target,” he wrote. “Therefore, when presented with information that is within your belief structure, your bias is confirmed and you accept the propaganda.”

To convince those not already disposed to believe, the key is repetition and having the story line pushed by the propaganda reported by a trusted media source, Prier wrote. Several media outlets reported on fears the KKK was on campus before the rumor was put to rest.

The episode helped create and maintain a false narrative that the MU campus was wracked by violence or experienced riots during the protests, which were peaceful.

Prier’s study “would certainly help explain the origin of that ‘news’ that we were trying to combat and in some cases continue to do so today,” MU spokesman Christian Basi said Tuesday.

The discovery of Russian trolls using events at MU to sow distrust isn’t especially surprising, said state Sen. Caleb Rowden, R-Columbia.

“I think there are a lot of people out there, maybe they are Russian, maybe they are not, but there are a lot of people out there who want to instigate and divide people on Twitter and other places,” he said.

The selection of @FanFan1911′s tweet about the KKK to highlight in the study wasn’t entirely random. Prier noticed the original tweet about the KKK on Nov. 11, 2015, and responded by saying “stop spreading lies” and posting a link to a Huffington Post report from 2013 about the beating.

But @FanFan1911 wasn’t the only account he studied for the article, Prier wrote.

“I mention only one particular user in this article, but I also monitored a dozen or so accounts that contributed to that hoax,” he wrote. “Each account followed a pattern that also happened to align with noted Russian influence operations in Europe and eventually in the U.S. presidential election.”

The effort targeting MU may have been a warm-up for 2016. The Russian efforts to insert fake reports into the news articles covering the 2016 presidential campaign were extremely successful, Prier wrote. One of the most-shared stories about the election on Facebook was a false report that Pope Francis had endorsed Republican candidate Donald Trump, he states.

“Command of the trend enables the contemporary propaganda model, to create a ‘firehose of information’ that permits the insertion of false narratives over time and at all times,” Prier wrote.