MENU
component-ddb-728x90-v1-01-desktop

Tough questions, few answers as social media companies face Congress 

From left, Google's Law Enforcement and Information Security Director Richard Salgado, Twitter's Acting General Counsel Sean Edgett, and Facebook's General Counsel Colin Stretch, appear together during a Senate Committee on the Judiciary, Subcommittee on Crime and Terrorism hearing on Capitol Hill in Washington, Tuesday, Oct. 31, 2017. (AP Photo/Andrew Harnik)

Testimony by representatives of Facebook, Twitter, and Google on Capitol Hill Tuesday revealed that the reach of Russia-linked content intended to influence the 2016 election and sow discord in the U.S. was exponentially greater than the companies initially believed, and it underscored the fact that the threat posed by foreign disinformation and propaganda goes far beyond paid advertisements.

Facebook General Counsel Colin Stretch, Twitter Acting General Counsel Sean Edgett, and Google Law Enforcement and Information Security Director Richard Salgado appeared before a Senate Judiciary subcommittee Tuesday, and the companies will meet with both the Senate and House Intelligence Committees Wednesday.

Facebook had initially reported that accounts connected to Russia purchased about 3,000 ads that reached about 10 million Americans, although more than half were viewed after the election. Stretch told senators Tuesday that including posts users did not pay to promote, content linked to the Internet Research Agency “troll farm” may have been seen by 126 million users in the last two years.

In addition, Facebook found about 120,000 pieces of “Russia-linked content” on Instagram and deleted over 170 accounts on the photo-sharing service.

Similarly, Twitter announced last month that about 200 accounts linked to Russia had been identified and suspended. On Tuesday, Edgett confirmed that more than 2,700 accounts connected with the Internet Research Agency posted 131,000 tweets between September and November 2016. In addition, 36,000 bot accounts produced 1.4 million tweets in that time that were viewed 288 million times.

Google has largely avoided the spotlight that has shone on Facebook and Twitter since last fall, and its platforms appear to have played a much smaller role in Russia’s meddling efforts. Salgado reported that the Internet Research Agency purchased about $4,700 in ads on its platforms that were not narrowly targeted to specific audiences.

The company also identified 1,108 videos comprising 43 hours of political content on YouTube channels associated with the Russian campaign. Those videos were viewed a total of 309,000 times in the U.S. between June 2015 and November 2016.

Lawmakers concerned by apparent foreign efforts to manipulate Americans online recently introduced a bipartisan Honest Ads Act intended to force internet companies to disclose more about political advertisers and maintain a public file of paid electioneering communications. The law would apply to ads promoting specific candidates and those addressing contentious political issues.

The Senate version of the bill was co-sponsored by Sens. Amy Klobuchar (D-Minn.), Mark Warner (D-Va.), and John McCain (R-Ariz.). The legislation is backed by several pro-transparency and democracy organizations.

“Online political advertising represents an enormous marketplace, and today there is almost no transparency,” Warner said in a statement. “The Russians realized this, and took advantage in 2016 to spread disinformation and misinformation in an organized effort to divide and distract us.”

McCain told reporters Tuesday that the latest news is just further evidence that this law is needed, and he expects even more revelations in the future.

“This is a centipede and other shoes will drop,” he said.

Klobuchar said prior to the hearing that she appreciates some of the steps Facebook and Twitter have announced to combat foreign interference, but it is not enough.

“Every company is doing something different, announcing something different,” she said. “You need rules of the road that cover these companies, and you need an enforcement mechanism.”

She suggested public disclosure of information about Facebook advertisers would have made it easier to identify and dismiss Russian propaganda.

“The Hillary Clinton campaign could have had a fighting chance to say, ‘Well, that's not real,’” she said. “And the media would be able to say, ‘Where are these ads coming from?’”

Klobuchar also noted that radio, TV, and print platforms are already subject to similar regulations to those they are proposing and have proven capable of identifying political content.

“I cannot wait to ask them why they claim it would be hard for them to ferret out what issue ads are,” she said.

The Internet Association, a trade group that represents about 40 internet companies including Facebook, Twitter, and Google, thanked lawmakers for introducing the bill last week but stopped short of actually supporting it.

“We are reviewing the legislation and look forward to further engagement with the sponsors,” President and CEO Michael Beckerman said in a statement. “This is an important issue that deserves attention and the internet industry is working with legislators in both the House and Senate interested in political advertising legislation.”

On Tuesday, the group laid out its own principles for legislation and regulation, mostly focused on transparency and uniformity in the rules for ads that are clearly identifiable as “election advertising.”

“Legislation or regulation should establish clear definitions and objective criteria to trigger removal or disclosure of election advertising,” a press release stated. “Vague definitions that leave a lot of room for interpretation will create significant challenges for the regulator and/or the platform responsible for deciding which ads fall into the category of ads to be disclosed or removed.”

The Internet Association warned against requirements that would discourage legitimate political speech or hold platforms liable for claims made by advertisers. It also recommended that Congress strengthen the government’s authority to prevent foreign actors from interfering in elections.

Sen. Lindsey Graham, R-S.C., said during his opening statement at the hearing that the purpose is to figure out “how do we keep the good and deal with the bad” on social media. Crafting legislation that effectively impedes foreign efforts to spread propaganda to U.S. internet users may be more difficult than senators think, though.

“How to address foreign election influence in the digital age is one of the great challenges of our generation,” said Laura Denardis, faculty director of the Internet Governance Lab at American University. “No one stakeholder can solve all of these problems. The principle of an informed citizenry is a bedrock of American democracy and preserving this principle requires actions from private industry, government, and citizens.”

The internet poses unique challenges because of the way information travels without borders, is easily shared, and can be microtargeted to specifically receptive audiences.

“Implementing transparency and accountability in the online environment is necessary but is also complicated because of the byzantine complexity, self-service and automated approach, and massive scale of online ads,” Denardis said.

According to Katherine Haenschen, an assistant professor of communication at Virginia Tech University, lawmakers trying to apply similar rules to social media posts that they do to radio or television ads may be overlooking other potent promotional tools that are available to users.

“Thinking about it like an ad like a TV or radio commercial doesn’t quite get at the unique paid features of Facebook,” she said.

Pages, posts, and events can be promoted without traditional ads, and Haenschen suspects even free posts from Russian accounts that generated significant traffic were getting some sort of signal boost.

“It’s really hard to create a Facebook page that gets tens if not hundreds of thousands of fans without some paid feature,” she said.

Once a post reaches a receptive audience, though, those people will share it and their friends will give it more credence because it comes from them. The few thousand ads the Russian accounts paid for that could be most easily regulated are only the tip of the problem.

Experts say companies like Twitter and Facebook cannot realistically block all content from foreign actors intended to influence U.S. politics, but they can do more than they are.

“They can't enforce a policy like ‘We will ban posts sent at the behest of Russian officials with the goal of stirring up confusion and unrest in U.S. politics,’” said Drew Margolin, an assistant professor at Cornell University who studies collective communication behavior.

They can make it more difficult and expensive to distribute such content, though, such as by more effectively identifying and scrutinizing coordinated campaigns or by limiting the use of anonymous and unverified accounts.

“This doesn't guarantee they won't happen again, but like TSA increased security after 9/11, when there is almost no oversight in place, it's often easy to make big gains by just taking simple steps,” he said.

If nothing else, the massive reach of the Russia-linked content means companies have to drop the “we’re just a platform, we don’t mean any harm” defense and recognize their public responsibility, according to Margolin.

“Whether sought or not or desired or not, they must not only to address the specific concerns related to the 2016 election in particular, but all of the ways that their sites might be used to harm the public interest while serving their business goals,” he said.

As evidenced by the fact that Facebook just recently applied new guidelines for ads, James Scott, a senior fellow at the Institute for Critical Infrastructure Technology and the Center for Cyber-Influence Operations Studies, said it is clear the companies could have done more sooner to fight foreign interference.

“Social media platforms control what content transmits on their platform,” he said. “Rather than accept any ad, it is within their capabilities to vet the message and source of the content before it is presented to users.”

Scott fears that changing policies or laws now may prove to be too late.

“Now instead of proactively combating emerging influence operations, legislators and companies are retroactively addressing the consequences,” he said. “Legislation regulating corporate dragnet surveillance and foreign interference was necessary years ago.”

Cybersecurity experts told Sinclair earlier this month that social media companies will not be able to police this content on their own, and the public must to learn to inoculate itself against Russian misinformation. Scott agrees that public awareness and cyber-hygiene need to improve, but he emphasized that this does not absolve the platforms of responsibility.

“Companies that interact with consumers for hours of their daily lives and that collect data on their every action should at the very least be required to protect those same users from influence operations perpetrated by special interests and foreign threat actors,” he said.

When social media users are left to fend for themselves against foreign influences, the experience is only as secure as the people who are sharing content into your feed make it.

“The problem is that, as with any public good, I am not insulated from the failure of others to be responsible,” Margolin said. “The spread of news and information is not about personal consumption, but the collective consequences. If I remain informed, but everyone around me chooses not to be, I can suffer consequences.”

Trending