Getty/Drew Angerer
- On Wednesday, lawyers from Facebook, Twitter, and Google are appearing before the Senate Intelligence Committee and House panel, answering questions about Russia-linked election meddling in the US.
- The first hearing begins at 9:30am ET. The second hearing begins at 2pm ET.
Top lawyers from Facebook, Twitter, and Google are testifying before the Senate Intelligence Committee and House of Representatives on Wednesday. They were also grilled on Tuesday over exactly what they knew and when they knew it.
The first Wednesday hearing, where general council for Facebook, Twitter, and Google will appear before the Senate Intelligence Committee, kicks off at 9:30 am ET.
You can watch a livestream of the 9:30am ET Senate Intelligence Committee hearing right here.
Facebook's general council, Colin Stretch, will be representing Facebook. Sean Edgett will be representing Twitter as its acting general council. Kent Walker, one of Google's general councils who reports into Google CEO Sundar Pichai, will be representing Google.
The second hearing of the day kicks off at 2:00 pm ET, as top lawyers from Facebook, Twitter, and Google appear before Congress and the House panel investigating election interference. You'll be able to watch the 2 PM hearing live here.
For a full recap on Tuesday's hearing, where Facebook's general council, Colin Stretch, admitted "there were signals we missed," click here.
Opening remarks
Senate Intel Committee Vice Chairman Sen. Mark Warner began Wednesday's hearing with some prepared remarks, stressing that "not one of us is doing enough to stop" Russian operatives from hijacking the "national conversation" in an attempt to "make Americans angry."
In this age of social media, you can't afford to waste too much time - or too many characters - in getting the point across, so I'll get straight to the bottom line.
Russian operatives are attempting to infiltrate and manipulate American social media to hijack the national conversation and to make Americans angry, to set us against ourselves and to undermine our democracy. They did it during the 2016 U.S. presidential campaign. They are still doing it now. And not one of us is doing enough to stop it.
That is why we are here today.
In many ways, this threat is not new. Russians have been conducting information warfare for decades.
But what is new is the advent of social media tools with the power to magnify propaganda and fake news on a scale that was unimaginable back in the days of the Berlin Wall. Today's tools seem almost purpose-built for Russian disinformation techniques.
Russia's playbook is simple, but formidable. It works like this:
Disinformation agents set up thousands of fake accounts, groups and pages across a wide array of platforms.
These fake accounts populate content on Facebook, Instagram, Twitter, YouTube, Reddit, LinkedIn, and others.
Each of these fake accounts spend months developing networks of real people to follow and like their content, boosted by tools like paid ads and automated bots. Most of their real-life followers have no idea they are caught up in this web.
These networks are later utilized to push an array of disinformation, including stolen emails, state-led propaganda (like RT and Sputnik), fake news, and divisive content.
The goal here is to get this content into the news feeds of as many potentially receptive Americans as possible and to covertly and subtly push them in the direction the Kremlin wants them to go.
As one who deeply respects the tech industry and was involved in the tech business for twenty years, it has taken me some time to really understand this threat. Even I struggle to keep up with the language and mechanics. The difference between bots, trolls, and fake accounts. How they generate Likes, Tweets, and Shares. And how all of these players and actions are combined into an online ecosystem.
What is clear, however, is that this playbook offers a tremendous bang for the disinformation buck. With just a small amount of money, adversaries use hackers to steal and weaponize data, trolls to craft disinformation, fake accounts to build networks, bots to drive traffic, and ads to target new audiences. They can force propaganda into the mainstream and wreak havoc on our online discourse. That's a big return on investment.
So where do we go from here?
It will take all of us - the platform companies, the United States government, and the American people - to deal with this new and evolving threat.
Social media and the innovative tools each of you have developed have changed our world for the better. You have transformed the way we do everything from shopping for groceries to growing our small businesses. But Russia's actions are further exposing the dark underbelly of the ecosystem you have created. And there is no doubt that their successful campaign will be replicated by other adversaries - both nation states and terrorists - that wish to do harm to democracies around the globe.
As such, each of you here today needs to commit more resources to identifying bad actors and, when possible, preventing them from abusing our social media ecosystem.
Thanks in part to pressure from this Committee, each company has uncovered some evidence of the ways Russians exploited their platforms during the 2016 election.
For Facebook, much of the attention has been focused on the paid ads Russian trolls targeted to Americans. However, these ads are just the tip of a very large iceberg. The real story is the amount of misinformation and divisive content that was pushed for free on Russian-backed Pages, which then spread widely on the News Feeds of tens of millions of Americans.
According to data Facebook has provided, 120 Russian-backed Pages built a network of over 3.3 million real people. From these now-suspended Pages, 80,000 organic unpaid posts reached an estimated 126 million real people. That is an astonishing reach from just one group in St. Petersburg. And I doubt that the so-called Internet Research Agency represents the only Russian trolls out there. Facebook has more work to do to see how deep this goes, including looking into the reach of the IRA-backed Instagram posts, which represent another 120,000 pieces of content.
The anonymity provided by Twitter and the speed by which it shares news makes it an ideal tool to spread disinformation. According to one study, during the 2016 campaign, junk news actually outperformed real news in some battleground states in the lead-up to Election Day. Another study found that bots generated one out of every five political messages posted on Twitter over the entire presidential campaign.
I'm concerned that Twitter seems to be vastly under-estimating the number of fake accounts and bots pushing disinformation. Independent researchers have estimated that up to 15% of Twitter accounts - or potentially 48 million accounts - are fake or automated. Despite evidence of significant incursion and outreach from researchers, Twitter has, to date, only uncovered a small percentage of that activity. Though, I am pleased to see that number has been rising in recent weeks.
Google's search algorithms continue to have problems in surfacing fake news or propaganda. Though we can't necessarily attribute to the Russian effort, false stories and unsubstantiated rumors were elevated on Google Search during the recent mass shooting in Las Vegas. Meanwhile, YouTube has become RT's go-to platform. You have also now uncovered 1100 videos associated with this campaign. Much more of your content was likely spread through other platforms.
It is not just the platforms that need to do more. The U.S. government has thus far proven incapable of adapting to meet this 21st century challenge. Unfortunately, I believe this effort is suffering, in part, because of a lack of leadership at the top. We have a President who remains unwilling to acknowledge the threat that Russia poses to our democracy. President Trump should stop actively delegitimizing American journalism and acknowledge and address this real threat posed by Russian propaganda.
Congress, too, must do more. We need to recognize that current law was not built to address these threats. I have partnered with Senators Klobuchar and McCain on a light-touch legislative approach, which I hope my colleagues with review. The Honest Ads Act is a national security bill intended to protect our elections from foreign influence.
Finally - but perhaps most importantly - the American people also need to be aware of what is happening on our news feeds. We all need to take a more discerning approach to what we are reading and sharing, and who we are connecting with online. We need to recognize that the person at the other end of that Facebook or Twitter argument may not be a real person at all.
The fact is that this Russian weapon has already proven its success and cost effectiveness. We can all be assured that other adversaries, including foreign intelligence operatives and potentially terrorist organizations, are reading their playbook and already taking action. We don't have the luxury of waiting for this Committee's final report before taking action to respond to this threat to our democracy.
To our witnesses today, I hope you will detail what you saw in this last election and tell us what steps you will undertake to get ready for the next one. We welcome your participation and encourage your continued commitment to addressing this shared responsibility.
Chairman Richard Burr also gave an opening statement, welcoming the general councils from Facebook, Twitter, and Google, and noting that Wednesday's hearing was a chance for them to "correct the record," noting that "my sense is that not all aspects of those stories have been told accurately."
Get the latest Google stock price here.