What YouTube taught me about Facebook Live and violent footage
That's what Facebook stated was the cause of the temporary removal from their service of the video capturing Philando Castile's shooting. Ben Thompson has a smart Stratechery column today about Facebook's emerging role as a Journalism Company. And the internal discussions (debates?) which are likely occurring.
I saw this firsthand during my time at YouTube where in addition to running the product org from 2007-2011, I sat on our Policy Committee. This group helped provide cross-functional input into the Community Standards maintained by our superb Policy and Operations teams. While some of the topics (what defines a "fetish video") had humorous aspects, overall there wasn't a more important document at the company than our Community Standards and the enforcement of these guidelines on behalf of the YouTube users.
While the Community Standards certainly evolved throughout my time at YouTube, it was really Middle East protests, and eventually the Arab Spring, that forced us to confront questions of "newsworthiness" formally for content which would otherwise violate our policies around graphic violence. Remember Neda?
Video which depicted - or seemed to depict - injuries and deaths of protesting citizens at the hands of authoritarian governments, soldiers or third party agents. These videos often were uploaded to the site with little contextual data - sometimes none at all. Uploaded not always from regional IP addresses given the need to obfuscate identity, tunnel out of regions where YouTube was blocked - or event transported on memory cards out of the Middle East to where they could be safely uploaded by collaborators or NGOs.
Most of us wanted to work hard to leave these videos up on the site and it took a collaborative effort to make it work. Steps we took included:
- Partnering with in-region or specialized human rights NGOs to help us verify content as authentic.
- Training our review teams to be able to judge between graphic violence which might violate our guidelines versus that which had a newsworthiness to it. Remember, we weren't proactively reviewing videos so it was a question of user flagging and other algorithmic determinants which put a video into the review queue.
- Creating an interstitial page which warned the viewer about the violent contents of the video versus removing it completely.
- Building features such as face blurring, which allowed uploaders to obfuscate portions of the video to protect identities when it could lead to collateral damage - for example, faces of people at a peaceful protest in a country where such organizing is outlawed.
What's most important - IMO - is that YouTube executives started to think about human rights and activists as a "user segment" we wanted to support. Commercially there had been a lot of effort to understand what "celebrities," "media companies," "musicians" etc wanted from the platform. But until we put some elbow grease behind supporting activism as a use case for YouTube, everything else was anecdotal and at risk of being lost while we prioritized monetization, celebs, and other tangible business goals.
So my hope for Facebook Live - and Periscope - is that they're able to not just feel proud they've built a tool that can help activism occur, but continue to put resources behind supporting activists. Not just with #StayWoke t-shirts (although those are cool), but with engineering resources and policy decisions.
Hunter Walk is a partner at Homebrew, a venture capital firm, and previously ran consumer product management for YouTube and was a founding member of the product and marketing team at Linden Lab, the creators of online virtual world Second Life. This post originally appeared on his blog and is republished here with permission.