scorecard
  1. Home
  2. tech
  3. news
  4. Facebook's former director of monetization says Facebook intentionally made its product as addictive as cigarettes — and now he fears it could cause 'civil war'

Facebook's former director of monetization says Facebook intentionally made its product as addictive as cigarettes — and now he fears it could cause 'civil war'

Aaron Holmes   

Facebook's former director of monetization says Facebook intentionally made its product as addictive as cigarettes — and now he fears it could cause 'civil war'
Tech3 min read
  • A former Facebook director lashed out at the company's business model during testimony before Congress on Thursday, saying Facebook's focus on driving engagement outweighs its consideration of potential harms.
  • In his prepared remarks before a House committee hearing, Facebook's former director of monetization Tim Kendall said Facebook "took a page from Big Tobacco’s playbook, working to make our offering addictive at the outset."
  • Now, Kendall says he's worried Facebook is contributing to extremism in the US and is "pushing ourselves to the brink of a civil war."

Facebook's former director of monetization Tim Kendall says he had a role in making Facebook as addictive as cigarettes — and worries that Facebook could be just as damaging to its users.

In a testimony before the House Consumer Protection and Commerce Subcommittee published Thursday, Kendall accused Facebook of building algorithms that have facilitated the spread of misinformation, encouraged divisive rhetoric, and laid the groundwork for a "mental health crisis."

"We took a page from Big Tobacco's playbook, working to make our offering addictive at the outset," Kendall said in prepared remarks submitted to lawmakers ahead of Thursday's hearing. "The social media services that I and others have built over the past 15 years have served to tear people apart with alarming speed and intensity. At the very least, we have eroded our collective understanding — at worst, I fear we are pushing ourselves to the brink of a civil war."

Kendall, who is now CEO of the time management app Moment, joined Facebook as its first director of monetization in 2006 and remained in the role until 2010. Kendall said he initially thought his role would involve balancing Facebook's interest in revenue with the wellbeing of its users, but that Facebook was interested in profits over everything.

"We sought to mine as much attention as humanly possible and turn into historically unprecedented profits," Kendall said.

Facebook's algorithm rewards shocking content and divisive rhetoric in order to evoke extreme emotional responses from users in order to hold users' attention and generate more ad revenue, Kendall told lawmakers.

"These algorithms have brought out the worst in us. They've literally rewired our brains so that we're detached from reality and immersed in tribalism," he said.

Facebook did not immediately respond to Business Insider's request for comment in response to Kendall's testimony.

Kendall isn't the first former Facebook employee to raise concerns about the platform's capacity to sow division. A Facebook engineer quit in protest last month, accusing the company of "profiting off hate." More recently, a fired Facebook data scientist reportedly wrote a whistleblower memo accusing the company of failing to direct enough resources to fighting misinformation.

Facebook has also faced activist campaigns urging it to more robustly crack down on misinformation and hate speech. More than 1,000 companies joined an advertiser boycott of the platform this summer led by civil rights activists, and this month, influencers staged a day of protest over hate speech on Facebook and Instagram.

At the subcommittee hearing Kendall testified at on Thursday, lawmakers said the spread of misinformation on Facebook could be cause for future government regulation of social media platforms.

"Driven by profit and power and in the face of obvious harm, these mega-companies successfully have convinced governments all over the world to essentially leave them alone ... big tech has helped divide our nations and has stoked genocide in others," said Rep. Jan Schakowsky, an Illinois Democrat who chairs the House Consumer Protection and Commerce Subcommittee.

Meanwhile, Republicans on the subcommittee focused primarily on claims of anti-conservative bias, a frequent talking point of President Donald Trump. They pointed to social media platforms' occasional fact-checking of Trump's posts that violate their policies on spreading misinformation as censorship. While Trump has frequently railed against these fact-checks, Republicans have provided minimal evidence of broader censorship of conservative ideas.

"Free speech is increasingly under attack," said Rep. Cathy Rodgers of Washington, the ranking Republican on the subcommittee. "I am extremely concerned when platforms apply inconsistent content moderation policies for their own purposes ... there's no clearer example of a platform using its power for political purposes than Twitter, singling out President Trump."

Republicans and Democrats alike said they supported reforming Section 230, a law that makes social media platforms immune to legal liability for the content of users' posts. Attorney General William Barr announced Wednesday that the Department of Justice has urged Congress to amend the law, but did not immediately elaborate on how the law should be replaced.

READ MORE ARTICLES ON


Advertisement

Advertisement