JUSTIN TALLIS/AFP via Getty Images
- The UK general election has been defined by online ads, memes, and misinformation on social media, most of which is visible to the public and to journalists.
- But cybersecurity experts warned that the election risks being compromised by possible disinformation campaigns being spread through messaging service WhatsApp, which are impossible to analyse.
- The Facebook-owned service has already been forced to take action in both India and Brazil, where far-right propaganda was shared through private messages far and wide during elections.
- One researcher told Business Insider they "couldn't even imagine" the amount of misinformation being shared as British voters go to the polls.
- WhatsApp says it has made a number of changes in recent months to help combat misinformation on its platform.
- Click here for more BI Prime stories.
Cybersecurity experts have warned the UK general election could be compromised by hidden misinformation campaigns on WhatsApp, just as the British public heads to the polls.
WhatsApp, owned by Facebook since 2014, has become a key battleground in the war on political disinformation around the world. In the last 18 months, elections in India and Brazil have seen far-right activists sharing false information via the messaging app, with WhatsApp taking direct action to limit the spread of propaganda.
Last month, Sky News reported that British Hindus had been targeted en masse with WhatsApp messages urging them to vote against Labour in the election, accusing the party of being "anti-India" and "anti-Hindu". The messages had been spread through WhatsApp's "forward" feature. WhatsApp limited the ability to forward messages globally in order to reduce the spread of false information.
Due to the private nature of the WhatsApp platform, compared to public posts shared on Twitter or Facebook, cybersecurity researchers warned there could be "many more" disinformation campaigns going on that the wider public is unaware of.
"The difficult part of WhatsApp is that it's made up of closed groups," said Priscilla Moriuchi, a former cybersecurity expert at the NSA, who says the company should consider sharing some of its data with the research community.
Moriuchi, now head of strategic threat development at Recorded Future, said one way forward could be the analysis of metadata. Metadata is typically defined as "data about other data", as in this case: timestamps, location, and profile.
"You wouldn't necessarily need to see private messages," Moriuchi said. "You can feed them into a machine without ever looking at them, and you could try and register patterns associated with misinformation."
Researchers have not uncovered widespread evidence that WhatsApp is being used to spread false information about the general election, but experts say that's partly because it's so hard to investigate.
Moriuchi added: "When the election is so tight, and just a few swing seats could decide the fate of the country, I think it's worth investigating the extent to which these smaller communities are being targeted."
'We would never know unless those targeted raise the alarm'
The British government's home secretary, Priti Patel, has previously said governments should be allowed to read people's WhatsApp messages, claiming the end-to-end encryption of Facebook's messaging platforms risks hindering police investigations. Facebook has said it will not weaken WhatsApp's encryption or provide any kind of backdoor.
Reuters
Andy Patel, a researcher at cybersecurity firm F-Secure, said it was "already difficult enough" analyzing disinformation campaigns on more public platforms like Facebook on Twitter.
"I can't even imagine what's going on," he said. "With WhatsApp, because it's all private, researchers essentially have to crowdsource their information. There could be any number of campaigns being run right now, but we would never know unless those targeted raise the alarm.
"We know companies like Cambridge Analytica exist. We know they're out there... It's certainly not outside the realms of possibility that they could be influencing our election through WhatsApp right now."
WhatsApp has said it will provide some information about its users to police, such as names, email addresses, profile pictures, when they receive "valid legal requests" from authorities. It also publishes regular reports detailing how it deals with these requests.
The company told Business Insider it "cares deeply about the safety" of its users. It added: "We have made a number of product changes in recent months specifically to combat the harmful effects of misinformation and to protect election integrity."
These include using machine learning technology to identify and ban spam accounts, limiting the number of people users can forward the same message to, labelling messages that have been forwarded, and asking for user permission before they are added to a group.
WhatsApp declined to comment further.