scorecard
  1. Home
  2. tech
  3. Donald Trump's winning formula can still be misused by companies — sites and links you share reveal your stance on a topic

Donald Trump's winning formula can still be misused by companies — sites and links you share reveal your stance on a topic

Donald Trump's winning formula can still be misused by companies — sites and links you share reveal your stance on a topic
Tech3 min read

  • New research shows that sharing links or simply ‘liking’ a post can help companies figure your stance on a particular topic irrespective of the content.
  • A similar formula was used by Cambridge Analytica to allegedly help Trump win the 2016 US Presidential elections.
  • Researchers now want to use their findings to develop a way to keep companies from automatically detecting your preferences and protect user privacy.
Social media is a treasure trove of information on how people react to facts about something or someone — something that most people are already familiar with after Donald Trump’s tryst with Cambridge Analytica during the 2016 US Presidential Election.

Researchers have now found that it’s possible for companies find patterns in how and what you share online even while ignoring the actual content.

Simply sharing a link or ‘liking’ someone else’s post can be enough for companies to figure out what you opinion might be about a particular topic, according to a study pre-published in Cornell University’s arXiv.

Even if you don’t discuss it, analysis can use the context to figure out how you would feel about it.

An upgrade to Cambridge Analytica

This phenomenon was pushed into the spotlight by the Cambridge Analytica scandal in 2018.

US President Donald Trump hired Cambridge Analytica during the 2016 US Presidential campaign to target voters and design his political advertising strategy.

The company used data analysis, data mining and data brokerage on information from over 50 million Facebook users without their permission to target them with personalised political advertisements depending on their tastes and preferences.

Three sets of networks

Conducting a text-independent study, the researchers used three sets of networks for their analysis on the SemEval stance database — which includes 400 tweets on five political, social and religious topics — to figure out just how telling your online activity can be.

The first set is called the ‘interaction network’. This includes accounts and web domains that users interacts with and quotes — retweets, replies, mentions and links. This network accounts for all the active social media users.

But if you’re a passive or so-called ‘silent’ user, other two networks can decipher your opinions too. The ‘preference network’ includes the accounts and websites within the Tweets that a user likes and the ‘connection network’ looks into accounts that the user follows, and accounts which follow the user.

Even if you might not share or post content, your preferences can be determined using other users and accounts.

What your online activity says about you

In the ‘connection network’, social influence manifests itself through users’ friends. Essentially, you tend to follow people who share the same opinion as you on a topic.

According to the study, users with an against stance toward legalisation of abortion tend to follow accounts like @prolifeyouth and @march_for_life.

But for the other two networks, socially influential accounts include news handles.

For instance, following @telegraph in your ‘interaction network’ has a positive correlation with favoured stance towards climate change. But, news accounts aren’t as accurate to determine user stance towards issues like feminist movement or atheism.

Overall, the researchers found that network features are better equipped to detect a person’s opinion rather than relying on text or content alone. Combining the two, can produce better results.

No such thing as a ‘neutral’ opinion

The researchers also found that there’s no such thing as not having an opinion or a ‘neutral stance’. Everyone has an opinion that’s biased one way or another.

Realising the potential vulnerability of user data — being able to detect preferences when they’re not even discussing the topic — the researchers want to develop methods to counter these automatic methods to detect how someone feels about a topic and protect their privacy

See also:
These malfunctioning AI incidents show the need for stronger user privacy measures

Facebook still has incredible control over your data

Here’s what global tech CEOs have to say about India's data protection laws.

READ MORE ARTICLES ON


Advertisement

Advertisement