racial profiling

How to prevent Digital Racial Profiling

Written by Micah Roberts

Our capacity to be manipulated is broad and well-documented. Semantic games when positing questions can result in radical changes in answers. George Monbiot, cites a survey to illustrate the power of emotive lexical choice. 65% of Americans agreed that the government was spending too little on “assistance to the poor”. However, only 25% agreed that it was spending too little on “welfare”. This classic method of framing bears no comparison to the powerful repurposing of psycho-graphics – typically a retail market-ing technique relying on people’s personality, interests, opinions, and lifestyles – for use in the political sphere.

Political micro-targeting is a pervasive and dominating tool used across multiple jurisdictions that has evaded the reach of the law. Political campaigns and private actors have been able to target and exploit people’s anxieties and psychological proclivities via social media plat-forms to influence their decision-making. In the digital age, the influence these actors have in politics can rupture an individual’s process of autonomous and intelligible deci-sion-making.

Unfortunately, the obscure and deceptive use of harvested personal fake news and disinformation, the government outlined plans to work closely with the Electoral Commis-sion and social media companies to amend spending rules and policies on political ads. However, the government’s response shied away from talk of any law preventing the posting of factually incorrect or false statements, or general palpable statutory changes.

The Information Commission-er’s Office (ICO) report ‘Democracy Disrupted?’ (2018) makes ten recommendations and has issued eleven political parties with letters of warning. These letters of warning outlined non-compliance and areas of concern in advance of compulsory audits of the parties. Notably, the ICO has requested that the government introduce a code of practice underpinned by primary legislation on the use of personal data in political campaigns reiterated in the Digital, Culture, Media and Sport Committee’s “Disinformation and ‘fake news'” report (Feb 2019). However, until then the Information Commissioner, Elizabeth Denham has called for an ‘ethical pause’, for campaigners and platforms alike to deeply consider their wider responsibilities. The ICO’s ‘Democracy Disrupt-ed?’ report highlights political par-ties’ “special status” is recognised data and the connection between political campaigns and wealthy private actors has unveiled an approach that seeks to personalise and localise advertisements and ultimately direct debate instead of facilitating it. The satisfaction of a ravenous appetite for influencing voters through political micro-targeting is an affront to democracy since it capitalises on fear to justify power grabs. Our democracy is further afflicted by the limited number of global platforms, such as Facebook and Google, who consistently introduce initiatives to process information in huge volumes and varied scope, resulting in what American author and scholar Shoshana Zuboff calls ‘asymmetries in knowledge’, which are exploited by those with funds. This is damaging: election results can be decided by whoever pours the most money into micro-targeting.

Political campaigns can serve our democratic process well. Campaigns that truthfully and coherently outline a political stance, despite having an elective purpose, can in-form voters to make decisions that reflect their core beliefs. More has to be done by the UK government and regulators to ensure that campaigns do not continue to drift away from their legitimate function.

Government and Regulatory Response

Political micro-targeting is not simply a one-off phenomenon em-bodied by the Cambridge Analytica scandal unveiled by investigative journalist Carole Cadwalladr’s year-long investigation. Given the loom-ing potential of a general election, it is paramount that the UK’s Electoral law is updated to reflect the change in political campaigning from the physical to the digital. In response to the Digital, Culture, Media, and Sport Committee’s report outlining its recognition of issues concerning online political campaigning, in law, which allows them to carry out political opinion data when carrying out legitimate activities.

However, the report also outlines parties’ responsibility as data controllers “to comply with the requirements of the law including data protection principles.” Unfortunately, this conflicted wording clearly leaves some scope for misuse. The “special status” of political parties (under S.8 Data Protection Act 2018) may prove an issue in the lead up to the Boris Johnson’s tabled upcoming election (12th December 2019), given the rife criticism on the lack of transparency about “fair processing” despite enhanced privacy notice requirements under GDPR.

However, the report highlights the DPA 2018’s attempt to stifle the extent to which data is harvested by asserting that businesses supply-ing data to political parties cannot “repurpose that personal data for political campaigning” without the informed consent of individuals. The ICO has already launched investigations into the misuse of personal data by online platforms. If the new Data Protection Act 2018 had been in place when ICO started investigations with Facebook, their Notice of Intent to impose 4% of its annual turnover would have totalled £315 million, instead of its £500,000 penalty for the platform’s “lack of transparency and security issues” contravening the Data Protection Act 1998. However, Denham was warned about fines not being the most effective remedy in radically preventing the misuse of personal data.

Steps taken by Social Media companies

Richard Allan, Facebook’s Vice President of Policy Solutions, has outlined three ways of dealing with microtargeting’s disruption of democracy in the lead up to the election. These methods include the (1) removal of fake accounts, (2) introducing transparency to political advertisements and (3) setting up a dedicated operations team to monitor activity while the UK general election is underway.

These steps show despite Facebook clearly making progress to quell the effects of micro-targeting and spread of misinformation in the UK, hazardous issues still remain. It is plain to see that Facebook’s political problems are a product of a deeper systemic issue residing at the heart of ‘free’ business model of online platforms, namely the surveillance and modification of human behaviour for revenue. Despite Richard Allan’s agreement with the DCMS committee and the ICO that the political campaigns need more rules in the digital era, a conflict of interest is self-evident.

UK electoral law should mimic and work in unison with the data protection advancements, to provide clarity to all stakeholders (political parties, plat-forms and other private actors). If UK electoral law is not revamped to reflect the peak of digital and trough of the physical then we risk leaving private platforms, like Facebook, to executively set policy about what is permissible in political campaigns and adjudicate in the domain of 2.5 million monthly users.

Parliament and regulators have yet to decide and entrench in statute multiple questions about digital political campaigning. Questions that include what constitutes a political ad, the threshold of spending over which political ads must declare who their funders are, in addition to rules about the framing and content of political ads, all remain. Hope-fully, the lead up to the election sees increased scrutiny of the digital threat to British democracy and is not overcrowded by Brexit.

Share:

Share on facebook
Facebook
Share on twitter
Twitter
Share on pinterest
Pinterest
Share on linkedin
LinkedIn
On Key

Related Posts

Class mates

Vasavi reviews a surprise classic of LSE literature

scroll to top