sdecoret - stock.adobe.com
A growing focus on data privacy among regulators across the globe is forcing companies to rethink how they deliver personalised content and marketing to users online.
Where once it was easy to slurp up personal data without people even being aware, in recent years we’ve seen growing public and political unrest about just how much companies know about individuals.
As algorithms delivering personalised content and ads across social networks, apps and the web have become ubiquitous, so more people are expressing concerns about who is accessing their data and how it’s being used.
These have been heightened by ongoing high-profile data breaches and scandals such as the Facebook/Cambridge Analytica affair. But many people simply feel slightly violated when, for example, their smartphone displays a barrage of ads for similar products to those they may have casually searched for on their work computer hours earlier.
The EU responded to growing concerns over data privacy with its General Data Protection Regulation (GDPR), which came into force last May threatening big financial penalties for those not collecting and securing personal data lawfully.
Its scope is wide-ranging, aiming both to ensure companies take compliance and security more seriously and give individuals proper control over use of their personal data. The US looks to be following suit, too, with growing support for initiatives like the California Consumer Privacy Act which comes into force next year.
Legislation hasn’t yet fully bedded down, but as penalties become more widely applied and test cases clarify remaining grey areas, it looks increasingly likely many of those operating in the personalised marketing space – most notably retail marketers and adtech firms – will need to make fundamental changes to their business models.
Read more about personalisation
- The UK marketing director of subscription service Birchbox told the Retail Week Tech conference about the retailer’s use of social media and data to offer customers a personalised service.
- Travel firm Holiday Extras has introduced machine learning and artificial intelligence into its smartphone app to give users a more personalised experience.
Will Robertson, a partner at law firm Osborne Clarke, says: “Historically, advertisers have relied on their legitimate interests to strengthen relationships with customers and raise brand awareness when engaging in personalised advertising.
“They’ve argued consumers prefer relevant and engaging content rather than blanket spam messaging. However, critics argue the level of processing involved is much more intrusive than an average consumer would expect. Indeed, there are a number of ongoing regulatory complaints against the adtech industry that specifically take aim at the online personalised advertising system.”
Among those who have adtech abuses in their crosshairs is the Open Rights Group (ORG), which campaigns for digital civil liberties in the UK. In September 2018, ORG executive director Jim Killock was among a clutch of complainants against Google and IAB Europe for GDPR violations in their real-time bidding systems.
“Whenever a targeted ad is served to an internet user, this is the result of their personal data being shared and bidded out to potentially hundreds of companies,” ORG states. “That data can include a user’s location, browsing habits and other intimate details. The bid request process does not protect against unauthorised access to personal data, which is a violation of GDPR rules.”
Since then, a number of similar complaints against the industry have been filed in Poland, Belgium, Luxembourg, the Netherlands and Spain.
“There must be consent and a clear means for the consumer to know who has their data and a means to have it corrected or destroyed,” says ORG’s Killock. “We are a long way from that in the ad market. It functions on trust and maximises sharing without regard to data rights such as retrieval, correction or deletion.”
In June 2019, the Information Commissioner’s Office (ICO) issued its update report into adtech and real time bidding which agreed with the complainants that current data collection and use practices were unlawful. “The ICO has confirmed massive illegality on behalf of the adtech industry,” says Killock. “They should be insisting on remedies and fast.”
In the meantime, users are increasingly turning to individual means of protecting their privacy, such as VPNs and online ad blockers. Paul Bischoff, privacy advocate at tech research firm Comparitech.com, believes this will result in some firms shifting their focus to personalising ads around content rather than users.
“So instead of saying, ‘This person bought cat food three days ago, so we’ll show them an ad for cat litter,’ the logic is more like, ‘This person is reading an article about dogs, so we’ll show them an ad for dog treats.’ This model doesn’t impede on the user’s privacy, and I would be surprised if it did not surface as an alternative to more data-hungry ad networks,” says Bischoff.
But many companies in the business of personalised advertising and marketing see GDPR not as a threat, but a catalyst for more effective personalisation.
David Snocken, director of data and strategy at Internet marketing and advertising firm Clicksco, says: “In Europe, the impact of getting GDPR wrong is pretty hefty so organisations are having to take it seriously. That provides a push for industry to change, but there’s a pull as well – the benefits that come from better-quality personalisation.
“For a long time the industry has focused on reach – getting their ads in front of as many people as possible. Now we need to move from quantity to quality – targeting fewer people, but more accurately.”
People will consent to the use of their personal data, Snocken believes, but only if they are rewarded with far more valuable and compelling personalised online experiences for doing so.
“Good quality personalisation shouldn’t be intrusive – it should be a seamless experience from the consumer’s point of view, based on a detailed knowledge of where they are in the buying process,” says Snocken. “Using high-quality data on consumers – with their consent – in order to deliver a better experience is ultimately where the industry needs to move to.”
Raj De Datta, CEO at search, merchandising and content platform Bloomreach, agrees. “Consumers are not impressed with a constant flow of advertisements, but they do value a flawless online journey. By using data to create the perfect online experience, businesses have the potential to increase engagement and profitability,” says De Datta.
At the moment, however, the mechanisms by which consumers can control the use of their data are often confusing and intrusive. “Most consumers want a simple way of exercising some control, but without it being too onerous,” says Clicksco’s Snocken.
ORG’s Killock also believes there’s a need for more transparent mechanisms, but he believes best practice is to minimise data sharing. “For instance, there are now companies seeking to give users more control of their information by using systems that do their profiling on the user’s device rather than sharing information onwards,” he says.
Such systems could finally square the privacy versus personalisation circle, giving people a guaranteed assurance of privacy, while allowing them to easily share data with companies they trust in return for ever more compelling personalised services.
Indeed, they could be the only way to realise the full benefits of personalisation in areas where people have traditionally been reluctant to share data because of its sensitivity – like health and finance.
Personal data security
Julian Ranger, founder of personal data security platform Digi.me, says: “There is only one person who knows all about you and that’s you. So why can’t you be the aggregating platform? You can bring all your data together and it can be completely decentralised – each person keeps hold of their own.”
Digi.me is currently working with organisations like Barclays, the BBC and BT to prove the system works effectively. Ranger says: “Companies never actually see, touch or hold the data. Individuals decide where to store it – Dropbox, Google Drive, etc – and it is fully encrypted.”
The system employs a layered consent model – any app can ask a user for data using a certificate system and if the user agrees the data is then processed on their own device without ever leaving it. “That means people can share massively more data with full privacy,” says Ranger.
There are other potential solutions, too. For example, the Enigma initiative is developing a protocol that will allow people to secure and control data on decentralised blockchains.
Any organisation looking to personalise marketing in future must appreciate that the days of data slurping are on the way out. And as users regain control, the onus will be on businesses to give customers compelling reasons to share their data – and the confidence they can be trusted with it.