Opting-out is criminal
I was recently intrigued by an article on how ‘one woman hid her pregnancy from big data’. The woman was Princeton University’s associate professor of sociology Janet Vertesi, who wanted to keep her pregnancy hidden from the internet which she successfully did, managing to get through the whole nine months without being served a single nappy ad. Vertesi and her husband followed a strict process which involved surfing baby sites via Tor’s anonymous browser, paying for goods in cash and keeping all ‘new arrival’ social chat off limits. The cost of staying ‘baby marketing free’ left the couple feeling, by her own admission like “criminals”.

Vertesi stated the experiment was not about consumption, but “resisting the act of tracking it”. A central theme was privacy, in particular data and the methods used to collect it. As an expectant mother, her data was worth much more than the average consumer and in marketers’ intense pursuit of the 360 customer view, Vertesi discovered that ‘opting-out’ was treated with suspicion.

For me, there was a further dimension to consider which is intrinsically linked to privacy; personalisation. Particularly its impact on our retail and information gathering experiences – was Vertesi’s experience more or less positive due to the lack of personalised content or was it simply the process of ‘hiding’ that was problematic?


Filtering the bubble – the problem (and benefit of) personalisation

In 2011 Eli Pariser published 'The Filter Bubble, What the internet is hiding from you'. It discussed the concept of internet users being trapped in their own unique digital bubbles, where they are fed content based on information known about them, for instance location, search history, click behaviour etc. The consequence is that people are slowly having their ability to ‘discover’ eroded. Pariser cites the likes of Facebook’s personalised news feed and Google’s personalised search as key components of the bubble.

Since publication, there have been numerous articles suggesting that the bubble has burst. Whilst I’m inclined to agree, I think it it still has merit. What we see online is manipulated (albeit based on our behaviour), which means that we are used to seeing things that we might like or tend to agree with. However this experience isn’t limited to technology, it’s societal.

If there's something we don’t want to engage with on television, we switch over - the same with a paper; we turn the page. We choose to spend the majority of our time with people who have similar viewpoints. Humans are creatures of habit who find it difficult to break routine. Personalisation of our digital experiences can’t be seen as a limiter to our overall experience of news, opinion and products; in many cases it enhances it.  


Transparency and customisation

Personalisation can apply a unique richness to digital experiences, delivering contextual information and offers direct into our hands. However, there’s a flipside when it becomes too smart – when US store Target knew a teenager was pregnant before her own father did.

There is an emerging polarisation within personalisation, it’s either too sophisticated or in Vertesi’s experience too difficult to opt-out from. A personalised experience needs to be delivered through user initiated customisation, with greater transparency from organisations about what data they want and why.

A move towards this can be seen with the renewed emphasis on tools that empower users to take control of the levels to which their experiences are personalised, for example, search engine DuckDuckGo provides non-personalised search results, browser extensions like Ghostery show who is tracking you and the last 18 months have seen the emergence of companies like Datacoup and Meeco. They are personal data marketplaces which have emerged out of a need to readdress the data value exchange between customer and business.  In using the likes of Datacoup, people are beginning to take greater control of the information they’ll see and ultimately the choices that they’re presented with.

Personalisation has become ingrained in our digital experiences, it’s what people have come to expect and I’d argue, enhances those experiences. However, our focus shouldn’t be centred on it ‘limiting discovery or influencing choice’.  What we should be concerned about is having greater control and transparency over what is presented to us.