Digital Lifestyle - Why companies need to talk about "Privacy Anxiety".

Data privacy gets a lot of attention these days. But it needs the right kind of attention.

As a mandatory task, a legal issue,  and a - more or less annoying - admin topic -, data privacy is a usual sight on the management agenda. Yet, over-saturated with GDPR requirements, cookie banners and “the right to be forgotten”, it is easy to overlook that protecting customer data in the age of digitalization actually touches the core of any business who runs online channels. In fact, it can play an existential role in corporate communications and in the future-proofing the company’s business model.

We are only now beginning to take a phenomenon into account that is as disturbing as it was predictable: Privacy Anxiety.

 

What online businesses need to know about the psychology of their customers

So, I am not complaining that there is too little talk about privacy. Privacy is very present in the business context from the C-level down to almost all corporate departments. However, it is usually discussed with a very limited scope:

  • What do I actually have to do to meet the minimum requirements?
  • How do I legally secure my company?
  • And how can I match the legal requirements with the apparently unavoidable pressure to collect and store masses of customer data for marketing and service purposes?

Almost unaffected and unconnected to this is the private perspective on the responsible handling of personal data, that we all know so well. I am talking about the mental load of the "always-on" digital overkill.

Reaching for the smartphone right after waking up, social media posts specially designed for *bedtime scrolling*, reaching for the mobile phone 80 to to 100 times a day, the physical unease and Fear Of Missing Out (belittlled by the nickname "FOMO"), when you haven't looked at your phone for more than three hours....

Concerns about an unhealthy digital lifestyle are no longer a taboo. They are a part of zeitgeist: 'Digitize yourself!', 'Improve yourself!', but also: 'Carpe diem!', and 'Know how to switch off sometimes'. Naturally, there are - funny enough - lots of digital wellbeing apps to chose from...

And this is where the social care for digital overkill ends: we acknowledge it in conversation, we install a well-being app but aport from that any concrete practical measures to get better is referred to the level of individual agency. Those who feel the ill-affects of the "always-on lifestyle" are said to be responsible for themselves. If they are not happy about some aspects of their life, they need to change their own behaviour. And if it's really bad, undergo therapy. It's personal, why should companies care?

I think this reasoning reveals a dangerous fallacy, because these two perspectives - caring for customer data and caring for personal lives - belong together.

Busting a common myth: Personal data privacy is not a private matter

The combination of two commonly held beliefs leads companies onto a deceptive path of false security concerning their clients' privacy claims. It does not seem evident why the impact of digital consumption in private life should be tied to a business responsibility. But I will to argue that any company collecting digital data must feel responsible, even more: that they will actually want to care that all privacy needs are met.

On the one hand, we have successfully separated the consumer as market participant from the private individual eager to protect and not share their data, forgetting that they are one and the same human being.
This separation is so sharp that even if someone manages to recover after struggling with digital addiction, this is seen as evidence that defining privacy concerns as an individual responsibility is indeed justified. The possibility of recovery seemingly puts the burden of responsibility on the consumers' shoulders. Yet, the fact that the majority is struggling to find a healthy balance (e.g. the role of social media in teen anxiety) is never discussed as a possible evidence to the contrary. What if the struggle is the norm precisely because specific factor in the environment is setting the consumer up for failure? Why should the individual be solving the problem then?

And on the other hand, a very different idea is firmly planted in our minds, too. It is the thought that, as with all things also in digital life "the dose makes the poison."

Statements as "You can deal with all things in a healthy or unhealthy way" shrug away concerns by pretending that the level of privacy is a mere question of choice as if any kind of consumption scenario was open to any person.

 

But this does obviously not appear to be true in the case of securing private data.

The information contained in private data is indeed about a person's private life. But in the form in which it is externalized and made accessible by the use of digital devices users cannot opt out of the collection of data just as not buying a consumer good, like chocolate. Privacy is in fact not a good or service at all. It is a relationship. A relationship with those who, sitting on the other end, receive, read, store, use, and perhaps dispose of one's own data.

Every industrial revolution has its symbolic disease

Your private data being held by other people is of course a scary thing. Especially if you had no actual choice and transparency about what happens. This will of course show a impact on people’s health. History shows that this is no minor thing and we can actually prepare for the consequences we learned from the past.

If is fair to say that internet has transformed how we do business and all other aspects of life. We experienced a deep transformation over the last 20-30 years.

As with all innovations that become defining for a new age, and same with digitization, it is all too easy to regard everything as new, unprecedented and incomparable. Ultimately, however, we are currently 'only' experiencing another industrial revolution. Structural comparisons can be drawn. And so far, every industrial revolution has produced its symbolic disease.

This became particularly clear with the first industrial revolution, which brought us mechanization. Large factories and the archetype of the industrial factory worker, who, suffering from notoriously bad working and living conditions, fell victim to a tuberculosis epidemic. The health of the workforce (or more aptly: counter-balancing the suddenly reduced lifespan) became such a critical failure factor for business, that companies and insurances began to finance their cure. (Pulmonary sanatoriums such as Beelitz near Berlin paint a picture of this period that is as impressive as it is gruesome.)

 

After programming for automated production, we are now in the fourth industrial revolution with the internet deeply rooting in infrastructure, industry, commerce and connected digital devices as standard household appliances.

Today, when we think of the specific health drawbacks and diseases of our time, the high numbers of back and neck pain or other musculoskeletal disorders come to mind first and foremost. The “tech neck” seems to be our only issue with staring down in smartphone and laptop screens. It is not.

Largely unnoticed is the reference to the increasing number of mental problems. Mental illnesses and behavioral disorders such as depression, burnout, ADHD, dementia, dissociative disorders, anxiety disorders and addictive behavior haven risen over the years and cause a great and widespread suffering in our time.

Is it not obvious to see a connection between the formative technology and health issues due to changed behavior?

Data as a "means of payment"?

I think it is quite obvious that every digitally active company is involved when their customers are supposedly struggling with the consequences of alienation of private data. But it might still happen that companies do not want to accept this responsibility or will be finger pointing somewhere else.

So, let’s put this aside for a moment and follow a different line of thought.

People who struggle with the effects of digital lifestyle will inevitably become more cautious in their behavior, including being wary of sharing personal data. Yes, companies want and need this data because it has become central to their marketing, sales and even production activities.

So, when companies are thinking about higher privacy requirements, they are not only talking about spending more money by investing in data storage, management, and security. Having less personal data at their fingertips also translates into less visibility in marketing, lower conversion in sales, and poorer market fit in product strategy. In short, increased privacy standards translate into a double loss: more spendings and less revenue.

From this angle it is not far-fetched to link privacy measures very closely to money. Perhaps we could even go as far as to to call them a modern means of payment.

Users "pay" by sharing their customer data. Just as the platforms Facebook, Instagram, TikTok and even Google Search can be used without fees, because the data collected in this way is a far more valuable asset. (Once soberly translated into: "If you don't have to pay for a service, you are the product.")

When private details are no longer private

The understanding that you have to handover your data or else will be obliged to pay to keep your privacy intact is particularly obvious in the way many news portals try to monetize their service.

Publishing was delivered a huge blow by the freely available content online and publishing houses and platforms are still trying to find their way into a new business model. Many now allowing free access to quality content if the reader enables tracking, and if not flat-out asking for a one-time-payment or subscription free.

Here, the equation has already been made: Data can seemingly translated in money. The exchange rate of your data shared by reading one article equals to 3,50 Euro.

However, what sounds like a nice and tidy solution is a fundamentally false analogy, because money and data differ in one very important respect.

Money (fiat money!) is simply printed paper or a number on an account. It is unconnected to the people who own it and can therefore flow freely back and forth between them.

Just as money is not bad in itself; it depends on what you do with it; so too collected data is not bad, it just depends on who treats it with what conscience and intentions.

Data is a completely different matter, and it is precisely from this that the very special possibility of abuse and threat emanates.

Personal data not only speaks about one's person and one's life; it belongs to the person themselves, it is a digital part of one's self. And the possibility of any kind of unplanned use, manipulation or "expropriation" is already an attack on your personal identity.

Introducing "Privacy Anxiety"

This is why losing your data is bad for you even if it is not used to do harm. The crack in your digital identity is the problem. On top of that data brokers and business models like Cambridge Analytica are not a surreal nightmare, they do exist and are actual market players.

In addition, the inclusive and democratizing effect of the Internet acts as a massive amplifier:

Losing private data and thus experiencing an attack on the right to one's self is not a luxury problem of privileged individuals. Nor is it a milieu problem, like the turberculosis of the working class in late 19th century Berlin. Online commerce, social media, smartphones and GPS networks are part of everyday life everywhere, at all times and for people of all ages. It affects everyone and is thus a mass phenomenon.

 

In this context, a new mental disorder is emerging that may become the disease of our time. It appears in many ways and does not yet have a specific name, but analogous to the phenomena of "Nuclear Anxiety" and "Climate Anxiety", it would be appropriate to call it "Privacy Anxiety". The fear of losing privacy.

The true costs of Privacy Anxiety

The General Data Protection Regulation (GDPR) is currently the standard for a set of requirements for companies to responsibly protect customer data. And it does in fact already confirm the companies’ responsibility in the process of data collection as it applies “the polluter pays” principle, the data collector bears the responsibility for the consequences. And it is universal and applies to all cases not just in the event of damage caused. A certain level of quality in security and communication must be maintained as soon as the first data package is read and stored – whether the company ever intends to use it or not.

Now that GDPR is in place companies might feel that they have done their fair share if it comes to data privacy. But GDPR plays on the level between legislator and companies qua data collectors. It completely disregards the active role of customers and their requirements towards the companies they do business with.

But GDPR is no cure for privacy anxiety because consumers are still expected to handover their data and might not want to. GDPR is a tweak to the rules of the game, but it does not change the game itself.

Fearful clients will avoid sharing data, will eventually avoid doing business with companies they do not feel save with.  And this poses the huge risk of losing customer base and hence revenue. So, the question how to secure and maintain customer relationships is at least as important as the question of GDPR compliance.

Clients not feeling safe, can destroy the basis of market and business – even if you are GDPR compliant. Because business is built on trust, and trust grows between people, it is not a tech feature.

For this reason, companies should take the initiative to counteract privacy anxiety by setting high privacy standards that fulfil actual customer’s needs and most importantly start communicating authentically and honestly with their customers about why and how they are working with their data.

Why online retailers in particular need a new approach to privacy in order to prevent permanent damage to their business will be discussed in the next article.