Is the price of your privacy a new flashlight app?

man view

Remember the dodo bird?  No surprise there if you don’t — it disappeared a long time ago. Unfortunately, our privacy looks like it’s heading in the same direction as our now legendary feathered friend. It doesn’t seem to matter that everybody agrees how extremely important privacy is — we’re still happily giving it up in exchange for a navigation/social/search, or even a flashlight app.

 

We are giving up our privacy the same way that Manhattan was allegedly bought from the Native Americans for a mere 60 guilders in 1626 (that’s $1,000 in 2006 values). And we could argue about whether we sold our privacy at a fair price or too cheaply, but the fact remains that the majority of us have chosen to sell it to companies (and are continuing to sell it each day) in exchange for new apps, nicer GUIs etc.

 

So instead of privacy, perhaps we should talk about trust – whether we can trust the entities that have our private information to use it wisely, according to our contractual agreements and implicit expectations.

 

The contractual agreement sets some boundaries such as whether they can sell our data to third parties or spam us with advertising, but most people often don’t bother to read these (lengthy) contracts, even though many of these contracts are quite open, giving them a lot of flexibility to exploit our data. The implicit expectations are our expectations around what will be done with the data and whether it will benefit us. Hearing our conversations and sending targeted advertising might be acceptable for some, but using the same info to update the bank about financial problems, (even if permitted according to the general contract), won’t meet most people’s expectations. Even though companies breaching the contractual agreement, or the implicit expectations can find themselves sanctioned by legal entities and/or social media, that still doesn’t guarantee that they won’t breach our trust intentionally or unintentionally

 

Trust is also about trusting those who “acquire” our private data to keep it safe protected from hackers and from leaks.

 

So today when I sell/give away my privacy, I ask myself: do I trust the “buying” entity? (Personally, I would trust Google with my data more than an anonymous flashlight app developer).  And when possible – and if it makes sense – I prefer to pay with money, instead of with my privacy.

 

But this is the reality of being a citizen of the digital/IoT era – we’re forced to interact with more and more entities and vendors that are collecting more and more data about us. We are even welcoming sensors, microphones and cameras into our houses (from Amazon’s Alexa to Mattel’s Hello Barbie smart doll) which create even more privacy concerns.

 

But again, the question isn’t whether or not we have privacy, but rather whether we trust companies like Amazon and Mattel with our personal private information. Unfortunately, it’s impossible to check and trust each of the devices/services/vendors – there are now simply far too many.

 

So here’s the big opportunity for the big players: convince me that I can trust you, and provide an extensive ecosystem of devices/services/vendors for which you are willing to vouch. These players can be the Googles, Amazons, Microsofts and Facebooks of today, but they can also be the communication service providers (CSP). The competitive differentiator is about who can create trust, and provide a large ecosystem.

 

To create trust, CSPs need to verify that the digital ecosystem is functioning flawlessly, and that it’s protected from classic fraud and security attacks, as well as ever-changing threats. And to do this in a dynamic adaptive way so it keeps up with the fast pace of change, it isn’t enough to rely only on human analysts, and supporting systems from the  last decade – they need to use advanced Artificial Intelligence, Machine Learning, and robotics to ensure flawless operations and to continuously adapt the protection of the data. This is a key reason why today we’re seeing these technologies being used more and more in cyber-fraud protection and revenue-assurance domains, and you can see an example of using Behavior Analysis to protect citizens as part of the TM Forum Connected Citizen: Life in a Green, Clean, Smart City catalyst.

 

By the way, what is the price of your privacy – is it less than the cost of a flashlight app?

 

This article by Dr. Gadi Solotorevsky was originally published in TM Forum’s Inform. Gadi will be taking part in the panel debate on The reality of security & privacy concerns at TM Forum Live! (in May in Nice, France) 

 

Author: Dr. Gadi Solotorevsky doesn’t just enjoy creating blog articles – he’s got multiple patents to his name too. With years of experience in developing and deploying solutions, methodologies, and consulting in telecommunications and revenue analytics, he’s also one of the founders (and the Chair) of the TM Forum Revenue Assurance team, a TMF Distinguished Fellow and Ambassador, and one of the authors of TMF’s Revenue Assurance TR131 and GB941 – the de-facto standards and best practices widely used by the telecom industry. Formerly cVidya’s CTO, he’s now Chief Evangelist – Revenue Guard, at Amdocs.

Summary

Instead of privacy, perhaps we should talk about trust – whether we can trust the entities that have our private information to use it wisely, according to our contractual agreements and implicit expectations.

Follow

Add New Comment

Add new comment

Summary

Instead of privacy, perhaps we should talk about trust – whether we can trust the entities that have our private information to use it wisely, according to our contractual agreements and implicit expectations.

Follow