MY ACCOUNT | CONTACT | FAQ 
Search :
QUANTUM COMPUTINGTHE QUANTUM FUTUREOUR TECHNOLOGYAPi


Machine Learning the New Champion of Data Privacy

We live in a world where the sharing of personally identifiable information is an obligation - not a choice. Globally we face difficult challenges on a daily basis, from the recent COVID-19 pandemic to human trafficking and more routinely in day to day services, healthcare and government administration. What we think for certain, is that whatever the purpose, people deserve to know that the data they share is as secure as possible and that only the organisations entitled or authorised to use it have access. 

Commonly, data is held by multiple data controllers. Users share data multiple times with multiple parties to access services. Yet, the purpose of sharing from a user perspective is often the same - "I simply want access to a service". Wouldn't it be good if user data could be accessed in a way that allowed friendly services to utilise data responsibly, but didn't involve dozens of companies holding different copies of the same data? 

Putting Machine Learning between essential services and your user data

Surprisingly, it turns out that a technology that is often thought of as "privacy invasive", holds the answer to the data sharing nightmare. That technology is Machine Learning (ML). ML is the building block of many automated systems, some of which are being called Artificial Intelligence (AI), but that is more of a common myth than a fact. Toridion have developed a new kind of data sharing technology that is powered by autonomous ML which acts as 'guardian' of personal data. It offers a unified memory that can recognise users and key identifying facts without actually holding any actual data in the traditional sense. 

"Our approach makes it impossible for data controllers to access your data out of context"

Beyond the obvious and frankly major problem of simply keeping data secure, another obstacle to overcome is simply knowing 'where the user data you require is located'. If an organisation needs to access information they didnt create, they first need to know where to look right? In fact, they first have to know that the data exists. Once the data source is located, we face further challenges of determining if the external data source is authentic or even up to date.

Instead of allowing multiple companies to share your data, The Toridion system "teaches - not stores" data to a centralised ML system. This centralised ML system then learns to recognise user data in a particular (or multiple) context.

In a future scenario where a company needs to validate a users identity or presented documents for example, rather than having to submit multiple data sharing requests to perhaps multiple agencies - the company simply asks the ML system if it recognises the user in the presented context.

"data is not made "accessible" in a traditional sense, rather the ML behaves more like a trusted human advocate able to vouch for the data presented"

By trusting the users personal identity to ML, and allowing the guardian to decide if the company asking the question is entitled to access it in the current context, data is not made "accessible" in a traditional sense, rather the ML behaves more like a trusted human advocate able to vouch for the data presented - but only if it determines the context of the request is valid.

We have recently launched our authID API in beta via IBM Cloud® click here to learn more














  



Website powered by Firecart X eCommerce