Systems that can’t lie: ‘Honest Computing’ ensures transparent data practices

by

a cube shaped building on a rock

‘Honest Computing’ offers a new approach to building computer systems, guaranteeing that they operate transparently, reliably, and ethically.

Data forms the basis of all scientific, industrial or commercial processes and ensuring the effective management of data, whilst preventing manipulation or unethical practices is of increasing importance.  

A team of researchers and practitioners led by Imperial’s Florian Guitton from the Data Science Institute have proposed a new technical foundation known as ‘Honest Computing’ to enable the creation of auditable and enforceable rules for confidential and private data protection that can be easily validated.  

This work helps empower policymakers to establish rule-based data protection regulations that are underpinned by robust technical protocols and will be presented at the Data For Policy Conference between 9-11 July 2024, held at Imperial College London.  

Lead author Mr Guitton said: “This work has been a long time in the making and the technologies needed to build Honest Computing systems have only been accessible commercially for a few years. Systems built upon this framework have the potential to change the way we handle trust-based applications in healthcare, finance, supply chain management etc… with demonstrability of evidence at the core.” 

Addressing regulatory gaps in data protection 

In this work, the researchers touch on the difficulties with regulating technology as often technological advancements outpace the ability to effectively regulate new tech.  

At the same time, in the data protection landscape, current regulatory frameworks lack the detailed technical protocols required to ensure that methods of data protection and processing are robust and most importantly verifiable. 

As a result, there is a pressing need for innovative solutions that can address these challenges and enable regulators to establish auditable, validatable, and enforceable rules in data processing and protection. 

Through the integration of technologies like Trustless Computing, Confidential Computing, Distributed Computing, Cryptography, and AAA security concepts, Honest Computing offers a scalable solution that can bridge these regulatory gaps, helping policymakers to navigate the complexities of data life cycles while ensuring compliance, fairness, and ethical conduct. 

This solution comprises of an architecture using mature technologies at the Technological Readiness Level of 7 – this means the technologies are proven to work and have potential applications in production. 

Focus on sustainability 

Unlike blockchains, homomorphic encryption or game-theory-based solutions which can provide elements of resolutions to these issues, the Honest Computing framework described in the work selects technologies which combine have a low power consumption and high scalability, rendering the development of applications more sustainable and environmentally friendly. 

Rule-based approaches 

Policymakers currently face significant challenges in ensuring that regulations about data protection move away from an abstract, principle-based approach to one that is rule-based and more easily validated.   

While principle-based approaches set broad standards for companies or individuals to adhere to, rule-based approaches provide specific and detailed guidelines for clear-cut directives. This means rule-based approaches are often considered more useful in policymaking as they leave little room for interpretation and therefore offer more straightforward compliance and easier monitoring.  

Mr Guitton explains: “When we talk about regulating data usage, it is even more crucial that regulations are rule-based in order to provide demonstrable evidence of correct handling rather than a belief-based statement so that we can safeguard any information effectively. Honest computing exemplifies this by creating a framework that ensures a system cannot lie about the data or the algorithms it runs by default and by design.” 

Developing Honest Computing systems 

The next step for the team is to provide an implementation of an Honest Computing system as defined in this work and accessible to all. There will be a need to understand how to integrate Honest Computing systems with legacy infrastructure to protect as many of the guarantees it aims to offer. 

Also, with Honest Computing, new types of data research will become crucial, for example in blind data quality checking or visualisation-including data transformation chains. 

‘Honest Computing: Achieving demonstrable data lineage and provenance for driving data and process-sensitive policies’ by Guitton et al., published as part of Data For Policy Conference 2024.  

Image from Nicolas Arnold - Source: Unsplash.

Reporter

Gemma Ralton

Gemma Ralton
Faculty of Engineering

Click to expand or contract

Contact details

Email: gemma.ralton@imperial.ac.uk

Show all stories by this author

Tags:

Big-data, Artificial-intelligence, Engineering-Computing, Global-challenges-Data, Engineering-AI-and-machine-learning
See more tags