RESOURCE LIBRARY
One Pager

What is Confidential Computing?

On this page

Confidential Computing is a new approach to data security that aims to protect data while it is being processed by the application. Before Confidential Computing, encryption was a security measure that could protect data at rest (storage) or in transit (network), but once data was accessed by the application (memory), it had to be in cleartext to be processed, leaving it vulnerable to attacks. Confidential Computing addresses this vulnerability by processing data inside a hardware-isolated, attested Trusted Execution Environment (TEE).

What is a Trusted Execution Environment (TEE)?

A Trusted Execution Environment (TEE), also referred to as a secure enclave, is a secure area within the main processor isolated at the hardware level, where code can be executed and data can be processed in a fully secure manner. In this environment, code and data can be processed in a way that is invisible or inaccessible to unauthorized entities, including those with root access to the compute layer, such as insiders or cloud providers. This ensures that even if an attacker gains access to the system or network, they will not be able to access confidential data.

How Does Confidential Computing Work?

During the manufacturing process, chipmakers embed secure keys in the Confidential Computing processor. When requested, the processor establishes a TEE or secure enclave to run the application, which is sealed off from the rest of the system. The data processed is encrypted and decrypted using the keys embedded in the processor. This encrypted data remains protected in memory until decryption is necessary, and the decrypted data is inaccessible to the operating system, other processes, and cloud providers, making it invisible to potential attackers. The TEE ensures that only the application running within it can access data.

Attestation is the other critical component of Confidential Computing. Attestation is the process by which one system proves to another system that a specific application or code is running on certain hardware, such as secure confidential computing hardware, and that the application or code has not been tampered with. Without attestation, a compromised system could deceive others into trusting it, claim that it is running specific software in a TEE, and potentially compromise the confidentiality or integrity of the data being processed or the trusted code's integrity.

As a result of implementing data-in-use encryption and code identity attestation, Confidential Computing hardware guarantees the following qualities:

Why is Confidential Computing Important?

Confidential Computing is a cutting-edge solution for a long-standing problem in the realm of data protection and cybersecurity, which is protecting data in use. While encryption has long been used to secure data in transit and at rest, there was no practical way to protect data in use inside a processor or memory until the emergence of confidential computing. This left data vulnerable to being viewed or modified while it was being processed.

Previously, security primarily focused on protecting the perimeter and the servers that organizations owned and operated in private data centers. In such cases, it was acceptable for the infrastructure, operating system, and other applications to be exposed to running data and code.

However, with the rise of cloud computing, organizations are exposed to a shared responsibility model where they run their workloads on infrastructure owned by a third party, in this case the cloud service provider. In a world where data in use is unencrypted, anyone with root access, including attackers, insiders or cloud provider administrators, can exfiltrate code, data and secrets from a running process, and then move laterally to disrupt the entire system, rendering other security mechanisms futile, such as encryption for data at rest and in transit. Consequently, despite best efforts from cloud providers to prove they can secure their environments, the inherent risk associated with this shared responsibility model has kept many organizations from moving their most sensitive workloads to the public cloud. Confidential Computing fundamentally eliminates that risk, enabling enterprises to embrace the cloud, migrate sensitive workloads, and innovate without the threat of attackers, insiders or third parties eavesdropping on or tampering with code and data.

Benefits of Using Confidential Computing

The benefits of confidential computing are numerous:

  1. Confidently move workloads to the cloud: With Confidential Computing, organizations can maintain control over their data and move workloads to the cloud without sacrificing security.
  2. Protect sensitive data while in use: With Confidential Computing, data can be protected even while it is being processed, ensuring it is never exposed to unauthorized parties. 
  3. Secure valuable intellectual property (IP): Confidential Computing can help safeguard IP by ensuring that sensitive data and code (e.g. AI or ML models) are never exposed to anyone unauthorized to access it.
  4. Enable collaboration with partners on secure, confidential cloud solutions: Confidential Computing allows for secure collaboration between partners, as data can be processed without the risk of exposure.

Confidential Computing: A Major Advance but Complex to Deploy

While Confidential Computing offers powerful data protection, cloud service providers are focused on offering the infrastructure building blocks by making the latest CPUs capable of  confidential computing available as part of their services. For customers, this means that Confidential Computing can still be complex to implement, since the process of deploying applications inside secure enclaves requires re-architecting them. This demands the participation of specialized engineers, which can drive up operating expenses to impractical levels. Furthermore, each chip manufacturer and cloud provider has created proprietary Confidential Computing solutions and services, including Intel SGX, Azure, AMD SEV, AWS Nitro Enclaves, and Google Confidential VMs. This results in a large and heterogeneous array of choices for organizations trying to manage on-premises, hybrid, and multi-cloud environments.

Anjuna Seaglass: A Simple and Scalable Approach to Confidential Computing

Anjuna Seaglass is designed to make Confidential Computing easy, hardened and scalable, enabling enterprises to accelerate their path to run their workloads in the cloud with trust and confidence.

If you want to learn more about Anjuna Seaglass, visit the product page.

Resources

Confidential Computing is a new approach to data security that aims to protect data while it is being processed by the application. Before Confidential Computing, encryption was a security measure that could protect data at rest (storage) or in transit (network), but once data was accessed by the application (memory), it had to be in cleartext to be processed, leaving it vulnerable to attacks. Confidential Computing addresses this vulnerability by processing data inside a hardware-isolated, attested Trusted Execution Environment (TEE).

What is a Trusted Execution Environment (TEE)?

A Trusted Execution Environment (TEE), also referred to as a secure enclave, is a secure area within the main processor isolated at the hardware level, where code can be executed and data can be processed in a fully secure manner. In this environment, code and data can be processed in a way that is invisible or inaccessible to unauthorized entities, including those with root access to the compute layer, such as insiders or cloud providers. This ensures that even if an attacker gains access to the system or network, they will not be able to access confidential data.

How Does Confidential Computing Work?

During the manufacturing process, chipmakers embed secure keys in the Confidential Computing processor. When requested, the processor establishes a TEE or secure enclave to run the application, which is sealed off from the rest of the system. The data processed is encrypted and decrypted using the keys embedded in the processor. This encrypted data remains protected in memory until decryption is necessary, and the decrypted data is inaccessible to the operating system, other processes, and cloud providers, making it invisible to potential attackers. The TEE ensures that only the application running within it can access data.

Attestation is the other critical component of Confidential Computing. Attestation is the process by which one system proves to another system that a specific application or code is running on certain hardware, such as secure confidential computing hardware, and that the application or code has not been tampered with. Without attestation, a compromised system could deceive others into trusting it, claim that it is running specific software in a TEE, and potentially compromise the confidentiality or integrity of the data being processed or the trusted code's integrity.

As a result of implementing data-in-use encryption and code identity attestation, Confidential Computing hardware guarantees the following qualities:

Why is Confidential Computing Important?

Confidential Computing is a cutting-edge solution for a long-standing problem in the realm of data protection and cybersecurity, which is protecting data in use. While encryption has long been used to secure data in transit and at rest, there was no practical way to protect data in use inside a processor or memory until the emergence of confidential computing. This left data vulnerable to being viewed or modified while it was being processed.

Previously, security primarily focused on protecting the perimeter and the servers that organizations owned and operated in private data centers. In such cases, it was acceptable for the infrastructure, operating system, and other applications to be exposed to running data and code.

However, with the rise of cloud computing, organizations are exposed to a shared responsibility model where they run their workloads on infrastructure owned by a third party, in this case the cloud service provider. In a world where data in use is unencrypted, anyone with root access, including attackers, insiders or cloud provider administrators, can exfiltrate code, data and secrets from a running process, and then move laterally to disrupt the entire system, rendering other security mechanisms futile, such as encryption for data at rest and in transit. Consequently, despite best efforts from cloud providers to prove they can secure their environments, the inherent risk associated with this shared responsibility model has kept many organizations from moving their most sensitive workloads to the public cloud. Confidential Computing fundamentally eliminates that risk, enabling enterprises to embrace the cloud, migrate sensitive workloads, and innovate without the threat of attackers, insiders or third parties eavesdropping on or tampering with code and data.

Benefits of Using Confidential Computing

The benefits of confidential computing are numerous:

  1. Confidently move workloads to the cloud: With Confidential Computing, organizations can maintain control over their data and move workloads to the cloud without sacrificing security.
  2. Protect sensitive data while in use: With Confidential Computing, data can be protected even while it is being processed, ensuring it is never exposed to unauthorized parties. 
  3. Secure valuable intellectual property (IP): Confidential Computing can help safeguard IP by ensuring that sensitive data and code (e.g. AI or ML models) are never exposed to anyone unauthorized to access it.
  4. Enable collaboration with partners on secure, confidential cloud solutions: Confidential Computing allows for secure collaboration between partners, as data can be processed without the risk of exposure.

Confidential Computing: A Major Advance but Complex to Deploy

While Confidential Computing offers powerful data protection, cloud service providers are focused on offering the infrastructure building blocks by making the latest CPUs capable of  confidential computing available as part of their services. For customers, this means that Confidential Computing can still be complex to implement, since the process of deploying applications inside secure enclaves requires re-architecting them. This demands the participation of specialized engineers, which can drive up operating expenses to impractical levels. Furthermore, each chip manufacturer and cloud provider has created proprietary Confidential Computing solutions and services, including Intel SGX, Azure, AMD SEV, AWS Nitro Enclaves, and Google Confidential VMs. This results in a large and heterogeneous array of choices for organizations trying to manage on-premises, hybrid, and multi-cloud environments.

Anjuna Seaglass: A Simple and Scalable Approach to Confidential Computing

Anjuna Seaglass is designed to make Confidential Computing easy, hardened and scalable, enabling enterprises to accelerate their path to run their workloads in the cloud with trust and confidence.

If you want to learn more about Anjuna Seaglass, visit the product page.

By submitting this form you agree to the Anjuna Privacy Policy.
Thanks for your interest in Anjuna!
Oops! Something went wrong while submitting the form.

See the power of protection for yourself

Don't Delete
No items found.
No items found.