Anjuna’s Analysis of the Uber Security Incident

Published on
Nov 1, 2022
Anjuna’s Confidential Computing makes sensitive assets, code, and data invisible and inaccessible to attackers already inside the network.
https://www.anjuna.io/blog/anjunas-analysis-of-the-uber-security-incident

Breaches cannot be stopped, but Anjuna’s Confidential Computing makes sensitive assets, code, and data invisible and inaccessible to attackers already inside the network.

First and foremost, the recent Uber breach once again reveals that the weakest link will always be exploited every time: people and clear, accessible data; and we can expect this to happen again and again. It also reminds us that well-intentioned companies with brilliant products can also fall foul of risks with heavy impact. The good news is there are new ways to mitigate the consequences far more thoroughly than ever before. However, it takes both a shift in thinking and the application of new technology approaches to make weak links just an everyday "weakness" and not a link to something much more valuable and sensitive - enterprise and customer data.

Exploiting the Weakest Link

If the reports are accurate, the Uber attacker gained access to a username and password, social engineered past the Multi-Factor Authentication, and was able to freely scan Uber’s network and find a PowerShell script with administrative access. Using this elevated privilege, the attacker was able to obtain access to Uber’s secrets and Privileged Access Management (PAM) system that exposed the attacker to cryptographic secrets, application keys, encryption keys, and downstream apps - including Slack, VPNs, key management servers and deeper into the network including databases and source code. This is potentially a game-over moment: when a privileged access management solution is breached, nothing is safe.

Diagram - Michael Koczwara

Credit: Michael Koczwara

Don't Fight on Their Terms

Attackers will always find a way to gain access to your network and systems - that’s a given. The attack surface of an IT environment processing sensitive data and code is often massive, especially with modern microservice architectures. However, what if you could dramatically reduce the attack surface to your PAM/secrets manager, API, and CI/CD Pipeline - and even make data and applications invisible and inaccessible to the wrong parties with a hardware-enforced shield? This is exactly what Confidential Computing solutions do - and yet its widespread adoption has yet to be recognized, so the question remains: why isn’t it more ubiquitous? The answer may be unsurprising that most Confidential Computing solutions are incredibly inconvenient to deploy, often diverting considerable resources in a DIY effort or requiring the architecting of applications for use with commercial offerings. Anjuna’s Confidential Computing Platform allows workloads and applications, including key management systems (KMS) and PAM systems, to be truly isolated even when attackers are already moving around IT systems and cloud infrastructure with root-level access. This is protection at the CPU level that cannot be manipulated - not just software.

Solving for Secret Zero

In the Uber case, even if the Key Management Server itself had not been breached, the attacker would have eventually gained access to sensitive data and secrets by simply acquiring root access or leveraging memory scraping or process dumping techniques which is potentially a longer and somewhat harder process, but the logical next step in the attack had they not acquired the privileged credentials to the KMS and PAM systems right away.

The truth is, there is a critical industry risk that until now has not been fully solved: managing human and machine credentials in software, especially secrets which must be present during software start phases without intervention, like a container booting or an app scaling up new instances. This is an intractable problem that cannot be solved without a completely different compute trust model. The industry calls it the "Secret Zero problem."

Here's why. Organizations fundamentally struggle with “Secret Zero” machine credentials in particular - there is no 'autonomous and secure machine brain' to remember them independent of the host or code, and no business wants a human to get out of bed and enter complex credentials into a server on boot or as a service is launched - that would be crazy though it actually happens sometimes for the unfortunate DevOps engineer on call that night. So, organizations "work around” this issue leaving credentials exposed and requiring substantial investment in ultimately ineffective monitoring controls and other "bolt-on" approaches to security. They put secrets into files or environment variables and maybe encrypt them with another key that's left sitting in a file - open. Thus the problem turtles down to some other point of compromise. Organizations may put credentials in a secret vault to move them from direct exposure, which is an improvement. However, there still needs to be another secret to authenticate to the vault or encrypted file - and that right there is the machine equivalent of the post-it note with the admin password written on it on the root admin's monitor. Yet it’s potentially open to attack.

The Clear Answer to Credentials in Cleartext

The dirty secret in the industry, pun intended, is that keys, secrets, API keys, and other sensitive data are often left to read in the clear - files with names like “credentals.js,” “tls.key” or “privatekey.txt” - to allow a service to boot, and then retrieve more secrets, become low hanging fruit in a compromised host. They may also exist in memory, offering another vector for compromise. For example, an application might boot normally and pull a key out of a secure HSM - a hardware security module designed to secure stored keys with robust controls. The key may move into memory, but to do so, the process needs to authenticate with a secret or token. If that token is stolen or that memory is compromised, which isn’t too hard with root access, it can be game over for a key or the credential for HSM access, allowing complete recovery of sensitive data and dire consequences.

To remove this risk, the industry has embraced a machine-based root of trust with Confidential Computing. In effect, in addition to protecting data in use with the new memory-encrypting CPUs, protection can extend to data at rest and in motion. Anjuna brings machine-level trust to the CPU layer, referred to as attestation. A workload can have "Secret Zero'' cryptographically bound to it with the attestation process, so it’s absolutely and provably the only workload with permission at boot time to unlock it. No external access is required, and no extra “secrets” in files (which are never secret) are needed. What's more, given the workload runs with data encrypted during use, even a core dump, root user, malware, attacker, or memory-dumping error condition won't reveal data. Secrets, data, code - everything runs privately - isolated from the cloud itself, root access, and attackers.

Uber Hack Flow-1

Anjuna’s Confidential Computing Platform brings a new way to solve privacy over workloads and secrets management and isolate them from even an attacker already inside the network, looking for data - our software enables the widely available confidential CPUs across clouds and servers to be easily utilized. Breaches cannot be stopped, and defense in depth is better than nothing, but for a new level of protection and isolation, Anjuna’s technology applied to regular workloads brings a new promise of risk reduction and a new way to compute and transform applications that process valuable and sensitive data. Attackers might still get inside the network but will likely find only the unprotected general IT that has no intrinsic theft value or bragging rights - and move on to more valuable and more easily compromised victim’s IT.

Schedule a demo to learn more about Anjuna’s platform and see how simple it is to deploy.

More like this
Get Started Free with Anjuna Seaglass

Try free for 30 days on AWS, Azure or Google Cloud, and experience the power of intrinsic cloud security.

Start Free