Apple adds ‘black magic’ AI security to Swift

Homomorphic encryption is a phrase that might never make it to everybody’s lips, but the technology will become part of what we do each day, thanks to Apple, Swift and the need for artificial intelligence (AI) in the cloud. It’s a privacy-protecting technology that lets you secure data in the cloud, work on that data using cloud services, and do all that without anyone other than you knowing what your data is. 

Why does this matter?

Think about it this way. When we store data in the cloud, we already use technology to lock that data down so no one, including the people running the servers, can access it. It’s like putting your data in a safety deposit box only you can open. 

But what we do with cloud data is changing rapidly as a multitude of services appear that let you use powerful AI systems to work with it. To do so, the servers must access your information — they need to open that safety deposit box to work with the information it contains, which makes your data less secure.

What can be done to make it possible to use AI services while leaving data secure? Homomorphic encryption seems to be the answer. 

MIT Professor, Vinod Vaikuntanathan calls that process “black magic” in this video that very clearly explains some of the intricacies of homomorphic encryption. That’s because the encryption tech makes it possible for the server to put its hands inside the safety deposit box and work with data without ever accessing or even knowing what it is working with.

Leaving that data encrypted unlocks the power of cloud-based AI, while also building in privacy. I expect the tech will see use in Private Cloud Compute, though it is not certain to what extent it will be capable of handling large and complex tasks at this point.

How is Apple boosting homomorphic encryption?

Apple already uses homomorphic encryption. Now, it has introduced a new open source Swift package for homomorphic encryption

The rationale here is obvious. Unlike so many in the industry, Apple prioritizes user privacy, which it sees as a human right. It is quite plausible that one of the challenges it faced on its road to generative AI has been the need for more complex cloud-based computations to access core data, which conflicts with the company’s privacy goal. The deployment of homomorphic encryption marries those two conflicting aims.

Apple isn’t going quite so far as to say it is about that, even though it evidently is. Instead, it talks about how it uses the tech in its Live Caller ID Lookup feature, which provides caller ID and spam blocking services. In use, this lets Lookup interrogate a server for information pertaining to a phone number without that server ever actually accessing the number itself.

What is a typical workflow?

On its Github page sharing the tech, Apple explains what a typical homomorphic encryption workflow might be:

The client encrypts sensitive data and sends the resulting ciphertext to the server.

The server performs computation without learning what any ciphertext decrypts to.

The server sends the resulting ciphertext response to the client.

The client decrypts to learn the response.

Apple also provides its own explanation of homomorphic encryption:

“Homomorphic encryption (HE) is a cryptographic technique that enables computation on encrypted data without revealing the underlying unencrypted data to the operating process. It provides a means for clients to send encrypted data to a server, which operates on that encrypted data and returns a result that the client can decrypt. During the execution of the request, the server itself never decrypts the original data or even has access to the decryption key. Such an approach presents new opportunities for cloud services to operate while protecting the privacy and security of a user’s data, which is obviously highly attractive for many scenarios.”

Empowering next-generation AI — securely

Of course, there are challenges here around performance and speed, but it is plausible that Apple’s own servers might already be more than capable, given their computational capacity and low energy requirements. But given that the tech is also thought to be capable of providing data protection against quantum computer attacks, it is certain homomorphic encryption will now become an important force empowering AI on Apple’s platforms down the road.

Please follow me on Mastodon, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.

​Homomorphic encryption is a phrase that might never make it to everybody’s lips, but the technology will become part of what we do each day, thanks to Apple, Swift and the need for artificial intelligence (AI) in the cloud. It’s a privacy-protecting technology that lets you secure data in the cloud, work on that data using cloud services, and do all that without anyone other than you knowing what your data is. 

Why does this matter?

Think about it this way. When we store data in the cloud, we already use technology to lock that data down so no one, including the people running the servers, can access it. It’s like putting your data in a safety deposit box only you can open. 

But what we do with cloud data is changing rapidly as a multitude of services appear that let you use powerful AI systems to work with it. To do so, the servers must access your information — they need to open that safety deposit box to work with the information it contains, which makes your data less secure.

What can be done to make it possible to use AI services while leaving data secure? Homomorphic encryption seems to be the answer. 

MIT Professor, Vinod Vaikuntanathan calls that process “black magic” in this video that very clearly explains some of the intricacies of homomorphic encryption. That’s because the encryption tech makes it possible for the server to put its hands inside the safety deposit box and work with data without ever accessing or even knowing what it is working with.

Leaving that data encrypted unlocks the power of cloud-based AI, while also building in privacy. I expect the tech will see use in Private Cloud Compute, though it is not certain to what extent it will be capable of handling large and complex tasks at this point.

How is Apple boosting homomorphic encryption?

Apple already uses homomorphic encryption. Now, it has introduced a new open source Swift package for homomorphic encryption. 

The rationale here is obvious. Unlike so many in the industry, Apple prioritizes user privacy, which it sees as a human right. It is quite plausible that one of the challenges it faced on its road to generative AI has been the need for more complex cloud-based computations to access core data, which conflicts with the company’s privacy goal. The deployment of homomorphic encryption marries those two conflicting aims.

Apple isn’t going quite so far as to say it is about that, even though it evidently is. Instead, it talks about how it uses the tech in its Live Caller ID Lookup feature, which provides caller ID and spam blocking services. In use, this lets Lookup interrogate a server for information pertaining to a phone number without that server ever actually accessing the number itself.

What is a typical workflow?

On its Github page sharing the tech, Apple explains what a typical homomorphic encryption workflow might be:

The client encrypts sensitive data and sends the resulting ciphertext to the server.

The server performs computation without learning what any ciphertext decrypts to.

The server sends the resulting ciphertext response to the client.

The client decrypts to learn the response.

Apple also provides its own explanation of homomorphic encryption:

“Homomorphic encryption (HE) is a cryptographic technique that enables computation on encrypted data without revealing the underlying unencrypted data to the operating process. It provides a means for clients to send encrypted data to a server, which operates on that encrypted data and returns a result that the client can decrypt. During the execution of the request, the server itself never decrypts the original data or even has access to the decryption key. Such an approach presents new opportunities for cloud services to operate while protecting the privacy and security of a user’s data, which is obviously highly attractive for many scenarios.”

Empowering next-generation AI — securely

Of course, there are challenges here around performance and speed, but it is plausible that Apple’s own servers might already be more than capable, given their computational capacity and low energy requirements. But given that the tech is also thought to be capable of providing data protection against quantum computer attacks, it is certain homomorphic encryption will now become an important force empowering AI on Apple’s platforms down the road.

Please follow me on Mastodon, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe. Read More