Original article by Advait (Leo) Jayant

Compiled by: LlamaC

“Recommendation: Fully homomorphic encryption (FHE) is often hailed as the holy grail of cryptography. This article explores the application prospects of FHE in the field of artificial intelligence and points out the current limitations. It also lists some projects dedicated to using fully homomorphic encryption (FHE) for AI applications in the encryption field. For cryptocurrency enthusiasts, you can get a deeper understanding of fully homomorphic encryption through this article. Enjoy!”

text?

Person A wants highly personalized recommendations on Netflix and Amazon. Person B doesn’t want Netflix or Amazon to know their preferences.

In today’s digital age, we enjoy the convenience of personalized recommendations from services like Amazon and Netflix that are precisely tailored to our tastes. However, these platforms’ penetration into our private lives is causing increasing unease. We crave customization without sacrificing privacy. In the past, this seemed like a paradox: how to achieve personalization without sharing large amounts of personal data with cloud-based AI systems. Fully homomorphic encryption (FHE) offers a solution that allows us to have our cake and eat it too.

Artificial Intelligence as a Service (AIaaS)

Artificial intelligence (AI) now plays a key role in addressing complex challenges in multiple fields including computer vision, natural language processing (NLP), and recommender systems. However, the development of these AI models has brought significant challenges to ordinary users:

1. Data volume: Building accurate models often requires huge data sets, sometimes even reaching the scale of petabytes.

2. Computational power: Complex models like the converter require the powerful computing power of dozens of GPUs, often running continuously for weeks.

3. Domain Expertise: Fine-tuning these models requires deep domain expertise.

These obstacles make it difficult for most users to develop powerful machine learning models independently.

AI-as-a-Service pipeline in action

Enter AI as a Service (AIaaS), which overcomes these barriers by providing cloud services managed by tech giants (including FAANG members) that give users access to state-of-the-art neural network models. Users simply upload their raw data to these platforms, where it is processed to generate insightful inferences. AIaaS effectively democratizes access to high-quality machine learning models, opening up advanced AI tools to a wider audience. Unfortunately, however, today’s AIaaS brings these conveniences at the expense of our privacy.

Data Privacy in AI-as-a-Service

Currently, data is only encrypted during transmission from the client to the server. The server has access to the input data and the predictions made based on that data.

In the AI ​​as a Service process, servers have access to input and output data. This situation complicates the sharing of sensitive information, such as medical and financial data, by ordinary users. Regulations such as GDPR and CCPA exacerbate these concerns because they require users to provide explicit consent before data is shared and guarantee users the right to understand how their data is used. GDPR also further stipulates encryption and protection of data during transmission. These regulations set strict standards to ensure user privacy and rights, advocating clear transparency and control over personal information. Given these requirements, we must develop strong privacy mechanisms in AI as a Service (AIaaS) processes to maintain trust and compliance.

FHE solves the problem

By encrypting a and b, we ensure that the input data remains private.

Fully homomorphic encryption (FHE) provides a solution to the data privacy issues associated with cloud computing. The FHE scheme supports operations such as ciphertext addition and multiplication. The concept is simple: the sum of two encrypted values ​​is equal to the encrypted result of the sum of the two values, and the same is true for multiplication.

In practice, this works as follows: The user performs an addition operation on the plaintext values ​​? and ? locally. The user then encrypts ? and ? and sends the ciphertext to the cloud server. The server is able to perform an addition operation on the encrypted values ​​(homomorphically) and return the result. The result decrypted from the server will be consistent with the local plaintext addition result of ? and ?. This process protects data privacy while allowing computation to be performed in the cloud.

Deep Neural Network (DNN) based on fully homomorphic encryption

In addition to basic addition and multiplication operations, significant progress has been made in utilizing fully homomorphic encryption (FHE) for neural network processing in AI-as-a-service processes. In this context, users can encrypt raw input data into ciphertext and transmit only these encrypted data to the cloud server. The server then performs homomorphic computations on these ciphertexts, generates encrypted output, and returns it to the user. The key is that only the user holds the private key, enabling them to decrypt and access the results. This builds an end-to-end FHE encrypted data flow, ensuring user data remains private throughout the process.

Neural networks based on fully homomorphic encryption provide users with significant flexibility in AI as a service. Once the ciphertext is sent to the server, the user can go offline since frequent communication between client and server is not required. This feature is particularly beneficial for IoT devices, which often operate under constrained conditions where frequent communication is often impractical.

However, it is worth noting the limitations of fully homomorphic encryption (FHE). Its computational overhead is huge; FHE schemes are inherently time-consuming, complex, and resource-intensive. In addition, FHE currently has difficulty supporting nonlinear operations effectively, which poses a challenge for the implementation of neural networks. This limitation may affect the accuracy of neural networks built on FHE, as nonlinear operations are crucial to the performance of such models.

K.-Y. Lam, X. Lu, L. Zhang, X. Wang, H. Wang, and S. Q. Goh, "Privacy-Enhanced Neural Networks with Efficient Fully Homomorphic Encryption for AI-as-a-Service," Nanyang Technological University (Singapore) and Chinese Academy of Sciences (China).

(Lam et al., 2024) describe a privacy-enhanced neural network protocol for AI-as-a-service. The protocol first defines the parameters of the input layer by using learning with errors (LWE). LWE is a cryptographic primitive used to protect data by encrypting it so that computations can be performed on the encrypted data without first decrypting it. For the hidden output layer, the parameters are defined using ring-LWE (RLWE) and ring-GSW (RGSW), two advanced cryptographic techniques that extend LWE to achieve more efficient cryptographic operations.

The public parameters include the decomposition basis ? and ???. Given an input vector ? of length ?, a set of ? LWE ciphertexts (??,??) are generated for each element ?[?] using the LWE private key ?. The evaluation keys for ? are generated for the indices ?[?]>0 and ?[?]<0. In addition, a set of LWE switching keys are set for ?. These keys support efficient switching between different encryption schemes.

The input layer is designated as layer 0 and the output layer as layer ? For each layer ? from 1 to ? The number of neurons ?? is determined in layer 0. The weight matrix ?? and the bias vector ?? are defined starting from layer 0 and stacked on layer 0. For each neuron ℎ from 0 to ??−1 the LWE ciphertext from layer ?−1 is evaluated under homomorphic encryption. This means that the computation is performed on the encrypted data to compute the linear function in ℎ. The -th neuron in the ?-th layer, combined with the weight matrix and the bias vector. Subsequently, the lookup table (LUT) is evaluated in ℎ. The -th neuron, and the switch from ?′ to a smaller ? After performing the operation, the result is then rounded and rescaled. This result is included in the set of LWE ciphertexts for layer ?.

Finally, the protocol returns the LWE ciphertext to the user. The user can then decrypt all ciphertexts using the private key ? to find the inference results.

This protocol efficiently implements privacy-preserving neural network reasoning by leveraging fully homomorphic encryption (FHE) technology. FHE allows computation on encrypted data without leaking the data itself to the processing server, ensuring data privacy while providing the advantages of AI as a service.

Application of Fully Homomorphic Encryption in AI

FHE (fully homomorphic encryption) makes it possible to perform secure computing on encrypted data, which not only opens up many new application scenarios but also ensures the privacy and security of the data.

Consumer Privacy in Advertising: (Armknecht et al., 2013) proposed an innovative recommendation system that leverages fully homomorphic encryption (FHE). This system is able to provide personalized recommendations to users while ensuring that these recommendations are completely confidential to the system itself. This ensures that user preference information remains private, effectively addressing a major privacy issue in targeted advertising.

Medical applications: (Naehrig et al., 2011) proposed a compelling solution for the healthcare industry. They proposed using fully homomorphic encryption (FHE) to continuously upload patients’ medical data to service providers in an encrypted form. This approach ensures that sensitive medical information remains confidential throughout its lifecycle, enhancing patient privacy protection while enabling seamless data processing and analysis by healthcare organizations.

Data mining: Mining large datasets can yield significant insights, but often at the expense of user privacy. (Yang, Zhong, and Wright, 2006) addressed this problem by applying functional cryptography in the context of fully homomorphic encryption (FHE). This approach makes it possible to extract valuable information from large datasets without compromising the privacy of the individuals whose data is being mined.

Financial privacy: Imagine a scenario where a company has sensitive data and proprietary algorithms that must be kept confidential. (Naehrig et al., 2011) proposed the use of homomorphic encryption to address this problem. By applying fully homomorphic encryption (FHE), the company is able to perform necessary computations on encrypted data without exposing the data or algorithms, thereby ensuring financial privacy and protection of intellectual property rights.

Forensic Image Recognition: (Bosch et al., 2014) describe a method for outsourcing forensic image recognition using fully homomorphic encryption (FHE). This technology is particularly beneficial to law enforcement agencies. By applying FHE, police and other agencies can detect illegal images on hard drives without revealing the content of the images, thereby protecting the integrity and confidentiality of the data in the investigation.

From advertising and healthcare to data mining, financial security, and law enforcement, fully homomorphic encryption has the potential to revolutionize how we handle sensitive information in every field. As we continue to develop and improve these technologies, the importance of protecting privacy and security in an increasingly data-driven world cannot be overstated.

Limitations of Fully Homomorphic Encryption (FHE)

Despite its potential, there are still some key limitations that we need to address

  • Multi-user support: Fully homomorphic encryption (FHE) allows computation on encrypted data, but the complexity increases exponentially in scenarios involving multiple users. Typically, each user's data is encrypted with a unique public key. Managing these different datasets, especially at scale given the computational demands of FHE, becomes impractical. To this end, researchers such as Lopez-Alt et al. proposed a multi-key FHE framework in 2013 that allows simultaneous operations on datasets encrypted with different keys. While promising, this approach introduces an additional layer of complexity and requires careful coordination in key management and system architecture to ensure privacy and efficiency.

  • Massive computational overhead: The core of fully homomorphic encryption (FHE) lies in its ability to perform computations on encrypted data. However, this ability comes with a huge price tag. The computational overhead of FHE operations increases significantly compared to traditional unencrypted computations. This overhead usually manifests itself in polynomial form, but involves high-order polynomials, exacerbating the runtime and making it unsuitable for real-time applications. Hardware acceleration for FHE represents a huge market opportunity, aiming to reduce computational complexity and increase execution speed.

  • Limited operations: Recent advances have indeed broadened the scope of fully homomorphic encryption to support a wider variety of operations. However, it is still primarily applicable to linear and polynomial computations, which is a significant limitation for AI applications involving complex nonlinear models such as deep neural networks. The operations required by these AI models are challenging to perform efficiently under the current fully homomorphic encryption framework. Although we are making progress, the gap between the operational capabilities of fully homomorphic encryption and the requirements of advanced AI algorithms remains a key obstacle that needs to be overcome.

 

Fully Homomorphic Encryption in the Context of Encryption and Artificial Intelligence

Here are some of the companies working in the crypto space to leverage fully homomorphic encryption (FHE) for AI applications:

  • Zama offers Concrete ML, a suite of open source tools designed to simplify the use of fully homomorphic encryption (FHE) for data scientists. Concrete ML converts machine learning models into their homomorphic equivalents, enabling confidential computation on encrypted data. Zama's approach enables data scientists to take advantage of FHE without requiring deep cryptography knowledge, which is particularly useful in fields such as healthcare and finance where data privacy is critical. Zama's tools facilitate secure data analytics and machine learning while keeping sensitive information encrypted.

  • Privasee is focused on building a secure AI computing network. Their platform leverages fully homomorphic encryption (FHE) technology to enable multiple parties to collaborate without revealing sensitive information. By using FHE, Privasee ensures that user data remains encrypted throughout the AI ​​computing process, thereby protecting privacy and complying with strict data protection regulations such as GDPR. Their system supports a variety of AI models, providing a versatile solution for secure data processing.

  • Octra combines cryptocurrency with artificial intelligence to improve digital transaction security and data management efficiency. By combining fully homomorphic encryption (FHE) and machine learning technology, Octra is committed to enhancing the security and privacy protection of decentralized cloud storage. Its platform uses blockchain, cryptography and artificial intelligence technology to ensure that user data is always encrypted and secure. This strategy builds a solid framework for digital transaction security and data privacy in the decentralized economy.

  • Mind Network combines fully homomorphic encryption (FHE) with AI to enable secure encrypted computations during AI processing without decryption. This promotes a privacy-preserving, decentralized AI environment that seamlessly blends cryptographic security with AI capabilities. This approach not only protects the confidentiality of data, but also enables a trustless, decentralized environment where AI operations can be performed without relying on a central authority or exposing sensitive information, effectively combining the cryptographic strength of FHE with the operational needs of AI systems.

The number of companies operating at the forefront of fully homomorphic encryption (FHE), artificial intelligence (AI), and cryptocurrency remains limited. This is primarily due to the huge computational overhead required to effectively implement FHE, requiring massive processing power to efficiently perform cryptographic calculations.

Conclusion

Fully homomorphic encryption (FHE) offers a promising approach to enhancing privacy in AI by allowing computations to be performed on encrypted data without decryption. This capability is particularly valuable in sensitive fields such as healthcare and finance, where data privacy is critical. However, FHE faces significant challenges, including high computational overhead and limitations in processing nonlinear operations necessary for deep learning. Despite these obstacles, advances in FHE algorithms and hardware acceleration are paving the way for more practical applications in AI. Continued developments in this area are expected to greatly enhance secure, privacy-preserving AI services that balance computational efficiency with strong data protection.