CKKS Bootstrapping: Observed Numerical Error Reduction Between Bootstraps

Hello everyone,

I am currently using the CKKS scheme to perform multiple computations on ciphertexts and have noticed an interesting characteristic related to CKKS bootstrapping. In the image below, you can observe sharp spikes in numerical errors, which correspond to the moments when bootstrapping occurs. What’s particularly interesting is that, between bootstrapping events, the numerical error gradually decreases until the next bootstrapping takes place.

Between these bootstrappings, I perform many ciphertext-plaintext/scalar multiplications (with small values around 1) and ciphertext-ciphertext additions. From this graph, it seems that exhausting all multiplication levels by multiplying with a factor of 1.0 could help to reduce the numerical error before decryption.

Is this reduction in numerical error between bootstrapping events a general property of CKKS bootstrapping, or could it be specific to my case?

I would really appreciate any insights or explanations regarding this behavior.

Thank you for your help!

Best regards,
Arseniy

image (1)

cc @sloede

This is most likely due to a combination of how the numerical error is computed and the evaluated circuit having self-correcting properties as performing operations with small values will not decrease the error but only limit its growth.

Thank you for your insights. However, I am not fully convinced that this can explain the observed behavior. We have implemented the same circuit using the exact same arithmetic operations without FHE encryption, and there we do not see such a periodic behavior. Furthermore, the errors shown here are computed by comparing the ciphertext results to the unencrypted results in the discrete L^\infty or L^2 norms, and they are of \mathcal{O}(10^{-6}). In comparison, the actual numerical error is much higher \mathcal{O}(10^{-3}), thus I don’t think that the numerical scheme is responsible for such an error distribution.

Therefore, I still think that the root cause for the observed behavior lies somewhere within the CKKS scheme or its implementation in the OpenFHE library.

Just like @Pro7ech noted, the bootstrapping may reduce the precision but the self-correcting properties of the computation will regain some of the precision back. This is very common in many machine learning applications and often justifies using smaller-precision arithmetic in GPU-based computations in the clear. In other words, this behavior happens quite often.

You could also try another experiment with two iterations of CKKS bootstrapping (this will roughly double the precision of bootstrapping): see openfhe-development/src/pke/examples/iterative-ckks-bootstrapping.cpp at main · openfheorg/openfhe-development · GitHub for an example. This behavior may go away or the jump will become much less noticeable.

Thank you very much for your explanation :pray: