Questions on CKKS bootstrapping with some computations after bootstrapping

Several questions/clarifications.

  • What parameters are you using for bootstrapping?
    • Check the scaling factor and first modulus size. Are they set the same way as in the bootstrapping examples?
    • Are you running it for NATIVE_SIZE=128 or 64? NATIVE_SIZE=128 has a higher precision after bootstrapping.
  • Are you using single- or double-precision CKKS bootstrapping (double-precision is useful for NATIVE_SIZE=64 and achieves higher precision)
    • See the iterative bootstrapping example for double-precision bootstrapping
  • An example with a large depth after bootstrapping is the logistic regression training example @iquah mentioned. It does a computation with 14 levels (one iteration of logreg) between bootstrapping calls.
  • The level parameter for `MakeCKKSPackedPlaintext’ sets the level of the ciphertext after encryption. For example, level 0 means fresh encryption [using the full multiplicative depth = bootstrapping depth + leveled computation after bootstrapping (before next bootstrapping)].
    • If you use level 0 (full multiplicative depth is available), then OpenFHE will not need to call bootstrapping as the remaining number of levels is already enough. Hence, a leveled computation will be used. Eventually, a bootstrapping will be called (when the leveled budget is saturated).
  • Large magnitude is a bad configuration for CKKS bootstrapping (which seems to be the problem in your most recent example). You always want to scale down the message to 1 (or less than 1). CKKS bootstrapping under the hood uses a sine wave to approximate modular reduction, which means the scaled message should be in an interval where the sine wave is well-approximated by a linear function.

If you want me to look at why your example does not produce correct results, please post the full example, including all parameters for the CryptoContext.

2 Likes