Coming from PALISADE, I noticed there is a new CKKS bootstrapping example in OpenFHE. Can this procedure be used to build an arbitrarily deep (approximate) arithmetic circuit?

Found a good conceptual video explanation, so I’m assuming this probably applies to OpenFHE as well?

Can this procedure be used to build an arbitrarily deep (approximate) arithmetic circuit?

Yes, you can! Having said that, it’ll be slow (as far as I know). @saroja since you wrote the script can you give any insights?

Yes, you can use bootstrapping to build an arbitrarily deep approximate arithmetic circuit. In the *simple-ckks-bootstrapping* example, we set the initial multiplicative depth on L:106. *parameters.SetMultiplicativeDepth(depth)*

This sets the initial ciphertext level to be *depth*. After performing a multiplication, the ciphertext level goes down by one. When the ciphertext level is 0, we can no longer perform any more computations. However, before the level reaches 0, we can run *EvalBootstrap* to bump the level back up, which gives us room to perform more multiplications. We can do this as many times as needed, which allows us to run arbitrarily deep arithmetic circuits.

However, as @iquah mentioned, to get the best performance, it is best to design your circuit and FHE computations in a way that requires as few bootstrap operations as possible.

Ah ok thanks for the explanation! So for practical purposes, the improvement here is that if we (the user) design a circuit that would reach the depth limit, we can refactor the circuit to include a bootrapping layer to add a handful of extra levels, whereas we would otherwise be completely out-of-luck (if bootstrapping was not supported).

whereas we would otherwise be completely out-of-luck (if bootstrapping was not supported).

You could do a reencrypt (decrypt-encrypt) which would refresh the number of levels but it’s not always feasible since you’d need to have the private key

But for all intents and purposes, yes, you are correct.

I would like to add a note on the precision of the result after bootstrapping. In CKKS, bootstrapping is approximate. So there is some loss in precision, primarily after the first bootstrapping. However, many machine learning applications are approximate in nature. So this loss of precision does not significantly affect the result. A good example is logistic regression training. We were able to run even hundreds of logistic regression iterations, with hundreds of bootstrapping calls, while still getting 4-5 decimal digits of accuracy.

When you need to do deep computations, I suggest using the 128-bit CKKS, i.e., setting NATIVE_SIZE=128, as the 64-bit CKKS is limited to 3-4 decimal digits (after first bootstrapping call).