Hi, I’m currently conducting calculations using CKKS Bootstrapping and evaluating performance. I’m interested in observing performance differences when defining the precision of decimal places in advance. For example, I have the number 5.7811581611561, and I want to see the differences when the most significant bit (MSB) is at positions 2, 4, 6, and 8. Does anyone know how to realize this? Thank you very much.
The input parameters sets I’m evaluating for my benchmark are as follows:
- Rescaling Mode (FIXEDAUTO, FLEXIBLEAUTO, FLEXIBLEAUTOEXT)
- Batchsize (8, 16, 32, 64, 128 bits)
- ScaleModSize (49, 54, 59)
- FirstModSize (50, 55, 60)
- Ring Dimension (1024, 2048, 4096, 8192, 16384)