Converting OpenFHE parameters to lattice parameters

Several posts in the forum suggest using this lattice estimator to estimate the security level, but how do OpenFHE parameters relate with the ones in the estimator?

n the dimension of the LWE sample vector (Z/qZ)^n.
q the modulus of the space Z/qZ of integers the LWE samples are in.
Xs the distribution on Z/qZ from which the LWE secret is drawn
Xe the distribution on Z/qZ from which the error term is drawn

I assume n is BatchSize, q is ModSize, and Xs is related to SecretKeyDist? How can I extract the correct parameters from OpenFHE?

n either corresponds to ring dimension (for RingLWE schemes, like BGV, BFV, abd CKKS) or lattice dimension for (TFHE/FHEW).

q is the maximum modulus used for encryption. It can either be the ciphertext modulus Q or PQ (in hybrid key switching for the schemes in the pke module). See openfhe-development/src/pke/examples/advanced-real-numbers.cpp at v1.2.0 · openfheorg/openfhe-development · GitHub for more explanations on Q and QP and how to output them in OpenFHE.

\chi_e by default uses 3.19 as the distribution parameter. The parameter can be changed (but this is uncommon for OpenFHE applications).

You are correct abour \chi_s.

Thank you for your reply.

My understanding is that OpenFHE uses the Homomorphic Encryption Standard, and the Standard uses this estimator, so the two should be consistent?

I set these parameters in OpenFHE:

parameters.SetSecretKeyDist(UNIFORM_TERNARY);
parameters.SetMultiplicativeDepth(17);
parameters.SetScalingModSize(33);
parameters.SetScalingTechnique(FLEXIBLEAUTO);
parameters.SetSecurityLevel(HEStd_128_classic);
parameters.SetNumLargeDigits(3);
parameters.SetKeySwitchTechnique(HYBRID);

Which gives me:

ring dimension of 32768
Q = 8811260159874699672336486180796396406243869077359646715352840075916900504413241269164695376404013045805703256871370210283196107117288933764274398464130413565444208105643075409269856534529 (bit length: 622)
P = 1532495540823465179097654646454320532079704299659657217 (bit length: 180)

But when I this this on the estimator:

n=2**15,
q=[Q above]*[P above],
Xs=estimator.nd.NoiseDistribution.UniformMod(3),
Xe=estimator.nd.NoiseDistribution.DiscreteGaussian(stddev=3.19),

I get less than 128-bit:

usvp                 :: rop: ≈2^106.9, red: ≈2^106.9, δ: 1.004234, β: 366, d: 63121, tag: usvp
dual_hybrid          :: rop: ≈2^106.9, red: ≈2^106.9, guess: ≈2^53.2, β: 366, p: 3, ζ: 0, t: 0, β': 366, N: ≈2^43.0, m: ≈2^15.0

Any ideia why?

Thank you.

It does not seem right. According to Table 1 of https://homomorphicencryption.org/wp-content/uploads/2018/11/HomomorphicEncryptionStandardv1.1.pdf, one can use n=2^{15} up to \log q = 881. Even more refined estimates using the latest lattice estimator (and MATZOV cost model) suggest one can use n=2^{15} up to \log q = 868. So I suspect you are not using the lattice estimator correctly. I am not an expert in the lattice estimator, but I typically specify the uniform ternary distribution not as Xs=estimator.nd.NoiseDistribution.UniformMod(3) but using Xs = ND.Uniform(-1,1,n) instead

I would run the test using the following command (I started it on my machine but it is taking too long - I will add it here if I get a result later).

params1 = LWE.Parameters(n=2^15, q=13503216904043430132322375130323520686726597922127692329036916273626067969416451343520467627876120344481929085215823190265452525616489335894869259632143756391813518830548262609950752986418758718499434014407188181502236786143292803727664545793, Xs = ND.Uniform(-1,1,n), Xe=ND.DiscreteGaussian(3.19))
LWE.estimate(params1)