ML using OpenFHE: Logistic regression training examples are now available

Check out GitHub - openfheorg/openfhe-logreg-training-examples: Examples of Logistic Regression Training using Nesterov Accelerated Gradient Descent, which shows examples of logistic regression training using the Nesterov Accelerated Gradient Descent method. The examples were developed using CKKS in OpenFHE. @iquah was the main developer.

2 Likes

Thank you for sharing! Definitely needed that :slight_smile:

Edit: I exploit this post to ask: in this line, the depth is increased by one because of the DCRTPolyImpl's towers are not initialized issue. Is this something that will be fixed in the future, or bootstrapping can’t simply be performed at the last level?

Thank you for reporting this. The comment can be improved. In 64-bit CKKS bootstrapping, we need one extra level (for an extra scaling). We will fix the comment.

This happens in general in 64 bits bootstrapping or just in case of 2 iterations?

This is specific to the use of 2 iterations for the 64-bit case. This requirement is illustrated in openfhe-development/iterative-ckks-bootstrapping.cpp at v1.0.3 · openfheorg/openfhe-development · GitHub (search for numIterations).

1 Like

Hello there, another question here :smiley:

In this line, you justify the use of a level budget equal to {2, 2}, can you please elaborate a little bit more? Thank you :pray:

Level budget = {e,d} controls the computational complexity of the homomorphic encoding and decoding steps in CKKS bootstrapping, both are homomorphic evaluations of linear transforms. A higher budget allows faster computation of these steps at the expense of using a deeper circuit (consuming more levels). On the other hand, lower budget would be slower but it uses a shallow circuit.

Recommended values, found experimentally, are e or d = {1,2,3, or 4} (they do not need to be equal). You can run a few experiments to find the best configuration for you.

1 Like


Hello,
In the data directory, we not only have “pvalue.txt” and “result.txt” files but also 7 other files. Could you please provide more information about those files and their purpose?

That seems to be for the genomic examples instead of using the Logistic Regression one

1 Like

Thank you for your answer. With recommended values you mean values that have a good ratio between circuit depth and complexity reduction for bootstrap?

Yes, going higher than four consumes more levels.

1 Like

Hello
I have built the logistic regression example " openfhe-logreg-training-examples".In lr_nag, when I disable bootstrapping , it works well, but when I enable it, I get a “killed” message. Do you have any recommendations about the parameters? Which ones should I modify and what values should I choose?

Can you provide more information about the error message? I suspect you may not have enough memory to hold the evaluation keys. How much RAM does your machine have?

The examples should work out of the box. In addition to what @Caesar mentioned, are you running in 64 or 128 bit mode?

I am running it in 64 bits mode

I have 12GB of RAM, I think so, but I don’t know how to configure the parameters to run the code on my machine. I’ve tried some values, but I always encounter the same problems.

I’m not currently at my desktop but one thing you can use to test if the issue is with how much RAM your machine has is to drop the ring size (and the number of data points you read). 2<<16 perhaps?

12GB may not be enough. 32GB should be safe. Also please compile using NATIVE_SIZE=64, which should be by default - most likely this is what you are already using. `NATIVE_SIZE=128 will take twice more memory.

Trying a twice smaller ring dimension, as suggested by @iquah, will also help.