OpenFHE CKKS out of memory issue

Hi guys,

I’m working on a project with OpenFHE where I’m importing a database with 100,000 rows. Each row has variables like security lambda, ring dimension, scaling modulus, first modulus, and multiplication depth. I use the CKKS scheme to generate ciphertexts based on the variables in each row, compute the precision of the decrypted plaintext, and store it back in the database.

I’m running my code in WSL, but it always seems to get killed after processing around 1,200 rows due to an out of memory issue. I initially tried batching to free up memory, but it didn’t help much.

Is there any way to free OpenFHE memory to prevent this out of memory issue?

Please provide details on how you run the experiment. Is this one OpenFHE process that creates/deletes multiple CKKS crypto-contexts or a separate OpenFHE process for every configuration per DB row?

A simple experiment methodology (separate process for OpenFHE each configuration) to resolve this is as follows:

Query database for records 
Set record_count to -1
For each record: 
   Increment record_count 
   Extract configuration: configuration[record_count] = DB_row[record_count]
   Start OpenFHE process with configuration[record_count]
   Log OpenFHE output 
   Terminate OpenFHE process

You can easily do this in a shell script.

1 Like