Hello. In some FHE algorithm, like BFV, before encryption, encoding is performed, in which, a vector with size “2^15” encoded in one message. I think (and read in some paper) that we can use this encoding to reduce the size of data which is sent over the network. In other word, I don’t want use encryption, and only want to sent a vector over the network. I would like to see if I can use the encoding algorithm of BFV to reduce communication overhead?
I use a python library. When I get the size of encoded message (by sys.getsizeof(encoded_message) ), the size is 64 Byte. However, before sending the message over network, we need serialization, and when i perform serialization and again get the size of message, the size is about 250 KB. It is while that the size of pure array we want to sent over network is also about 250 KB.
I think there is an issue in interpreting the returned value of sys.getsizeof(encoded_message). I do not think it really returns the actual data size, and here is why:
Encoding a vector of length 2^{15} using BFV would result in a polynomial of degree \leq 2^{15}-1, which is represented as a vector of length, say 2^{15} coefficients. Assuming you are using 64-bit words, the size of this vector is 64*2^{15}/(8*1024) = 256 KB. Which is close to what you are getting after serialization. I would investigate further what sys.getsizeof(encoded_message) is actually measuring.
I am not sure I understand the question in this thread. Encoding is done before encryption. If the encoded vector is sent over network, this means it will not be encrypted. Are you talking about a scenario when the vector does not need to be encrypted and is only used in ciphertext-plaintext computation?
The purpose of encoding is to convert the vector supplied by the user into an FHE-compatible polynomial (in Double-CRT) so it could be encrypted.