Approximation error is too high

Hi!
I’ve been using openfhe to implement logistic regression training. I block the plaintext and pack each piece into a ciphertext. But in the process of training, I encountered a problem. It says: "lbcrypto::math_error: The decryption failed because the approximation error is too high. Check the parameters. "What parameters are causing the problem and what changes should I make?Here are the parameters I set.

usint dcrtBits               = 56;
usint firstMod               = 60;
std::vector<uint32_t> dim1 = {0, 0};
std::vector<uint32_t> levelBudget = {4,4};

uint32_t levelsRemaining = 19 ;
usint depth = levelsRemaining + FHECKKSRNS::GetBootstrapDepth(8, levelBudget, secretKeyDist);
parameters.SetMultiplicativeDepth(depth);
parameters.SetRingDim(1<<16);
long slots = 256;
cc->EvalBootstrapSetup(levelBudget,dim1,slots);

I’d recommend 1) checking out GitHub - openfheorg/openfhe-logreg-training-examples: OpenFHE-Based Examples of Logistic Regression Training using Nesterov Accelerated Gradient Descent and 2) posting a minimal reproducible example

Thanks for your advice. I have some problems running the lr_nag program in [ GitHub - openfheorg/openfhe-logreg-training-examples: OpenFHE-Based Examples of Logistic Regression Training using Nesterov Accelerated Gradient Descent](https:// GitHub - openfheorg/openfhe-logreg-training-examples: OpenFHE-Based Examples of Logistic Regression Training using Nesterov Accelerated Gradient Descent). When the code runs to

cc->EvalBootstrapSetup(levelBudget, bsgsDim, numSlotsBoot);

It stopped but no error message was returned. I have not modified the program code, why is this?
The following is the detailed error message:
11

How much RAM does you machine have? CKKS bootstrapping typically works with large keys. You can try to use a smaller ring dimension (twice smaller).

Generally speaking, Approximation error is too high means that the decryption result is incorrect. This may happen for various reasons, e.g., incorrect use of keys or anything else.