Cant find EvalMult() defined

I found the virtual function here:
Ciphertext EvalMult(ConstCiphertext ciphertext1, ConstCiphertext ciphertext2) const {
TypeCheck(ciphertext1, ciphertext2);

    const auto evalKeyVec = GetEvalMultKeyVector(ciphertext1->GetKeyTag());
    if (!evalKeyVec.size()) {
        OPENFHE_THROW(type_error, "Evaluation key has not been generated for EvalMult");
    }

    return GetScheme()->EvalMult(ciphertext1, ciphertext2, evalKeyVec[0]);
}

but I am still not sure what this is doing. Maybe I am not understanding the code but I was expecting to see some multiplication operations or something. Can this be explained?

Again, I was hoping to implement CUDA or some form of hardware acceleration. I know Duality is working on the DARPA Drive project which is doing things in this area?

I also see a response from November 2023 (Homomorphic operations in in CUDA kernel) which asks about this.

Is this something that can be done by an individual or is this something much more complicated than I am anticipating. My computations take minutes to compute and looking to see how to speed them up.