I’m trying to understand if it’s possible to use a custom data type with a PyTorch model that has been trained in Python then exported to C++. I’ve got my model all up and running in C++ when the input data comes from a standard double array, like so:
double sim_state[8]; // this gets populated with some data
std::vector<torch::jit::IValue> inputs;
inputs.push_back(torch::from_blob(sim_state, {1, 8}, at::dtype(at::kDouble)));
at::Tensor prediction = module.forward(inputs).toTensor();
Now, I want to use a different data type instead of kDouble. My data type is called DACE::DA from the dace.h header, and essentially it’s a polynomial instead of a number. The DACE::DA type comes with all the required functionality for addition, subtraction, multiplication, etc., and the inputs can be given as
DACE::AlgebraicVector<DACE::DA> sim_state(8); // this gets populated with some data
Of course, the torch::from_blob fails even before getting to the model.forward call, as expected. Is there a way to use custom data types like this with my PyTorch model? I should also note that I do not want to convert my DACE::DA to double, input it into the model, get the double outputs and then convert back to DACE::DA. I need to be able to propagate my DACE::DA type through my model.
Thanks in advance for any feedback on this!
aevs is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.