I do have a C library with a defs.h containing a custom typedef
typedef enum _CustomParam
{
OptionOne, /// Default 0
OptionTwo, /// Default 0
} CustomParam;
and a lib.h containing the function:
extern "C"
{
/// @param[in] Parameter to retrieve value for
/// @return the parameter value
/// @retval -1 if the retrieval could not be made
LIB_EXPORT double getParam(CustomParam param);
/// @param[in] param Parameter to set
/// @param[in] val the value to set the parameter to
/// @retval 0 if setting the parameter was successful
/// @retval -1 if setting the parameter failed
LIB_EXPORT int setParam(CustomParam param, double value);
}
I now implemented a python script to use this function:
# def.py
from ctypes import *
class CustomParam(c_int):
OptionOne = 0
OptionTwo = 1
# main.py
from .lib import Lib
library = Lib("lib/lib.dll")
library.init()
# lib.py
from def.h import *
def __init(self, libpath)
super().__init__()
self.lib = CDLL(libpath)
def init(self):
self.lib.getParam.argtypes = [CustomParam]
self.lib.getParam.restype = c_double
result = self.lib.getParam(CustomParam.OptionOne)
print("Parameter: ", result)
result is always -1. No matter if I set or get a Parameter.
In general: All functions which take the CustomParam as an input fail. All functions with generic input formats (like e.g. c_int) do work correctly. That’s why I guess that my Input for getParam might be in an unexpected format. I also tried to get change the formats to (c_int(0)) or 0 instead of CustomParam.OptionOne (Which should be the same).
Any suggestions on how I can use a C typedef enum as an input in my python script to call a function from a C library?