C++ API¶
Native¶
Implements the methods for interacting with Proteus in the native C++ API.
-
namespace proteus¶
Functions
-
void initialize()¶
Initialize proteus.
-
void terminate()¶
Shut down proteus.
-
void startHttpServer(int port)¶
Start the HTTP server for collecting metrics. This is a no-op if Proteus is compiled without HTTP support.
- Parameters
port – port to use
-
void stopHttpServer()¶
Stop the HTTP server. This is a no-op if Proteus is compiled without HTTP support.
-
std::string load(const std::string &worker, RequestParameters *parameters)¶
Load a worker.
- Parameters
worker – name of the worker to load
parameters – any load-time parameters to pass to the worker
- Returns
std::string the qualified name of the worker to make inference requests
-
InferenceResponseFuture enqueue(const std::string &workerName, InferenceRequestInput request)¶
Enqueue an inference request to Proteus.
- Parameters
workerName – name of the worker to make the request to
request – the request to make
- Returns
InferenceResponseFuture a future to get the results of the request
-
std::string getHardware()¶
Get a string that lists the available kernels (“<name>:i,<name>:j…”)
-
bool hasHardware(const std::string &kernel, size_t num)¶
Check if a particular kernel exists on the server. The string splitting code is inspired from: https://stackoverflow.com/a/14266139.
- Parameters
kernel – kernel name to check if it exists. An empty string is a special name that matches any kernel name.
num – minimum number of the kernels that should be present
- Returns
bool
-
void initialize()¶