Inference Service Base Classes


AbstractInferenceService Class

The AbstractInferenceService class implements the basic infrastructure required for serving inference requests. It has the following member variables

Name

Type

Description

Developer Guidelines

model

Dictionary

Stores the model for the service

Developer should implement the load_model method to load the model from the file path provided as a parameter

The class has the following methods:

Name

Parameters

Description

Developer Guidelines

load

None

Loads the model from the model versioning system

This method should be called from the inference service entry point. Developers must implement the get_crdentials method to fetch the credentials for the versioning system and the load_model method to load the model from the file system, after they have been fetched from the model versioning system

get_credentials

None

Gets user credentials for the model versioning system

Should return a dictionary withe elements “uid” (User ID), “pwd” (password) and “env” (environment from which to fetch the model)

load_model

path of model files (string)

Loads model from the path provided

Should set the model variable to the loaded model

transform_input

request (pandas DataFrame)

Transforms the request if required

should return a pandas DataFrame with the transformed data

predict

request (pandas DataFrame)

makes a prediction using the loaded model

transform_output

response

Transforms the output response if required