Invoke Existing InferenceService

In this section, you will query an existing KFServing InferenceService for predictions using Kale helpers.

What You'll Need

  • An Arrikto EKF or MiniKF deployment with the default Kale Docker image.
  • An InferenceService deployed in your namespace, for example, the one created after completing the Serve Model from Notebook guide.

Procedure

Note

The procedure below serves as an example for the InferenceService you deployed in the Serve Model from Notebook guide. If you want to use your own InferenceService, adjust the code accordingly.

  1. Navigate to the Models UI to retrieve the name of the InferenceService. In our example, this will be kale-serving.

    ../../../_images/mwa.png
  2. Create a new Notebook server using the default Kale Docker image. The image will have the following naming scheme:

    gcr.io/arrikto/jupyter-kale-py36:<IMAGE_TAG>
    

    Note

    The <IMAGE_TAG> varies based on the MiniKF or Arrikto EKF release.

  3. Create a new Jupyter Notebook (that is, an IPYNB file):

    ../../../_images/ipynb1.png
  4. Copy and paste the import statements in the first code cell and run it:

    import json
    
    from sklearn.datasets import make_classification
    from sklearn.model_selection import train_test_split
    
    from kale.common.serveutils import KFServer
    

    This is how your notebook cell will look like:

    ../../../_images/kfserverimports.png
  5. In a different code cell, load the data you will use to get predictions. Then, run the cell:

    # load the data
    x, y = make_classification(random_state=42)
    _, x_test_raw, _, _ = train_test_split(x, y, test_size=0.1)
    

    This is how your notebook cell will look like:

    ../../../_images/kfserverload.png
  6. In a different code cell, initialize a Kale KFServer object using the name of the InferenceService you retrieved during the first step. Then, run it:

    kfserver = KFServer(name="kale-serving")
    

    Note

    When initializing a KFServer, you can also pass the namespace of the InferenceService. If you do not provide one, Kale assumes the namespace of the Notebook server.

    This is how your notebook cell will look like:

    ../../../_images/kfserverinit.png
  7. Invoke the server to get predictions in a different code cell and run it:

    data = json.dumps({"instances": x_test_raw.tolist()})
    predictions = kfserver.predict(data)
    

    This is how your notebook cell will look like:

    ../../../_images/predict.png

Summary

You have successfully used Kale's KFServer to invoke an already served model.

What's Next

Check out the rest of the Kale user guides for serving.