Invoke Existing InferenceService

In this section, you will query an existing InferenceService for predictions using Kale’s Endpoint API.

What You’ll Need

  • An Arrikto EKF or MiniKF deployment with the default Kale Docker image.
  • An InferenceService deployed in your namespace, for example, the one created after completing the Serve Scikit Learn Models guide.

Procedure

Note

The procedure below serves as an example for the InferenceService you deployed in the Serve Scikit Learn Models guide. If you want to use your own InferenceService, adjust the code accordingly.

  1. Navigate to the Models UI to retrieve the name of the InferenceService. In our example, this will be sklearn-tutorial.

    ../../../_images/sklearn-endpoint.png
  2. Create a new notebook server using the default Kale Docker image. The image will have the following naming scheme:

    gcr.io/arrikto/jupyter-kale-py38:<IMAGE_TAG>

    Note

    The <IMAGE_TAG> varies based on the MiniKF or Arrikto EKF release.

  3. Create a new Jupyter notebook (that is, an IPYNB file):

    ../../../_images/ipynb2.png
  4. Copy and paste the import statements in the first code cell, and run it:

    import json from sklearn.datasets import fetch_20newsgroups from sklearn.model_selection import train_test_split from kale.serve import Endpoint

    This is how your notebook cell will look like:

    ../../../_images/endpoint-import.png
  5. In a different code cell, fetch the dataset, and split it into train and test sets. Copy and paste the following code, and run it:

    # download dataset newsgroups_dataset = fetch_20newsgroups(random_state=42) # create the dataset x = newsgroups_dataset.data y = newsgroups_dataset.target # split the dataset x_train, x_test, y_train, y_test = train_test_split(x, y, test_size=.2, random_state=42)

    This is how your notebook cell will look like:

    ../../../_images/endpoint-dataset.png
  6. In a different code cell, initialize a Kale Endpoint object using the name of the InferenceService you retrieved in the previous step. Then, run it:

    endpoint = Endpoint(name="sklearn-tutorial")

    Note

    When initializing an Endpoint, you can also pass the namespace of the InferenceService. If you do not provide one, Kale assumes the namespace of the notebook server.

    This is how your notebook cell will look like:

    ../../../_images/sklearn-endpoint-define.png
  7. Prepare the data payload for the prediction request. Copy and paste the following code in a new cell, and run it:

    # covert the test sample into json format index_test = 2 data = {"instances": [x_test[index_test]]}

    This is how your notebook cell will look like:

    ../../../_images/endpoint-example.png
  8. Invoke the server to get predictions. Copy and paste the following snippet in a different code cell, and run it:

    # get and print the prediction res = endpoint.predict(json.dumps(data)) print(f"The prediction is {res['predictions']}")

    This is how your notebook cell will look like:

    ../../../_images/sklearn-pred.png

Summary

You have successfully used Kale’s Endpoint to invoke an already served model.

What’s Next

Check out how you can serve a model from a notebook.