Build Custom Transformer Image

This section will guide you through building a custom transformer image.

Procedure

  1. In your local environment, create a new folder to place the image:

    user@local:~$ mkdir custom_images && cd custom_images
  2. Inside the custom_images folder, create another folder, and name it custom_transformer:

    user@local:~$ mkdir custom_transformer && cd custom_transformer
  3. Inside the custom_transformer folder, create the following files named trasnformer.py and requirements.txt:

    user@local:~$ touch transformer.py requirements.txt
  4. Copy and paste the custom transformer’s code inside transformer.py:

    custom_transformer_image.py
    1# Copyright © 2022 Arrikto Inc. All Rights Reserved.
    2
    3"""This script defines a custom transformer."""
    4
    5import io
    6import base64
    7import argparse
    8
    9from typing import Dict
    10
    11import kserve
    12
    13from PIL import Image
    14from torchvision import transforms
    15
    16
    17DEFAULT_MODEL_NAME = "custom-model"
    18DEFAULT_PROTOCOL_VERSION = "v1"
    19
    20
    21class Transformer(kserve.Model):
    22 """Define custom transformer."""
    23
    24 def __init__(self, model_name: str, predictor_host: str, protocol: str):
    25 super().__init__(model_name)
    26 self.predictor_host = predictor_host
    27 self.protocol = protocol
    28
    29 def preprocess(self, request: Dict):
    30 inputs = request["instances"]
    31 data = inputs[0]["image"]["b64"]
    32
    33 raw_img_data = base64.b64decode(data)
    34 input_image = Image.open(io.BytesIO(raw_img_data))
    35
    36 preprocess = transforms.Compose([
    37 transforms.Resize(256),
    38 transforms.CenterCrop(224),
    39 transforms.ToTensor(),
    40 transforms.Normalize(mean=[0.485, 0.456, 0.406],
    41 std=[0.229, 0.224, 0.225]),
    42 ])
    43
    44 input_tensor = preprocess(input_image)
    45 return {"instances": [input_tensor.numpy().tolist()]}
    46
    47 def postprocess(self, request: Dict) -> Dict:
    48 return request
    49
    50
    51if __name__ == "__main__":
    52 parser = argparse.ArgumentParser(parents=[kserve.model_server.parser])
    53 parser.add_argument('--model_name', default=DEFAULT_MODEL_NAME,
    54 help='The name that the model is served under.')
    55 parser.add_argument('--predictor_host',
    56 help='The URL for the model predict function',
    57 required=True)
    58 parser.add_argument('--protocol', default="DEFAULT_PROTOCOL_VERSION",
    59 help='The serving protocol for the predictor')
    60 args, _ = parser.parse_known_args()
    61
    62 server = kserve.ModelServer()
    63 model = Transformer(model_name=args.model_name,
    64 predictor_host=args.predictor_host,
    65 protocol=args.protocol)
    66 server.start(models=[model])

    Note

    The above transformer code creates mini-batches of 3-channel RGB images of shape (3 x 224 x 224). The images are loaded in to a range of [0, 1], and then are normalized using mean = [0.485, 0.456, 0.406] and std = [0.229, 0.224, 0.225]. You can read more about the image preprocessing and the Alexnet model here.

  5. Copy and paste the custom transformer’s dependencies inside requirements.txt:

    custom_transformer_requirements.txt
    1kserve==0.8.0
    2torchvision
    3ray<1.7.0
    4protobuf==3.19.3
    5tornado==6.1
  6. Inside the custom_images folder, create the following Dockerfile for building the image:

    user@local:~$ cd .. && touch transformer.Dockerfile
  7. Copy and paste the following code inside transformer.Dockerfile:

    custom_transformer_image.Dockerfile
    1FROM python:3.7-slim
    2
    3COPY custom_transformer custom_transformer
    4
    5WORKDIR custom_transformer
    6RUN pip install --upgrade pip && pip install -r requirements.txt
    7
    8ENTRYPOINT ["python", "transformer.py"]
  8. Build the custom model image:

    user@local:~$ docker build -t custom-transformer-image:latest -f transformer.Dockerfile .

Summary

You have successfully built a custom transformer server image.

What’s Next

Check out how you can serve a custom model.