Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

got unexpected datatype None from numpy array, expected BYTES" from mlserver infer #1210

Closed
RafalSkolasinski opened this issue Jun 12, 2023 · 0 comments · Fixed by #1213
Closed

Comments

@RafalSkolasinski
Copy link
Contributor

RafalSkolasinski commented Jun 12, 2023

mlserver infer ... subcommand does not support BYTES inputs - reproduced by mock v2 echo model here.

I used a simple Echo Model for testing

import json

from typing import Dict, Any
from mlserver import MLModel, types
from mlserver.codecs import StringCodec


class EchoModel(MLModel):
    async def load(self) -> bool:
        print("Echo Model Initialized")
        return await super().load()

    async def predict(self, payload: types.InferenceRequest) -> types.InferenceResponse:
        return types.InferenceResponse(
            id=payload.id,
            model_name=self.name,
            model_version=self.version,
            outputs=[
                types.ResponseOutput(
                    name=input.name,
                    shape=input.shape,
                    datatype=input.datatype,
                    data=input.data,
                    parameters=input.parameters,
                )
                for input in payload.inputs
            ],
        )

with settings.json

{
    "debug": "true"
}

and model-settings.json

{
    "name": "echo-model",
    "implementation": "model.EchoModel"
}

the model served locally with mlserver start does properly process request of form

{
  "inputs": [
    {
      "name": "fp32-input",
      "shape": [1, 1],
      "datatype": "FP32",
      "data":  [ 1.23 ]
    },
    {
      "name": "int32-input",
      "shape": [1, 1],
      "datatype": "INT32",
      "data":  [ 1.23 ]
    },
    {
      "name": "bytes-input",
      "shape": [1, 1],
      "datatype": "BYTES",
      "data":  [ "some bytes" ]
    }
  ],
  "outputs": [
    {
      "name": "predict"
    }
  ]
}

however when trying to put it as input batch with

{"inputs":[{"name":"fp32-input","shape":[1,1],"datatype":"FP32","data":[1.23]},{"name":"int32-input","shape":[1,1],"datatype":"INT32","data":[1.23]},{"name":"bytes-input","shape":[1,1],"datatype":"BYTES","data":["some bytes"]}],"outputs":[{"name":"predict"}]}
{"inputs":[{"name":"fp32-input","shape":[1,1],"datatype":"FP32","data":[1.23]},{"name":"int32-input","shape":[1,1],"datatype":"INT32","data":[1.23]},{"name":"bytes-input","shape":[1,1],"datatype":"BYTES","data":["some bytes"]}],"outputs":[{"name":"predict"}]}
{"inputs":[{"name":"fp32-input","shape":[1,1],"datatype":"FP32","data":[1.23]},{"name":"int32-input","shape":[1,1],"datatype":"INT32","data":[1.23]},{"name":"bytes-input","shape":[1,1],"datatype":"BYTES","data":["some bytes"]}],"outputs":[{"name":"predict"}]}
{"inputs":[{"name":"fp32-input","shape":[1,1],"datatype":"FP32","data":[1.23]},{"name":"int32-input","shape":[1,1],"datatype":"INT32","data":[1.23]},{"name":"bytes-input","shape":[1,1],"datatype":"BYTES","data":["some bytes"]}],"outputs":[{"name":"predict"}]}
{"inputs":[{"name":"fp32-input","shape":[1,1],"datatype":"FP32","data":[1.23]},{"name":"int32-input","shape":[1,1],"datatype":"INT32","data":[1.23]},{"name":"bytes-input","shape":[1,1],"datatype":"BYTES","data":["some bytes"]}],"outputs":[{"name":"predict"}]}

and run via

mlserver infer -u localhost:8080 -m echo-model -i input.txt -o output.txt --workers 10 -v

I am getting

2023-06-12 20:52:04,787 [mlserver] ERROR - consumer 0: failed to preprocess items: got unexpected datatype None from numpy array, expected BYTES
@RafalSkolasinski RafalSkolasinski changed the title mlserver infer subcommand does not support BYTES inputs got unexpected datatype None from numpy array, expected BYTES" from mlserver infer Jun 12, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant