Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Unsupported type_id conversion to cudf #813

Closed
oyilmaz-nvidia opened this issue May 15, 2021 · 1 comment · Fixed by #839
Closed

[BUG] Unsupported type_id conversion to cudf #813

oyilmaz-nvidia opened this issue May 15, 2021 · 1 comment · Fixed by #839
Assignees
Labels
bug Something isn't working Inference P0

Comments

@oyilmaz-nvidia
Copy link
Contributor

When the a cudf with string column(s) is converted with convert_df_to_triton_input() function and sent to inference server, the following error is raised;

---------------------------------------------------------------------------
InferenceServerException                  Traceback (most recent call last)
<ipython-input-3-e0e01a3ad320> in <module>
      6 # make the request
      7 with grpcclient.InferenceServerClient("localhost:8001") as client:
----> 8     response = client.infer("rossmann", inputs, request_id="1",outputs=outputs)
      9 
     10 end = timer()

/conda/envs/merlin/lib/python3.8/site-packages/tritonclient/grpc/__init__.py in infer(self, model_name, inputs, model_version, outputs, request_id, sequence_id, sequence_start, sequence_end, priority, timeout, client_timeout, headers)
   1059             return result
   1060         except grpc.RpcError as rpc_error:
-> 1061             raise_error_grpc(rpc_error)
   1062 
   1063     def async_infer(self,

/conda/envs/merlin/lib/python3.8/site-packages/tritonclient/grpc/__init__.py in raise_error_grpc(rpc_error)
     59 
     60 def raise_error_grpc(rpc_error):
---> 61     raise get_error_grpc(rpc_error) from None
     62 
     63 

InferenceServerException: [StatusCode.INTERNAL] in ensemble 'rossmann', GRPC Execute Failed, message: Traceback (most recent call last):
  File "/opt/tritonserver/backends/python/startup.py", line 275, in Execute
    responses = self.model_instance.execute(inference_requests)
  File "/model/rossmann/rossmann_nvt/1/model.py", line 89, in execute
    input_df_2[name] = _convert_tensor(get_input_tensor_by_name(request, name))
  File "/conda/envs/merlin/lib/python3.8/contextlib.py", line 75, in inner
    return func(*args, **kwds)
  File "/conda/envs/merlin/lib/python3.8/site-packages/cudf/core/dataframe.py", line 777, in __setitem__
    self.insert(len(self._data), arg, value)
  File "/conda/envs/merlin/lib/python3.8/contextlib.py", line 75, in inner
    return func(*args, **kwds)
  File "/conda/envs/merlin/lib/python3.8/site-packages/cudf/core/dataframe.py", line 3162, in insert
    value = column.as_column(value)
  File "/conda/envs/merlin/lib/python3.8/site-packages/cudf/core/column/column.py", line 1890, in as_column
    data = as_column(
  File "/conda/envs/merlin/lib/python3.8/site-packages/cudf/core/column/column.py", line 1749, in as_column
    col = ColumnBase.from_arrow(arbitrary)
  File "/conda/envs/merlin/lib/python3.8/site-packages/cudf/core/column/column.py", line 410, in from_arrow
    return libcudf.interop.from_arrow(data, data.column_names)._data[
  File "cudf/_lib/interop.pyx", line 167, in cudf._lib.interop.from_arrow
RuntimeError: cuDF failure at: /opt/conda/envs/rapids/conda-bld/libcudf_1615843445425/work/cpp/src/interop/from_arrow.cpp:80: Unsupported type_id conversion to cudf

After checking the model.py, noticed that string columns are not recognized by the _convert_tensor(t) function. out.dtype.kind function returns an 'O' kind. This was working before and I am guessing something might have changed in the grpcclient.InferInput() function of triton. Or, I am missing something :)

@oyilmaz-nvidia oyilmaz-nvidia added the bug Something isn't working label May 15, 2021
@oyilmaz-nvidia oyilmaz-nvidia self-assigned this May 15, 2021
@oyilmaz-nvidia
Copy link
Contributor Author

oyilmaz-nvidia commented May 15, 2021

After changing this line in model.py;

if out.dtype.kind == "S" and out.dtype.str.startswith("|S"):
        out = out.astype("str")

to

if out.dtype.kind == "O":
        out = out.astype("str")

it worked. I am not sure we should assume all the object dytpes here are string though.

@benfred benfred self-assigned this May 17, 2021
@benfred benfred added the P0 label May 20, 2021
@benfred benfred linked a pull request May 21, 2021 that will close this issue
@viswa-nvidia viswa-nvidia added this to the NVTabular v0.6 milestone Jun 2, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working Inference P0
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants