Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NonZero inaccuracies #111

Open
attila-dusnoki-htec opened this issue Sep 11, 2023 · 3 comments
Open

NonZero inaccuracies #111

attila-dusnoki-htec opened this issue Sep 11, 2023 · 3 comments

Comments

@attila-dusnoki-htec
Copy link

Failing tests:

  • test_nonzero_example_cpu
@attila-dusnoki-htec
Copy link
Author

FAIL: test_nonzero_example_cpu (__main__.OnnxBackendNodeModelTest)
======================================================================
FAIL: test_nonzero_example_cpu (__main__.OnnxBackendNodeModelTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/onnx/backend/test/runner/__init__.py", line 290, in device_test_func
    return test_func(*args, device=device, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/onnx/backend/test/runner/__init__.py", line 467, in run
    self.assert_similar_outputs(
  File "../test/py/onnx_backend_test.py", line 59, in assert_similar_outputs
    np.testing.assert_allclose(ref_outputs[i],
  File "/usr/local/lib/python3.8/dist-packages/numpy/testing/_private/utils.py", line 1530, in assert_allclose
    assert_array_compare(compare, actual, desired, err_msg=str(err_msg),
  File "/usr/local/lib/python3.8/dist-packages/numpy/testing/_private/utils.py", line 763, in assert_array_compare
    raise AssertionError(msg)
AssertionError: 
Not equal to tolerance rtol=0.001, atol=1e-05

Program =
module: "main"
condition = @param:condition -> bool_type, {2, 2}, {2, 1}, target_id=0
@1 = nonzero(condition) -> int64_type, {2, 4}, {4, 1}, target_id=0
@2 = @return(@1), target_id=0


Compiled program =
module: "main"
@0 = check_context::migraphx::gpu::context  -> float_type, {}, {}, target_id=0
@1 = hip::hip_allocate_memory[shape=int8_type, {96}, {1},id=main:scratch] -> int8_type, {96}, {1}, target_id=0
condition = @param:condition -> bool_type, {2, 2}, {2, 1}, target_id=0
@3 = load[offset=64,end=68](@1) -> bool_type, {2, 2}, {2, 1}, target_id=0
@4 = hip::copy_to_gpu(condition,@3) -> bool_type, {2, 2}, {2, 1}, target_id=0
@5 = load[offset=0,end=64](@1) -> int64_type, {2, 4}, {4, 1}, target_id=0
@6 = gpu::nonzero(@4,@5) -> int64_type, {2, 4}, {4, 1}, target_id=0
@7 = hip::copy_from_gpu(@6) -> int64_type, {2, 4}, {4, 1}, target_id=0
@8 = hip::sync_stream(@7) -> int64_type, {2, 4}, {4, 1}, target_id=0
@9 = @return(@8), target_id=0


(shapes (2, 3), (2, 4) mismatch)
 x: array([[0, 1, 1],
       [0, 0, 1]])
 y: array([[0, 1, 1, 0],
       [0, 0, 1, 0]], dtype=int64)

@gyulaz-htec
Copy link

This is caused by NonZero's output doesn't have a dynamic shape.
This is a known issue in AMDMIGraphX: ROCm#948
Other related issue is: ROCm#1886

@gyulaz-htec
Copy link

gyulaz-htec commented Oct 9, 2023

Moving this to blocked because of the dynamic shape support.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: 🚧 Blocked
Development

No branches or pull requests

2 participants