Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Coerce bool to int in unpack #287

Merged
merged 1 commit into from
Oct 6, 2023

Conversation

drubinstein
Copy link
Contributor

@drubinstein drubinstein commented Oct 5, 2023

While trying to use torch.compile with backend=inductor for a model that uses unpack as

unpack(x, ps, "b d *")[0]

I received the error

torch._dynamo.exc.TorchRuntimeError: Failed running call_function <function unpack at 0x7f5c30806200>(*(FakeTensor(..., device='cuda:0', size=(1, 128, (s1//2))), [((s1//2),)], 'b d *'), **{}):
unsupported operand type(s) for +: 'int' and 'SymBool'

tracing back to the line changed in this PR. Coercing the boolean to an int so sum does not try to sum a boolean and an int fixed my issue locally.

While trying to use torch.compile with `backend=inductor` for a model that uses unpack as 
```
unpack(x, ps, "b d *")[0]
```

I received the error 

```
torch._dynamo.exc.TorchRuntimeError: Failed running call_function <function unpack at 0x7f5c30806200>(*(FakeTensor(..., device='cuda:0', size=(1, 128, (s1//2))), [((s1//2),)], 'b d *'), **{}):
unsupported operand type(s) for +: 'int' and 'SymBool'
```

tracing back to this line. Coercing the boolean to an int so `sum` does not try to sum a boolean and an int fixed my issue locally.
@arogozhnikov
Copy link
Owner

Did you report this to torchdynamo team? Using boolean in arithmetics is perfectly legal in python

@arogozhnikov arogozhnikov merged commit e3082f2 into arogozhnikov:master Oct 6, 2023
2 of 4 checks passed
@arogozhnikov
Copy link
Owner

reported to torch as well
pytorch/pytorch#110738

@drubinstein
Copy link
Contributor Author

drubinstein commented Oct 6, 2023

Thanks for reporting it to pytorch. The issue came up specifically when using torch._dynamo.explain when trying to find the reason for a graph break. I should've been more specific.

I will add that information to the issue you reported and thank you for merging the PR.

@drubinstein drubinstein deleted the patch-1 branch October 6, 2023 18:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants