Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update CI python #118

Merged
merged 2 commits into from
Jun 6, 2024
Merged

Update CI python #118

merged 2 commits into from
Jun 6, 2024

Conversation

leejuyuu
Copy link
Contributor

@leejuyuu leejuyuu commented Jun 1, 2024

Update the python version in CI to fix CI failure. Fixes the test breakage due to package updates.

Fixes #117.

Copy link
Owner

@kmkurn kmkurn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for the PR! It looks good overall. I've left some comments and a request before I can merge it.

.github/workflows/run_tests.yml Outdated Show resolved Hide resolved
requirements-test.txt Show resolved Hide resolved
if HAS_ASSERT_CLOSE:
torch.testing.assert_close(llh, manual_llh, atol=1e-12, rtol=1e-6)
else:
torch.testing.assert_allclose(llh, manual_llh, atol=1e-12, rtol=1e-6)
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you explain what assert_close and assert_allclose do that approx doesn't?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The problem is not in approx itself, but it requires converting from torch tensor to python scalar, likely through numpy. I started to see RuntimeError: Can't call numpy() on Tensor that requires grad. Use tensor.detach().numpy() instead. when using torch 1.4. Instead of adding .detach().numpy().item() all over the place, it feels simpler and more readable to use dedicated testing utilities provided by torch itself.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll change the commit message to make it more clear. The error mentioned in the commit seemed to only happen in my local environment.

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see. I agree that adding .detach().numpy().item() all over the place is less readable. However, I believe Pytest's approx has a nicely formatted error message when the test fails, which is also important in terms of readability. Does PyTorch's assert_close have it too? Can you please compare them and share it here?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes. The following code

def test_approx():
    a = 1/3
    b = 0.3
    assert a == approx(b)

def test_torch_assert_close():
    a = torch.tensor(1/3)
    b = torch.tensor(0.3)
    torch.testing.assert_close(b, a)

gives
image

Copy link
Owner

@kmkurn kmkurn Jun 5, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! The PyTorch one is readable too, which is good!

Now, to improve readability further, can you please move the calling logic to a separate function? Something like:

def assert_close(value, /, expected):
    fn_name = "assert_close" if Version(torch.__version__) >= Version("1.9.0") else "assert_allclose"
    assert_fn = getattr(torch.testing, fn_name)
    return assert_fn(value, expected, atol=1e-12, rtol=1e-6)

Then you can use assert_close in all the tests to replace approx.

Also, if approx is no longer used anywhere, please delete its import statement (line 5).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The requested changes have been added. I did not use string literals for function names because it makes editor hovering and completions impossible.

@coveralls
Copy link

coveralls commented Jun 3, 2024

Coverage Status

coverage: 100.0% (+100.0%) from 0.0%
when pulling 07c06d3 on leejuyuu:update-ci-python
into c885fba on kmkurn:master.

Copy link
Owner

@kmkurn kmkurn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for your clarifications. I've left further requests to be addressed.

.github/workflows/run_tests.yml Outdated Show resolved Hide resolved
if HAS_ASSERT_CLOSE:
torch.testing.assert_close(llh, manual_llh, atol=1e-12, rtol=1e-6)
else:
torch.testing.assert_allclose(llh, manual_llh, atol=1e-12, rtol=1e-6)
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see. I agree that adding .detach().numpy().item() all over the place is less readable. However, I believe Pytest's approx has a nicely formatted error message when the test fails, which is also important in terms of readability. Does PyTorch's assert_close have it too? Can you please compare them and share it here?

Because python 3.6 has reached EOL and was dropped from the
ubuntu-latest image (22.04), the CI will fail when setting up the
environment.
See kmkurn#117

This updates the python version in the CI matrix to use the oldest
supported minor python verion.
The torch version is also updated to match the available build on PyPI.
Testing on oldest torch 2.x is also added.
Copy link
Owner

@kmkurn kmkurn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for responding to my requests. Please see my further comments below. Just a bit more change before I can merge this!

if HAS_ASSERT_CLOSE:
torch.testing.assert_close(llh, manual_llh, atol=1e-12, rtol=1e-6)
else:
torch.testing.assert_allclose(llh, manual_llh, atol=1e-12, rtol=1e-6)
Copy link
Owner

@kmkurn kmkurn Jun 5, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! The PyTorch one is readable too, which is good!

Now, to improve readability further, can you please move the calling logic to a separate function? Something like:

def assert_close(value, /, expected):
    fn_name = "assert_close" if Version(torch.__version__) >= Version("1.9.0") else "assert_allclose"
    assert_fn = getattr(torch.testing, fn_name)
    return assert_fn(value, expected, atol=1e-12, rtol=1e-6)

Then you can use assert_close in all the tests to replace approx.

Also, if approx is no longer used anywhere, please delete its import statement (line 5).

tests/test_crf.py Outdated Show resolved Hide resolved
The tests outputs `RuntimeError: Can't call numpy() on Tensor that
requires grad. Use tensor.detach().numpy() instead.` for torch above
v1.4. The error seems to came from trying to convert tensor to scalar.

Replace the pytest approx() with `torch.testing.assert_close()`
to let torch handle the conversion.
Use `assert_allclose()` for torch versions before 1.9.0.
Copy link
Owner

@kmkurn kmkurn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for addressing the requests. All good! I can merge this PR now.

@kmkurn kmkurn merged commit cc449eb into kmkurn:master Jun 6, 2024
3 checks passed
@leejuyuu leejuyuu deleted the update-ci-python branch June 7, 2024 15:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Github workflow fails for python 3.6
3 participants