Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

timeout: fix subprocess is never terminated #4315

Merged
merged 2 commits into from
Mar 15, 2023

Conversation

miles170
Copy link
Contributor

@miles170 miles170 commented Feb 1, 2023

Closes #4176.

This should also fix #4073.

Follow GNU's timeout behavior, send a signal to the process group to make sure subprocesses are cleaned up, and also send SIGCONT to the child and the whole process group after that.

@sylvestre
Copy link
Sponsor Contributor

would it be possible to add a test for this?
Thanks

@miles170 miles170 marked this pull request as draft February 3, 2023 01:03
@miles170 miles170 force-pushed the issue-4176-fix-timeout branch 2 times, most recently from f490aaf to 7f03e29 Compare February 3, 2023 03:35
@miles170 miles170 marked this pull request as ready for review February 3, 2023 03:35
@miles170
Copy link
Contributor Author

miles170 commented Feb 3, 2023

would it be possible to add a test for this? Thanks

Added test_kill_subprocess test.

@miles170 miles170 marked this pull request as draft February 3, 2023 04:00
@miles170 miles170 force-pushed the issue-4176-fix-timeout branch 2 times, most recently from 49fc1ac to edd81ed Compare February 3, 2023 05:50
@miles170 miles170 marked this pull request as ready for review February 3, 2023 05:57
@miles170 miles170 force-pushed the issue-4176-fix-timeout branch 2 times, most recently from 2ac1cb7 to 5dadb70 Compare February 3, 2023 06:30
@miles170
Copy link
Contributor Author

miles170 commented Feb 8, 2023

I'm not sure why the CI is failing randomly.
I have increased the timeout from 2 seconds to 5 seconds. (Everything works fine on my local Ubuntu 22.10)

@github-actions
Copy link

github-actions bot commented Feb 8, 2023

GNU testsuite comparison:

Congrats! The gnu test tests/tail-2/inotify-dir-recreate is no longer failing!

@github-actions
Copy link

github-actions bot commented Feb 8, 2023

GNU testsuite comparison:

Congrats! The gnu test tests/tail-2/inotify-dir-recreate is no longer failing!

@miles170
Copy link
Contributor Author

miles170 commented Feb 8, 2023

I have increased the timeout from 5 seconds to 10 seconds, and the failing tests are not related to this PR.

@github-actions
Copy link

github-actions bot commented Feb 8, 2023

GNU testsuite comparison:

Congrats! The gnu test tests/tail-2/inotify-dir-recreate is no longer failing!

@github-actions
Copy link

github-actions bot commented Mar 5, 2023

GNU testsuite comparison:

Congrats! The gnu test tests/tail-2/inotify-dir-recreate is no longer failing!

Copy link
Member

@tertsdiepraam tertsdiepraam left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I reran the CI because of some unrelated linting issues that should be fixed by now, but this PR looks good.

@miles170
Copy link
Contributor Author

miles170 commented Mar 15, 2023

I checked the GNU test logs and found that this should also fix #4073, which is caused by the split subprocess (filter) not being killed by the timeout.

@tertsdiepraam
Copy link
Member

Indeed this reduces the log size from 86MB to just 2MB, that's great! Let's hope the CI becomes a bit more stable as a result.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
3 participants