Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Recover non-running pretend tests in tests/test_proxy_functional.py #6031

Closed
1 task done
webknjaz opened this issue Oct 2, 2021 · 5 comments
Closed
1 task done
Labels
bug infra meta regression Something that used to work stopped working "as before" after upgrade

Comments

@webknjaz
Copy link
Member

webknjaz commented Oct 2, 2021

Describe the bug

While working on #6002 and #5992, I noticed several functions that are prefixed with xtest_ and are marked with @pytest.mark.xfail. Clearly @asvetlov wanted them to run but not influence the test session outcome. But pytest only picks up functions marked as test_ (by default) as tests so that never worked or was executed.

We need to drop that x and let them run. Also, we need to audit them to see if they are still relevant to the today's codebase. It sounds like they were added with these names originally back in 2017. And some of them were added earlier that year prefixed with _test_ (which wouldn't work either). The @pytest.mark.xfail decorator was added later (also the same year) but the names were never corrected so it didn't have any effect. Further archeology revealed that it was @fafhrd91 who changed test_ to _test_ (in Feb 2017) and before that the tests had the proper names and the regular xfail mark. And the original tests (that existed at the time) were added in 2016 by @kserhii.

There's no record of why those tests were disabled and forgotten so I'm posting these findings here for history.

To Reproduce

Run pytest -q --no-cov tests/test_proxy_functional.py --collect-only and see that none of the functions that start with xtest_ are listed there.

Expected behavior

The functions in question are also marked with @pytest.mark.xfail meaning that they are perceived by the humans as tests but because of a small oversight they are never picked up and run by pytest.

Logs/tracebacks

N/A

Python Version

N/A

aiohttp Version

master branch

multidict Version

N/A

yarl Version

N/A

OS

N/A

Related component

Client

Additional context

No response

Code of Conduct

  • I agree to follow the aio-libs Code of Conduct
@webknjaz webknjaz added bug infra meta regression Something that used to work stopped working "as before" after upgrade labels Oct 2, 2021
@webknjaz

This comment has been minimized.

@kserhii
Copy link
Contributor

kserhii commented Oct 7, 2021

Here is what I know about those tests.

Back in 2016, there was an issue #1413 in aiohttp related to the proxy usage that I have fixed.
In addition, there was a request #1218 from @asvetlov to write functional tests for proxy support.
I have solved this issue as well with the monkey patch of asyncio.selector_events.BaseSelectorEventLoop._make_ssl_transport. It was not the best solution but better than before.
You can read the whole pull request communication in #1421.

But I did not track the changes after that and I have no idea why those tests were deprecated.

@webknjaz
Copy link
Member Author

@kserhii thanks for your input! I think now that we have a fixture for actual integration e2e testing, it should be easier to resurrect the testing that got lost over time.

@asvetlov
Copy link
Member

Please feel free to drop these tests, or make them pass.
I was trying to write reliable functional tests without monkey patching, it took much more effort than I had time to invest into this particular part of aiohttp.

@Dreamsorcerer
Copy link
Member

I've renamed these while doing the typing PRs, so all of these tests should be running now (though still xfailing).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug infra meta regression Something that used to work stopped working "as before" after upgrade
Projects
None yet
Development

No branches or pull requests

4 participants