-
Notifications
You must be signed in to change notification settings - Fork 9
-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Review Code with Jason webinar (for testing) and make a list of refactor suggestions #149
Comments
Awesome, thanks @Eduardo06sp |
I'd like to be assigned to this! So just to clarify, the first step is to create a list of things that need to be refactored. As I have time, I'll watch through the video and create the list, then go from there! |
Yes I think that's a good high level approach. I am not sure how much work this might end up being as it could infer some more fundamental design approaches that implicate all or many of the test files. So, maybe it would be good to develop a couple bullets on testing design principals that could be added to the Readme, and the next step could be to create a file by file summary of changes (as many as you feel like taking on, one or many, no pressure). Then we can create an issue for each file that outlines a summary of changes. What are your thoughts on this? |
I was thinking this could turn into a "Test Refactoring (Jason Swett)" project, and the issues could be grouped as one project on this page: The same approach can be used for the other issue (#121), as it may be easier to track progress like this. I don't think just anyone can make a new project however. I'm not sure what I would be adding to the README. I'll definitely start by making the overview/list then let you know here when I'm done! |
Ok tracking on a project page sounds good: https://github.com/users/kasugaijin/projects/4/views/1 |
@Eduardo06sp the project linked above should now be available. I just realised it was private and set it to public. https://github.com/users/kasugaijin/projects/4 |
I don't have the ability to add issues there. However, it's not a major issue honestly. I will work on the list then just create new issues as needed. I didn't want to add more work than is necessary! |
General Points
|
There was A LOT to unpack from the video, but here is a revised draft of my notes. The structure of my notes essentially follows the video chronologically (except the general points at the very top). Anywhere you see a checkbox, it indicates something that can directly be addressed. All other bullet points are things that I found important or could potentially become a change. I found it difficult to really point to any specific part where I could say "this is what needs to change in this file", especially since some of the points are more broad & apply to the entire test suite. So right off the bat, we could just work on the file that Jason focused on for a large part of the video, Alternative tasks/issues could be focused on the stuff under "General Points," which will require reviewing the rest of the test files and checking what is applicable. |
Awesome work @Eduardo06sp I will have time to review your notes above this weekend. I agree on your suggestion to start with the org dogs test file. How do you want to approach this? Perhaps refactor a test and submit a PR then we can review together? After a couple iterations of that the chunks of work could get bigger if desired? I'd like to keep the overhead low if possible, so a test at a time would help me at first. |
@kasugaijin do you want me to just start with one of the tests then create a PR? skipping the part where I create issues altogether? in that case, we can use this issue as the "mega-thread" for all the work, and each PR thread can contain a more specific review/discussion as needed I wanted to create issues to make it easier to track what I'm assigned to & potentially have others take some if desired. I honestly wouldn't mind chipping at it, then just creating PRs one at a time to simplify the whole process (with no Issues created). I would check off the list I made here as the work gets done. I definitely would love to review any changes together. The video was really informative & we could ensure nothing gets missed. |
If that works for you I would say just chip away at it and make PRs as you go. Then we can review!. Once we have a few down and if you're feeling like spreading the load we can make some issues then. I agree that the above can be used to track work. Perhaps we can add a new checkbox with specific description of the work completed each time you make a PR, or at least a link to the PR, and then we can check them off as we go to keep track of things? I am happy for that to be done after the fact instead of detailing everything up front (as we all know those things will be subject to change as work progresses). |
Sounds good! I'll start knocking them out as time allows, and just make PRs. I'll broadly track my work accordingly. :} |
Our conversation about using more general selectors throughout the tests: Just dropping this here to keep track of things! |
@kasugaijin Hey Ben, I apologize about the delay in communicating my next steps, I was a bit busy these past days. I think the individual tests in Jason Swett mentioned he would do those a lot differently (as noted in my long list of notes above). However, I think they are fine as is. If they get refactored, it would involve creating new Dogs of each type, then making sure they show up on the page. I think we can instead just leave the technique you used, which is checking the amount of elements against the total count for number of dogs displayed / being tested. He also mentioned that we should keep system tests limited to what APPEARS on the page, instead of what appears in the db, e.g. assertions such as Since these are integration tests, it is to my understanding that "reaching" into the database is okay. I certainly agree with Jason when it comes to system tests, where you are essentially mimicking how a user navigates and "sees" your website. In system tests, there should be no mixing levels of abstraction. One thing left is ensuring the test files are for features being tested, instead of giving the test files general names like it currently has. This is going to change the file structure for you integration tests a bit, so we can also leave the files as-is if you don't think that would be necessary. Alternatively, I can start looking through the rest of the integration tests & see if anything needs some "simplification"/refactoring like I've done with I think these would be the last two items for this issue, or last one if you want to disregard one of the suggestions. |
Thank you @Eduardo06sp no problem at all! I agree with your assessments here. I think that the remaining refactors you mention above, while perhaps arguably the 'best' approach, don't really necessitate the effort and won't add much value right now. However, it's certainly a good exercise to identify what could be better, as you have done, so it is noted for the next time. For example, I definitely didn't put a ton of thought into the file names, but I am OK they remain as is for now. If we were to refactor this it would likely create a number more test files as I was creating test files by route rather than feature, and a route may have several features. From here on that should be the way it is handled. As far as the db query within an integration test goes, I agree, this is acceptable. These are not expensive queries, either, so no need to invest time to consider other approaches for these. Thanks again for the above and the PRs. This was a good exercise and I will refer to your notes periodically. |
@kasugaijin No problem! Thank you for confirming some of the thoughts I had and sharing some of your own. So just to clarify, do you think we should close this Issue at this point? I'm still working on my personal projects, but I am interested in contributing to this repo one way or another, whether it's this specific Issue or one of the other open ones. I would like to know if I should start focusing my efforts on another open Issue instead, as time permits. |
Yes let's close this issue now! Though I will link to this issue in the Readme to serve as reference material for anyone working on tests in the future. |
https://www.youtube.com/watch?v=AD5Sg-CKgww
Similar to #121. I think Jason provided very valuable feedback and some of those changes could definitely be incorporated into this project.
I am very interested, so I will keep a lookout on this issue over the next 7 days. If no one takes it, I will request to be assigned to it.
I would love to give someone a chance to work on this before me. :}
The text was updated successfully, but these errors were encountered: