Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add a limit for the number of scanned URLs #582

Open
pjoulot opened this issue Dec 11, 2023 · 1 comment
Open

Add a limit for the number of scanned URLs #582

pjoulot opened this issue Dec 11, 2023 · 1 comment
Labels
enhancement New feature or request

Comments

@pjoulot
Copy link

pjoulot commented Dec 11, 2023

Hello,

First, I love this package, thank you so much for your hard work.
I would love to see a feature where you can limit the number of scans. Some site are so big that the scan last forever and usually end up crashing. It would be great if we could have a limit, so the scan stops after X URLs.

For example an new argument,
--limit=500
And linkinator will scan the first 500 URLs that he's finding, and then stops and give the results for the 500 first URLs it found.

@pjoulot pjoulot changed the title Add a limit Add a limit for the number of scanned URLs Dec 11, 2023
@JustinBeckwith JustinBeckwith added the enhancement New feature or request label Jul 1, 2024
@JustinBeckwith
Copy link
Owner

Hey thanks for the kind words! For a project this like, I'm usually really looking for comprehensiveness. I'm fine with it scanning for a long time as long is it can tell me nothing is broken.

Taking a step back - how are you trying to use it? The recurse option is meant to kinda limit the damage by keeping things shallow, but even with that enabled it's supposed to only scan within the same domain really. Tell me more about the site you're trying to scan?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants