Skip to content

(MY OTHER SCRAPER REPOS ARE BETTER . Was just publicizing older haskell scraper code) made this fast to train some ai back 8 months or so ago. I made a quick and dirty script, curling all urls with a bash script, to then use haskell. This is ugly, anbd I would recomend using haskell only for everything. I was doing this fast and just grabbing data.

Notifications You must be signed in to change notification settings

Tknott95/quickNDirty_CNFTCollectionScraper

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 

Repository files navigation

Scrapers

(sloppy speed code, never refactored)

(MY OTHER SCRAPER REPOS ARE BETTER - just was publicizing older haskell scraper code) made this fast to train some ai back 8 months or so ago. I made a quick and dirty script, curling all urls with a bash script, to then use haskell. This is ugly, anbd I would recomend using haskell only for everything. I was doing this fast and just grabbing data. It is also better to utilize mapConcurrrently over concurrent nesting as that is ugly code. I just wanted to batch and run concurrently fast as I was speed coding to get data trained for a GAN to make new rabbits/nfts from a collection.

my other scraper has some fun util funcs under the hood to run mass scrapes super fast

About

(MY OTHER SCRAPER REPOS ARE BETTER . Was just publicizing older haskell scraper code) made this fast to train some ai back 8 months or so ago. I made a quick and dirty script, curling all urls with a bash script, to then use haskell. This is ugly, anbd I would recomend using haskell only for everything. I was doing this fast and just grabbing data.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published