Skip to content

hangoutsidesf/parklet-crawl

Repository files navigation

parklet-crawl

Coverage Status CircleCI

This is a web scraper. We will populate our database with data from: Data SF on all of the parklets in SF.

Usage

Development

Localhost

  1. install dependencies (npm install)
  2. start server in development mode (npm run start:dev)

Container

  1. npm run deploy:develop will build and run latest local changes in dev mode
  2. npm run deploy:staging will build and run latest local changes in staging mode
  3. docker-compose stop will down crawler container

Production

  1. Clone the repo
git clone https://github.com/hangoutsidesf/parklet-crawl.git crawler

cd crawler
  1. Start Production Container
npm run deploy:prod

TODO

  1. Edit start script to auto exit when data is collected