Skip to content

SDCCopyTim/SDC-Description

Repository files navigation

SDC-Description

Description

The goal of this project is to build a scalable RESTful API service built for an existing e-commerce platform and optimize it to handle web-scale traffic. This is done by increasing the dataset from 100 unique records to 10 million unique records. The 10 million records were generated into a PostgreSQL database, chosen between that or MongoDB by comparing query latency through B-tree indexing. Then horizontally scaled out from 1 service at 600/RPS to 4 services at 2000/RPS while maintaining 20ms average latency and <1% error rate through load balancing by least connections in Nginx. Then implemented reverse proxy caching to produce average latency of 11ms at 10,000/rps with one service. All services deployed in an AWS EC2 T2.micro instance (1 CPU, 1GB mem). More details below.

Tech Stack

  • NodeJS
  • Express
  • PostgreSQL
  • Nginx

CRUD API using REST

CRUD REST PATH DATA TO PASS
CREATE POST /one States (STRING)
Farms (STRING)
Camps (STRING)
minimumNights (STRING)
acceptsBookings (STRING)
checkIn (STRING)
checkOut (STRING)
onArrival (STRING)
costs (NUMBER)
description (STRING)
responses (STRING)
recommended (STRING)
Parks(STRING)
Lodging (STRING)
Essentials (STRING)
Amentities (STRING)
Owners (STRING)
photosofResponsers (STRING)
cancellation (STRING)
checkmark (BOOLEAN
READ GET /one/:id
UPDATE PUT /one/:id <name of data column you want to change>: <new data>
DELETE DELETE /one/:id

Stress Testing on Local Machine

Requests Per Second Optimization Average Response Time in ms
500 Indexing 32ms
1000 Indexing 270ms
1000 Indexing, Max Connections 185ms
1000 Indexing, Max Connections, Static 170ms
1000 Indexing, Max Connections, Static, Async 57ms
1000 Indexing, Max Connections, Static, Async Close system processes 38ms
500 client requests per second

Loader.io graph

1000 client requests per second

Loader.io graph

1000/RPS Modify DB Max Connections from 100 -> 1000

Loader.io graph

1000/RPS Move Express Serve Index.html below api routes

Loader.io graph

1000/RPS Convert database request functions to asynchronous

Loader.io graph

1000/RPS Close unnecessary running processes in machine (Slack, Zoom, etc.)

Loader.io graph

Load Testing on AWS EC2 T2 Micro Instances

Requests on random IDs

400 client requests per second

Loader.io graph

600 client requests per second

Loader.io graph

1000 client requests per second

Loader.io graph

Requests on One ID

1000/RPS | 2 services

Loader.io graph

2000/RPS | 2 services

Loader.io graph

2000/RPS | 3 services

Loader.io graph

2000/RPS | 4 services

Loader.io graph

2000/RPS | 5 services

Loader.io graph

Reverse Proxy Caching | Least Connections

2000/RPS | 2 services

Loader.io graph

5000/RPS | 2 services

Loader.io graph

10000/RPS | 1 service

Loader.io graph

Instructions

  1. To create database, copy paste the schema.js file into your mysql.
  2. To seed the database, input command "npm run seed"
  3. Run npm run start on terminal to start server
  4. Add your password to config.example.js and then rename the file to config.js
  5. Go to package.json. Change "nodemon server/index.js" to "node server/index.js" and "webpack -d --watch" to "webpack"

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published