A pipeline that consumes twitter data to extract meaningful insights about a variety of topics using the following technologies: twitter API, Kafka, MongoDB, and Tableau.
-
Updated
Aug 2, 2021 - Python
A pipeline that consumes twitter data to extract meaningful insights about a variety of topics using the following technologies: twitter API, Kafka, MongoDB, and Tableau.
Artifician is an event-driven framework designed to simplify and accelerate the process of preparing datasets for Artificial Intelligence models.
A simplistic, general purpose pipeline framework.
Making it easier to navigate and clean TAHMO weather station data for ML development
convtools is a specialized Python library for dynamic, declarative data transformations with automatic code generation
Add a description, image, and links to the data-processing-pipelines topic page so that developers can more easily learn about it.
To associate your repository with the data-processing-pipelines topic, visit your repo's landing page and select "manage topics."