Skip to content

Our project is targeted at making an application that dynamically detects the user’s expressions and gestures and projects it onto an animation software which then renders a 2D/3D animation realtime that gets broadcasted live.

Notifications You must be signed in to change notification settings

METALXRAY/Dynamic-Realtime-Animation-Control

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Dynamic Realtime Animation Control

Our project is targeted at making an application that dynamically detects the user’s expressions and gestures and projects it onto an animation software which then renders a 2D/3D animation realtime that gets broadcasted live. At it’s skeletal state, our project essentially merges facial keypoint detection, keypoint meshing, emotion and gesture tracking with animation. The final rendered animation can be projected / broadcasted onto an application that requires webcam access.

3 step

MOTIVATION

To look presentable on any video call at any given time.
For protecting yourself and maintaining privacy on the internet.
To make a replacement for video streaming through a webcam that consumes more bandwidth, resulting in using lower bandwidth.

OBJECTIVES

To make a pose-detection model using OpenCV and Tensor flow.
To also make an emotion detection model.
Animate a render of the newly made model on a live rendering software like Blender or Three.JS!

CONTRIBUTERS

Harsh-Avinash
Seshank-k
Nishita-Varshney
Aaryan Bhatiya Ghosh

About

Our project is targeted at making an application that dynamically detects the user’s expressions and gestures and projects it onto an animation software which then renders a 2D/3D animation realtime that gets broadcasted live.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%