A.I Emotion Detector
This is an ongoing project using Tensorflow A.I platform made by google, The aim of this CNN A.I is to be able to run live camera footage from a Trade show or Marketing event.
The A.I detects the presence of Human faces and then runs the CNN network on the detected cropped face to determine what the human emotional state is. In the example image The network is being run on Video comedy footage as a test example, The idea for deployment is to have a camera mounted above the exit door’s at large trade shows or events.
As people walk under the camera the A.I creates an emotional state profile for each person & then outputs whether the person leaving has positive emotion or negative, This data is then fed into a CSV file database allowing the event organizer to build in realtime feedback on the event.
This type of deep data is like gold dust for automated feedback which can then be run into a interactive dashboard giving graphs for positive feedback or negative, The final dashboard then generates graphs and visual statistics allowing to evaluate the success of the event.
This can run on edge hardware directly at the event location or streamed to the cloud from a dumb edge device just streaming video to the AI on-line.
This project has expanded to also allowing gender classification of humans detected, & age prediction for more detailed analytics on attendee’s.
This Project Included
- Tensorflow AI
- Python
- CNN A.I Deep learning creating
- Custom trained A.I Model and Dataset
Here’s a short video clip: