Emotion detection based on webcam


Emotion detector

I share with you a javascript library for fitting facial models to faces in videos or images.
It is an implementation of constrained local models fitted by regularized landmark mean-shift, as described in Jason M. Saragih’s paper.
It tracks a face and outputs the coordinate positions of the face model as an array, following the numbering of the model below:


The script can be tested here:


For those who want to play with this script and develop further applications (for driving avatar emotions for example - unfortunately will not run in VR, because the HMD is covering the user face), the files can be downloaded from here:



been after somone able to make this work again in here for ageees
we need itt
@Menithal was playing with a script to manually drive the face