Health
Cornell researchers created an earphone that can track facial expressions – Engadget
The team says C-Face works even when the subject is wearing a mask.

The earphone uses two RGB cameras that are positioned below each ear. They can record changes in cheek contours when the wearer’s facial muscles move.
Once the images have been reconstructed using computer vision and a deep learning model, a convolutional neural network analyzes the 2D images. The tech can translate those to 42 facial feature points representing the position and shape of the wearer’s mouth, eyes and eyebrows.
C-Face can translate those expressions into eight emoji, including…
-
General5 hours ago
Australian celebrity chef Peter Russell-Clarke dies aged 89
-
Noosa News22 hours ago
Brisbane’s original 1979 electric trains take one last journey
-
General19 hours ago
Dr Daniel Hunt, Michael Long and Christine Anu celebrated at 50th NAIDOC Week Awards
-
Noosa News14 hours ago
Rainbow Beach surfer’s untold 7/7 story