Learn how to use RealEye.io - webcam eye-tracking platform.

Beata Lewandowska
Written by Beata Lewandowska

Facial coding and K-coefficient


RealEye provides analysis based on facial landmarks using a popular method invented by Paul Ekman - Facial Action Coding System.

Facial Action Coding System (FACS) is a system to taxonomize human facial movements by their appearance on the face, based on a system originally developed by a Swedish anatomist named Carl-Herman Hjortsjö.[1] It was later adopted by Paul Ekman and Wallace V. Friesen, and published in 1978.[2] Ekman, Friesen, and Joseph C. Hager published a significant update to FACS in 2002.

There are 3 emotions available in RealEye:

  • happy, based on the cheek raising and the lip corner pulling (so the smile),
  • surprise, based on brow-raising, slightly rising the upper lid, and the jaw-dropping.
  • neutral, based on lack of lip pulling, lack of eyebrows movement, and lack of mouth opening,

On the same plot, there is one more parameter available: K-coefficient and its mean value. This parameter indicates attention level, where:

  • K >0 means relatively long fixations followed by short saccade amplitudes, indicating focal processing,
  • K < 0 means relatively short fixations followed by relatively long saccades, indicating ambient processing.

This parameter was provided based on the following article (available here):

Krzysztof Krejtz, Andrew Duchowski, Izabela Krejtz, Agnieszka Szarkowska, and Agata Kopacz. 2016. Discerning Ambient/Focal Attention with Coefficient K. ACM Trans. Appl. Percept. 13, 3, Article 11 (May 2016), 20 pages. https://doi.org/10.1145/2896452


If you want to learn more about the K-coefficient (how it works and how it can be used), please read the article on our blog.


The data is being collected at ~30Hz rate (assuming that the webcam is 30Hz).

The data is available for each participant separately and also in an aggregated way.

Please also remember, that talking during the study can influence facial features, hence it influences emotion reading.

Here you can find a case study with an example of how you can use these additional insights based on the entertainment (happiness) level.

You can also download the emotion data as a CSV file.

Categories: