A new way to perform A/B tests and UX studies

Jun 26, 2016 01:55 GMT  ·  By

Three researchers from Brown University have created a JavaScript library that allows them to estimate what a person might be looking at inside a Web page using the user's webcam and nothing else except JavaScript code.

Called WebGazer.js, the library only requires users to trust it enough to give it access to their webcam.

Once this happens, the library uses three different computer vision algorithms to detect the user's face and eyes in the webcam footage. WebGazer pays special attention to the user's eyes and their features.

WebGazer is accurate up to 100px

Based on this data, WebGazer then trains a module that maps the position of the user's eyes with mouse movements, mouse clicks, and page scrolling to predict a general area of the screen the user might be looking at. Researchers say WebGazer has an accuracy of around 100 pixels.

WebGazer can store data between different sessions using localStorage, and in case the library's code affects the browser's performance, users can pause and resume on command.

The research team points out that, compared to alternatives, WebGazer is a plug-n-play solution because it doesn't need any special hardware outside the webcam, and doesn't need special calibration.

No calibration required

Alternative solutions usually require the user to look and click in the screen corners in turn, in order to define boundaries and to detect special eye positional cue points.

Additionally, the library's modular design allows developers to swap any of the tree eye detection algorithms with their own.

Researchers say WebGazer works in real time and supports Google Chrome 47+, Microsoft Edge 13+, Mozilla Firefox 44+, and Opera 36+. The library's code has been open-sourced under the GPL license and published on GitHub.

In practice, the library is a more efficient way of carrying out A/B tests and UX/UI studies.