April 2024

AdTracker

A neuromarketing tool that employs advanced eye tracking and breathing rate analytics to gauge consumer interaction with digital advertisements.

Tech Stack: React, JavaScript, CSS, Flask

01.

Project Motivation

In the dynamic realm of digital marketing, insights into consumer behavior are akin to gold dust, fueling strategic decisions and competitive edge. However, these insights have historically been the domain of deep-pocketed corporations, with the costs of neuromarketing tools acting as a barrier to entry for smaller players and creative individuals.

AdTracker emerged from the vision that every business, regardless of size or budget, should have access to sophisticated tools that decode consumer behavior. The foundational ethos was to level the playing field, empowering small companies and creative minds with the same arsenal of insights previously reserved for industry giants.

02.

Key Application Functions

Eye Tracking

Using WebGazer.js, a third party Javascript library, users’ eye positions are mapped to coordinates on a laptop screen. Some initial calibration is necessary to provide training data to the underlying machine learning model; once the advertisement starts playing, the model can predict where exactly the user is looking on the screen (signaled by the red dot). These coordinates are stored and fed back to users later for potential EDA.

Breath Rate Monitoring

While the advertisement is playing, we emit radio frequency chirps from the laptop speakers. These chirps reflect off of a user’s chest, and head back to the laptop. The time it takes to make this orbit is recorded, and the distance from the laptop to a user’s chest is calculated. As a user is breathing, their chest moves up and down, and therefore this distance oscillates. Using the recorded frequency of oscillation, we derived breathing rate. This information is subsequently displayed to the user at the conclusion of their advertisement.