To identify a correlation between IMU and microphone data in earable computing with regards to chewing
More Info
expand_more
Abstract
This research explores the correlation between chewing activities and non-chewing activities using an Arduino microcontroller. Chewing samples are recorded by attaching the microcontroller to the back of the jaw, underneath the ear. The microcontroller collects data from a microphone capturing audio data of chewing sounds, as well as an Inertial Measurement Unit (IMU) collecting motion and orientation data of jaw movements.
The collected data is processed using signal processing techniques, extracting relevant features from the microphone data, such as intensity, frequency content, and features related to acceleration, orientation, and jaw movement patterns from the IMU data. Statistical analysis, employing correlation metrics like Pearson correlation coefficient and Spearman's rank correlation coefficient, determines the correlation between the extracted features from the microphone and IMU data.
Conclusions from the analysis indicate that pre-processing and feature extraction techniques are needed to establish meaningful correlations between the IMU and microphone data. The sliding window approach shows promising results, particularly in correlating the sum of energy from the audio with the sum of gyro data, specifically in the y- and z-axes. The accelerometer data does not exhibit significant correlations, but it can be useful as a threshold for detecting the start of chewing events based on zero crossings.
Furthermore, the findings reveal that food texture and density play a larger role than anticipated in determining the correlation between chewing patterns and sensory data. The research outcomes contribute to various fields, including dentistry, nutrition, and human-computer interaction.