PMID- 38186565 OWN - NLM STAT- PubMed-not-MEDLINE LR - 20240109 IS - 1530-437X (Print) IS - 1530-437X (Linking) VI - 23 IP - 23 DP - 2023 Dec TI - Fine-Grained Intoxicated Gait Classification Using a Bilinear CNN. PG - 29733-29748 LID - 10.1109/jsen.2023.3248868 [doi] AB - Consuming excessive amounts of alcohol causes impaired mobility and judgment and driving accidents, resulting in more than 800 injuries and fatalities each day. Passive methods to detect intoxicated drivers beyond the safe driving limit can facilitate Just-In-Time alerts and reduce Driving Under the Influence (DUI) incidents. Popularly-owned smartphones are not only equipped with motion sensors (accelerometer and gyroscope) that can be employed for passively collecting gait (walk) data but also have the processing power to run computationally expensive machine learning models. In this paper, we advance the state-of-the-art by proposing a novel method that utilizes a Bi-linear Convolution Neural Network (BiCNN) for analyzing smartphone accelerometer and gyroscope data to determine whether a smartphone user is over the legal driving limit (0.08) from their gait. After segmenting the gait data into steps, we converted the smartphone motion sensor data to a Gramian Angular Field (GAF) image and then leveraged the BiCNN architecture for intoxication classification. Distinguishing GAF-encoded images of the gait of intoxicated vs. sober users is challenging as the differences between the classes (intoxicated vs. sober) are subtle, also known as a fine-grained image classification problem. The BiCNN neural network has previously produced state-of-the-art results on fine-grained image classification of natural images. To the best of our knowledge, our work is the first to innovatively utilize the BiCNN to classify GAF encoded images of smartphone gait data in order to detect intoxication. Prior work had explored using the BiCNN to classify natural images or explored other gait-related tasks but not intoxication Our complete intoxication classification pipeline consists of several important pre-processing steps carefully adapted to the BAC classification task, including step detection and segmentation, data normalization to account for inter-subject variability, data fusion, GAF image generation from time-series data, and a BiCNN classification model. In rigorous evaluation, our BiCNN model achieves an accuracy of 83.5%, outperforming the previous state-of-the-art and demonstrating the feasibility of our approach. FAU - Li, Ruojun AU - Li R AD - Department of Optical Information, Huazhong University of Science and Technology, Wuhan, China. AD - Department of Electrical and Computer Engineering, Worcester Polytechnic Institute(WPI), Worcester, MA, USA. FAU - Agu, Emmanuel AU - Agu E AD - Computer Science Department, Worcester Polytechnic Institute, Worcester, MA, USA. FAU - Sarwar, Atifa AU - Sarwar A AD - computer science with Magna Cum Laude. FAU - Grimone, Kristin AU - Grimone K AD - Ohio State University. FAU - Herman, Debra AU - Herman D AD - Department of Psychiatry and Human Behavior and a Research Psychologist in the Behavioral Medicine and Addictions Research group at Butler Hospital. FAU - Abrantes, Ana M AU - Abrantes AM AD - Behavioral Medicine and Addictions Research at Butler Hospital and a Professor in the Department of Psychiatry and Human Behavior at the Alpert Medical School of Brown University. FAU - Stein, Michael D AU - Stein MD AD - Chair of Health Law, Policy & Management at Boston University. LA - eng GR - R21 AA025193/AA/NIAAA NIH HHS/United States PT - Journal Article DEP - 20230407 PL - United States TA - IEEE Sens J JT - IEEE sensors journal JID - 101212357 PMC - PMC10769125 MID - NIHMS1948738 OTO - NOTNLM OT - Blood Alchol Content (BAC) OT - Convolutional Neural Networks (CNNs) OT - Gait Analysis OT - Neural Networks EDAT- 2024/01/08 06:42 MHDA- 2024/01/08 06:43 PMCR- 2024/12/01 CRDT- 2024/01/08 04:26 PHST- 2024/12/01 00:00 [pmc-release] PHST- 2024/01/08 06:43 [medline] PHST- 2024/01/08 06:42 [pubmed] PHST- 2024/01/08 04:26 [entrez] AID - 10.1109/jsen.2023.3248868 [doi] PST - ppublish SO - IEEE Sens J. 2023 Dec;23(23):29733-29748. doi: 10.1109/jsen.2023.3248868. Epub 2023 Apr 7.