featured

CAN-STRESS: A Real-World Multimodal Dataset for Understanding Cannabis Use, Stress, and Physiological Responses

The CAN-STRESS dataset provides multimodal physiological and self-reported data from 82 participants (39 cannabis users and 43 non-users) collected in real-world conditions using Empatica E4 wristbands. Preliminary analysis shows machine learning models can distinguish users from non-users with high accuracy, with electrodermal activity and heart rate emerging as key predictors.

Time-Aware Cross-Attention for Multi-Modal Sensor-Based Blood Glucose Forecasting

This paper introduces a multimodal blood glucose forecasting framework that combines time-aware cross-attention with an LSTM to predict glucose levels from CG) data and complementary wearable signals (heart rate, EDA, accelerometry, and diet).

LLM-Powered Prediction of Hyperglycemia and Discovery of Behavioral Treatment Pathways from Wearables and Diet

We developed GlucoLens, that takes sensor-driven inputs and uses advanced data processing, large language models, and explainable machine learning models to predict postprandial AUC and hyperglycemia from diet, physical activity, and recent glucose patterns.