A large sensor foundation model pre-trained on continuous glucose monitor data for diabetes management

Abstract

Continuous glucose monitoring (CGM) combined with AI offers new opportunities for proactive diabetes management through real-time glucose forecasting. However, most existing models are task-specic and lack generalization across patient populations. Inspired by the autoregressive paradigm of large language models, we introduce CGM-LSM, a Transformer decoder-based Large Sensor Model (LSM) pretrained on 1.6 million CGM records from patients with different diabetes types, ages, and genders. We model patients as sequences of glucose time steps to learn latent knowledge embedded in CGM data and apply it to the prediction of glucose readings for a 2-h horizon. Compared with prior methods, CGM-LSM signi cantly improves prediction accuracy and robustness. A 48.51% reduction in root mean square error in 1-h horizon forecasting and consistent zero-shot prediction performance across held-out patient groups. We analyze model performance variations across patient subgroups and prediction scenarios and outline key opportunities and challenges for advancing CGM foundation models.

Date
Feb 4, 2026 12:00 PM — 12:30 PM
Event
EMIL Spring'26 Seminars
Location
Online (Zoom)
Saman Khamesian
Saman Khamesian
Graduate Research Associate

I am a Ph.D. researcher at Arizona State University, specializing in Artificial Intelligence with a focus on health applications. As a Graduate Research Assistant in the EMIL Lab under Dr. Hassan Ghasemzadeh, I work on developing advanced machine learning solutions for Type 1 diabetes management, including personalized glucose forecasting and automated insulin delivery systems.