Collecting information through smart watches can spot insulin resistance (IR) before it is detected in routine clinical tests, which could facilitate lifestyle interventions that prevent progression to type 2 diabetes.
A study in Nature revealed how patterns in data continuously collected during routine activities through wearable devices could reveal early signs of metabolic dysfunction.
When this data was integrated with routine blood biomarkers and demographic information using a machine learning model, it predicted IR in a way that might not be detectable through the usual clinic visits.
The findings suggest that collecting data in this way can reveal early physiological stress that intermittent assessments are poorly suited to detect.
“By drawing on continuous signals from daily life, the authors’ approach highlights physiological strain that is invisible to episodic testing,” explained Christopher Hartshorn, PhD, from the National Institutes of Health in an accompanying News & Views article.
“The work raises the possibility that identifying insulin resistance—a key early feature of type 2 diabetes—earlier could enable simpler interventions and, ultimately, reduce the downstream burden of metabolic disease.”
Blood sugar levels can remain in the normal range even when there is increasing physiological effort necessary to maintain these, leaving IR under the radar of isolated clinical assessments that are performed under standardized conditions.
In contrast, smart-watch data captures fluctuations in activity, sleep and cardiovascular function that can reflect, over time, the cumulative demands of metabolic regulation.
To investigate further, a team led by Ahmed Metwally, PhD, who heads the metabolic health AI research program at Google, conducted the WEAR-ME study in 1165 people with a mean body mass index of 28 mg/m2, median age of 45 years and median hemoglobin A1c of 5.5%.
Using longitudinal signals collected from Fitbit and Google Pixel watch devices integrated with demographic characteristics and readily available blood biomarkers, the researchers trained a computational model against the homeostatic model assessment of IR (HOMA-IR).
The initial model identified stable patterns linked to IR and improved predictive performance compared with any single data source alone.
Using a HOMA-IR cut-off of 2.9, the multimodal model was robust in detecting IR when combining data from the wearable devices, together with demographic and routine blood biomarker data. It gave an area under the receiver operating characteristic curve of 0.80, sensitivity of 76%, and specificity of 84%.
The wearable foundation model (WFM) was then pretrained on 40 million hours of sensor data to pre-tune it and then tested on an independent validation cohort of 72 individuals.
A model integrating WFM-derived information with demographic data was more accurate than a demographics-only baseline, with an AUROC of 0.75 versus 0.66.
Adding WFM-derived data to a model that included demographics, fasting glucose and a lipid panel substantially improved performance compared with an identical model without data from wearables, with a corresponding AUROC of 0.88 compared with 0.76.
“In this study, we present a method for predicting IR using signals derived from a consumer smartwatch, demographics and routinely measured blood biomarkers,” the authors concluded.
“This method has the potential to be scaled to millions of people, and to enable widespread identification of IR.”
