Home Type 1What my continuous glucose monitor can never capture

What my continuous glucose monitor can never capture

by Mark E. Paull
0 comments Donate
AdobeStock 620419801 Editorial Use Only 1024x576

A continuous glucose monitor (CGM) can tell you when my glucose spikes, but not that I delayed correcting it because I was in a high-stakes meeting. It can record the number after I under-bolused for lunch, but not that I did it because I was running on three nights of broken sleep from overnight CGM alarms. It can show a perfect flatline, but not the pre-emptive vigilance, the mental load, the constant micro-calculations it took to get there.

Right now, our technology sees the aftermath of a decision but none of the logic that produced it.

Diabetes is not only metabolic; it’s emotional, cognitive, and situational. Until our systems acknowledge that, they will keep missing the why behind every what — and with it, the real path to adherence and improved clinical outcomes. The gap between success and failure in chronic care often has more to do with a patient’s context than their physiology.

Algorithms, as they exist, are pattern readers, not pattern understanders. They predict the curve but not the cause. They assume I am a consistent, rational actor in a static environment. I am not.

I am a human being making dozens of micro-decisions a day under shifting conditions — weighing hypoglycemia, embarrassment, fatigue, frustration, fear — often in milliseconds. Every data point hides a decision, and every decision hides a state of mind.

STAT Plus: In a new era, glucose sensors straddle the line between medical device and wellness tool

Humans spent decades teaching machines to model physiology while ignoring psychology. We’ve trained them to learn from glucose curves and insulin units, but not from hesitation, doubt, intuition, or fear. The result is a built-in limitation that makes predictive artificial intelligence fundamentally incomplete.

And that limitation cannot be fixed with more sensors.

The API for human behavior

What’s missing is behavioral intelligence as structured, machine-readable context — the cognitive, emotional, and environmental metadata that drives adherence. This is the API for human behavior, the layer required to close the loop on personalization. It allows a system to move beyond static correction factors and finally account for cognitive load, stress hormones, social pressure, and fatigue — all of which alter the way a person doses, absorbs, or trusts insulin. It would record:

Why did I make that correction?

Why did I wait?

Why did I override the bolus suggestion?

These aren’t random acts. They’re behavioral patterns shaped by emotion, environment, and past experience. Right now, these patterns vanish because our systems aren’t designed to capture them.

But they could be.

Imagine a CGM spike at 2 p.m. Instead of logging only the number, your device — via voice prompt or tap menu — asks a single simple question: “What’s happening?” And you select a tag or record a freeform answer: “Stressful meeting,” “Forgot pre-bolus,” “Pump site feels off,” “Chose to wait — low earlier today.” The system would record each freeform answer as a new tag for use in the future.

That one tag transforms the data. Repeat it over time, and patterns emerge that no glucose curve alone could reveal: You under-correct on Mondays after weekend lows. You over-bolus in anxious moments. You delay corrections during meetings.

This isn’t journaling. It’s frictionless tagging — the behavioral equivalent of counting steps. The technology exists. What’s missing is the recognition that this data is as clinically meaningful as the glucose reading itself.

Until diabetes technology understands why I do what I do, it will never personalize treatment. It will only personalize numbers.

When I talk about behavioral intelligence, I’m not talking about replacing human judgment. I’m talking about finally letting our lived logic — the real data of self-management — become part of the system.

My six months of lived logic

To demonstrate what this looks like, I built my own dataset — machine-readable behavioral context. For six months, I tagged every bolus and correction with structured metadata: cognitive state (focused, foggy, overwhelmed), emotional state (calm, anxious, frustrated), environmental context (alone, public setting, high-stakes situation), and physiological background (sleep-deprived, well-rested, recovering from a low).

The correlations are unmistakable. When my cognitive load is high and my emotional tag is “anxious” in a public setting, my rate of delayed bolusing increases by 40%. That’s a behavioral biomarker — invisible to the CGM alone.

The CGM sees a long high. Behavioral intelligence sees a patient delaying correction because of the cognitive demand of a public environment.

The CGM sees a lighter bolus. Behavioral intelligence sees protective behavior after multiple nights of broken sleep.

Mattel introduces its first Barbie with type 1 diabetes

My data shows that my “cautious correction” pattern appears only when I’m cognitively fatigued — not when I’m anxious. And that I consistently over-bolus in public, driven by fear of visible hyperglycemia. No glucose algorithm could infer that on its own.

This is the kind of intelligence that could make closed-loop systems genuinely adaptive. A pump could learn why I override suggestions — not label them “non-compliance.” It could predict when my 3 a.m. correction is a reasonable choice based on my pattern of morning rebounds.

Clinical and regulatory imperatives

For device makers and investors, behavioral intelligence is not “soft data.” It is the missing variable required for real precision and scalable personalization. Without it, companies will keep hitting ceilings in time-in-range (TIR), because AI cannot learn from any decision it cannot interpret.

Behavioral intelligence is not a feature. It is the risk-mitigation layer. It is the value driver.

For regulators, any device claiming personalization should be required to demonstrate that it incorporates structured behavioral metadata and can show measurable TIR improvement. Without that, “personalized AI” remains a marketing phrase, not clinical reality.

This is the next frontier in digital health—not a new sensor, not a better dashboard, but systems that can interpret behavior as data.

We’ve mastered the science of measurement. Now we need the science of meaning.

Behind every glucose graph is a person doing calculus with their own survival. Until our technology can see the reasons behind the numbers, it will continue to fail the human being behind the machine.

Mark E. Paull is a CME-certified diabetes educator and peer reviewer for Diabetes Care. He has lived with type 1 diabetes since 1967 and writes about the intersection of behavior, data, and care.

You may also like

Today’s Diabetes News, your ultimate destination for up-to-date and insightful information on diabetes, health tips, and living a fulfilling life with diabetes. Our mission is to empower and support individuals with diabetes, their loved ones, and the wider community by providing reliable, relevant, and engaging content that fosters a healthier and happier life.

Most Viewed Articles

Latest Articles

Are you sure want to unlock this post?
Unlock left : 0
Are you sure want to cancel subscription?
Show/Hide Player
-
00:00
00:00
Update Required Flash plugin
-
00:00
00:00