🩸Notion-Based Glucose Dashboard
A visualization of 14 days of my Glucose Data Collection (or yours if you want)
View the progress of your glucose journey at different points with my dashboard:

My earlier childhood memories include pricking my grandfather’s fingers, reading his glucose levels, grabbing insulin from the fridge and administering when he needed it before meals. This simple task he handed to me made me feel important. I was directly improving his quality of life. I realized healthcare is pretty fundamental to human connection. My grandfather passed away in 2016 from diabetes related complications. Great guy, not well equipped at managing diabetes.
Recently, Abbott launched the Abbott Lingo on the Amazon digital marketplace: I wrote about some of my thoughts regarding the pricing and volume behind Abbott’s Diabetes sector and its product mix here.
This technology is something that would have changed the way Pop-Pop and I would have interacted. He would have preferred other treats from the fridge.

Clinical researchers, data scientists and others in the life sciences space will be able to benefit greatly if they’re interested in:
- Blood glucose statistics
- Diabetes device utilization
- Blood glucose readings
- Demographics information
- Biometrics information
- Lifestyle data (TBD from Apple Watch)
In November 2024, I purchased an Abbott Lingo. I was excited to have this piece of over-the-counter technology after what seemed like a lifetime of waiting.
View videoEvery day I’m logging into Linkedin and seeing another “AI-first, no-code/low-code healthcare solution”. So I built this Glucose Dashboard after requesting my data from Abbott (interesting to note that they use Zendesk for their CRM):

This was a simple data request to Abbott’s Lingo Support, and they were happy to assist me and successfully returned a CSV with the following:
Reading date,Measurement,Unit of Measurement,Lingo Count Targets
2024-11-19T16:30:18.000Z,0.0,mg/dL,60
2024-11-19T16:35:18.000Z,0.0,mg/dL,60
2024-11-19T16:40:18.000Z,0.0,mg/dL,60
2024-11-19T16:45:18.000Z,0.0,mg/dL,60Not really the formatting I’m looking for or friendly with Notion.
It’s a common data cleaning procedure with clinical data to fix timestamp discrepancies to match the preferred visualization output. But how do we ensure that everyone’s on the same page? The Clinical Data Interchange Standards Consortium (CDISC) has implemented the use of the International Organization for Standardization (ISO) format, ISO 8601, for datetimes in Study Data Tabulation Model (SDTM) domains to alleviate the confusion. This has been adopted has an industry standard.
However, converting “datetimes” from the raw data source to the ISO 8601 format and vice versa is annoying. I imagine LLM’s fixing these issues in codebases being a solution area.
Let’s generate a script to fix this to separate out the date and time:
import csv
from datetime import datetime
def convert_date_to_24hr_time():
# Update file paths if needed
input_file_path = "/Users/prahlaad/Desktop/GlucoseDataPR.csv"
output_file_path = "/Users/prahlaad/Desktop/GlucoseDataPR_new.csv"
with open(input_file_path, mode='r', newline='', encoding='utf-8') as infile:
reader = csv.DictReader(infile)
# Final CSV columns:
fieldnames = [
"Reading date (original)",
"Date only",
"Time only",
"Measurement"
]
with open(output_file_path, mode='w', newline='', encoding='utf-8') as outfile:
writer = csv.DictWriter(outfile, fieldnames=fieldnames)
writer.writeheader()
for row in reader:
original_datetime = row["Reading date"] # Keep as-is
measurement_value = row["Measurement"] # Keep as-is
# Try to parse the original date/time string
try:
dt = datetime.strptime(original_datetime, "%Y-%m-%dT%H:%M:%S.%fZ")
# Format the date as MM-DD-YYYY
date_str = dt.strftime("%m-%d-%Y")
# Format the time in 24-hour HH:MM
time_str = dt.strftime("%H:%M")
except ValueError:
# If parsing fails, leave them blank or fallback
date_str = ""
time_str = ""
# Construct the new row
new_row = {
"Reading date (original)": original_datetime,
"Date only": date_str,
"Time only": time_str,
"Measurement": measurement_value
}
writer.writerow(new_row)
print(f"✅ Successfully created: {output_file_path}")
if __name__ == "__main__":
convert_date_to_24hr_time()Perfect, now it shows up in a better format.
Now we have the something that works better with Notion with a separate column for time:
Reading date (original),Date only,Time only,Measurement
2024-11-19T16:30:18.000Z,11-19-2024,16:30,0.0
2024-11-19T16:35:18.000Z,11-19-2024,16:35,0.0
2024-11-19T16:40:18.000Z,11-19-2024,16:40,0.0
2024-11-19T16:45:18.000Z,11-19-2024,16:45,0.0The key is being able to represent these data points in an insightful and accessible way: This is a presentation of my own glucose monitor data collected from my Abbott Lingo here:

The Lingo reads your glucose levels every 5 minutes. Using this data file, it generates the following dashboard with displays of:
- 14 Day Average from 11/19 - 12/3: Your average glucose level in mg/L per day
- Week 1 Average view from 11/19 - 11/26: Your average Week 1 glucose level in mg/L per day
- Week 2 Average view from 11/26 - 12/3: Your average Week 2 glucose level in mg/L per day
- AM View: Precise glucose level from 12 AM to 8 AM (30 min increments to fit more time)
- PM View: Precise glucose level from 8 PM to 8 AM (30 min increments to fit more time)
- 1 Hour View: Precise glucose level from start of the hour to the end of the hour exclusive (Starts at 8 AM)
Try My CGM Dashboard Below:
But this is just my own glucose data. I find myself frequently visiting and revisiting ideas and concepts related to giving patients more insights into their own health data for micro-decision making.
Maybe you’d like to get a chance to upload your own and view it here in my Notion Glucose Dashboard. Try it out on your own!
You can import a CSV to the page here and then use the Notion Chart below to view your averages across different scopes of time
My Glucose Data is currently filled in the database as a placeholder, but you can paste yours in after applying the script above to your Abbott Lingo CSV
Check out a visualization in the variance of my day to day glucose: https://matrices.com/share/graph/84f45057-c2c0-4ce8-aa28-1f49dbeb8bcb
Matrices is a quick way to explore, visualize, and share large datasets.
TBD through my Apple Watch data pull and more testing, but the idea here is visualizing that exercising and diet flattens the variance throughout usage to improve A1c (I exercised some days and ate poorly some, can you guess which?)
- You can either use a tool like Excel and sacrifice speed with hundreds of thousands of rows, or you can use SQL or code and settle for a glacial iteration speed between queries. More enterprisey tools like Tableau help with this, but they focus primarily on reporting, and exploration isn't easy.
- Matrices seeks to address these problems by...
- It's free to use, all of your data is stored locally until you hit 'publish'.
Do we need to keep building and maintaining new data infrastructure when a good analyst could do it with one SQL query in a structured environment?
I fear we downplayed importance of understanding who can get useful data in a pragmatic manner. I haven’t even been in the industry for long and yet I continue to constantly see “let’s create a new standard infrastructure where data folks can find their stuff easier”
We then end up back at square one (or zero) where you need someone who understand the data and digests it and displays it intelligently
Let’s chat.