We’ve already seen standalone apps like SNAQ, which lets you snap a photo of your food and learn how it might affect your glucose levels. But now, these tools are making their way directly into the AID systems people with diabetes use every day. Open-source Loop, a DIY automated insulin delivery system that works with Dexcom CGMs and Omnipod Dash insulin pumps, now has an experimental AI-powered food search built right into the app.
The new feature—developed by Loop user Taylor Patterson—uses large language models like Anthropic’s Claude or OpenAI to process images, barcodes, and even text or voice prompts. In seconds, it identifies foods, estimates portion sizes, and provides nutritional breakdowns with diabetes-oriented notes on glycemic index, fat-protein units, and absorption time. For people who never had in-depth carb counting education—or who simply get stumped at restaurants—this type of real-time feedback could be a game-changer for confidence and time in range.
I dove into this project on the latest episode of the Diabetech Podcast with Taylor Patterson himself. We talked about how he built the feature in just 30 minutes using AI, what it can do today, and why AI could reshape how we learn about food and diabetes. You can listen to the full episode up top or watch the video version at the bottom of this page.
Want more?
For the latest diabetes tech, join our free newsletter.
If you’re enjoying our content, consider joining Diabetech All Access—our premium membership with exclusive stories, Live Q&As, and industry analysis. Your support helps sustain our independent journalism and keeps this platform thriving.
Disclaimer: Diabetech content is not medical advice—it’s for educational purposes only. Always consult with a physician before making changes to your healthcare.