CONTEXTCarbs & Cals is a nutrition app that helps people, especially those managing diabetes,track their meals with visual portion guides and accurate nutritional data.
But as our user base grew, so did a common pain point: logging food was often slow, repetitive, and incomplete.
3 main problemsEven with a database of over 270,000 foods, users were hitting friction at nearly every step. Through research, we identified three recurring challenges:
Logging Fatigue
entering or searching for meals multiple times a day felt repetitive and discouraging.
Database Gaps
despite the scale of our food library, packaged products from certain regions or smaller brands were still missing, frustrating users who rely on precise tracking.
Irrelevant Recents
the existing “Recents” section simply listed viewed items in order, ignoring context like time of day, frequency, or meal type.
02
Outcomes → 3 Features AI-powered Mindfully Crafted
Rather than adding AI for the sake of it, we focused on where it could make a real difference: reducing friction, saving time, and personalising the experience.
By combining behavioural insights with intelligent automation, we reimagined three key moments of the user journey that turn out as 3 new features.
FEATURES SHIPPED
🎯
Label Scanner
Using AI to extract nutritional info from packaged foods
🎯
Smart Suggestions
Transforming “Recents” into personalised meal recommendations.
🎯
Meal Scanner
To log meals with a photo. ✍️ WIP: User Testing Phase
outcomesEach feature tackled a specific user frustration, but together they reshaped the entire food-logging experience, making it smarter, faster, and more human.
Users report that logging meals feels quicker and more intuitive.
Feedback suggests less frustration when foods are missing from the database.
The AI features help users explore new foods and stay consistent with tracking.
Internal testing indicates that time saved per meal is noticeable, and early adoption of these features is encouraging.
The Features in depth
Label Scanner: Logging Packaged Food Instantly
CONTEXT & PROBLEM
Carbs & Cals users relied on a Barcode Scanner to log packaged foods and it worked well → as long as the product existed in our provider’s database.
User research showed that missing data was one of the top reasons users dropped off. People living outside the UK struggled the most, since our database primarily covered British brands.
We explored integrating open food databases but it had trade-offs in accuracy and scalability.
That’s when we decided to explore extracting nutritional data directly from the source: the food label itself.
“I can’t find half of my local supermarket’s items.”
Anonymous User
“I can’t find half of my local supermarket’s items.”
Anonymous User
process
Nutrition labels are tricky.They’re not just text: they mix layouts, numbers, and units that shift meaning based on context.
The first proposal used a compressed model (LiLT) that understood visual layouts and text simultaneously. In lab testing, it achieved 86% accuracy but once real-world photos entered the mix (poor lighting, blurry text...) OCR quality dropped reducing accuracy to around 70%. Instead of spending months debugging OCR noise, we pivoted and tested a larger off-device LLM using prompt engineering to handle complex nutritional layouts.
We achieved: 95% accuracy, multilingual support, 1–2 second response times.
The user confirms the name and optionally uploads a photo of the product.
It’s simple, intuitive, and gives users control without requiring them to type or search.
outcomesIn less than a quarter, we shipped a high-impact, low-risk feature:
✅
95% accuracy
in parsing nutrition labels
🌐
Multilingual Support
unlocked for global users
🚀
Time-to-market reduced by months
vs. database expansion
This project reinforced a key product lesson: AI only adds value when it reduces friction. Instead of chasing novelty, we designed a practical solution that scaled.
WHAT WE DIDSmart Suggestions was our first AI feature, replacing the old Recents list.
The challenge was to make recommendations that actually mattered to usersby analysing habits in different meal buckets (breakfast, lunch, dinner, and snacks), sowe created an algorithm that combines frequency, time of day, recent views, and individual behaviour. This replaced a static list with a dynamic, personalised experience. The result is faster, more relevant logging and a system that adapts to each user’s habits.
the challengeCarbs & Cals users fall into two main behavioural groups:
📝
Food Loggers
who track meals daily across breakfast, lunch, dinner, and snacks.
👀
Food Viewers
who browse foods to learn about their carb or calorie content, without logging.
A single logic for both groups wouldn’t work. Loggers needed context-aware recommendations while viewers needed relevant reminders based on browsing patterns.
Behind the scenes
Our challenge was to design an algorithm (and UI) that felt personally useful to both types of users.
outcomes
🎯
28% faster logging
Users logged meals faster compared to the old recents list
🎯
35% more interaction
“View-only” users interacted with food more often
Smart Suggestions became a subtle but powerful feature → one that showed how AI can enhance everyday usability without drawing attention to itself.
It set the foundation for the later AI features like the Meal Scanner and Label Scanner, proving that thoughtful personalisation starts with understanding different user intentions, not just adding complexity.
03
Key Learnings
These projects reinforced that AI should always solve real user problems, rather than being implemented for the sake of technology.
Iteration and testing were critical to ensuring that solutions aligned with user needs, business goals, and technical constraints.
Finally, close collaboration with cross-functional teams was essential to create technically feasible solutionsthat were also intuitive and user-friendly.
CONTEXTCarbs & Cals is a nutrition app that helps people, especially those managing diabetes,track their meals with visual portion guides and accurate nutritional data.
But as our user base grew, so did a common pain point: logging food was often slow, repetitive, and incomplete.
3 main problemsEven with a database of over 270,000 foods, users were hitting friction at nearly every step. Through research, we identified three recurring challenges:
Logging Fatigue
entering or searching for meals multiple times a day felt repetitive and discouraging.
Database Gaps
despite the scale of our food library, packaged products from certain regions or smaller brands were still missing, frustrating users who rely on precise tracking.
Irrelevant Recents
the existing “Recents” section simply listed viewed items in order, ignoring context like time of day, frequency, or meal type.
02
Outcomes → 3 Features AI-powered Mindfully Crafted
Rather than adding AI for the sake of it, we focused on where it could make a real difference: reducing friction, saving time, and personalising the experience.
By combining behavioural insights with intelligent automation, we reimagined three key moments of the user journey that turn out as 3 new features.
FEATURES SHIPPED
🎯
Label Scanner
Using AI to extract nutritional info from packaged foods
🎯
Smart Suggestions
Transforming “Recents” into personalised meal recommendations.
🎯
Meal Scanner
To log meals with a photo. ✍️ WIP: User Testing Phase
outcomesEach feature tackled a specific user frustration, but together they reshaped the entire food-logging experience, making it smarter, faster, and more human.
Users report that logging meals feels quicker and more intuitive.
Feedback suggests less frustration when foods are missing from the database.
The AI features help users explore new foods and stay consistent with tracking.
Internal testing indicates that time saved per meal is noticeable, and early adoption of these features is encouraging.
The Features in depth
Label Scanner: Logging Packaged Food Instantly
CONTEXT & PROBLEM
Carbs & Cals users relied on a Barcode Scanner to log packaged foods and it worked well → as long as the product existed in our provider’s database.
User research showed that missing data was one of the top reasons users dropped off. People living outside the UK struggled the most, since our database primarily covered British brands.
We explored integrating open food databases but it had trade-offs in accuracy and scalability.
That’s when we decided to explore extracting nutritional data directly from the source: the food label itself.
“I can’t find half of my local supermarket’s items.”
Anonymous User
“I can’t find half of my local supermarket’s items.”
Anonymous User
process
Nutrition labels are tricky.They’re not just text: they mix layouts, numbers, and units that shift meaning based on context.
The first proposal used a compressed model (LiLT) that understood visual layouts and text simultaneously. In lab testing, it achieved 86% accuracy but once real-world photos entered the mix (poor lighting, blurry text...) OCR quality dropped reducing accuracy to around 70%. Instead of spending months debugging OCR noise, we pivoted and tested a larger off-device LLM using prompt engineering to handle complex nutritional layouts.
We achieved: 95% accuracy, multilingual support, 1–2 second response times.
The user confirms the name and optionally uploads a photo of the product.
It’s simple, intuitive, and gives users control without requiring them to type or search.
outcomesIn less than a quarter, we shipped a high-impact, low-risk feature:
✅
95% accuracy
in parsing nutrition labels
🌐
Multilingual Support
unlocked for global users
🚀
Time-to-market reduced by months
vs. database expansion
This project reinforced a key product lesson: AI only adds value when it reduces friction. Instead of chasing novelty, we designed a practical solution that scaled.
WHAT WE DIDSmart Suggestions was our first AI feature, replacing the old Recents list.
The challenge was to make recommendations that actually mattered to usersby analysing habits in different meal buckets (breakfast, lunch, dinner, and snacks), sowe created an algorithm that combines frequency, time of day, recent views, and individual behaviour. This replaced a static list with a dynamic, personalised experience. The result is faster, more relevant logging and a system that adapts to each user’s habits.
the challengeCarbs & Cals users fall into two main behavioural groups:
📝
Food Loggers
who track meals daily across breakfast, lunch, dinner, and snacks.
👀
Food Viewers
who browse foods to learn about their carb or calorie content, without logging.
A single logic for both groups wouldn’t work. Loggers needed context-aware recommendations while viewers needed relevant reminders based on browsing patterns.
Behind the scenes
Our challenge was to design an algorithm (and UI) that felt personally useful to both types of users.
outcomes
🎯
28% faster logging
Users logged meals faster compared to the old recents list
🎯
35% more interaction
“View-only” users interacted with food more often
Smart Suggestions became a subtle but powerful feature → one that showed how AI can enhance everyday usability without drawing attention to itself.
It set the foundation for the later AI features like the Meal Scanner and Label Scanner, proving that thoughtful personalisation starts with understanding different user intentions, not just adding complexity.
03
Key Learnings
These projects reinforced that AI should always solve real user problems, rather than being implemented for the sake of technology.
Iteration and testing were critical to ensuring that solutions aligned with user needs, business goals, and technical constraints.
Finally, close collaboration with cross-functional teams was essential to create technically feasible solutionsthat were also intuitive and user-friendly.