CONTEXTCarbs & Cals is an app that helps people managing diabetes,track their meals with visual portion guides and nutritional data.
But as our user base grew, so did a common pain point: logging food was slow, repetitive, and incomplete.
3 main problemsEven with a database of over 270,000 foods, users were hitting friction. Through research (recurrent surveys, diary studies and user interviews), we identified three main challenges:
Logging Fatigue
entering or searching for meals multiple times a day felt repetitive and discouraging.
Database Gaps
despite our extensive food library, foods from certain regions or smaller brands were still missing and that was frustrating for some users, specially those who don’t live in the UK
Irrelevant Recents
the existing “Recents” section simply listed viewed items in order, ignoring context like time of day, frequency, or meal type.
02
Solution: 3 Individual AI- Powered Features
Rather than adding AI for the sake of it, we focused on where it could make a real difference: reducing friction, saving time and personalising the experience.
Based on the 3 problems we identified, we used AI to reimagine key moments of the user journey. First, our initial idea was to create some kind of chatbot but after a few rounds of user testing it turned out as 3 new features.
FEATURES SHIPPED
🧃
Label Scanner
Using AI to extract nutritional info from packaged foods
⭐️
Smart Suggestions
Transforming “Recents” into personalised meal recommendations.
📷
Meal Scanner
To log meals with a photo. ✍️ WIP: User Testing Phase
outcomesAll these features changed the entire food-logging experience, making it smarter, faster and more human.
Users reported that logging meals feels quicker and more intuitive.
Feedback from Helpdesk showed less frustration when foods are missing from the database.
The beta testing of the Meal Scanner indicated that time saved per meal was increased and after shipping, the early adoption of these features was encouraging.
03
Project Details
The Features in depth
Label Scanner: Logging Packaged Food Instantly
CONTEXT & PROBLEM
Carbs & Cals users relied on a Barcode Scanner to log packaged foods and it worked well but only as long as the product existed in the database.
User research showed that missing data was one of the top reasons users dropped off. People living outside the UK where the ones struggling the most, as our database primarily covered British brands.
We explored integrating open food databases but it had trade-offs in accuracy and scalability.
That’s when we decided to explore extracting nutritional data directly from the source: the food label itself.
“I can’t find half of my local supermarket’s items.”
Anonymous User
processHowever, nutrition labels are tricky.They’re not just text: they mix layouts, numbers and units with different meaning based on context.
The first proposal used a compressed model (LiLT) that understood visual layouts and text simultaneously. In internal testing, it achieved 86% accuracy but once real world photos entered the mix (poor lighting conditions, blurry texts...) the quality dropped reducing accuracy to around 70%. Instead of spending months debugging, the dev team pivoted and tested a larger off-device LLM using prompt engineering to handle complex nutritional layouts.
We achieved: 95% accuracy, multilingual support and 1–2 second response times.
The user confirms the name and optionally uploads a photo of the product.
It’s simple, intuitive, and gives users control without requiring them to type or search.(the image below is blurred due to NDA reasons) 👇
outcomesIn less than a quarter, we shipped a high-impact, low-risk feature:
✅
95% accuracy
in parsing nutrition labels
🌐
Multilingual Support
unlocked for global users
🚀
Time-to-market reduced by months
vs. database expansion
This project reinforced a key product lesson: AI only adds value when it reduces friction. Instead of chasing novelty and working for month on a complex AI feature, we designed a practical solution in 1 month.
WHAT WE DIDSmart Suggestions was our first AI feature with the main goal of replacing the old Recents list.
The challenge was to make recommendations that actually mattered to usersby analysing habits in different meal buckets (breakfast, lunch, dinner, and snacks), sowe created an algorithm combining frequency, time of day, recent views, and individual behaviour. This replaced a static list with a dynamic and personalised experience. This ended up with a faster and more relevant logging and a list that adapts to each user’s habits.
the challengeCarbs & Cals users fall into two main groups:
📝
Food Loggers
who track meals daily across breakfast, lunch, dinner, and snacks.
👀
Food Viewers
who browse foods to learn about their carb or calorie content, without logging.
A single logic for both groups wouldn’t work. Loggers needed context-based recommendations while viewers needed relevant reminders based on browsing patterns.
Behind the scenes
Our challenge was to design an algorithm (and UI) that felt personally useful to both types of users.
outcomes
🎯
28% faster logging
Users logged meals faster compared to the old recents list
🎯
35% more interaction
“View-only” users interacted with food more often
Smart Suggestions became a subtle but powerful feature → one that showed how AI can have an impact in everyday usability without drawing attention to itself.
04
Key Learnings
These projects confirmed that AI should always solve real user problems, rather than just being implemented for the sake of technology.
Iteration and testing were critical to make sure that solutions aligned with user needs, business goals and technical constraints.
Finally, close collaboration with devs and QA was essential to create technically feasible solutionsthat were also intuitive and user-friendly.
CONTEXTCarbs & Cals is an app that helps people managing diabetes,track their meals with visual portion guides and nutritional data.
But as our user base grew, so did a common pain point: logging food was slow, repetitive, and incomplete.
3 main problemsEven with a database of over 270,000 foods, users were hitting friction. Through research (recurrent surveys, diary studies and user interviews), we identified three main challenges:
Logging Fatigue
entering or searching for meals multiple times a day felt repetitive and discouraging.
Database Gaps
despite our extensive food library, foods from certain regions or smaller brands were still missing and that was frustrating for some users, specially those who don’t live in the UK
Irrelevant Recents
the existing “Recents” section simply listed viewed items in order, ignoring context like time of day, frequency, or meal type.
02
Solution: 3 Individual AI- Powered Features
Rather than adding AI for the sake of it, we focused on where it could make a real difference: reducing friction, saving time and personalising the experience.
Based on the 3 problems we identified, we used AI to reimagine key moments of the user journey. First, our initial idea was to create some kind of chatbot but after a few rounds of user testing it turned out as 3 new features.
FEATURES SHIPPED
🧃
Label Scanner
Using AI to extract nutritional info from packaged foods
⭐️
Smart Suggestions
Transforming “Recents” into personalised meal recommendations.
📷
Meal Scanner
Using AI to log meals with a photo. ✍️ WIP: User Testing Phase
outcomesAll these features changed the entire food-logging experience, making it smarter, faster and more human.
Users reported that logging meals feels quicker and more intuitive.
Feedback from Helpdesk showed less frustration when foods are missing from the database.
The beta testing of the Meal Scanner indicated that time saved per meal was increased and after shipping, the early adoption of these features was encouraging.
03
Project Details
Label Scanner: Logging Packaged Food Instantly
CONTEXT & PROBLEM
Carbs & Cals users relied on a Barcode Scanner to log packaged foods and it worked well but only as long as the product existed in the database.
User research showed that missing data was one of the top reasons users dropped off. People living outside the UK where the ones struggling the most, as our database primarily covered British brands.
We explored integrating open food databases but it had trade-offs in accuracy and scalability.
That’s when we decided to explore extracting nutritional data directly from the source: the food label itself.
“I can’t find half of my local supermarket’s items.”
Anonymous User
process
However, nutrition labels are tricky.They’re not just text: they mix layouts, numbers and units with different meaning based on context.
The first proposal used a compressed model (LiLT) that understood visual layouts and text simultaneously. In internal testing, it achieved 86% accuracy but once real world photos entered the mix (poor lighting conditions, blurry texts...) the quality dropped reducing accuracy to around 70%. Instead of spending months debugging, the dev team pivoted and tested a larger off-device LLM using prompt engineering to handle complex nutritional layouts.
We achieved: 95% accuracy, multilingual support and 1–2 second response times.
The user confirms the name and optionally uploads a photo of the product.
It’s simple, intuitive, and gives users control without requiring them to type or search.
outcomesIn less than a quarter we achieved:
✅
95% accuracy
in parsing nutrition labels
🌐
Multilingual Support
unlocked for global users
🚀
Time-to-market reduced by months
vs. database expansion
This project reinforced a key product lesson: AI only adds value when it reduces friction. Instead of chasing novelty and working for month on a complex AI feature, we designed a practical solution in 1 month.
WHAT WE DIDSmart Suggestions was our first AI feature with the main goal of replacing the old Recents list.
The challenge was to make recommendations that actually mattered to usersby analysing habits in different meal buckets (breakfast, lunch, dinner, and snacks), sowe created an algorithm combining frequency, time of day, recent views, and individual behaviour. This replaced a static list with a dynamic and personalised experience. This ended up with a faster and more relevant logging and a list that adapts to each user’s habits.
the challengeCarbs & Cals users fall into two main groups:
📝
Food Loggers
who track meals daily across breakfast, lunch, dinner, and snacks.
👀
Food Viewers
who browse foods to learn about their carb or calorie content, without logging.
A single logic for both groups wouldn’t work. Loggers needed context-based recommendations while viewers needed relevant reminders based on browsing patterns.
Behind the scenes
Our challenge was to design an algorithm (and UI) that felt personally useful to both types of users.(the image below is blurred due to NDA reasons) 👇
outcomes
🎯
28% faster logging
Users logged meals faster compared to the old recents list
🎯
35% more interaction
“View-only” users interacted with food more often
Smart Suggestions became a subtle but powerful feature → one that showed how AI can have an impact in everyday usability without drawing attention to itself.
Meal Scanner:Coming Soon! Currently in Beta Testing 🧪
04
Key Learnings
These projects confirmed that AI should always solve real user problems, rather than just being implemented for the sake of technology.
Iteration and testing were critical to make sure that solutions aligned with user needs, business goals and technical constraints.
Finally, close collaboration with devs and QA was essential to create technically feasible solutionsthat were also intuitive and user-friendly.