Judit Salvadó
💻 Currently @Pocketworks
Frame 120

Making food logging smarter with AI for people living with diabetes

AI-powered features that save time, reduce friction and make healthy habits easier for more than 70,000 active monthly users.
ROLE Lead Product Designer
CLIENT Carbs & Cals
2024 - present
TEAM PO, Flutter Devs, Growth Manager, QA & UX Designers
Smart Suggestions
Label Reader
Meal Scanner

01

Starting Point: Why we needed AI

CONTEXT Carbs & Cals is an app that helps people managing diabetes, track their meals with visual portion guides and nutritional data.

But as our user base grew, so did a common pain point: logging food was slow, repetitive, and incomplete.
image 3
3 main problems Even with a database of over 270,000 foods, users were hitting friction. Through research (recurrent surveys, diary studies and user interviews), we identified three main challenges:

Logging Fatigue

entering or searching for meals multiple times a day felt repetitive and discouraging.

Database Gaps

despite our extensive food library, foods from certain regions or smaller brands were still missing and that was frustrating for some users, specially those who don’t live in the UK

Irrelevant Recents

the existing “Recents” section simply listed viewed items in order, ignoring context like time of day, frequency, or meal type.

02

Solution: 3 Individual AI- Powered Features

Rather than adding AI for the sake of it, we focused on where it could make a real difference: reducing friction, saving time and personalising the experience.

Based on the 3 problems we identified, we used AI to reimagine key moments of the user journey. First, our initial idea was to create some kind of chatbot but after a few rounds of user testing it turned out as 3 new features.
FEATURES SHIPPED

🧃

Label Scanner

Using AI to extract nutritional info from packaged foods

⭐️

Smart Suggestions

Transforming “Recents” into personalised meal recommendations.

📷

Meal Scanner

To log meals with a photo. ✍️ WIP: User Testing Phase
outcomes All these features changed the entire food-logging experience, making it smarter, faster and more human.

  • Users reported that logging meals feels quicker and more intuitive.
  • Feedback from Helpdesk showed less frustration when foods are missing from the database.
  • The beta testing of the Meal Scanner indicated that time saved per meal was increased and after shipping, the early adoption of these features was encouraging.

03

Project Details

The Features in depth

Label Scanner: Logging Packaged Food Instantly

CONTEXT & PROBLEM
Carbs & Cals users relied on a Barcode Scanner to log packaged foods and it worked well but only as long as the product existed in the database.

User research showed that
missing data was one of the top reasons users dropped off. People living outside the UK where the ones struggling the most, as our database primarily covered British brands.

We explored integrating open food databases but it had trade-offs in accuracy and scalability.

That’s when we decided to explore extracting nutritional data directly from the source: the food label itself.
“I can’t find half of my local supermarket’s items.”
Anonymous User
process However, nutrition labels are tricky. They’re not just text: they mix layouts, numbers and units with different meaning based on context.

The first proposal used a compressed model (LiLT) that understood visual layouts and text simultaneously.
In internal testing, it achieved 86% accuracy but once real world photos entered the mix (poor lighting conditions, blurry texts...) the quality dropped reducing accuracy to around 70%.

Instead of spending months debugging, the dev team
pivoted and tested a larger off-device LLM using prompt engineering to handle complex nutritional layouts.

We achieved:
95% accuracy, multilingual support and 1–2 second response times.

Behind the scenes

From the UX POV we designed a lightweight flow:

  1. User snaps a photo of a nutrition label.
  2. AI extracts key nutritional values (carbs, calories, fat, saturated fat, etc.).
  3. The user confirms the name and optionally uploads a photo of the product.

It’s simple, intuitive, and gives users control without requiring them to type or search.
(the image below is blurred due to NDA reasons) 👇
outcomes In less than a quarter, we shipped a high-impact, low-risk feature:

95% accuracy

in parsing nutrition labels

🌐

Multilingual Support

unlocked for global users

🚀

Time-to-market reduced by months

vs. database expansion
This project reinforced a key product lesson: AI only adds value when it reduces friction.
Instead of chasing novelty and working for month on a complex AI feature, we designed a practical solution in 1 month.

Smart Suggestions: Personalised Meal Recommendations

WHAT WE DID Smart Suggestions was our first AI feature with the main goal of replacing the old Recents list.

The challenge was to
make recommendations that actually mattered to users by analysing habits in different meal buckets (breakfast, lunch, dinner, and snacks), so we created an algorithm combining frequency, time of day, recent views, and individual behaviour.

This replaced a static list with a dynamic and personalised experience. This ended up with a faster and more relevant logging and a list that adapts to each user’s habits.
the challenge Carbs & Cals users fall into two main groups:

📝

Food Loggers

who track meals daily across breakfast, lunch, dinner, and snacks.

👀

Food Viewers

who browse foods to learn about their carb or calorie content, without logging.
A single logic for both groups wouldn’t work. Loggers needed context-based recommendations while viewers needed relevant reminders based on browsing patterns.

Behind the scenes

Our challenge was to design an algorithm (and UI) that felt personally useful to both types of users.
1_JXEGWslxc4QzrYyuXVTTTQ 1
outcomes

🎯

28% faster logging

Users logged meals faster compared to the old recents list

🎯

35% more interaction

“View-only” users interacted with food more often
Smart Suggestions became a subtle but powerful feature → one that showed how AI can have an impact in everyday usability without drawing attention to itself.

04

Key Learnings

These projects confirmed that AI should always solve real user problems, rather than just being implemented for the sake of technology.

Iteration and testing were critical to make sure that solutions aligned with user needs, business goals and technical constraints.

Finally,
close collaboration with devs and QA was essential to create technically feasible solutions that were also intuitive and user-friendly.
you’ve reachead the end!

Thanks for your attention :)

Back to the top
Judit Salvadó
💻 Currently @Pocketworks
Frame 120

Making food logging smarter with AI for people living with diabetes

AI-powered features that save time, reduce friction and make healthy habits easier for more than 70,000 active monthly users.
ROLE Lead Product Designer
CLIENT Carbs & Cals
2024 - present
TEAM PO, Flutter Devs, Growth Manager, QA & UX Designers
Smart Suggestions
Label Reader
Meal Scanner

01

Starting Point: Why we needed AI

CONTEXT Carbs & Cals is an app that helps people managing diabetes, track their meals with visual portion guides and nutritional data.

But as our user base grew, so did a common pain point: logging food was slow, repetitive, and incomplete.
image 3
3 main problems Even with a database of over 270,000 foods, users were hitting friction. Through research (recurrent surveys, diary studies and user interviews), we identified three main challenges:

Logging Fatigue

entering or searching for meals multiple times a day felt repetitive and discouraging.

Database Gaps

despite our extensive food library, foods from certain regions or smaller brands were still missing and that was frustrating for some users, specially those who don’t live in the UK

Irrelevant Recents

the existing “Recents” section simply listed viewed items in order, ignoring context like time of day, frequency, or meal type.

02

Solution: 3 Individual AI- Powered Features

Rather than adding AI for the sake of it, we focused on where it could make a real difference: reducing friction, saving time and personalising the experience.

Based on the 3 problems we identified, we used AI to reimagine key moments of the user journey. First, our initial idea was to create some kind of chatbot but after a few rounds of user testing it turned out as 3 new features.
FEATURES SHIPPED

🧃

Label Scanner

Using AI to extract nutritional info from packaged foods

⭐️

Smart Suggestions

Transforming “Recents” into personalised meal recommendations.

📷

Meal Scanner

Using AI to log meals with a photo. ✍️ WIP: User Testing Phase
outcomes All these features changed the entire food-logging experience, making it smarter, faster and more human.

  • Users reported that logging meals feels quicker and more intuitive.
  • Feedback from Helpdesk showed less frustration when foods are missing from the database.
  • The beta testing of the Meal Scanner indicated that time saved per meal was increased and after shipping, the early adoption of these features was encouraging.

03

Project Details

Label Scanner: Logging Packaged Food Instantly

CONTEXT & PROBLEM
Carbs & Cals users relied on a Barcode Scanner to log packaged foods and it worked well but only as long as the product existed in the database.

User research showed that
missing data was one of the top reasons users dropped off. People living outside the UK where the ones struggling the most, as our database primarily covered British brands.

We explored integrating open food databases but it had trade-offs in accuracy and scalability.

That’s when we decided to explore extracting nutritional data directly from the source: the food label itself.
“I can’t find half of my local supermarket’s items.”
Anonymous User
process
However, nutrition labels are tricky. They’re not just text: they mix layouts, numbers and units with different meaning based on context.

The first proposal used a compressed model (LiLT) that understood visual layouts and text simultaneously.
In internal testing, it achieved 86% accuracy but once real world photos entered the mix (poor lighting conditions, blurry texts...) the quality dropped reducing accuracy to around 70%.

Instead of spending months debugging, the dev team
pivoted and tested a larger off-device LLM using prompt engineering to handle complex nutritional layouts.

We achieved:
95% accuracy, multilingual support and 1–2 second response times.

Behind the scenes

From the UX POV we designed a lightweight flow:

  1. User snaps a photo of a nutrition label.
  2. AI extracts key nutritional values (carbs, calories, fat, saturated fat, etc.).
  3. The user confirms the name and optionally uploads a photo of the product.

It’s simple, intuitive, and gives users control without requiring them to type or search.
outcomes In less than a quarter we achieved:

95% accuracy

in parsing nutrition labels

🌐

Multilingual Support

unlocked for global users

🚀

Time-to-market reduced by months

vs. database expansion
This project reinforced a key product lesson: AI only adds value when it reduces friction.
Instead of chasing novelty and working for month on a complex AI feature, we designed a practical solution in 1 month.

Smart Suggestions: Personalised Meal Recommendations

WHAT WE DID Smart Suggestions was our first AI feature with the main goal of replacing the old Recents list.

The challenge was to
make recommendations that actually mattered to users by analysing habits in different meal buckets (breakfast, lunch, dinner, and snacks), so we created an algorithm combining frequency, time of day, recent views, and individual behaviour.

This replaced a static list with a dynamic and personalised experience. This ended up with a faster and more relevant logging and a list that adapts to each user’s habits.
the challenge Carbs & Cals users fall into two main groups:

📝

Food Loggers

who track meals daily across breakfast, lunch, dinner, and snacks.

👀

Food Viewers

who browse foods to learn about their carb or calorie content, without logging.
A single logic for both groups wouldn’t work. Loggers needed context-based recommendations while viewers needed relevant reminders based on browsing patterns.

Behind the scenes

Our challenge was to design an algorithm (and UI) that felt personally useful to both types of users. (the image below is blurred due to NDA reasons) 👇
1_JXEGWslxc4QzrYyuXVTTTQ 1
outcomes

🎯

28% faster logging

Users logged meals faster compared to the old recents list

🎯

35% more interaction

“View-only” users interacted with food more often
Smart Suggestions became a subtle but powerful feature → one that showed how AI can have an impact in everyday usability without drawing attention to itself.

Meal Scanner: Coming Soon! Currently in Beta Testing 🧪

04

Key Learnings

These projects confirmed that AI should always solve real user problems, rather than just being implemented for the sake of technology.

Iteration and testing were critical to make sure that solutions aligned with user needs, business goals and technical constraints.

Finally,
close collaboration with devs and QA was essential to create technically feasible solutions that were also intuitive and user-friendly.
you’ve reachead the end!

Thanks for your attention :)

Back to the top