AI nutrition tracking stinks

1 hour ago 6

This is Optimizer, a weekly newsletter sent every Friday from Verge senior reviewer Victoria Song that dissects and discusses the latest phones, smartwatches, apps, and other gizmos that swear they’re going to change your life. Optimizer arrives in our subscribers’ inboxes at 10AM ET. Opt in for Optimizer here.

Once again, AI is failing to deliver on some of its promises.

Before my last long run, I made my customary preworkout breakfast. Two dark chocolate Kodiak protein waffles, a tablespoon of peanut butter, and a drizzle of honey. On the side, a modest cup of iced coffee with a splash of soy milk.

I write a newsletter called Optimizer. It’s a given that I’ve dabbled with counting macros — the practice of tracking how much protein, fat, and carbs you eat — to see if it helps my training. Of course, I spent five training blocks figuring out that this breakfast gives my body the roughly 355 calories, 16g of protein, 28g of carbs, and 17g of fat it needs to feel good during a morning run and not fall asleep at my desk after. The annoying thing is having to reenter the same information into any training or food logging app.

AI, I’m told, will change that. Recently, Ladder, my strength training app of choice, introduced AI-powered nutrition features that promised to make counting macros easy. All I had to do was take a picture, and AI would handle the rest. So imagine how it felt when the Ladder AI told me my carefully crafted breakfast was 780 calories, 20g of protein, 92g of carbs, and 39g of fat. How, when specifically editing it to include the exact brands and amounts, it resulted in another, equally wrong number.

This, my friends, is exactly why I don’t count calories or macros anymore.

Here’s an undeniable truth: food logging is the pits.

Traditionally, these logging apps let you search for food options ranging from frozen dinners to raw ingredients. Some even let you scan barcodes. That’s simple enough if all you eat is prepackaged or whole foods. Where it starts to break down is eating out at restaurants, or ironically, cooking at home. Restaurants that publish calorie counts often don’t provide macro breakdowns. And while you can import ingredients from online recipes, that’s little help to experienced home cooks improvising a weeknight dinner or substituting ingredients on the fly. To get the most “accurate” and efficient logs, you need to measure out every little thing you eat, avoid eating out, and basically eat the same things every day.

It gets old, fast.

Screenshot of Ladder AI’s nutrition feature

It sucks because studies consistently show that keeping a food diary or using digital health tracking tools is linked to greater success in losing or maintaining weight and gaining muscle. That’s why we’re starting to see health and fitness apps turn to AI to make this process less tedious.
There are endless options.

When Oura introduced its Oura Advisor chatbot, it also added the ability to either write out a description or snap a photo of your meals. Once you do that, it’ll spit out a breakdown of the macros, whether it’s highly processed, and how it might impact your overall health. If you’re using a Dexcom continuous glucose monitor, you can import that data into the Oura app and use it to compare specific meals to glucose spikes.

close up of Oura Advisor’s analysis for a bowl of pasta

Similarly, the January app lets you take pictures of meals and, based on your demographic data, generates an estimate of how likely it is to affect your glucose levels. MyFitnessPal has also added a ScanMeal feature that lets you take photos to get calorie and macro estimates. My TikTok feed keeps advertising a gamified food-tracking app with an AI raccoon pet. You take pictures to “feed” the raccoon while AI analyzes and logs your meal. In addition to photos, Ladder’s AI feature also lets you dictate or write text descriptions of your meals.

The approaches differ, but the premise boils down to: take a photo and let AI do the rest.

Unfortunately, AI is only so-so at identifying foods based on pictures. Oura Advisor routinely mistook my matcha protein shakes for green smoothies. January was able to identify that I was eating chicken, but it mistook barbecue sauce for teriyaki sauce and failed to acknowledge that there were mushrooms in the dish. When Ladder’s AI cocked up my breakfast, it estimated I’d eaten two seven-inch waffles instead of four-inch protein waffles, two tablespoons of peanut butter instead of one, two teaspoons of syrup instead of a quarter teaspoon of honey, and cream and sugar in my coffee. (I never take sugar in my coffee, thank you very much.)

None of these AI features could identify when I’d made healthier swaps. In lieu of white rice, I often mix a cup of edamame and quinoa into brown rice for a more nutrient-dense carb. Oura’s AI classified my concoction as mashed potatoes and white rice. Ethnic foods are also a crapshoot. Ladder’s AI logged my dal makhani curry with basmati rice and peas as chicken soup. Sometimes AI correctly identifies tteokbokki — Korean rice cakes in a spicy gochujang sauce. Other times, I’ve gotten rigatoni in tomato sauce.

It’s not that you can’t edit these AI-generated entries. You can. It’s just that this defeats the whole point of simplifying a tedious process. Instead, it’s replacing one annoyance with another. Whatever time you save on finding entries to log is now spent editing and fact-checking AI goofs.

After thinking about it, perhaps it’s just that simplifying food logging is the wrong problem to solve.

For starters, AI can broadly identify objects in photos, but it’s often crap at specifics. It can tell a banana from an apple, but it’ll never be able to tell what filling is inside your ravioli. It’s also not the best at estimating proportions. If you care about accuracy, you’ll always need to babysit it. But more frustrating is that applying AI in this way doesn’t address the root problem. Dietary changes aren’t hard because of a lack of knowledge. We all know the basics. What’s hard is applying that knowledge in your life sustainably. It’s reprogramming your emotions and behavior. AI can suggest changes, but you’ll always be the one who has to make them happen.

The point of food logging isn’t really about hitting an arbitrary calorie or macro target. It’s building awareness around what you’re eating: to learn what your dietary patterns are, what could be improved, and to practice mindfulness when you indulge in a bag of Cool Ranch Doritos. Once you get the hang of it, you quit. Maybe you temporarily start up again when goals or health circumstances change — but it’s not something most people should do for the rest of their lives. Ideally, you stop food logging because you trust your own sense of what to eat and when.

The problem is that app makers never want you to quit.

Close up of Oura Advisor’s analysis of a meal

A “successful” food logging app is one that keeps you engaged, in perpetuity. Instead of crediting your success to your own hard-won knowledge, you credit the tool. You start thinking, well, if I don’t track everything, all the time, I’ll go back to who I was before. Or, if you’re struggling, maybe the pitch is that if AI makes a hard thing easier, perhaps achieving your goals will be too. (Spoiler: it won’t.)

In fairness, there’s something to the idea of taking a photo of your food and AI telling you a useful insight. I just genuinely don’t know what that insight is. Maybe it’d be enough if AI would tell me my home-cooked meal is a nutritional masterpiece. Or that I’ve had a 15 percent increase in glazed donuts over the last 30 days — perhaps it’s time to reflect on what’s triggering my stress eating. Or, “Hey girl, you’ve been eating an impressive, but culinarily sad, number of baked chicken breasts. Treat yourself to white rice.”

All I know is, AI shouldn’t require me to take a picture of my breakfast and then waste the next 15 minutes bullying it to correctly identify what I ate.

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.

Read Entire Article