Fitness apps can make users feel ashamed, research reveals. A study led by University College London (UCL) and Loughborough University has found that the very tools designed to motivate healthy behaviour often trigger feelings of guilt, irritation and demoralisation when users fail to meet algorithm-generated targets. The findings, published in the British Journal of Health Psychology, draw on artificial intelligence to analyse tens of thousands of social media posts, exposing a side of fitness technology that is rarely acknowledged by its developers.
Shame and Demotivation
Researchers identified a recurring pattern of negative emotions among users of the five most profitable fitness applications. Posts on X (formerly Twitter) revealed that people felt “shame” when they logged foods they considered unhealthy, “irritation” at the constant notifications sent by the apps, and “disappointment” when they were unable to meet the targets set for them. In some cases these experiences led to outright “demotivation”, with users seemingly giving up on their health goals altogether.
Dr Paulina Bondaronek, senior author of the paper and a researcher at the UCL Institute of Health Informatics, said: “In these posts, we found a lot of blame and shame, with people feeling they were not doing as well as they should be. These emotional effects may end up harming people’s motivation and their health.” She added: “We need to learn to be kinder to ourselves. We are good at blaming and shaming because we think it will help us to do better but actually it has the opposite effect.”
Co-author Dr Lucy Porter, from the UCL Division of Psychology and Language Sciences, noted: “Listening to users’ reports on social media has shown that fitness apps can sometimes leave users feeling demoralised and ready to give up – which is the exact opposite of what these tools are supposed to do.”
The study highlighted particular concern with the rigid, algorithm-generated targets that are often based solely on a person’s weight loss goals. One user reported being told they needed to “consume −700 (negative 700) calories a day” to reach their target weight – a physically impossible and dangerous recommendation. “These apps rely on algorithms that do not reflect the flexibility and messiness of real life, or account for individual circumstances and differences,” the researchers wrote.
Criticism of the apps also extended to the way goals are dictated by user weight targets rather than public health recommendations such as those from the NHS. Nottingham Business School has separately found that individuals with limited experience of physical activity are most vulnerable to the emotional and psychological harm from fitness trackers, as they become overly dependent on default targets and external validation, sometimes developing anxiety or a distorted relationship with their bodies. Concerns have also been raised about the potential for these apps to exacerbate disordered eating habits, particularly through an excessive focus on dietary restriction and weight loss.
How AI Uncovered User Sentiment
The study’s methodology represents a novel approach to understanding the real-world impact of fitness apps. Rather than relying on surveys or lab experiments, the researchers turned to the vast repository of unfiltered user feedback available on social media. They used artificial intelligence to analyse posts on X, initially identifying 58,881 posts that discussed the five most profitable fitness applications: MyFitnessPal, Strava, WW (formerly Weight Watchers), Workouts by Muscle Booster, and Fitness Coach & Diet (also listed as FitCoach).
From this pool, the AI filtered the posts to isolate those that expressed a “negative sentiment”. This left 13,799 posts for further analysis. The technique, known as Machine-Assisted Topic Analysis (MATA), combines AI-powered topic modelling with human qualitative analysis, allowing researchers to process huge volumes of data quickly while retaining the nuance needed for psychological insight. Dr Bondaronek explained: “Social media provides a huge amount of data that could help us understand these effects. By using AI, we were able to analyse this data more quickly.”
The researchers found that users often reported aversive emotional responses to app notifications, describing them as “pestering” and irritating. Technical challenges such as inaccurate data and data loss also disrupted the self-monitoring experience, while oversimplified algorithms made it difficult to accurately quantify diet and physical activity. The emphasis on rigid, quantitative goals was found to undermine intrinsic motivation – the inherent enjoyment or satisfaction that comes from being active – because success becomes narrowly defined by metrics such as weight lost. When users struggled to meet targets, they sometimes engaged in avoidant behaviours or abandoned the app and healthy habits altogether.
Calls for a More Holistic Approach
In response to the findings, the researchers advocate a fundamental shift in how fitness apps are designed. Instead of focusing narrowly on calorie counting and exercise regimes, they argue that developers should adopt a more holistic approach that prioritises overall wellbeing and intrinsic motivation. “Instead of very narrow, rigid measures of success relating to amount of weight lost, health apps should prioritise overall wellbeing and focus on intrinsic motivation – ie, the inherent enjoyment or satisfaction in activities,” said Dr Bondaronek.
The call echoes broader trends in the wellness industry, where platforms are increasingly integrating mindfulness, nutrition and stress management alongside physical activity. Dr Porter added that the key question now is “how pervasive these effects on morale and emotional wellbeing are, and whether there is anything that can be done to adapt fitness apps so that they better meet people’s needs.” The researchers suggest that app design should become more user-centred and psychologically informed, incorporating elements such as self-compassion and social connection.
It is important to note that the study focused exclusively on negative posts, so it cannot assess the overall balance of benefit and harm. Dr Bondaronek acknowledged: “The apps may have a negative side, but they likely also provide benefits to many people.” Nevertheless, the evidence of shame, demotivation and even unsafe recommendations raises serious questions about an industry that has grown rapidly without adequate scrutiny of its psychological effects. As AI continues to be adopted in healthcare for data analysis, patient monitoring and communication, the findings serve as a cautionary tale: technology that tracks our bodies must also understand our minds.
