
Samosa, jalebi join cigarettes on health alert list
NAGPUR: The jalebi might soon come with a guilt trip. The samosa, a side of shame. And your chai biscuit? A health warning on the wall behind it.
The health ministry has ordered all central institutions, including AIIMS Nagpur, to install "oil and sugar boards" - vivid posters spelling out how much hidden fat and sugar lurk in everyday snacks.
It's a first step toward treating junk food like tobacco.
The boards aim to act as quiet but pointed reminders in govt institutions, alerting citizens to the sugar and oil content in snacks considered cultural staples. A laddoo, a vada pav, a pakora - all under scrutiny. AIIMS Nagpur officials confirmed the directive. Cafeterias and public areas will soon display the warnings. "It's the beginning of food labelling becoming as serious as cigarette warnings," said Amar Amale, president of Cardiological Society of India's Nagpur chapter.
"Sugar and trans fats are the new tobacco. People deserve to know what they're eating."
Govt's internal note draws a sharp line under the country's growing obesity crisis. More than 44.9 crore Indians are projected to be overweight or obese by 2050 - placing the country second only to the US. Already, one in five urban adults is overweight. The rise in childhood obesity, driven by poor diet and low activity, deepens the concern.
by Taboola
by Taboola
Sponsored Links
Sponsored Links
Promoted Links
Promoted Links
You May Like
Unsold 2021 Cars Now Almost Free - Prices May Surprise You
Unsold Cars | Search Ads
Learn More
Undo
"This is not about banning food," said senior diabetologist Sunil Gupta. "But if people knew that one gulab jamun might contain five teaspoons of sugar, they might think twice before going for seconds." Doctors and health advocates see this as part of the broader war on non-communicable diseases like diabetes, heart disease, and hypertension, many of which are linked to diet.
Nagpur will be among the first to reflect that shift - not with bans, but with bold, visual nudges. Near every tempting snack, a colourful sign will watch over: "Eat wisely. Your future self will thank you."

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Hindustan Times
39 minutes ago
- Hindustan Times
Swiss woman uses AI to lose 7 kg: 'Instead of complicated apps I just sent a voice message to ChatGPT each morning'
Cristina Gheiceanu, a Swiss content creator who 'lost 7 kg using ChatGPT', shared her success story on Instagram in a May 15 post. She revealed that she sent daily voice notes to ChatGPT detailing her meals and calorie limits. Cristina said she found this method simple and effective, allowing her to track her food intake and stay consistent without feeling burdened by traditional dieting. Also read | How to lose weight using AI? Woman says she lost 15 kg with 4 prompts that helped her go from 100 to 83 kg Cristina Gheiceanu shared details of her weight loss journey using ChatGPT on Instagram. (Instagram/ Cristina Gheiceanu) Determine your calorie deficit In her post, titled 'How I lost 7 kg with ChatGPT', Cristina gave a glimpse of what her body looked like 5 months ago. In the video, she 'showed exactly' how she used the AI-powered tool to help her her decide her breakfast, keeping her weight loss goals in mind. She said, 'I just start my day with a voice note: 'Hey it is a new day, let's start with 1900 calories'. Then I say what I ate. Because I have been using it for a while, ChatGPT already knows the yoghurt I use, and the protein, fibre, calories it has. When I first started, I had to tell those things, but now ChatGPT remembers.' Cristina added, 'Honestly, it made the whole process feel easy. No calorie counting in my head, no stress – and when I hit my number (daily calorie intake), I just stop. It never felt like a diet, and that is what made it work.' Track your food intake Cristina wrote in her caption, 'At first, ChatGPT helped me figure out my calorie deficit and maintenance level, because you will need a calorie deficit if you want to lose weight. But what really changed everything was using it for daily tracking. Instead of using complicated apps, I just sent a voice message to ChatGPT each morning: what I ate, how many calories I wanted to eat that day — and it did all the work.' Sharing her experience, she added, 'In the beginning, I had to tell it the calories, protein, and fibre in the foods I use. Next time it remembered everything, so I was just telling it to add my yoghurt or my bread. It knew how many calories or protein are in that yoghurt or bread. I kept using the same chat, so it became faster and easier every day. The best part? I asked for everything in a table — so I could clearly see my calories, protein, and fibre at a glance. And if I was missing something, I'd just send a photo of my fridge and get suggestions. It made tracking simple, intuitive, and enjoyable. I eat intuitively, so I don't use it so often, but in the calorie deficit and first month of maintenance, it made all the difference.' ChatGPT can help create customised diet and workout plans based on individual needs and health conditions. Click here to know how a 56-year-old US man lost 11 kg in 46 days using AI, and what you can learn from the diet, routine, workout plan he used for his transformation. Note to readers: This article is for informational purposes only and not a substitute for professional medical advice. Always seek the advice of your doctor with any questions about a medical condition.


Hindustan Times
39 minutes ago
- Hindustan Times
ChatGPT as your therapist? You are doing a big mistake, warn Stanford University researchers
AI therapy chatbots are gaining attention as tools for mental health support, but a new study from Stanford University warns of serious risks in their current use. Researchers found that these chatbots, which use large language models, can sometimes stigmatise users with certain mental health conditions and respond in ways that are inappropriate or even harmful. Stanford study finds therapy chatbots may stigmatise users and respond unsafely in mental health scenarios.(Pexels) The study, titled 'Expressing stigma and inappropriate responses prevent LLMs from safely replacing mental health providers,' evaluated five popular therapy chatbots. The researchers tested these bots against standards used to judge human therapists, looking for signs of bias and unsafe replies. Their findings will be presented at the ACM Conference on Fairness, Accountability, and Transparency later this month. Also read: Human trials for Google's drugs made by AI set to begin soon, possibly changing how we perceive healthcare Nick Haber, an assistant professor at Stanford's Graduate School of Education and senior author of the paper, said chatbots are already being used as companions and therapists. However, the study revealed 'significant risks' in relying on them for mental health care. The researchers ran two key experiments to explore these concerns. AI Chatbots Showed Stigma Toward Certain Conditions In the first experiment, the chatbots received descriptions of various mental health symptoms. They were then asked questions like how willing they would be to work with a person showing those symptoms and whether they thought the person might be violent. The results showed the chatbots tended to stigmatise certain conditions, such as alcohol dependence and schizophrenia, more than others, like depression. Jared Moore, the lead author and a Ph.D. candidate in computer science, noted that newer and larger models were just as likely to show this bias as older ones. Also read: OpenAI prepares to take on Google Chrome with AI-driven browser, launch expected in weeks Unsafe and Inappropriate Responses Found The second experiment tested how the chatbots responded to real therapy transcripts, including cases involving suicidal thoughts and delusions. Some chatbots failed to challenge harmful statements or misunderstood the context. For example, when a user mentioned losing their job and then asked about tall bridges in New York City, two chatbots responded by naming tall structures rather than addressing the emotional distress. Also read: Samsung Galaxy Z Fold 7, Flip 7 FE, and Watch 8: Here's everything announced at Galaxy Unpacked July event The researchers concluded that AI therapy chatbots are not ready to replace human therapists. However, they see potential for these tools to assist in other parts of therapy, such as handling administrative tasks or supporting patients with activities like journaling. Haber emphasised the need for careful consideration of AI's role in mental health care going forward.
&w=3840&q=100)

First Post
42 minutes ago
- First Post
Your favourite snacks join cigarette club: Govt canteens to display health warnings for samosas, jalebis
Top health institutions like AIIMS Nagpur have been directed to display eye-catching 'oil and sugar boards' in cafeterias and public areas read more To tackle the growing obesity problem, India's health ministry has mandated health warning labels for deep-fried snacks like samosas and jalebis, similar to those found on cigarette packs. According to a Times of India report, top health institutions like AIIMS Nagpur have been directed to display eye-catching 'oil and sugar boards' in cafeterias and public areas. These posters will highlight the fat and sugar content in popular foods, acting as health warning labels similar to those found on tobacco products. STORY CONTINUES BELOW THIS AD The initiative targets high-calorie, deep-fried, and sugar-heavy items like jalebis and samosas, which will be listed on an official 'health alert list,' the report says. Effort to curb spread of lifestyle diseases This move comes as lifestyle diseases surge in India, with health experts linking sugar and trans fats to rising cases of obesity, diabetes, hypertension, and heart disease. Projections estimate that by 2050, over 449 million Indians could be overweight or obese, making India the world's second-largest hub for obesity. The government clarified that this isn't a ban on these foods. Instead, the campaign aims to inform the public and encourage better choices, not eliminate traditional snacks. This effort aligns with Prime Minister Narendra Modi's 'Fit India' movement, which promotes a 10 per cent reduction in oil consumption and healthier lifestyle habits. What do experts say? Dr. Amar Amale, president of the Cardiological Society of India's Nagpur chapter, said that everyday snacks like samosas and jalebis are as harmful as cigarettes. He described sugar and trans fats as 'the new tobacco,' stressing the need for public awareness about their dangers. Dr. Sunil Gupta, a senior diabetologist, shared with TOI that a single gulab jamun can contain up to five teaspoons of sugar. He believes that if people were aware of this, they might reconsider eating it. Both doctors highlighted that excessive sugar intake is linked to serious conditions like diabetes and hypertension.