&w=3840&q=100)
Swiggy launches high-protein food category across 30 cities in India
Swiggy's new offering allows users to identify and choose from a set of curated high-protein dishes that fit their goals without needing to sift through long menus or labels. Each dish listed under the High Protein category meets carefully defined nutritional standards: a minimum of 15 grams of protein per serving, a dish calorific value of less than or equal to 700 kcal, and a protein kcal to total kcal ratio of at least 10 per cent. These benchmarks ensure users are discovering meals that are not only protein-rich but also balanced and suitable for regular consumption.
Consumers can simply search for terms like 'protein' or 'diet' on the Swiggy app to discover the High Protein section. In the dedicated section, dishes are categorised based on their protein content (in grams), enabling users to select options that best suit their dietary preferences. In addition, they can filter dishes based on the source of protein and choose from a range of options such as paneer, soya, etc. Users can also select a restaurant, where protein items are listed at the top for easy selection.
According to a report by ICRISAT, the International Food Policy Research Institute (IFPRI), and the Centre for Economic and Social Studies (CESS), over two-thirds of households in India's semi-arid tropics consume less protein than recommended, with diets heavily reliant on staple grains like rice and wheat. A survey conducted by the Indian Market Research Bureau (IMRB) found that 73 per cent of Indians are deficient in protein, and only 10 per cent of the population consumes adequate protein from their daily diet.
Over 1.8 million customers discovered and appreciated high-protein options during the pilot phase of the High Protein category. Bengaluru, Mumbai, and Delhi are among the top cities in terms of orders, while cities like Chandigarh are also standing out for their high concentration of protein-rich food orders.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Indian Express
7 hours ago
- Indian Express
Chef Sanjyot Keer claims his ‘liquid laddoo' is the perfect cure for cold and cough — we find out if it's true
Since time immemorial, our first line of defence against cold, cough, headache, sore throat, and everything in between has been our grandma's kitchen. Being Indians, it goes beyond saying that Chef Sanjyot Keer's new 'liquid laddoo' recipe took us down the memory lane and filled our hearts with nostalgia. 'Punjabi gharo mein, hamari nani-dadi yeh delicious recipe banati hai which we call besan ka sheera, aisa lagta hai jaise kisi ne ek besan ka ladoo pighla ke cup mein daal diya ho, it's that delicious and comforting,' he captioned his latest Instagram post. Having said that, can this 'liquid laddoo' rationally serve as a miracle cure to your cough and cold this rainy season or might just end up as another viral comfort drink? spoke to Dr. Yashawant Kumar – Founder and CEO, Benefic Nutrition and found out. Almonds: Packed with healthy fats, proteins, and vitamin E, almonds add nutritional value and mild nourishment. They are a good source of energy but not a direct decongestant. Ghee: Ghee can help soothe and lubricate irritated throat. It can calm mucosal membranes and offer symptomatic relief. Besan: Besan is not directly therapeutic, nevertheless it has Vitamin B₁ and antioxidants, which can help with fatigue during sickness. Milk: Milk can comfort and calm the body. When taken along with spices and ghee, it can help clear mucus and fall asleep better. Turmeric Powder: Turmeric is proven to have antiviral, antiseptic, and antibacterial properties. The curcumin in haldi has anti-inflammatory properties and can help reduce inflammation. Black Pepper Powder: When black pepper is taken along with turmeric, it can help with the absorption of curcumin by the body. Black pepper also has antibacterial properties and can help relax sore throat. Cardamom Powder: Cardamom has a long history of serving as an ingredient in various Ayurvedic medicines for respiratory discomfort. It can help with digestion and congestion. Ginger Powder: Ginger is known to have anti-inflammatory and antimicrobial properties. It can soothe throat inflammation and suppress cough. Kesar: There is no evidence of kesar helping to relieve cough. However, it might offer a warm thermogenic effect. Jaggery Powder: The sweet taste of jaggery helps in improving palatability during illness. It may also help soothe the throat and clear mucus. According to Dr Kumar, turmeric, black pepper, and ginger can help reduce inflammation and congestion. 'Almonds and besan can provide much needed nutrients to the body and can relieve fatigue. They also aid digestion and improve energy levels during illness. Milk and ghee can help relax and fall asleep better,' he explained. However, he warned that this remedy is not proven clinically and is not a scientific cure. The ingredients in this recipe only offer temporary symptomatic relief and comfort. 'The 'liquid laddoo' recipe by Chef Sanjyot Keer is not a magic bullet, but it's not a false buzz either. Clearly, it's a clean home remedy with its ingredients rooted in Indian Ayurveda,' said Dr Kumar, adding that the drink indeed has the potential of providing a soothing and calming experience to the body during cold and cough. DISCLAIMER: This article is based on information from the public domain and/or the experts we spoke to. Always consult your health practitioner before starting any routine.


Hindustan Times
10 hours ago
- Hindustan Times
‘Highest road tax, highest road torture': Bengaluru doctor's viral post reignites outrage over bad traffic
A Bengaluru-based doctor's scathing critique of the city's failing civic infrastructure has struck a chord with thousands of residents fed up with endless traffic snarls and neglected roads. Dr Nandita Iyer, visibly frustrated after being stuck in traffic for nearly three hours to cover just 15 km via Varthur, took to social media to call out what she described as 'total civic chaos' in India's tech capital. Bengaluru doctor has called out the terrible traffic management in the city. (PTI) Also Read - 'Every kid in Bengaluru should study in Kannada': Zoho's Sridhar Vembu slams English-medium obsession Take a look at the post 'Bengaluru has the highest road tax in India — and also the highest suffering on the road,' Dr Iyer wrote in a strongly worded post. 'Giant pothole-ridden roads, endless bottlenecks, zero traffic policing, and rampant wrong-side driving. It's heartbreaking and disappointing.' Her post, shared on X, quickly resonated with citizens who face similar ordeals on a daily basis. While narrating her ordeal, Dr Iyer also expressed frustration over what she termed the government's apathy towards law-abiding, tax-paying citizens. 'There's no incentive to follow the rules, and no consequences for those who break them with impunity,' she said, adding that she fears nothing will change despite speaking out. Also Read - Bengaluru Metro becomes lifeline: Liver transported via Namma Metro for transplant in landmark operation 'There is no accountability. Our taxes aren't improving our lives, they're just lining politicians' pockets,' she wrote, summing up the collective disappointment felt by many urban Indians. Her remarks have triggered widespread debate online, with several users chiming in from other metros like Mumbai and Pune, sharing their own grievances. 'You're not alone. The situation is equally bad in Pune and we prioritized flashy airports and stadiums over basic infrastructure like roads,' one user commented. Another pointed out the health risks such long commutes pose, especially for daily travelers navigating such road conditions.


Scroll.in
15 hours ago
- Scroll.in
As young Indians turn to AI ‘therapists', how confidential is their data?
This is the second of a two-part series. Read the first here. Imagine a stranger getting hold of a mental health therapist's private notes – and then selling that information to deliver tailored advertisements to their clients. That's practically what many mental healthcare apps might be doing. Young Indians are increasingly turning to apps and artificial intelligence-driven tools to address their mental health challenges – but have limited awareness about how these digital tools process user data. In January, the Centre for Internet and Society published a study based on 45 mental health apps – 28 from India and 17 from abroad – and found that 80% gathered user health data that they used for advertising and shared with third-party service providers. An overwhelming number of these apps, 87%, shared the data with law enforcement and regulatory bodies. The first article in this series had reported that some of these apps are especially popular with young Indian users, who rely on them for quick and easy access to therapy and mental healthcare support. Users had also told Scroll that they turned to AI-driven technology, such as ChatGPT, to discuss their feelings and get advice, however limited this may be compared to interacting with a human therapist. But they were not especially worried about data misuse. Keshav*, 21, reflected a common sentiment among those Scroll interviewed: 'Who cares? My personal data is already out there.' The functioning of Large Language Models, such as ChatGPT, is already under scrutiny. LLMs are 'trained' on vast amounts of data, either from the internet or provided by its trainers, to simulate human learning, problem solving and decision making. Sam Altman, CEO of OpenAI that built ChatGPT, said on a podcast in July that though users talk about personal matters with the chatbot, there are no legal safeguards protecting that information. 'People use it – young people, especially, use it – as a therapist, a life coach; having these relationship problems and [asking] what should I do?' he asked. 'And right now, if you talk to a therapist or a lawyer or a doctor about those problems, there's legal privilege for it. There's doctor-patient confidentiality, there's legal confidentiality, whatever. And we haven't figured that out yet for when you talk to ChatGPT.' Play He added: 'So if you go talk to ChatGPT about your most sensitive stuff and then there's like a lawsuit or whatever, we could be required to produce that, and I think that's very screwed up.' Therapists and experts said the ease of access of AI-driven mental health tools should not sideline privacy concerns. Clinical psychologist Rhea Thimaiah, who works at Kaha Mind, a collective that provides mental health services, emphasised that confidentiality is an essential part of the process of therapy. 'The therapeutic relationship is built on trust and any compromise in data security can very possibly impact a client's sense of safety and willingness to engage,' she said. 'Clients have a right to know how their information is being stored, who has access, and what protections are in place.' This is more than mere data – it is someone's memories, trauma and identity, Thimaiah said. 'If we're going to bring AI into this space, then privacy shouldn't be optional, it should be fundamental.' Srishti Srivastava, founder of AI-driven mental health app Infiheal, said that her firm collects user data to train its AI bot, but users can access the app even without signing up and also ask for their data to be deleted. Dhruv Garg, a tech policy lawyer at Indian Governance and Policy Project, said the risk lies not just in apps collecting data but in the potential downstream uses of that information. 'Even if it's not happening now, an AI platform in the future could start using your data to serve targeted ads or generate insights – commercial, political, or otherwise – based on your past queries,' said Garg. 'Current privacy protections, though adequate for now, may not be equipped to deal with each new future scenario.' India's data protection law For now, personal data processed by chatbots is governed by the Information Technology Act framework and Sensitive Personal Data Rules, 2011. Section 5 of the sensitive data rules says that companies must obtain consent in writing before collecting or using sensitive information. According to the rules, information relating to health and mental health conditions are considered sensitive data. There are also specialised sectoral data protection rules that apply to regulated entities like hospitals. The Digital Personal Data Protection Act, passed by Parliament in 2023, is expected to be notified soon. But it exempts publicly available personal data from its ambit if this information has voluntarily been disclosed by an individual. Given the black market of data intermediaries that publish large volumes of personal information, it is difficult to tell what personal data in the public domain has been made available 'voluntarily'. The new data protection act does not have different regulatory standards for specific categories of personal data – financial, professional, or health-related, Garg said. This means that health data collected by AI tools in India will not be treated with special sensitivity under this framework. 'For instance, if you search for symptoms on Google or visit WebMD, Google isn't held to a higher standard of liability just because the content relates to health,' said Garg. WebMD provides health and medical information. It might be different for AI tools explicitly designed for mental healthcare – unlike general-purpose models like ChatGPT. These, according to Garg, 'could be made subject to more specific sectoral regulations in the future'. However, the very logic on which AI chatbots function – where it responds based on user data and inputs – could itself be a privacy risk. Nidhi Singh, a senior research analyst and programme manager at Carnegie India, an American think tank, said she has concerns about how tools like ChatGPT customise responses and remember user history – even though users may appreciate those functions. Singh said India's new data protection is quite clear that any data made publicly available by putting it on the internet is no longer considered personal data. 'It is unclear how this will apply to your conversations with ChatGPT,' she said. Without specific legal protections, there's no telling how an AI-driven tool will use the data it has gathered. According to Singh, without a specific rule designating conversations with generative AI as an exception, it is likely that a user's interactions with these AI systems won't be treated as personal data and consequently will not fall under the purview of the act. Who takes legal responsibility? Technology firms have tried hard to evade legal liability for harm. In Florida, a lawsuit by a mother has alleged that her 14-year-old son died by suicide after becoming deeply entangled in an 'emotionally and sexually abusive relationship' with a chatbot. In case of misdiagnosis or harmful advice from an AI tool, legal responsibility is likely to be analysed in court, said Garg. 'The developers may argue that the model is general-purpose, trained on large datasets, and not supervised by a human in real-time,' said Garg. 'Some parallels may be drawn with search engines – if someone acts on bad advice from search results, the responsibility doesn't fall on the search engine, but on the user.' Highlighting the urgent need for a conversation on sector-specific liability frameworks, Garg said that for now, the legal liability of AI developers will have to be assessed on a case-to-case basis. 'Courts may examine whether proper disclaimers and user agreements were in place,' he said. In another case, Air Canada was ordered to pay compensation to a customer who was misled by its chatbot regarding bereavement fares. The airline had argued that the chatbot was a ' separate legal entity ' and therefore responsible for its own actions. Singh of Carnegie India said that transparency is important and that user consent should be meaningful. 'You don't need to explain the model's source code, but you do need to explain its limitations and what it aims to do,' she said. 'That way, people can genuinely understand it, even if they don't grasp every technical step.' AI, meanwhile, is here for the long haul. Until India can expand its capacity to offer mental health services to everyone, Singh said AI will inevitably fill that void. 'The use of AI will only increase as Indic language LLMs are being built, further expanding its potential to address the mental health therapy gap,' she said.