
Silver Age hosts AI masterclass for senior citizens in Bhubaneswar
BHUBANESWAR: Artificial Intelligence (AI) can prove to be a powerful ally for senior citizens, playing a vital role in ensuring dignity, safety and enhanced quality of life for them, said experts here on Saturday.
Leading an online masterclass on AI, presented by Silver Age Foundation for Elders in association with CSM Technologies Limited, country director of the Tony Blair Institute for Global Change Vivek Agrawal spoke about the evolution and foundations of AI and how it can empower senior citizens for independent living by accessing information and services.
Chief technical officer at CSM Pradyut Mohan Das demonstrated how senior citizens can use AI in applications related to health, home security, entertainment and engagement. Features like fact-checking and tips for identifying deepfake videos were discussed.
Around 130 senior citizens from across the country were taken in a guided journey through the world of AI. The session demystified complex concepts such as machine learning, large language models, and generative AI. Around 50 per cent of the participants reported having used some form of AI tools, most commonly ChatGPT, for its utility, simplicity and companionship.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Time of India
34 minutes ago
- Time of India
Mohandas Pai flags lack of domestic capital for Indian startups; urges policy overhaul; calls for stronger R&D support
NEW DELHI: Indian startups are struggling to grow due to limited domestic investment and restrictive government regulations, warned industry veteran and Aarin Capital Chairman Mohandas Pai, calling for urgent policy reforms and increased R&D funding to boost the ecosystem. He cautioned that despite India's position as the world's third-largest startup ecosystem, the nation could lose ground in global innovation if existing issues remain unresolved. "We have 1,65,000 registered startups, 22,000 are funded. They created USD 600 billion in value. We got 121 unicorns, maybe 250-300 soonicorns," Pai said in an interview to PTI. "The biggest issue for startups is the lack of adequate capital. For example, China invested USD 835 billion in startups and ventures between 2014 and 2024, US invested USD 2.32 trillion. We just put in USD 160 billion, out of which possibly 80 per cent came from overseas. So local capital is not coming in," he added. He further highlighted that while American insurance firms and university endowments significantly fund startups, Indian regulations prevent endowments from such investments, and insurance companies remain uninvolved due to incomplete regulatory framework. by Taboola by Taboola Sponsored Links Sponsored Links Promoted Links Promoted Links You May Like Bu bankalar başvuran herkese kredi kartı veriyor mu? Kredi Kart Şimdi Keşfet Undo He recommended regulatory adjustments to enable insurance companies' participation in fund-of-funds and suggested increasing the government's fund-of-funds programme from Rs 10,000 crore to Rs 50,000 crore. Additionally, he noted that India's pension funds, holding Rs 40-45 lakh crore, cannot invest in startups due to conservative policies and regulatory restrictions. Pai emphasised the need to increase research funding in Indian universities substantially and urged organisations like DRDO to share their technologies with the private sector. He indicated that current research expenditure in public universities falls considerably short of global standards. "We need to remove barriers for startups to sell business to the government and public sector though the government has reformed it, it doesn't work in actual practice. It must be opened up, and I think that has to be a mind shift," the industry veteran continued. Pai further criticised the prevailing business culture in India, stating that, "The problem in India is that all the big companies try to beat down the small startups and give them less money, and force them to sell the technologies and use them, and often don't pay them on time.' "This culture of hurting the small people should change," Pai added. Stay informed with the latest business news, updates on bank holidays and public holidays . AI Masterclass for Students. Upskill Young Ones Today!– Join Now


News18
an hour ago
- News18
Energy And Electrons: Could They Become Currency In A Decade? What Nikhil Kamath Says
Last Updated: Zerodha co-founder Nikhil Kamath predicts energy and electrons could become trade currency in a decade due to rising data center and AI energy consumption. Zerodha co-founder Nikhil Kamath has shared a thought-provoking idea based on research with a lingering assumption that energy and electrons might be the currency of trade in a decade. In a series of tweets, he shared the infographics explaining the research, highlighting the growing financial footprint of data centers and artificial intelligence. The electricity consumption in data centers is insane, with one new Data center consuming more electricity than 4 lakh electric vehicles in a year. Thus, the research says, electricity alone eats up 65 per cent of a data center's costs – primarily for computing and cooling. Data centers are large facilities that store, process, and manage digital data. They are the backbone of the internet and digital services—every time you use a website, stream a video, store files in the cloud, or make a bank transaction, chances are it passes through a data center. US is the leader with the most number of data centers (3680), followed by Germany (424) and UK (418). India ranks seventh in the number of 262 data centers. The more servers the data center has, the more energy it requires. Earlier, OpenAI founder Sam Altman explained how the words like please and thank you cost the company tens of millions of dollars. The research adds that one ChatGPT query uses 10x the electricity of a regular Google search. Data Centers To Consume 10% Of Global Energy By 2030 The research shows that data centers' consumption of energy is expected to grow to 10 per cent by 2030, which is 1.5 per cent of global energy. 'Just 5% of global internet searches using AI could consume enough energy to power over 1 million Indian homes for a year." Energy To Become Asset Kamath's idea is rooted in an emerging economic paradigm: as the demand for electricity grows, energy becomes a critical and tradable asset—much like money. If energy is as essential as cash in powering AI, streaming services, financial systems, and cloud operations, pricing and trading electricity akin to currency could be the next step. This concept isn't just futuristic—it carries real-world relevance. Companies might begin to hedge not only currency risk but also energy risk. Imagine supermarkets trading kilowatt-hours or data centers using energy derivatives in a similar fashion to forex trading. Over time, energy tokens or blockchain-based credits for electrons could emerge, giving rise to a sophisticated energy-backed financial ecosystem. If this shift happens, economies would have to rethink everything from inflation metrics to banking, as 'energy currency" could fundamentally alter how value is transferred and stored—making energy both a means and a measure of wealth. About the Author Varun Yadav First Published: June 08, 2025, 15:03 IST News business Energy And Electrons: Could They Become Currency In A Decade? What Nikhil Kamath Says


The Hindu
an hour ago
- The Hindu
Physics changed AI in the 20th century. Is AI returning the favour now?
Artificial intelligence (AI) is booming. Various AI algorithms are used in many scientific domains, such as to predict the structure of proteins, search for materials with particular properties, and interpret medical data to provide a diagnosis. People use tools like ChatGPT, Claude, NotebookLM, DALL-E, Gemini, and Midjourney to generate images and videos from text prompts, write text, and search the web. The question arises in the same vein: can they prove useful in studies of the fundamental properties of nature or is there a gap between human and artificial scientists that needs to be bridged first? There is certainly some gap. Many of the current applications of AI in scientific research often use AI models as a black box: when the models are trained on some data and they produce an output, but the relationship between the inputs and the output is not clear. This is considered unacceptable by the scientific community. Last year, for example, DeepMind faced pressure from the life sciences community to release an inspectable version of its AlphaFold model that predicts protein structures. The black-box nature presents a similar concern in the physical sciences, where the steps leading up to a solution are as important as the solution itself. Yet this hasn't dissuaded scientists from trying. In fact, they started early: since the mid-1980s, they have integrated AI-based tools in the study of complex systems. In 1990, high-energy physics joined the fold. Astro- and high-energy physics In astronomy and astrophysics, scientists study the structure and dynamics of celestial objects. Big-Data analytics and image enhancement are two major tasks for researchers in this field. AI-based algorithms help with the first by looking for patterns, anomalies, and correlations. Indeed, AI has revolutionised astrophysical observations by automating tasks like capturing images and tracking distant stars and galaxies. AI algorithms are able to compensate for the earth's rotation and atmospheric disturbances, producing better observations in a shorter span. They are also able to 'automate' telescopes that are looking for very short-lived events in the sky and record important information in real time. Experimental high-energy physicists often deal with large datasets. For example, the Large Hadron Collider experiment in Europe generates more than 30 petabytes of data every year. A detector on the collider called the Compact Muon Solenoid alone captures 40 million 3D images of particle collisions every second. It is very difficult for physicists to analyse such data volumes rapidly enough to track subatomic events of interest. So in one measure, researchers at the collider started using an AI model able to accurately identify a particle of interest in very noisy data. Such a model helped discover the Higgs boson particle over a decade ago. AI in statistical physics Statistical mechanics is the study of how a group of particles behaves together, rather than individually. It is used to understand macroscopic properties like temperature, and pressure. For example, Ising developed a statistical model for magnetism in the 1920s, focusing on the collective behavior of atomic spins interacting with their neighbors. In this model, there are higher and lower energy states for the system, and the material is more likely to exist in the lowest energy state. The Boltzmann distribution is an important concept in statistical mechanics, used to predict, say, the precise conditions in which ice will turn to water. Using this distribution, in the 1920s, Ernst Ising and Wilhelm Lenz predicted the temperature at which a material changed to non-magnetic from magnetic. Last year's physics Nobel laureates John Hopefield and Geoffrey Hinton developed a theory of neural networks in the same way, based on the idea of statistical mechanics. An NN is a type of model where nodes that can receive data to perform computations on them are linked to each other in different ways. Overall, NNs process information the way animal brains do. For example, imagine an image made up of pixels, where some are visible and the rest are hidden. To determine what the image is, physicists have to consider all possible ways the hidden pixels could fit together with the visible pieces. The idea of most likely states of statistical mechanics could help them in this scenario. Hopefield and Hinton developed a theory for NNs that considered the collective interactions of pixels as neurons, just like Lenz and Ising before them. A Hopfield network calculates the energy of an image by determining the least-energy arrangement of hidden pixels, similar to statistical physics. AI tools apparently returned the favour by helping make advances in the study of Bose-Einstein condensates (BEC). A BEC is a peculiar state of matter that a collection of certain subatomic or atomic particles have been known to enter at very low temperatures. Scientists have been creating it in the lab since the early 1990s. In 2016, scientists at Australian National University tried to do so using AI's help with creating the right conditions for a BEC to form. They found that it did so with flying colours. The tool was even able to help keep the conditions stable, allowing the BEC to last longer. 'I didn't expect the machine could learn to do the experiment itself, from scratch, in under an hour,' the paper's coauthor Paul Wigley said in a statement. 'A simple computer program would have taken longer than the age of the universe to run through all the combinations and work this out.' Bringing AI to the quantum In a 2022 paper, scientists from Australia, Canada, and Germany reported a simpler method to entangle two subatomic particles using AI. Quantum computing and quantum technologies are of great research and practical interest today, with governments — including India's — investing millions of dollars in developing these futuristic technologies. A big part of their revolutionary power comes from achieving quantum entanglement. For example, quantum computers have a process called entanglement swapping: where two particles that have never interacted become entangled using intermediate entangled particles. In the 2022 paper, the scientists reported a tool called PyTheus, 'a highly-efficient, open-source digital discovery framework … which can employ a wide range of experimental devices from modern quantum labs' to better achieve entanglement in quantum-optic experiments. Among other results, scientists have used PyTheus to make a breakthrough with implications for quantum networks used to securely transmit messages, making these technologies more feasible. More work, including research, remains to be done but tools like PyTheus have demonstrated a potential to make it more efficient. From this vantage point in time, it seems like every subfield of physics will soon use AI and ML to help solve their toughest problems. The end goal is to make it easier to come up with the more appropriate questions, test hypotheses faster, and understand results more gainfully. The next groundbreaking discovery may well come from collaborations between human creativity and machine power. Shamim Haque Mondal is a researcher in the Physics Division, State Forensic Science Laboratory, Kolkata.