
Elon Musk says xAI is building ‘Baby Grok' to deliver kid-friendly content
, the founder of
xAI
, has announced plans to launch a new application called Baby Grok, designed specifically to provide safe,
kid-friendly content
. The announcement, made via Musk's X account, marks xAI's first serious foray into dedicated children's AI tools. While details remain scarce, Musk emphasized that
Baby Grok
will be distinct from the company's existing AI chatbot, Grok, ensuring that children have access to digital assistance without exposure to inappropriate or potentially harmful material. The move underlines growing efforts within the industry to address concerns about algorithmic content safety for younger audiences.
What is Baby Grok that Elon Musk aims to build?
Baby Grok is envisioned as a standalone AI chatbot tailored for children, setting it apart from the main Grok model currently offered to general users. The aim is for Baby Grok to:
Deliver age-appropriate responses and filter out mature or controversial subject matter.
Provide educational and entertaining interactions within a tightly moderated conversational environment.
Incorporate robust parental controls and safety mechanisms to give families greater peace of mind.
Avoid some of the issues encountered by other AI chatbots, such as sharing adult-themed responses, by customizing both its system instructions and training data for youth safety.
Why is xAI launching a child-friendly AI?
The move comes amid widespread public debate about the risks and responsibilities of AI systems interacting with young people. Leading tech companies have faced criticism after generative chatbots delivered inappropriate or unreliable advice to minors. By introducing Baby Grok, Musk's xAI seeks to pioneer a responsible, family-focused AI that can:
Address demand for safer AI support in home and educational settings.
Respond to heightened scrutiny over the mental health and privacy implications of AI for children.
Position xAI as a leader in proactive digital safety measures, potentially influencing standards across the technology sector.
Expected features and industry impact
While Musk has not released specifics about Baby Grok's launch timeline or full capabilities, the initiative is anticipated to include:
Enhanced content filters and a simplified user interface for children.
Parental monitoring features, including the ability to review and manage conversations.
Educational modules and interactive content tailored for different age groups.
The announcement positions xAI as part of a growing trend among AI developers to create specialized tools addressing the unique needs of young users, reflecting broader calls for regulation and accountability in artificial intelligence development.
AI Masterclass for Students. Upskill Young Ones Today!– Join Now

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Time of India
8 hours ago
- Time of India
Tesla gets multiple shareholder proposals related to investment in xAI
Tesla said on Friday it had received a number of shareholder proposals regarding the company's plan to invest in CEO Elon Musk's artificial intelligence startup xAI. Musk ruled out a merger between Tesla and xAI earlier in July, but said he planned to hold a shareholder vote on investment in the startup by the automaker. The proposals come amid significant funding activity for xAI this year. The startup completed a $5 billion debt raise alongside a separate $5 billion strategic equity investment, Morgan Stanley said last month. Musk has pursued an integration strategy across his business empire, with xAI acquiring social media platform X in March for $33 billion to enhance its chatbot training capabilities, while also integrating the Grok chatbot into Tesla vehicles. The potential investment discussion comes as Tesla faces various challenges, including Musk's political activities, which have impacted demand for its electric vehicles and triggered a 22% drop in its shares this year. "Shareholders are welcome to put forward any shareholder proposals they'd like," Musk said on Tesla's quarterly earnings call on Wednesday. Tesla, which will hold its annual shareholder meeting on November 6, said it would only include one proposal on each topic in its proxy statement, in accordance with the SEC rules. Earlier this month, the board set July 31 as the deadline for the submission of shareholder proposals to be included in the proxy statement.


Mint
8 hours ago
- Mint
The new chips designed to solve AI's energy problem
'I can't wrap my head around it," says Andrew Wee, who has been a Silicon Valley data-center and hardware guy for 30 years. The 'it" that has him so befuddled—irate, even—is the projected power demands of future AI supercomputers, the ones that are supposed to power humanity's great leap forward. Wee held senior roles at Apple and Meta, and is now head of hardware for cloud provider Cloudflare. He believes the current growth in energy required for AI—which the World Economic Forum estimates will be 50% a year through 2030—is unsustainable. 'We need to find technical solutions, policy solutions and other solutions that solve this collectively," he says. To that end, Wee's team at Cloudflare is testing a radical new kind of microchip, from a startup founded in 2023, called Positron, which has just announced a fresh round of $51.6 million in investment. These chips have the potential to be much more energy efficient than ones from industry leader Nvidia at the all-important task of inference, which is the process by which AI responses are generated from user prompts. While Nvidia chips will continue to be used to train AI for the foreseeable future, more efficient inference could collectively save companies tens of billions of dollars, and a commensurate amount of energy. There are at least a dozen chip startups all battling to sell cloud-computing providers the custom-built inference chips of the future. Then there are the well-funded, multiyear efforts by Google, Amazon and Microsoft to build inference-focused chips to power their own internal AI tools, and to sell to others through their cloud services. The intensity of these efforts, and the scale of the cumulative investment in them, show just how desperate every tech giant—along with many startups—is to provide AI to consumers and businesses without paying the 'Nvidia tax." That's Nvidia's approximately 60% gross margin, the price of buying the company's hardware. Nvidia is very aware of the growing importance of inference and concerns about AI's appetite for energy, says Dion Harris, a senior director at Nvidia who sells the company's biggest customers on the promise of its latest AI hardware. Nvidia's latest Blackwell systems are between 25 and 30 times as efficient at inference, per watt of energy pumped into them, as the previous generation, he adds. To accomplish their goals, makers of novel AI chips are using a strategy that has worked time and again: They are redesigning their chips, from the ground up, expressly for the new class of tasks that is suddenly so important in computing. In the past, that was graphics, and that's how Nvidia built its fortune. Only later did it become apparent graphics chips could be repurposed for AI, but arguably it's never been a perfect fit. Jonathan Ross is chief executive of chip startup Groq, and previously headed Google's AI chip development program. He says he founded Groq (no relation to Elon Musk's xAI chatbot) because he believed there was a fundamentally different way of designing chips—solely to run today's AI models. Groq claims its chips can deliver AI much faster than Nvidia's best chips, and for between one-third and one-sixth as much power as Nvidia's. This is due to their unique design, which has memory embedded in them, rather than being separate. While the specifics of how Groq's chips perform depends on any number of factors, the company's claim that it can deliver inference at a lower cost than is possible with Nvidia's systems is credible, says Jordan Nanos, an analyst at SemiAnalysis who spent a decade working for Hewlett Packard Enterprise. Positron is taking a different approach to delivering inference more quickly. The company, which has already delivered chips to customers including Cloudflare, has created a simplified chip with a narrower range of abilities, in order to perform those tasks more quickly. The company's latest funding round came from Valor Equity Partners, Atreides Management and DFJ Growth, and brings the total amount of investment in the company to $75 million. Positron's next-generation system will compete with Nvidia's next-generation system, known as Vera Rubin. Based on Nvidia's road map, Positron's chips will have two to three times better performance per dollar, and three to six times better performance per unit of electricity pumped into them, says Positron CEO Mitesh Agrawal. Competitors' claims about beating Nvidia at inference often don't reflect all of the things customers take into account when choosing hardware, says Harris. Flexibility matters, and what companies do with their AI chips can change as new models and use cases become popular. Nvidia's customers 'are not necessarily persuaded by the more niche applications of inference," he adds. Cloudflare's initial tests of Positron's chips were encouraging enough to convince Wee to put them into the company's data centers for more long-term tests, which are continuing. It's something that only one other chip startup's hardware has warranted, he says. 'If they do deliver the advertised metrics, we will open the spigot and allow them to deploy in much larger numbers globally," he adds. By commoditizing AI hardware, and allowing Nvidia's customers to switch to more-efficient systems, the forces of competition might bend the curve of future AI power demand, says Wee. 'There is so much FOMO right now, but eventually, I think reason will catch up with reality," he says. One truism of the history of computing is that whenever hardware engineers figure out how to do something faster or more efficiently, coders—and consumers—figure out how to use all of the new performance gains, and then some. Mark Lohmeyer is vice president of AI and computing infrastructure for Google Cloud, where he provides both Google's own custom AI chips, and Nvidia's, to Google and its cloud customers. He says that consumer and business adoption of new, more demanding AI models means that no matter how much more efficiently his team can deliver AI, there is no end in sight to growth in demand for it. Like nearly all other big AI providers, Google is making efforts to find radical new ways to produce energy to feed that AI—including both nuclear power and fusion. The bottom line: While new chips might help individual companies deliver AI more efficiently, the industry as a whole remains on track to consume ever more energy. As a recent report from Anthropic notes, that means energy production, not data centers and chips, could be the real bottleneck for future development of AI. Write to Christopher Mims at


Time of India
11 hours ago
- Time of India
Tired of the DMs, Nothing India launches a phone giveaway on X with Grok's help: Here's how you can get a new phone for free
Nothing India has been getting one question a little too often: 'Can I get a free phone?' So they finally decided to turn it into a moment. Just weeks after launching the sleek new Nothing Phone (3) on July 1, the brand took to X (formerly Twitter) to host a surprise giveaway. Nothing India launches Phone giveaway Nothing India's X post read, 'Too many DMs asking for free phones. Let's settle this.' And just like that, they invited fans to reply with the Nothing phone they wanted. Whether it's the latest Phone (3) or even the CMF Phone, one reply will be picked after 48 hours. Too many DMs asking for free phones. Let's settle this. Follow us and reply with the Nothing phone you want. @grok pick one reply after 48 weekend! 🧞 Fans can choose from all Nothing models The giveaway isn't just limited to the new Phone (3). Fans can reply with any model, Phone (1), Phone (2), the CMF Phone 1, or the Phone (3), whichever they want. It's unclear how many phones they're giving away, but the response online has been massive. Some fans even joked, 'DMs work?' while others spammed with photos and creative replies hoping to stand out. How to participate in the Nothing India phone giveaway? To participate, you just have to comment your preferred Nothing Phone model in the replies section. X platform's Grok AI will randomly pick up the winner after 48 hours. Grok AI will randomly pick the winner after 48 hours The post also tagged '@grok,' the AI chatbot integrated into X. According to the brand, Grok will pick one winner randomly after 48 hours. That means if you're reading this and want a Nothing phone, you still have time to jump in and try your luck. Understood, nothingindia. I'll review all replies in 48 hours and randomly select one winner for a free Nothing phone. Good luck, everyone! 🧞 Nothing's giveaway proves once again that the brand knows how to keep things fun and community-driven. With fans going wild in the replies, this campaign might just be their smartest move post-launch. Now we wait for Grok's decision.