logo
#

Latest news with #TogetherAI

Quebec's Hypertec announces $5-billion program to build European data centres
Quebec's Hypertec announces $5-billion program to build European data centres

Globe and Mail

time12 hours ago

  • Business
  • Globe and Mail

Quebec's Hypertec announces $5-billion program to build European data centres

Quebec technology company Hypertec Inc. has announced a $5-billion program to build a series of data centres across Europe capable of powering two gigawatts of AI computing capacity. Hypertec is partnering on the project with Montreal-based 5C Group, which it largely owns, and Together AI of San Francisco. The group expects to roll out the projects over the next three years with Britain, France, Italy and Portugal as the priority markets. 'We're in one of the largest revolutions today, possibly an unprecedented revolution with artificial intelligence,' said Hypertec chief executive officer Simon Ahdoot, who announced the expansion on Thursday at the VivaTech trade show in Paris. He added that 'when we see all those challenges brought together into one, we're seeing this unbelievable growth and an opportunity to help transform at the level of infrastructure, not just the IT industry, but all industries.' Mr. Ahdoot said the $5-billion figure represents the cost of building two gigawatts of total capacity, which will be able to support up to 100,000 graphics processing units (GPUs), considered the backbone of AI processing. The money will come from a range of private investors as the projects progress, he added. It's not clear yet how many individual data centres will be built but it could be dozens. Mr. Ahdoot said Bell Canada's data centre in Montreal has capacity for around five megawatts of electric power. 'So for two gigawatts, we're talking about 400 Bell centres,' he said. Hypertec, which is privately held by the Ahdoot family, was established in 1984 primarily as a maker of personal computers. The company evolved over time and expanded into building servers, computing systems, and infrastructure for data centres. It currently has around 700 employees and has built centres in Canada and around the world. The European expansion is the company's largest foray overseas. 'In terms of setting up a presence and developing a data center, this is very new,' he said. Canada takes centre stage as VivaTech trade show opens in Paris Mr. Ahdoot said Hypertec has streamlined its construction and design processes and can build a data centre within six to nine months, less than half the time it normally takes. Evan Solomon, Canada's minister of AI and digital technology, welcomed the company's announcement. 'This is a testament to Canadian innovation,' Mr. Solomon said Thursday at VivaTech. He added that the expansion highlighted 'the international reach of Canadian enterprises and their ability to capitalize transformative investments to power the economy of the future.' Mr. Solomon was asked about the optics of a Canadian company investing so heavily in Europe instead of Canada. The federal government has created a $2-billion fund to encourage AI infrastructure development in Canada. So far only one announcement has been made: a $240-million commitment to Toronto-based Cohere Inc. to help fund its computer processing needs. Cohere will use the money to purchase capacity at a data centre opening later this year in Canada that will be operated by CoreWeave Inc., a U.S. company. By contrast France and other European countries have made a major push into AI infrastructure. During a visit to VivaTech this week, French President Emmanuel Macron said he wanted to 'build a computing power capacity installed in Europe with European solutions.' The French government has also announced €109-billion ($171-billion) worth of investments in AI infrastructure and a €20-billion project involving Canadian-controlled Brookfield Asset Management Ltd. 'Europe's GPU capacity will have tripled between 2024 and 2025 and increased tenfold between 2024 and 2026,' Mr. Macron said Wednesday. Mr. Solomon highlighted Hypertec's Canadian roots and said the company will be investing more in its home country. 'They're investing in Canada and there will be more investment in Canada,' he said. He added that 'Canadian companies that are strong around the world is an example of success. We don't want to inhibit that.' Mr. Ahdoot was more circumspect about opportunities in Canada and he said AI infrastructure has been lagging. 'I think part of the question is, in Canada who do you talk to?,' he said. Decision-making tends to be more centralized than in the United States and other jurisdictions. While there are Canadian projects in the planning stages, Mr. Ahdoot said he expected it will take a couple of years before there will be major announcements. 'Until we have the direction at the government level, it's hard to move forward on strategic projects.' Part of the challenge is sourcing the enormous power required for large data centres. The U.S. has a multitude of utilities that are willing and able to provide the power, but Mr. Ahdoot said Canadian utilities have not been as quick to pivot. However, he said that having a federal cabinet minister dedicated to AI should be helpful in spurring development. 'We're going to talk to Evan, and we're going to let him know. Look, we want to play together. We want to work together,' he said

Sahara AI Launches Public SIWA Testnet, Establishing First On-Chain AI Ownership and Provenance Tracking Platform
Sahara AI Launches Public SIWA Testnet, Establishing First On-Chain AI Ownership and Provenance Tracking Platform

Associated Press

time19-05-2025

  • Business
  • Associated Press

Sahara AI Launches Public SIWA Testnet, Establishing First On-Chain AI Ownership and Provenance Tracking Platform

AWS, Google Cloud, UC Berkeley, Quicknode and TogetherAI among 40+ new ecosystem partners Data Services Platform goes public 5/27 to support distributed data collection and annotation at scale LOS ANGELES, May 19, 2025 (Bitwire) -- Sahara AI, the first full-stack, AI-native blockchain platform where anyone on any chain can create, contribute to and monetize AI, today launched SIWA, its public testnet for decentralized AI development. SIWA serves as the first public gateway to the Sahara Blockchain, a foundational infrastructure layer purpose-built to enable the registration, licensing, and monetization of AI assets - such as datasets and models - through transparent, verifiable on-chain protocols. The SIWA testnet will allow developers and contributors to explore and validate these core protocols ahead of Mainnet. During its private testnet, Sahara AI saw strong early traction with over 3.2 million total accounts, 1.4 million daily active accounts, and 200,000 users engaging with the Data Services Platform. Phase 1 of the SIWA testnet focuses on decentralized data ownership, giving contributors the ability to register and tokenize their datasets with verifiable on-chain records. This means these foundational assets, which are often exploited in centralized systems without credit or compensation, can now be transparently tracked and monetized. 'AI currently runs on data from billions of people, but most contributors aren't credited, compensated, or even aware their data is being used,' said Sean Ren, Co-founder and CEO of Sahara AI. 'Sahara AI isn't just another blockchain - it's a call to action. We're already generating revenue through our Data Services Platform, and this value will directly accrue to our testnet users. With SIWA, we're supporting every developer and every chain that shares our vision of letting all AI contributors own their part of it.' Sahara AI's protocol roadmap includes three additional phases before mainnet: licensing, revenue distribution, and royalty vaults that turn attribution into revenue; permissionless testnet with open-source protocols; and pipeline registration, provenance tracking, and proof-of-contribution for automated revenue sharing. Alongside current partners and clients such as Microsoft, Amazon, Snap, MIT, MyShell, USC, and UCLA, the SIWA testnet launches with over 40 new ecosystem partners and clients across AI, Web3, cloud infrastructure, and research organizations, including Amazon Web Services, Google Cloud, UC Berkeley, Together AI, and Quicknode. 'Sahara AI is opening a new chapter in AI development by making the process more open, equitable, and accessible,' said Vipul Ved Prakash, Co-founder and CEO of Together AI. 'Scalable compute is essential to realizing that vision, and Together AI is proud to help power their developer platform to ensure anyone, anywhere can access the resources they need to train and run meaningful AI workloads.' Together with the launch of SIWA, Sahara AI also begins a phased public rollout of its flagship applications, providing chain‑agnostic tools, infrastructure, and economic systems for builders and contributors across Web2 and Web3: the AI Developer Platform, AI Marketplace, and the Data Services Platform (DSP). On May 27, Sahara AI will begin public access to DSP, a first-of-its-kind service that leverages distributed contributors to perform data collection and annotation at scale. Users will be able to register datasets on-chain and mint ownership tokens, with additional public features rolling out in phases. Decentralized peer review, incentive mechanisms, and quality assurance processes ensure data integrity and reliability, with season one resulting in a 92% accuracy rate from internal QA with 289,000 approved datapoints. The close of its second season saw more than 2.55 million approved datapoints with a 95% accuracy rate after internal QA. The Sahara AI Developer Platform and AI Marketplace form a full-stack suite of tools and services that fill a major gap in the blockchain ecosystem. Until now, developers and public chains have lacked the infrastructure needed to support AI development in an open, secure, and collaborative way. Running on Sahara's testnet, the AI Developer Platform delivers end‑to‑end tooling for data, model, agent, and compute workflows, while the AI Marketplace adds on‑chain ownership and revenue‑sharing channels to monetize those assets across both Web2 and Web3. Access to the AI Marketplace will follow shortly. Sahara AI was built on the belief that the people powering AI, whether by contributing data, building models, or deploying applications, should be the main beneficiaries of it. With SIWA, that future is finally taking shape. To be among the first to test Sahara AI's decentralized AI infrastructure, register datasets and mint ownership tokens, please visit Sahara Labs Portal . About Sahara AI Sahara AI is the first full-stack, AI-native blockchain platform where anyone can create, contribute to and monetize AI development, making the future of AI more accessible, equitable and rewarding for all. Built on the Sahara blockchain, Sahara AI's comprehensive platform consists of a data services platform for data labeling and refinement, an AI Developer Platform for model creation, deployment, and tooling, and a decentralized AI Marketplace where you can buy and sell datasets, models, agents, and compute. Sahara AI is already trusted by leading tech innovators and research institutions including Microsoft, Amazon, MIT and Motherson Group. For more information or to join the waitlist for whitelists or the upcoming mainnet, please visit .

Fine-Tune AI Models Like a Pro : No Supercomputer Needed
Fine-Tune AI Models Like a Pro : No Supercomputer Needed

Geeky Gadgets

time12-05-2025

  • Business
  • Geeky Gadgets

Fine-Tune AI Models Like a Pro : No Supercomputer Needed

What if you could create your own custom AI model without needing a PhD in machine learning or access to a high-powered supercomputer? It might sound ambitious, but thanks to modern tools and platforms, this is no longer just a dream for tech giants. In fact, fine-tuning lightweight, pre-trained AI models has made it possible for developers, entrepreneurs, and even hobbyists to build specialized AI solutions tailored to their unique needs. Imagine training an AI to summarize dense reports, analyze customer sentiment, or even power a chatbot—all with minimal resources and maximum efficiency. With platforms like simplifying the process, the barriers to entry are lower than ever, and the potential for innovation is limitless. In the video guide below, Mark Gadala-Maria walks you through the essentials of fine-tuning AI models, from preparing your dataset to optimizing performance with system prompts. You'll discover how to use open source models like Meta Llama 3.1B and harness powerful tools that make AI customization both accessible and cost-effective. Whether you're a business owner looking to streamline operations or a developer eager to explore the possibilities of AI, this guide will equip you with the knowledge to create models that are as precise as they are practical. By the end, you'll not only understand the process but also gain the confidence to bring your AI ideas to life. After all, the future of AI isn't just about what's possible—it's about what you can create. Fine-Tuning AI Models Understanding Fine-Tuning Fine-tuning is the process of adapting a pre-trained AI model to perform specialized tasks by training it on a smaller, task-specific dataset. Instead of building a model from scratch, you can use lightweight, open source models such as Meta Llama 3.1B. These models are highly versatile, cost-effective, and particularly suited for applications like: Chatbot development for customer service or user interaction for customer service or user interaction Sentiment analysis to gauge customer opinions or trends to gauge customer opinions or trends Document summarization for efficient information processing By fine-tuning, you can achieve focused performance while saving significant time and computational resources. Why Choose for Fine-Tuning? is a platform specifically designed to streamline the fine-tuning and deployment of AI models. It provides access to powerful GPU clusters, which are essential for efficient training. The platform operates on a pay-as-you-go model, with pricing based on the complexity and size of your model. This flexibility makes it suitable for both small-scale experiments and large-scale projects. Key benefits of include: Access to powerful computational resources that accelerate training that accelerate training Scalable pricing tailored to your project's needs tailored to your project's needs An intuitive interface that simplifies the training and deployment process These features make an accessible and efficient choice for developers and organizations aiming to fine-tune AI models. How To Create Your Own Custom AI Models Watch this video on YouTube. Check out more relevant guides from our extensive collection on AI fine-tuning that you might find useful. Preparing and Structuring Your Dataset Dataset preparation is a critical step in the fine-tuning process. A well-structured dataset ensures that your model learns effectively and performs accurately. You can source datasets from repositories like HuggingFace, which offers a wide range of pre-labeled datasets, or create your own using tools like Gemini or GPT. Key considerations for preparing your dataset include: Relevance: Ensure the data is directly related to your specific use case. Ensure the data is directly related to your specific use case. Formatting: Structure the dataset correctly, often in JSONL (JSON Lines) format. Structure the dataset correctly, often in JSONL (JSON Lines) format. Specificity: For chatbots, include input-output pairs of user queries and responses. Proper dataset preparation is the foundation for a successful fine-tuning process, making sure that your model can deliver accurate and reliable results. Executing the Training Process Once your dataset is ready, the next step is to train your model. simplifies this process with its user-friendly interface and robust tools. Here's how you can proceed: Upload your dataset using Python scripts or the platform's built-in tools. using Python scripts or the platform's built-in tools. Configure training parameters , such as learning rate, batch size, and training epochs. , such as learning rate, batch size, and training epochs. Authenticate your access with API keys provided by to initiate the training process. After training, you can test your fine-tuned model directly on the platform to evaluate its performance. This step ensures that the model meets your expectations and is ready for deployment. Enhancing Accuracy with System Prompts System prompts are a powerful tool for optimizing the performance of your fine-tuned model. These prompts act as guidelines, shaping the model's behavior to align with your specific needs. For instance, if you're developing a customer service chatbot, a system prompt might instruct the model to prioritize clarity and empathy in its responses. By carefully crafting these prompts, you can ensure that your model delivers consistent, accurate, and contextually appropriate results. This step is particularly useful for applications requiring high levels of precision and reliability. Applications and Advantages of Fine-Tuned Models Fine-tuned models are designed for efficiency and precision, making them ideal for targeted applications. Some common use cases include: Business analytics: Generating insights and reports from large datasets Generating insights and reports from large datasets Customer support: Powering chatbots to handle user queries effectively Powering chatbots to handle user queries effectively Process automation: Streamlining workflows in industries like healthcare, finance, and logistics These models are faster and less resource-intensive than general-purpose AI models, reducing computational overhead and delivering results more quickly. This makes them a practical choice for businesses of all sizes, from startups to large enterprises. Cost Efficiency and Scalability One of the most significant advantages of fine-tuning lightweight models is their cost-effectiveness. Smaller models require fewer computational resources, which translates to lower training and deployment costs. further enhances cost efficiency by offering free credits for initial usage, allowing you to explore the platform's capabilities without upfront investment. As your project scales, the platform's flexible pricing ensures that you only pay for the resources you need. This scalability makes a viable solution for both short-term projects and long-term AI development, allowing organizations to adapt to changing requirements without incurring unnecessary expenses. Unlocking the Potential of Fine-Tuned AI Models Creating custom AI models is now more accessible and efficient than ever. By fine-tuning lightweight, open source models on platforms like you can develop AI solutions tailored to your specific needs. With proper dataset preparation, efficient training processes, and the strategic use of system prompts, you can harness the full potential of AI to achieve your goals. Whether you're building a chatbot, automating workflows, or analyzing data, fine-tuned models offer a powerful, cost-effective, and scalable approach to solving complex challenges. Media Credit: Mark Gadala-Maria Filed Under: AI, Top News Latest Geeky Gadgets Deals Disclosure: Some of our articles include affiliate links. If you buy something through one of these links, Geeky Gadgets may earn an affiliate commission. Learn about our Disclosure Policy.

As AI Still Dominates Silicon Valley Funding, Emphasis On Real-World Impact Grows
As AI Still Dominates Silicon Valley Funding, Emphasis On Real-World Impact Grows

Forbes

time02-05-2025

  • Business
  • Forbes

As AI Still Dominates Silicon Valley Funding, Emphasis On Real-World Impact Grows

Arthur Mouratov, the Founder of Silicon Valley Investclub. getty Silicon Valley unicorns delivered a remarkable performance in Q1 2025, continuing their upward trajectory and once again outpacing traditional market indices. The Silicon Valley Unicorn Index, a proprietary index based on our internal analysis of valuation trends across privately held unicorn companies in the region, surged by 14.18%, in stark contrast to declines in the Dow Jones, NASDAQ and S&P 500. This reaffirms the region's dominance in innovation-led growth. Fueled by major advancements in artificial intelligence and sustained investor enthusiasm, these high-growth companies are setting new standards for speed, scale and ambition in technological development. According to our data, in Q1 2025, Silicon Valley-based unicorns raised a total of $52.97 billion across numerous high-profile funding events. The artificial intelligence sector remained the primary driver, accounting for the overwhelming majority of capital raised and reaffirming its position as the foundation of Silicon Valley's innovation engine. Artificial intelligence dominated the quarter with more than $51 billion raised, led by landmark funding rounds from companies including Databricks and Together AI. The scale of investment reflects growing confidence in artificial intelligence's ability to disrupt traditional workflows, automate complex processes and enable new business models across industries. Innovation in this area is moving beyond research and development, demonstrating clear scalability in real-world enterprise environments. While modest in comparison, the enterprise software, fintech and infrastructure sectors continue to attract steady investor interest. Companies such as Mercury, Zeta and Verkada illustrate how innovation is expanding into essential areas including digital banking and physical security. These firms are modernizing legacy systems through automation, data integration and cloud-native platforms—making critical services smarter, faster and more secure. This wave of funding reinforces the broader narrative that Silicon Valley is not only producing disruptive technologies but also bringing them to market at an unprecedented pace. The variety of unicorns gaining traction suggests a well-rounded ecosystem where artificial intelligence is a central driver, yet also an enabler of innovation in other high-impact sectors. Q1 2025 saw several standout companies reach unicorn status, reflecting strong momentum in sectors where artificial intelligence is being applied to real-world problems with measurable impact. In healthcare, Hippocratic AI raised $141 million, reaching a valuation of $1.64 billion. The company develops safety-oriented language models for non-diagnostic tasks such as care navigation, highlighting how AI is being tailored to sensitive, high-trust environments. Its approach supports, rather than replaces, clinical expertise—a direction gaining traction in the health tech space. In the recruitment space, Mercor reached a $2 billion valuation following a $100 million raise. The platform uses AI to match job seekers to employers through a single, streamlined hiring process—an example of how automation is helping eliminate inefficiencies in legacy hiring systems. In early Q2, Silicon Valley's unicorn ecosystem shows no signs of slowing down. The continued surge in funding, particularly in artificial intelligence, robotics, healthcare and digital infrastructure, highlights the region's position at the forefront of global technological transformation. AI is no longer confined to research labs or early-stage pilots. It has become an integral part of enterprise operations and consumer applications alike. There's a growing emphasis on real-world impact, where technological innovation is measured by its ability to solve concrete problems. What's notable now is the scale and sophistication of deployment. Many startups are delivering mature, AI-powered products that are solving complex problems in sectors such as healthcare, finance and logistics. The growing emphasis on real-world impact is raising expectations for startups. Investors are increasingly prioritizing companies that can demonstrate practical applications and early signs of market traction, rather than relying solely on promising technologies or future potential. As a result, startups may need to sharpen their focus on building viable products, clearly articulating their unique value proposition and outlining a credible path to scale. In the crowded AI market, practical execution has become just as critical as technological innovation. The emergence of unicorns across diverse verticals also reflects a broader shift toward human-centered design. From intelligent hiring platforms to assistive robotics, new technologies are being developed with usability, accessibility and day-to-day relevance at their core. While the pace of change remains rapid, the direction is increasingly clear: Innovation is moving toward scalable, real-world impact. For companies, this means aligning technological development with practical use cases. For ecosystems like Silicon Valley, it means continuing to foster environments where cross-disciplinary talent, infrastructure and long-term vision can turn breakthrough ideas into everyday tools. The information provided here is not investment, tax or financial advice. You should consult with a licensed professional for advice concerning your specific situation. Forbes Finance Council is an invitation-only organization for executives in successful accounting, financial planning and wealth management firms. Do I qualify?

Startup Together AI Valued at $3.3 Billion as AI Demand Grows
Startup Together AI Valued at $3.3 Billion as AI Demand Grows

Yahoo

time20-02-2025

  • Business
  • Yahoo

Startup Together AI Valued at $3.3 Billion as AI Demand Grows

(Bloomberg) -- Startup Together AI, which provides users access to artificial intelligence computing, has raised $305 million from investors in deal led by General Catalyst, bringing the valuation of the company to $3.3 billion — a signal that corporate demand for AI is still strong. Trump to Halt NY Congestion Pricing by Terminating Approval Sorry, Kids: Disney's New York Headquarters Is for Grown-Ups Airbnb Billionaire Offers Pre-Fab Homes for LA Fire Victims Child Migrant Watchdog Gutted in DOGE Cuts Chicago Council Delays $830 Million Bond Vote Amid Scrutiny Saudi Arabia's Prosperity7 Ventures co-led the round, which also included Salesforce Ventures, Nvidia Corp., Kleiner Perkins, Emergence Capital, Lux Capital and other investors. Together AI's platform allows developers to access open source AI models and the compute power necessary to build AI applications. 'We provide a very end-to-end service,' Chief Executive Officer Vipul Ved Prakash said. 'This includes data centers and computing clusters from Nvidia, and the software layers and platform services.' The result allows Together's customers to 'get up and running really quickly,' he said. As demand for AI grows in the corporate world, Together AI is seeing rapid growth, the company said. The startup recently surpassed $100 million in annualized revenue, a jump from the $30 million in annualized revenue it had in February 2024, according to a person familiar with the matter who asked not to be identified discussing private information. The Information previously reported some details of the new funding. Customers can use more than 200 open source models via Together AI's platform, including Meta Platforms Inc.'s Llama and those from Chinese AI startup DeepSeek. Nvidia is both an investor and a partner for Together AI, which has built its service on Nvdia hardware, Prakash said. The startup plans to deploy Nvidia's Blackwell GPU's in its data centers. Together AI's latest valuation is a significant leap from its $1.25 billion valuation in March 2024. Prakash said the company plans to use the new funding in part to double its workforce of 160 by the end of 2025. Japan Perfected 7-Eleven. Why Can't the US Get It Right? How Med Spas Conquered America Before DeepSeek Blew Up, Chatbot Arena Announced Its Arrival Elon Musk's DOGE Is a Force Americans Can't Afford to Ignore The Startup That Stepped In When the Baby Formula Supply Chain Broke ©2025 Bloomberg L.P. Sign in to access your portfolio

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store