logo
#

Latest news with #VikramGupta

Thyrocare Tech spurts as Q1 PAT jumps 61% YoY to Rs 39 cr
Thyrocare Tech spurts as Q1 PAT jumps 61% YoY to Rs 39 cr

Business Standard

time24-07-2025

  • Business
  • Business Standard

Thyrocare Tech spurts as Q1 PAT jumps 61% YoY to Rs 39 cr

Thyrocare Technologies surged 11.30% to Rs 1,340.55 after the healthcare service provider reported a 61.07% increase in consolidated net profit to Rs 38.93 crore on a 23.02% rise in revenue from operations to Rs 193.03 crore in Q1 FY26 over Q1 FY25. Profit before tax stood at Rs 50.48 crore in Q1 FY26, registering a growth of 50.46% from Rs 33.55 crore recorded in Q1 FY25. Total expenses rose 16.45% YoY to Rs 147.45 crore during the quarter. Employee benefits expenses stood at Rs 33 crore (up 14.9%), while the cost of materials consumed was at Rs 55.41 crore (up 24.1% YoY) during the period under review. The company's revenue from Diagnostic Testing Services was at Rs 178.33 crore (up 24.52% YoY), while revenue from Imaging Services stood at Rs 14.04 crore (up 7.67% YoY) in Q1 FY26. Reported EBITDA improved 37% YoY to Rs 57.46 crore in the June 2025 quarter from Rs 42.01 crore reported in the corresponding period the previous year. EBITDA margin increased to 30% in Q1 FY26 compared to 27% reported in the corresponding quarter last fiscal year. Meanwhile, the company announced the appointment of Vikram Gupta as the chief financial officer (CFO) of the company, effective from 24 July 2025. Thyrocare Technologies is engaged in the business of the healthcare industry and is involved in providing quality diagnostic services at affordable costs to patients, laboratories, and hospitals in India.

How Federated Learning Moves AI Closer To The Edge
How Federated Learning Moves AI Closer To The Edge

Forbes

time16-07-2025

  • Business
  • Forbes

How Federated Learning Moves AI Closer To The Edge

Vikram Gupta is Chief Product Officer, SVP & GM of the IoT Processor Business Division at Synaptics, a leading EdgeAI semiconductor company. In my previous articles, I explored how the rapid growth of edge AI is calling for a new class of AI-native compute platforms and how multimodal sensing—including vision and audio—is enabling more intuitive, context-aware user experiences. These trends mark a decisive shift from centralized cloud processing to intelligent, personalized and privacy-conscious computing at the edge. Building on that foundation, the next frontier is not just how AI runs at the edge, but how it learns and evolves there. This is where federated machine learning (FML) enters the picture. Despite the ubiquity of AI in our everyday lives, the vast majority of AI model development and processing still happens in the cloud, often far away from where we interact with it. The approach has served us well, with centralized and powerful compute engines doing the heavy lifting involved in collecting data and training sophisticated learning models. As AI proliferates and billions of connected devices generate data at the edge, traditional centralized model training is becoming increasingly impractical, constrained by privacy concerns, regulatory pressures and latency limitations. At the same time, the push for more personalized, context-aware experiences is accelerating AI processing toward a fragmented landscape of "far edge" devices, such as smartwatches, wearables and industrial sensors, that rely on real-time, local understanding of their environments. This transformation is not only enabling today's intelligent experiences but also laying the groundwork for an entirely new class of cloud-agnostic, AI-driven applications—many of which we have yet to imagine—that will operate independently and become seamlessly woven into the fabric of everyday life. This shift has given rise to the new paradigm known as federated machine learning. By enabling localized intelligence directly on the devices where data is created, FML introduces a wider range of more private, personalized and responsive alternatives to cloud-centric models. Realizing this vision means evolving the AI ecosystem from system architecture and silicon design to software tooling. It also extends into how data is collected, used and protected. The need for centralized computing resources won't necessarily go away, but this broader range of synergistic processing approaches is being driven by a more federated future. One size does not fit all in the world of edge AI. Meeting the need for broader and more diverse deployment of AI, the rise of FML allows systems to become progressively more intelligent and autonomous by using on-device data and sharing only encrypted model updates. Devices that can benefit from FML are increasingly present in our everyday lives, such as smart home assistants learning speech patterns locally, wearables monitoring health metrics without cloud sync and industrial machines predicting failure based on unique deployment environments. Recent announcements from companies like Google and OpenAI point to a future where AI is moving beyond phones into a new generation of devices. The evolution of devices such as extended reality (XR) wearables raises questions: Do these devices need a cloud connection? A phone tether? Or can they operate independently, or even coordinate locally through a hub? FML introduces the idea of processing zones, which could range from on-device to near-edge aggregation hubs or the centralized cloud. This transition to the future of edge AI depends on flexible, multitiered intelligence. Ecosystem Complexity At The Edge: Fragmentation, Tooling And Hardware Diversity The adoption of AI at the edge is not without some unique challenges. Unlike the relatively structured centralized processing model of the current data center-centric approach, the edge is messy. It features different operating systems (such as RTOS, Linux and Android variants with proprietary firmware), chip architectures (such as Arm, RISC-V and x86) and AI toolkits. On top of that, many devices lack the optimal processing, memory or power for robust on-device inference—let alone training. Tooling for deploying and updating models is fragmented, particularly at scale. FML doesn't scale unless tools and hardware converge around modularity, efficiency and openness. The Chip Supplier's New Role: A Scalable, Neutral Enabler As this federated future of AI unfolds, success will hinge on delivering flexible, scalable solutions that span silicon, software and tools capable of adapting to diverse devices and dynamic ecosystems. By embracing openness, efficiency and intelligent decentralization, companies can unlock the full potential of edge AI. The shift toward distributed intelligence is redefining how we interact with technology, making it more private, responsive and relevant to real-world environments. Real progress in edge AI depends on open-source tools, accessible frameworks and broad collaboration across the ecosystem. By focusing on practical solutions and inclusive innovation, this transformation can bring smarter experiences closer to where they matter most. Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?

Inside IvyCap's Tech Playbook
Inside IvyCap's Tech Playbook

Entrepreneur

time28-06-2025

  • Business
  • Entrepreneur

Inside IvyCap's Tech Playbook

Opinions expressed by Entrepreneur contributors are their own. You're reading Entrepreneur India, an international franchise of Entrepreneur Media. A homegrown venture capital firm, and at the helm of it, founder and managing partner, Vikram Gupta, whose tech-focused investment thesis is making careful bets in artificial intelligence (AI), healthtech, and other deep tech sectors. The firm's origin is unique: a venture capital fund backed by Indian institutional money and a strong IIT alumni trust. Through this, IvyCap is scaling up capital, with the idea to invest in very early-stage startups. "And these new age technologies, which are in very early stages in their TRL (technology readiness level) 1, 2, 3 stages, to helping them build a very unique kind of technology stacks. And we've been quite fortunate to have made a lot of room and a lot of progress there," says Gupta. Vikram says the idea is to help provide grants to such disruptive technologies. The firm is going into the deep pockets of all the IITs, IIMs, ISB, Indian Institute of Science, and others, leveraging these technologies and helping build centers of excellence. "That's how we can spot these technologies early on and even fund them for commercialization or even scale up." The VC firm sees huge opportunities across deep tech, emerging tech areas. According to Gupta, there are quite a few examples of disruptions, for example, AI being an overarching theme has multiple areas, including vertical AI, horizontal AI, and infrastructure AI. "Vertical AI is catering to sectors such as financial services, health care, or insurance. That is a unique opportunity building up. Horizontal AI, on the other hand, is building a lot of agentic AI tools and other AI disruptive technologies, which are catering to various models. And the infrastructure AI is working towards creating a lot of hardware support systems, like semiconductors, and setting up data centers and other things, which support the AI processing. So we are investing across all three areas," says Gupta. Gupta also adds that the firm is looking at opportunities across areas such as space tech, defence tech, IoT devices, and other similar hardware technologies. "In addition, there are opportunities in blockchain and other areas as well. So we are going deeper in each of these areas through our collaborations with the IoTs and identifying specific talent across each of these verticals and horizontals." Gupta believes that India is sitting at a very unique place to leverage these areas, and a lot of talent is now getting involved. "And with our funding and a lot of grant capital that we are building as a large pool of capital are likely to build this further." However, it is not easy to catch these technology trends early on, with the firm looking at people involved in solving specific problems. "We look for the passion they have in driving these technologies." "We also look at the specific business models being targeted and the understanding of the commercial side in terms of the problem getting solved. And I think some of these ideas are very disruptive. So when they're very disruptive, the risk is also very high." Sometimes, these bets could become capital-intensive over a period of time, with the firm having to look at various factors. "But if you were to break it down across different buckets, so one bucket is the bucket of the entrepreneur and the team, which is looking at their backgrounds. The second bucket is the business model, geographical spread, etc. There are all those kinds of things that we evaluate from a business model perspective. And the third piece is about the scale potential in terms of how large this can be," says Gupta. Factsheet: Corpus size: INR 5,000 crore Portfolio: 55 Companies Dragon Exits: Purplle at INR 330 Cr

Home Equity Finance Study Reveals Growing Momentum in Originations, RMBS Issuance
Home Equity Finance Study Reveals Growing Momentum in Originations, RMBS Issuance

Yahoo

time19-05-2025

  • Business
  • Yahoo

Home Equity Finance Study Reveals Growing Momentum in Originations, RMBS Issuance

Full year and fourth quarter activity reviewed LAGUNA BEACH, Calif., May 19, 2025 /PRNewswire/ -- Home Equity Lending News (HELN), the leading source for breaking news and statistics exclusively about home-equity finance, released the Q4 2024 Home Equity Finance Study, which shows origination growth last year. In addition, banks tightened their HELOC standards, even as RMBS issuers eased their requirements amid a potential doubling of securitization volume. The study highlights how traditional banks' long-standing dominance in the home-equity lending space has eroded. "Although the HELOC market has historically been bank dominated and relatively insulated from competition, a growing list of non-bank HELOC lenders has clearly changed this paradigm over the last few years," HELN Director Vikram Gupta said. "Banks must proactively adapt if they intend to maintain competitive credibility within this increasingly dynamic segment." HELOC and closed-end rates, yields and WACs have all tumbled as credit union yields fell furthest. "Three years ago, I saw price compression at 100.50 to 101.50," HELN Director Ralph Armenta said. "Today, there is such a voracious appetite that I am seeing trades at the 105-106 handle, a 6-year life for both CES and HELOC, which are driving yield to around 8%. Time, patience, and investor demand have been kind to this asset." Small financial institutions were responsible for a significant share of the growth in depository HELOC portfolios, while securitized loan performance improved, and internet searches for equity-sharing products outpaced lending products. RMBS issuance continued to gain momentum. HELN CEO Sam Garcia said, "So far in 2025, securitization volume has almost doubled compared to the same stretch last year. Given this accelerated start and the current deregulatory climate, it's conceivable that home-equity issuance could reach or even exceed 100 transactions for $35 billion by year-end." The full Q4 2024 study is available to download for free at: About Home Equity Lending NewsHELN is the leading source of breaking news and statistics exclusively about home-equity finance, including loan products and equity-sharing products. Our editorial team writes for home-equity originators, servicers, investors and other stakeholders. HELN was founded in 2022 by its CEO, Sam Garcia. Read more about home-equity lending online at Media Contact editor@ | 949.773.9237 View original content: SOURCE Home Equity Lending News Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data

Synaptics Extends Edge AI Portfolio with High-Performance Adaptive MCUs for Multimodal Context-Aware Computing
Synaptics Extends Edge AI Portfolio with High-Performance Adaptive MCUs for Multimodal Context-Aware Computing

Associated Press

time10-03-2025

  • Business
  • Associated Press

Synaptics Extends Edge AI Portfolio with High-Performance Adaptive MCUs for Multimodal Context-Aware Computing

NUREMBERG, Germany, March 10, 2025 (GLOBE NEWSWIRE) -- Synaptics® Incorporated (Nasdaq: SYNA) has extended its award-winning Synaptics Astra TM AI-Native platform with the SR-Series high-performance adaptive microcontroller units (MCUs) for scalable context-aware Edge AI. The series features three tiers of operation: performance (100 GOPS), efficiency, and ultra-low-power (ULP) always-on (AON) to deliver intelligence at every power level. Based on an Arm® Cortex®-M55 core and the Arm Ethos™-U55 neural processing unit (NPU), the SR-Series is supported by the AstraTM Machina Micro development kit and open-source SDK. It is optimized for multimodal consumer, enterprise, and industrial Internet of Things (IoT) workloads with accelerators and adaptive vision, audio, and voice algorithms. The small-form-factor MCUs have a rich set of peripherals—including multiple camera interfaces—to help minimize system cost, power, and footprint while enabling integration into a wide range of devices, such as battery-operated security cameras, sensors, appliances, point-of-sale, digital signage, and scanners. At EW2025? Join us in Booth #4A-259 to learn about our advances in Edge AI, wireless connectivity, and automotive display technologies. Email [email protected] for an appointment. Announced at EW2024 with the SL-Series MPUs, the Synaptics Astra AI-Native compute platform for the IoT combines scalable, low-power compute silicon for the device Edge with open-source, easy-to-use software and tools and Veros TM wireless connectivity. The platform was built upon Synaptics' foundation in neural networks, field-hardened AI hardware and compiler design expertise for the IoT, and refined, in-house support of a broad base of modalities. 'We believe we are at an inflection point in Edge AI where embedded developers have a unique opportunity to redefine human-machine interaction through multimodal processing and contextual awareness,' said Vikram Gupta, Senior Vice President and General Manager of IoT Processors, Chief Product Officer at Synaptics. 'Unlocking this potential requires a new class of embedded compute silicon. As part of our Astra family, the SR-Series extends our Edge AI processing roadmap with intelligence optimized for various power levels. It allows the development of cognitive IoT devices that seamlessly adapt to their surroundings, from ultra-low-power always-on sensing to high-performance edge inference.' 'Enabling ultra-low-power AI processing at the edge will revolutionize emerging applications across various markets, including retail and smart home, where we are seeing greater performance demands,' said Paul Williamson, Senior Vice President and General Manager, IoT Line of Business at Arm. 'With the new SR-Series, built on the Arm compute platform, Synaptics is delivering the real-time intelligence and innovation needed to scale edge AI deployments.' 'Meeting the needs of future intelligent edge devices requires solutions capable of multimodal processing to achieve situational awareness,' said Jim McGregor, Principal Analyst at TIRIAS Research. 'Synaptics' SR-Series of AI MCUs offers a scalable solution that maximizes IoT device awareness while simplifying integration and optimizing power and performance.' Technical highlights The SR-Series comprises three MCUs, the SR110, SR105, and SR102, each with its respective features and benefits that cater to a range of multimodal application requirements. All three MCUs use a Cortex-M55 core with Arm Helium™ technology running up to 400 MHz. The SR110 also has a Cortex-M4 core and Arm Ethos-U55 NPU and is sampling now; the SR105 has an Ethos-U55 NPU; and the SR102 is a single Cortex-M55 device. Other SR-Series features include: The Machina Micro kit maintains Astra's signature 'out-of-the-box' AI development experience for beginners and experts alike. For more information: About Synaptics Incorporated Synaptics (Nasdaq: SYNA) is driving innovation in AI at the Edge, bringing AI closer to end users and transforming how we engage with intelligent connected devices, whether at home, at work, or on the move. As a go-to partner for forward-thinking product innovators, Synaptics powers the future with its cutting-edge Synaptics Astra™ AI-Native embedded compute, Veros™ wireless connectivity, and multimodal sensing solutions. We're making the digital experience smarter, faster, more intuitive, secure, and seamless. From touch, display, and biometrics to AI-driven wireless connectivity, video, vision, audio, speech, and security processing, Synaptics is the force behind the next generation of technology enhancing how we live, work, and play. Follow Synaptics on LinkedIn, X, and Facebook, or visit Synaptics and the Synaptics logo are trademarks of Synaptics in the United States and/or other countries. All other marks are the property of their respective owners. For further information, please contact: Media Contact Patrick Mannion Synaptics +1-631-678-1015

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store