Latest news with #KwakNoh-jung


Korea Herald
4 days ago
- Business
- Korea Herald
SK hynix heads to Silicon Valley to recruit AI talent
SK hynix is heading to Silicon Valley this week to host its annual SK Global Forum aimed at recruiting top-tier tech talent for artificial intelligence development, the company said Thursday. The world's leading memory chip-maker is hosting the three-day forum from Friday in Santa Clara, California, bringing together company executives and US-based engineers, researchers and graduate students to discuss the future of memory technology. 'It is time for us to strengthen our capabilities in computing system architecture as we expand our presence in the AI memory ecosystem,' the company said. 'We established a new session for system architecture in our forum to engage with experts in this field.' SK hynix said its CEO Kwak Noh-jung will be leading the forum, delivering the keynote to outline the company's strategy for navigating the rapidly evolving AI landscape. Other top chiefs, including Kim Joo-sun, the president in charge of AI infrastructure; Ahn Hyun, the chief development officer; and Chief Technology Officer Cha Seon-yong will also join the event to meet with invitees and lead discussions on memory innovation and future product roadmaps, the company said. In a first for the forum, the chipmaker has set up a separate area to showcase its core technologies for AI data centers and on-device AI solutions, including high bandwidth memory, enterprise SSD and LPCAMM2 module, for the participants. 'We have been strengthening our technological competitive edge by recruiting excellent talent through the global forum,' said Shin Sang-kyu, head of Corporate Culture at SK hynix. 'We will continue our 'Renaissance journey' to secure talent to lead the AI era.' The annual event has become a key part of the company's global talent strategy since its launch in 2012.


Korea Herald
24-04-2025
- Business
- Korea Herald
SK hynix Q1 net profit shoots up on HBM, beats market expectations
SK hynix Inc. said Thursday its first-quarter net profit more than quadrupled on rising demand for artificial intelligence chips, including high bandwidth memory, exceeding market expectations. The world's second-largest memory chipmaker said in a regulatory filing that its net income reached 8.1 trillion won ($5.7 billion) for the January-March period, up from 1.92 trillion won a year earlier. Its operating income soared 157.8 percent on-year to 7.44 trillion won for the quarter, compared with 2.88 trillion won a year ago. Revenue rose 41.9 percent to 17.63 trillion won. Both operating profit and sales marked the second-best quarterly results following the record highs in the fourth quarter of last year. The earnings exceeded market expectations. The average estimate of net profit by analysts stood at 5.48 trillion won, according to a survey by Yonhap Infomax, the financial data firm of Yonhap News Agency. SK hynix attributed the better-than-expected bottom line to strong demand for AI memory, solidifying its status as a world leader in HBM technology. "The memory market ramped up faster than expected due to competition to develop AI systems and inventory accumulation demand," SK hynix said. SK hynix said its annual HBM sales for 2025 are expected to double from those of 2024, with sales orders for 2025 already secured. Its Chief Executive Officer Kwak Noh-jung previously mentioned that the company has sold out this year's HBM production, supplying its 12-layer HBM3E product to major customers, including Nvidia Corp. Sales of 12-layer HBM3E, currently the most advanced HBM in mass production, are expected to account for over half of its total HBM3E sales in the second quarter. For the sixth-generation HBM4 chips, SK hynix plans to complete preparations for mass production by the end of this year. "SK hynix will focus on products with demand feasibility and profitability to enhance investment efficiency," Kim Woo-hyun, chief financial officer at SK hynix, said. "As an AI memory leader, we will strengthen collaboration with partners and carry out technological innovation in efforts to continue profit growth with industry-leading competitiveness." (Yonhap)


Korea Herald
27-03-2025
- Business
- Korea Herald
SK hynix: HBM chips for 2026 to sell out in first half
SK hynix plans to finalize next year's supply agreement for AI-critical high-bandwidth memory chips in the first half of this year, according to chipmaker CEO Kwak Noh-jung. 'This year's HBM supply volume has already sold out, and we plan to finalize consultations with customers for the 2026 volume in the first half of this year,' Kwak said at the company's general shareholders' meeting held at its headquarters in Icheon, Gyeonggi Province, on Thursday. Kwak added that due to the long production cycle and high investment costs associated with HBM, the company opts for advance volume agreements with clients to enhance sales predictability. SK hynix, the world's second-largest memory-chip maker after Samsung, leads the HBM market, supplying fifth-generation 12-layer HBM3E chips to tech giant Nvidia. HBM is a critical component for the graphics processing units that power generative AI systems like ChatGPT. The chipmaker said last week that it has shipped samples of the next-generation 12-layer HBM4 to its major customers, with plans for mass production in the second half of this year. At the meeting, Kwak expressed confidence in the continued growth of the AI chip market, despite a prolonged economic slump. 'Uncertainty is high with the continued downturn in the global economic growth outlook, but big tech companies are ramping up investment to secure leadership in AI," he said. "With the increase in graphics processing units and application-specific integrated circuits, we expect an explosive rise in HBM demand.' He added that the industry expects the HBM market to expand 8.8-fold and the enterprise solid-state drive market to grow approximately 3.5-fold by 2025, compared to 2023. Kwak played down concerns that low-cost AI models like DeepSeek could reduce demand for high-performance AI memory chips such as HBM4. 'I don't see DeepSeek reducing demand for HBM,' he said. 'With the emergence of AI models like DeepSeek, the entry of new startups into the market will accelerate, and as high-quality AI services increase, demand for AI chips will increase more quickly.' Kwak also emphasized the company's flexibility in managing HBM production, particularly between HBM3E and HBM4. 'Since both products use the same DRAM platform, we can respond flexibly based on demand,' he said. 'We will continue close consultations with our customers leading up to the mass production of HBM4 in the second half of the year.' He also unveiled plans to begin mass production of system-on-chip advanced memory modules, or SOCAMM, a DRAM-based memory module tailored to AI servers and data centers. 'We are working with key customers to proactively respond to expected demand growth in the SOCAMM market for AI servers,' said Kwak. 'Development is underway with the goal of mass production this year.' At the meeting on Thursday, Kwak was reappointed as an internal director, and SK Square CEO Han Myung-jin was appointed as a new external director.


Korea Herald
19-03-2025
- Business
- Korea Herald
SK hynix leads AI chip race with early HBM4 shipments
SK hynix, the world's second-largest memory chip maker, said Wednesday it unveiled a sample of its next-generation high-bandwidth memory chips during GTC 2025, an annual tech conference hosted by the US chip giant Nvidia. At its booth, titled 'Memory, Powering AI and Tomorrow,' the company presents HBM, memory products for AI data centers, on-device memory, and memory solutions for automotive businesses until Friday in San Jose, California. Among its featured products is the 12-layer HBM3E, currently the most advanced HBM in mass production. The company is also introducing a prototype of the next-generation 12-layer HBM4, which is still under development, as well as a small outline compression attached memory module, a low-power DRAM-based memory module optimized for AI servers. SK hynix aims to expand mass production of 12-layer HBM3E chips this year while preparing for the production of 16-layer HBM3E chips in the first half of this year. The company is also gearing up for mass production of 12-layer HBM4 chips in the second half of this year, with supply expected to begin in alignment with customer demand. Key SK hynix executives, including CEO Kwak Noh-jung and head of AI infrastructure and Chief Marketing Officer Kim Ju-seon, are set to meet with the leaders of the global AI industry during GTC 2025 to enhance collaboration. 'We are proud to present our line-up of industry-leading products at GTC 2025,' Kim said. 'With a differentiated competitiveness in the AI memory space, we are on track to bring our future as the full stack AI memory provider forward.' Meanwhile, the memory chipmaker said on Wednesday that it has shipped samples of the world's first 12-layer HBM4 to its major customers. 'We have shipped 12-layer HBM4 samples earlier than originally planned and initiated the certification process with our customers. We'll also complete mass production preparations within the second half of the year, solidifying our position in the next-generation AI memory market,' an SK hynix official said. While the company did not disclose its customers, major US tech firms, including Nvidia and Broadcom, are believed to be among them. Currently, SK hynix leads the HBM market with its industry-leading HBM3E products. HBM plays a critical role in graphics processing units, a market largely dominated by Nvidia. Its crosstown rival Samsung Electronics aims to begin mass production of HBM4 in the second half of this year, while US-based Micron Technology has set a goal of mass-producing HBM4 within two years. In particular, Micron is accelerating efforts to strengthen its HBM4 capabilities ahead of its planned release by enhancing collaboration with TSMC, the world's leading foundry company. This includes appointing former TSMC Chairman Mark Liu to its board of directors.


Korea Herald
19-03-2025
- Automotive
- Korea Herald
SK hynix unveils next-generation HBM at GTC 2025
SK hynix Inc. said Wednesday it has unveiled a sample of its next-generation high bandwidth memory for the first time during GTC 2025, an annual tech conference hosted by Nvidia Corp. The South Korean chip giant opened a booth titled "Memory, Powering AI and Tomorrow" at GTC 2025 that kicked off Monday for a five-day run in California, according to SK hynix. The showcase highlights SK hynix's HBM and other memory products for artificial intelligence data centers and on-device and memory solutions for the automotive business that are essential for the AI era. Among its featured products is the 12-layer HBM3E, currently the most advanced HBM in mass production. The company is also showcasing a prototype of the next-generation 12-layer HBM4, which is still under development, for the first time, as well as a small outline compression attached memory module, a low-power DRAM-based memory module optimized for AI servers. Following the industry's first mass production and supply of the 12-layer HBM3E, SK hynix is now planning to complete the preparatory works for large-scale production of the HBM4 within the second half of the year. Meanwhile, SK hynix's top executives, led by Chief Executive Officer Kwak Noh-jung, are set to meet with global AI industry leaders during GTC 2025 to strengthen collaborations and explore future opportunities in AI-driven memory technology. (Yonhap)