SK hynix to Unveil ‘Full Stack AI Memory Provider’ Vision at CES 2025

SK hynix to showcase technological capabilities, participating in the world's largest consumer electronics show, CES 2025, from January 7-10, featuring a wide range of products driving the AI era, from HBM, the core of AI infrastructure, to next-gen memories like PIM. Company to present new possibilities in the AI era through technological innovation and provide irreplaceable value.
SK hynix to Unveil 'Full Stack AI Memory Provider' Vision at CES 2025

SK hynix Inc. (or “the company”, www.skhynix.com) announced today that it will showcase its innovative AI memory technologies at CES 2025, to be held in Las Vegas from January 7 to 10 (local time).

A large number of C-level executives, including CEO Kwak No-jung, CMO (Chief Marketing Officer) Justin Kim and Chief Development Officer (CDO) Ahn Hyun, will attend the event. “We will broadly introduce solutions optimized for on-device AI and next-generation AI memories, as well as representative AI memory products such as HBM and eSSD at this CES,” said Justin Kim. “Through this, we will publicize our technological competitiveness to prepare for the future as a ‘Full Stack AI Memory Provider1‘.”


1Full Stack AI Memory Provider: Refers to an all-round AI memory provider, which provides comprehensive AI-related memory products and technologies

SK hynix will also run a joint exhibition booth with SK Telecom, SKC and SK Enmove, under the theme “Innovative AI, Sustainable Tomorrow.” The booth will showcase how SK Group’s AI infrastructure and services are transforming the world, represented in waves of light.

SK hynix, which is the world’s first to produce 12-layer HBM products for 5th generation and supply them to customers, will showcase samples of HBM3E 16-layer products, which were officially developed in November last year. This product uses the advanced MR-MUF process to achieve the industry’s highest 16-layer configuration while controlling chip warpage and maximizing heat dissipation performance.

In addition, the company will display high-capacity, high-performance enterprise SSD products, including the ‘D5-P5336’ 122TB model developed by its subsidiary Solidigm in November last year. This product, with the largest existing capacity, high power and space efficiency, has been attracting considerable interest from AI data center customers.

“As SK hynix succeeded in developing QLC2 (Quadruple Level Cell)-based 61TB products in December, we expect to maximize synergy based on a balanced portfolio between the two companies in the high-capacity eSSD market” said Ahn Hyun, CDO at SK hynix. The company will also showcase on-device AI products such as ‘LPCAMM23‘ and ‘ZUFS 4.04,’ which improve data processing speed and power efficiency to implement AI in edge devices like PCs and smartphones. The company will also present CXL and PIM (Processing in Memory) technologies, along with modularized versions, CMM(CXL Memory Module)-Ax and AiMX5, designed to be core infrastructures for next-generation data centers.

2QLC: NAND flash is divided into SLC (Single Level Cell), MLC (Multi Level Cell), TLC (Triple Level Cell), QLC (Quadruple Level Cell), and PLC (Penta Level Cell) depending on how much information is stored in one cell. As the amount of information stored increases, more data can be stored in the same area.

3Low Power Compression Attached Memory Module 2 (LPCAMM2): LPDDR5X-based module solution that provides power efficiency and high performance as well as space savings. It has the performance effect of replacing two existing DDR5 SODIMMs with one LPCAMM2.

4Zoned Universal Flash Storage (ZUFS): A NAND Flash product that improves efficiency of data management. The product optimizes data transfer between an operating system and storage devices by storing data with similar characteristics in the same zone of the UFS, a flash memory product for various electronic devices such as digital camera and mobile phone.

5Accelerator-in-Memory based Accelerator (AiMX): SK hynix’s accelerator card product that specializes in large language models using GDDR6-AiM chips

In particular, CMM-Ax is an groundbreaking product that adds computational functionality to CXL’s advantage of expanding high-capacity memory, contributing to improving performance and energy efficiency of the next-generation server platforms6.

6Platform: Refers to a computing system that integrates both hardware and software technologies. It includes all key components necessary for computing, such as the CPU and memory.

“The changes in the world triggered by AI are expected to accelerate further this year, and SK hynix will produce 6th generation HBM (HBM4) in the second half of this year to lead the customized HBM market to meet the diverse needs of customers,” said Kwak Noh-Jung, CEO at SK hynix. “We will continue to do our best to present new possibilities in the AI era through technological innovation and provide irreplaceable value to our customers.”

About SK hynix Inc.

SK hynix Inc., headquartered in Korea, is the world’s top-tier semiconductor supplier offering Dynamic Random Access Memory chips (“DRAM”), flash memory chips (“NAND flash”), and CMOS Image Sensors (“CIS”) for a wide range of distinguished customers globally. The Company’s shares are traded on the Korea Exchange, and the Global Depository shares are listed on the Luxemburg Stock Exchange. Further information about SK hynix is available at www.skhynix.com, news.skhynix.com.

SOURCE SK hynix Inc.: https://www.prnewswire.com/news-releases/sk-hynix-to-unveil-full-stack-ai-memory-provider-vision-at-ces-2025-302341613.html


Recent Content

Telefónica Tech has partnered with Perplexity to launch Perplexity Enterprise Pro, a secure AI-powered search tool for businesses in Spain. Designed for enterprise use, the platform enables advanced, real-time knowledge discovery, integrates SSO and SOC2 protections, and respects data privacy. Telefónica offers pilots and full professional services to support implementation—targeting productivity boosts in sectors like healthcare, finance, and law.
Trump’s AI Action Plan marks a major shift in U.S. technology policy, emphasizing deregulation, global AI exports, and infrastructure acceleration. The plan repeals Biden-era safeguards and aims to position American companies ahead of China in the global AI race, while sparking debate on jobs, environmental costs, and the limits of state-level regulation.
OpenAI has confirmed its role in a $30 billion-per-year cloud infrastructure deal with Oracle, marking one of the largest cloud contracts in tech history. Part of the ambitious Stargate project, the deal aims to support OpenAI’s growing demand for compute resources, with 4.5GW of capacity dedicated to training and deploying advanced AI models. The partnership positions Oracle as a major player in the AI cloud arms race while signaling OpenAI’s shift toward vertically integrated infrastructure solutions.
Amazon is acquiring Bee, a San Francisco AI wearable startup, to expand its footprint in mobile AI devices. Bee’s $49.99 wristband records ambient conversations to generate tasks and reminders, positioning it as a personal AI companion. The move reflects Amazon’s broader strategy to integrate generative AI into everyday consumer hardware, potentially reshaping how we interact with AI beyond the home.
The NTIA has approved all 56 U.S. states and territories to move into the “Benefit of the Bargain” round under the $42.45B BEAD Program. This competitive subgrantee selection phase streamlines broadband deployment nationwide by allowing fiber, fixed wireless, and satellite providers equal footing under new, tech-neutral NTIA rules. Final proposals are due by September 4, 2025, as the U.S. pushes toward universal internet access.
smartR AI™ is celebrating a major win, taking home the coveted “Best AI Implementation in Information Technology” award at the highly competitive 2025 Business Awards UK. This prestigious recognition underscores the groundbreaking success and transformative impact of smartR AI’s flagship product, SCOTi®.
Whitepaper
Explore how Generative AI is transforming telecom infrastructure by solving critical industry challenges like massive data management, network optimization, and personalized customer experiences. This whitepaper offers in-depth insights into AI and Gen AI's role in boosting operational efficiency while ensuring security and regulatory compliance. Telecom operators can harness these AI-driven...
Supermicro and Nvidia Logo
Whitepaper
The whitepaper, "How Is Generative AI Optimizing Operational Efficiency and Assurance," provides an in-depth exploration of how Generative AI is transforming the telecom industry. It highlights how AI-driven solutions enhance customer support, optimize network performance, and drive personalized marketing strategies. Additionally, the whitepaper addresses the challenges of integrating AI into...
RADCOM Logo
Article & Insights
Non-terrestrial networks (NTNs) have evolved from experimental satellite systems to integral components of global connectivity. The transition from geostationary satellites to low Earth orbit constellations has significantly enhanced mobile broadband services. With the adoption of 3GPP standards, NTNs now seamlessly integrate with terrestrial networks, providing expanded coverage and new opportunities,...

Download Magazine

With Subscription

Subscribe To Our Newsletter

Scroll to Top

Private Network Awards

Recognizing excellence in 5G, LTE, CBRS, and connected industries. Nominate your project and gain industry-wide recognition.
Early Bird Deadline: Sept 5, 2025 | Final Deadline: Sept 30, 2025