Private Network Check Readiness - TeckNexus Solutions

SK hynix to Unveil ‘Full Stack AI Memory Provider’ Vision at CES 2025

SK hynix to showcase technological capabilities, participating in the world's largest consumer electronics show, CES 2025, from January 7-10, featuring a wide range of products driving the AI era, from HBM, the core of AI infrastructure, to next-gen memories like PIM. Company to present new possibilities in the AI era through technological innovation and provide irreplaceable value.
SK hynix to Unveil 'Full Stack AI Memory Provider' Vision at CES 2025

SK hynix Inc. (or “the company”, www.skhynix.com) announced today that it will showcase its innovative AI memory technologies at CES 2025, to be held in Las Vegas from January 7 to 10 (local time).

A large number of C-level executives, including CEO Kwak No-jung, CMO (Chief Marketing Officer) Justin Kim and Chief Development Officer (CDO) Ahn Hyun, will attend the event. “We will broadly introduce solutions optimized for on-device AI and next-generation AI memories, as well as representative AI memory products such as HBM and eSSD at this CES,” said Justin Kim. “Through this, we will publicize our technological competitiveness to prepare for the future as a ‘Full Stack AI Memory Provider1‘.”


1Full Stack AI Memory Provider: Refers to an all-round AI memory provider, which provides comprehensive AI-related memory products and technologies

SK hynix will also run a joint exhibition booth with SK Telecom, SKC and SK Enmove, under the theme “Innovative AI, Sustainable Tomorrow.” The booth will showcase how SK Group’s AI infrastructure and services are transforming the world, represented in waves of light.

SK hynix, which is the world’s first to produce 12-layer HBM products for 5th generation and supply them to customers, will showcase samples of HBM3E 16-layer products, which were officially developed in November last year. This product uses the advanced MR-MUF process to achieve the industry’s highest 16-layer configuration while controlling chip warpage and maximizing heat dissipation performance.

In addition, the company will display high-capacity, high-performance enterprise SSD products, including the ‘D5-P5336’ 122TB model developed by its subsidiary Solidigm in November last year. This product, with the largest existing capacity, high power and space efficiency, has been attracting considerable interest from AI data center customers.

“As SK hynix succeeded in developing QLC2 (Quadruple Level Cell)-based 61TB products in December, we expect to maximize synergy based on a balanced portfolio between the two companies in the high-capacity eSSD market” said Ahn Hyun, CDO at SK hynix. The company will also showcase on-device AI products such as ‘LPCAMM23‘ and ‘ZUFS 4.04,’ which improve data processing speed and power efficiency to implement AI in edge devices like PCs and smartphones. The company will also present CXL and PIM (Processing in Memory) technologies, along with modularized versions, CMM(CXL Memory Module)-Ax and AiMX5, designed to be core infrastructures for next-generation data centers.

2QLC: NAND flash is divided into SLC (Single Level Cell), MLC (Multi Level Cell), TLC (Triple Level Cell), QLC (Quadruple Level Cell), and PLC (Penta Level Cell) depending on how much information is stored in one cell. As the amount of information stored increases, more data can be stored in the same area.

3Low Power Compression Attached Memory Module 2 (LPCAMM2): LPDDR5X-based module solution that provides power efficiency and high performance as well as space savings. It has the performance effect of replacing two existing DDR5 SODIMMs with one LPCAMM2.

4Zoned Universal Flash Storage (ZUFS): A NAND Flash product that improves efficiency of data management. The product optimizes data transfer between an operating system and storage devices by storing data with similar characteristics in the same zone of the UFS, a flash memory product for various electronic devices such as digital camera and mobile phone.

5Accelerator-in-Memory based Accelerator (AiMX): SK hynix’s accelerator card product that specializes in large language models using GDDR6-AiM chips

In particular, CMM-Ax is an groundbreaking product that adds computational functionality to CXL’s advantage of expanding high-capacity memory, contributing to improving performance and energy efficiency of the next-generation server platforms6.

6Platform: Refers to a computing system that integrates both hardware and software technologies. It includes all key components necessary for computing, such as the CPU and memory.

“The changes in the world triggered by AI are expected to accelerate further this year, and SK hynix will produce 6th generation HBM (HBM4) in the second half of this year to lead the customized HBM market to meet the diverse needs of customers,” said Kwak Noh-Jung, CEO at SK hynix. “We will continue to do our best to present new possibilities in the AI era through technological innovation and provide irreplaceable value to our customers.”

About SK hynix Inc.

SK hynix Inc., headquartered in Korea, is the world’s top-tier semiconductor supplier offering Dynamic Random Access Memory chips (“DRAM”), flash memory chips (“NAND flash”), and CMOS Image Sensors (“CIS”) for a wide range of distinguished customers globally. The Company’s shares are traded on the Korea Exchange, and the Global Depository shares are listed on the Luxemburg Stock Exchange. Further information about SK hynix is available at www.skhynix.com, news.skhynix.com.

SOURCE SK hynix Inc.: https://www.prnewswire.com/news-releases/sk-hynix-to-unveil-full-stack-ai-memory-provider-vision-at-ces-2025-302341613.html


Recent Content

Edge AI is reshaping broadband customer experience by powering smart routers, proactive troubleshooting, conversational AI, and personalized Wi-Fi management. Learn how leading ISPs like Comcast and Charter use edge computing to boost reliability, security, and customer satisfaction.
The pressure to adopt artificial intelligence is intense, yet many enterprises are rushing into deployment without adequate safeguards. This article explores the significant risks of unchecked AI deployment, highlighting examples like the UK Post Office Horizon scandal, Air Canada’s chatbot debacle, and Zillow’s real estate failure to demonstrate the potential for financial, reputational, and societal damage. It examines the pitfalls of bias in training data, the problem of “hallucinations” in generative AI, and the economic and societal costs of AI failures. Emphasizing the importance of human oversight, data quality, explainability, ethical guidelines, and robust security, the article urges organizations to proactively navigate the challenges of AI adoption. It advises against delaying implementation, as competitors are already integrating AI, and advocates for a cautious, informed approach to mitigate risks and maximize the potential for success in the AI era.
A global IBM study reveals 81% of CMOs see AI as critical for growth, yet 54% underestimated the operational complexity. Only 22% have set clear AI usage guidelines, despite 64% now being responsible for profitability. Siloed systems, talent gaps, and lack of collaboration hinder translating AI strategies into results, highlighting a major execution gap as marketing leaders adapt to increased accountability for profit and revenue growth.
Elon Musk’s generative AI firm, xAI, is targeting $4.3 billion in new equity funding, following its previous $6 billion raise and a $5 billion debt effort. The capital will support high-cost AI models like Grok and Aurora, expand massive GPU-powered data centers, and drive xAI’s ambition to compete with leaders like OpenAI and DeepMind. Investors remain interested despite concerns over spending, betting on Musk’s strategy to blend social media and AI under one ecosystem.
The emergence of 6G networks marks a paradigm shift in the way wireless systems are conceived and managed. Unlike its predecessors, 6G will embed Artificial Intelligence (AI) as a native capability across all network layers, enabling real-time adaptability, intelligent orchestration, and autonomous decision-making. This paper explores the symbiosis between AI and 6G, highlighting key applications such as predictive analytics, alarm correlation, and edge-native intelligence. Detailed insights into AI model selection and architecture are provided to bridge the current technical gap. Finally, the cultural and organizational changes required to realize AI-driven 6G networks are discussed. A graphical abstract is suggested to visually summarize the proposed architecture.
As the telecom world accelerates toward 5G-Advanced and sets its sights on 6G, artificial intelligence (AI) is no longer a peripheral technology — it is becoming the brain of the mobile network. AI-driven Radio Access Networks (RANs), and increasingly AI-native architectures, are reshaping how operators design, optimize, and monetize their networks. From zero-touch automation to intelligent spectrum management and edge AI services, the integration of AI and machine learning (ML) is unlocking both operational efficiencies and new business models.

This article explores the evolution of AI in the RAN, the architectural shifts needed to support it, the critical role of Open RAN, and the most promising AI use cases from the field. For telcos, this is not just a technical upgrade — it is a strategic inflection point.
Whitepaper
Download our latest whitepaper, sponsored by RADCOM, to see how automated assurance, using the power of AI/ML, can help tackle these questions head-on....
Radcom Logo
Whitepaper
Download the Open RAN whitepaper to understand the parameters, challenges, and benefits of greenfield vs. brownfield deployments....
GSMA logo

It seems we can't find what you're looking for.

Download Magazine

With Subscription

Subscribe To Our Newsletter

Private Network Awards 2025 - TeckNexus
Scroll to Top

Private Network Awards

Recognizing excellence in 5G, LTE, CBRS, and connected industries. Nominate your project and gain industry-wide recognition.
Early Bird Deadline: Sept 5, 2025 | Final Deadline: Sept 30, 2025