SK Hynix to Supply HBM4E Samples in Second Half Using 1c DRAM
Companies Mentioned
Why It Matters
The move narrows SK hynix’s technology gap with Samsung and secures its position in the fast‑growing AI and high‑performance computing memory market.
Key Takeaways
- •HBM4E samples from SK hynix arriving H2 2024 using 1c DRAM.
- •Mass production of HBM4E slated for 2025, narrowing gap with Samsung.
- •LPDDR6 to ship H2 2024, boosting smartphone speed 33% and efficiency 20%.
- •192 GB SOCAMM2 launched for Nvidia AI platform, doubling bandwidth, cutting energy 75%.
- •SK hynix advancing CXL 3.0 modules and high‑bandwidth flash for AI workloads.
Pulse Analysis
The introduction of HBM4E marks a pivotal upgrade in the high‑bandwidth memory landscape. By leveraging its sixth‑generation 1c DRAM process, SK hynix can deliver higher data rates and lower power consumption than the older 1b‑based HBM4. This technical edge is crucial as AI accelerators and data‑center GPUs demand ever‑greater memory bandwidth. While Samsung has already been shipping 1c‑based HBM4, SK hynix’s sample rollout and planned 2025 mass production signal a competitive catch‑up that could diversify supply chains for major OEMs.
Beyond HBM, SK hynix is expanding its portfolio with LPDDR6 and a 192 GB SOCAMM2 module. The LPDDR6 chip promises a 33% speed boost and over 20% power savings versus LPDDR5X, positioning it for flagship smartphones launching later this year. Meanwhile, the SOCAMM2, optimized for Nvidia’s Vera Rubin AI platform, doubles bandwidth and improves energy efficiency by 75% compared with traditional server DIMMs. These products illustrate SK hynix’s strategy to capture both mobile and AI‑centric server markets, reinforcing its relevance across the computing stack.
Looking ahead, the company’s focus on Compute Express Link (CXL) 3.0 and high‑bandwidth flash reflects a broader shift toward memory‑centric architectures. CXL 3.0 promises greater capacity and lower latency, essential for heterogeneous workloads in cloud and edge AI. High‑bandwidth flash, built on a NAND‑in‑HBM‑style stack, aims to complement HBM’s limited capacity while delivering comparable speeds. Together, these initiatives position SK hynix as a one‑stop supplier for next‑generation memory solutions, a critical advantage as enterprises accelerate AI adoption and demand more flexible, high‑performance memory ecosystems.
SK hynix to Supply HBM4E Samples in Second Half Using 1c DRAM
Comments
Want to join the conversation?
Loading comments...