SK hynix starts mass production of 192GB SOCAMM2 for NVIDIA AI servers

NVIDIA Vera Rubin

If you think AI progress is all about GPUs, you are missing half the story. Memory is quickly becoming the real choke point, and SK hynix seems eager to cash in on that. The company says it has kicked off mass production of a 192GB SOCAMM2 module built on its latest 1cnm LPDDR5X DRAM. That … Read more

SK hynix and Sandisk want HBF to become the missing memory layer for AI inference

Hbf

SK hynix and Sandisk are working to standardize HBF memory under the Open Compute Project, positioning it as a new layer between HBM and SSDs for AI inference workloads. The companies say HBF can improve scalability, power efficiency, and total cost of ownership as AI shifts from training to large-scale deployment.