High Bandwidth Memory (HBM) is the commonly used type of DRAM for data center GPUs like NVIDIA's H200 and AMD's MI325X. High Bandwidth Flash (HBF) is a stack of flash chips with an HBM interface. What ...
High-bandwidth memory becomes the choke point for AI accelerators as AMD plans Seoul meetings with Samsung and Naver, while supply forecasts, HBM4 roadmaps and packaging capacity reshape pricing power ...
As SK hynix leads and Samsung lags, Micron positions itself as a strong contender in the high-bandwidth memory market for generative AI. Micron Technology (Nasdaq:MU) has started shipping samples of ...
During the event, SK hynix showed off some of its AI memory products, including its new HBM3E 12-Hi stack memory which it started mass-producing in September, marking a significant milestone in the ...
JEDEC is still finalizing the HBM4 memory specifications, with Rambus teasing its next-gen HBM4 memory controller that will be prepared for next-gen AI and data center markets, continuing to expand ...
SK Hynix and Taiwan’s TSMC have established an ‘AI Semiconductor Alliance’. SK Hynix has emerged as a strong player in the high-bandwidth memory (HBM) market due to the generative artificial ...
High bandwidth memory (HBM) chips have become a game changer in artificial intelligence (AI) applications by efficiently handling complex algorithms with high memory requirements. They became a major ...
The Fourth GMIF2025 Innovation Summit (Global Memory Innovation Forum) recently wrapped up in Shenzhen. Themed "AI Applications, Innovation Empowered, "GMIF2025 represented as a gathering of leading ...
At the SK AI Summit 2025 in Seoul on November 3, 2025, SK Hynix CEO Kwak Noh-jung announced a major strategic overhaul, revealing plans to transform the South Korean memory maker from a traditional ...