2024-09-06
한어Русский языкEnglishFrançaisIndonesianSanskrit日本語DeutschPortuguêsΕλληνικάespañolItalianoSuomalainenLatina
it home reported on september 6 that micron stated in a technical blog released this month that its "production-capable" 12-layer stacked hbm3e 36gb memory is now being delivered to major industry partners for verification in the entire ai ecosystem.
micron said that its 12-layer stacked hbm3e capacity is 50% higher than the existing 8-layer stacked hbm3e products, allowing large ai models such as llama-70b to run on a single processor, avoiding the latency issues caused by multi-processor operation.
micron's 12-layer stacked hbm3e memory has an i/o pin rate of 9.2+ gb/s and can provide 1.2+ tb/s of memory bandwidth. micron also claims that this product consumes significantly less power than its 8-layer stacked hbm3e competitor.
hbm3e is not an isolated product, but is integrated into the ai chip system, which relies on the cooperation of memory suppliers, product customers and osat companies. micron is a partner member of tsmc's 3dfabric advanced packaging alliance.
dan kochpatcharin, director of tsmc’s ecosystem and alliance management division, recently stated:
tsmc and micron have maintained a long-term strategic partnership. as part of the oip ecosystem, we work closely together to enable micron's hbm3e-based system and cowos packaging designs to support customers' ai innovations.