AI will generate increased demand for memory. This will drive demand for HBM memory but also increase demand for new emerging memory technologies.
Energy & Commerce Committee Subcommittee of Communication and Technology, Ronnie Vasishta, Senior VP of telecom at Nvidia said that mobile networks will be called upon to support a new kind of traffic—AI traffic.
This AI traffic includes the delivery of AI services to the edge, or inferencing at the edge. Such growth in AI data could reverse the general trend towards lower growth in traffic on mobile networks. Many AI-enabled applications will require mobile connectivity including autonomous vehicles, smart glasses, generative AI services and many other applications. He said that the transmission of this massive increase in data needs to be resilient, fit for purpose, and secure. Supporting this creation of data from AI will require large amount of memory, particularly very high bandwidth memory, such as HBM. This will result in great demand for memory that supports AI applications. Micron announced that it is now shipping HBM4 memory to key customers, these are for early qualification efforts. The Micron HBM4 provides up to 2.0TB/s bandwidth and 24GB capacity per 12-high die stack. The company says that their HBM4 uses its 1-beta DRAM node, advanced through silicon via technologies, and has a highly capable built-in self-test. See image below.HBM memory consisting of stacks of DRAM die with massively parallel interconnects to provide high bandwidth are combined GPU’s such as those from Nvidia. This memory close to the processor allows training and inference of various AI models. The current generation of HBM memory used in current GPUs use HBM3e memory. At the 2025 March GTC in San Jose, Jensen Huang said that Micron HBM memory was being used in some of their GPU platforms.Today’s NYT Strands Puzzle: Hints, Spangram And Answers For Thursday, June 12thThe manufacturers of HBM memories are SK Hynix, Samsung and Micron with SK Hynix and Samsung providing the majority of supply and with Micron coming in third. SK hynix was the first to announce HBM memory in 2013, which was adopted as an industry standard by JEDEC that same year. Samsung followed in 2016 and in 2020 Micron said that it would create its own HBM memory. All of these companies expect to be shipping HBM4 memories in volume by sometime in 2026. Numen, a company involved in magnetic random access memory applications, recently talked about how traditional memories used in AI applications, such as DRAM and SRAM have limitations in power, bandwidth and storage density.that processing performance has skyrocketed by 60,000X over the past 20 years but DRAM bandwidth has improved only 100X, creating a “memory wall.” The company says that its AI Memory Engine is a highly configurable memory subsystem IP that enables significant improvements in power efficiency, performance, intelligence, and endurance. This is not only for Numem’s MRAM-based architecture, but also third-party MRAMs, RRAM, PCRAM, and Flash Memory. Numem said that it has developed next-generation MRAM supporting die densities up to 1GB which can deliver SRAM-class performance with up to 2.5X higher memory density in embedded applications and 100X lower standby power consumption. The company says that its solutions are foundry-ready and production-capable today.report predict that AI and other memory-intensive applications, including the use of AI inference in embedded devices such as smart watches, hearing aids and other applications are already using MRAM, RRAM and other emerging memory technologies will decrease the costs and increase production of these memories. These memories technologies are already available from major semiconductor foundries. They scale to smaller lithographic scaling that DRAM and SRAM and because they are non-volatile, no refreshes are needed and so they consume less power. As a result, these memories allow more memory capacity and lower power consumption in space and power constrained environments. MRAM and RRAM are also being built into industrial, enterprise and data center applications. The figure below shows our projections for replacement of traditional memories, SRAM, DRAM, NOR and NAND Flash memory by these emerging memories. NOR and SRAM, in particular, for embedded memories are projected to be replaced by these new memories within the next decade as part of a future $100B memory market.AI will generate increased demand for memory to support training and inference. It will also increase the demand for data over mobile networks. This will drive demand for HBM memory but also increase demand for new emerging memory technologies.
Hbm4 Mram Rram Reram Fram Pcm Ai Inference Training
Canada Latest News, Canada Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
The pandemic impacted early language development, and emerging effects are alarming.When early language development is disrupted in one age group, all of us are affected due to its impact on school budgets and taxes, future workforce, and correctional systems.
Read more »
Two Frontrunners Emerging to Acquire Some of ESPN's MLB Broadcast RightsA shakeup appears imminent for baseball on television.
Read more »
Should The Nets Keep This Emerging 3-And-D Specialist?After struggling to find a permanent landing spot, Tyrese Martin might have found his home for the future in Brooklyn.
Read more »
Global Sports Update: Transfer Frenzy, Emerging Stars, and Coaching ChangesThis news update covers a range of major sporting events and developments, including the summer transfer window, the rise of young football talents, coaching changes in the NBA, and upcoming tours and tournaments.
Read more »
Rockies’ Hunter Goodman thriving behind the plate — and emerging as All-Star Game candidateHunter Goodman and the Rockies reached a crossroads during the offseason, deciding he would be a full-time catcher. Wise move.
Read more »
A Primary Care Guide to Emerging Therapies for Resistant HypertensionAre you caught up on the newest blood pressure-lowering agents? Lucky for you, The Curbsiders are here to make sure your practice is up to date!
Read more »
