Live Science on MSN
MIT's chip stacking breakthrough could cut energy use in power-hungry AI processes
Data doesn’t have to travel as far or waste as much energy when the memory and logic components are closer together.
Nvidia has reportedly begun testing the limits of the global AI memory supply chain by signaling interest in 16-layer high-bandwidth memory for delivery as earl ...
SK hynix said Tuesday it is unveiling its next generation of artificial intelligence memory chips — including a 16-layer HBM4 ...
Nvidia has been the biggest beneficiary of the artificial intelligence revolution so far, but another chip stock may be about ...
Samsung last month unveiled a SOCAMM2 LPDDR5-based memory module designed specifically for AI data center platforms.
Explosive demand from Nvidia and other AI chipmakers has soaked up global memory supply, pushing DRAM and HBM prices to ...
Additionally, SK hynix showcased SOCAMM2, a low-power memory module for AI servers, and a 321-layer 2-terabit QLC NAND flash product designed for ultra-high-capacity enterprise SSDs. (ANI) ...
This could be the turning point where we all stop going 'Eww' when we see a QLC SSD. When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works. For SSD ...
NAND flash technology is on a roll with advancements in cell structure and the subsequent boost in storage density. That allows this non-volatile-memory (NVM) chip to deliver faster throughput and ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results