Samsung is developing a memory chip embedded in the AI ​​processor

Samsung is developing a memory chip embedded in the AI ​​processor

Samsung Electronics today announced that it has developed a high-bandwidth memory (HBM) embedded chip with artificial intelligence (AI) processor, which offers low power consumption and improved performance. New Processing In Memory (PIM) technology will help embed powerful AI computing capabilities into high-performance memory. The chip, dubbed HBM-PIM, doubles the performance of AI systems and reduces power consumption by more than 70% compared to conventional HBM2, Samsung said in a statement. All of this will speed up large-scale processing in data centers, high-performance computing (HPC) systems, and AI-powered mobile applications, Samsung added. HBM-PIM is said to use the same HBM interface as previous iterations. In this case, customers will not have to change hardware and software to apply the chip to their existing systems.

New chip maximizes parallel processing

Taking a look at standard computing architecture, Samsung said, the processor and memory are separated and data is exchanged between the two. In such a setup, latency occurs primarily when a lot of data is moving. Overcoming these issues, Samsung now installs artificial intelligence engines in each memory bank, maximizing parallel processing to improve performance. “HBM-PIM brings processing power directly to where data is stored by placing a DRAM-optimized AI engine within each memory bank, a storage sub-unit, enabling parallel processing and minimizing data movement. . The chip is currently being tested in customer AI accelerators that are expected to be completed in the first half of the year. An artificial intelligence accelerator is computer hardware that is specially designed to handle the demands of artificial intelligence. Samsung's paper on the chip will be presented at the International Conference on Virtual Semiconductor Circuits on February 22. Via: Samsung