Samsung and SK are also eyeing AI…’Game Changer’ to reverse the semiconductor slump


Reporter Han Ji-yeon of Money Today | 2023.02.12 08:20

[MT리포트-생성 AI 시대, 한국은 어디로] 2-②


Artificial Intelligence HBM-PIM developed by Samsung Electronics / Photo courtesy of Samsung Electronics

As generative AI (artificial intelligence) led by ChatGPT is causing a syndrome, the semiconductor industry is also stretching. The semiconductor industry is considered a representative industry that will benefit from the expansion of the AI ​​industry. In order to advance artificial intelligence, high-performance and high-capacity memory semiconductors are essential.

Currently, in the global AI market, GP-GPUs, which are made in a form suitable for deep learning by subtracting the graphics function of the GPU (graphics processing unit) and utilizing the calculation function, are most commonly used. It is the memory that stores and processes data that is basically mounted on the GPU, which is responsible for the ‘head’ of AI with learning and reasoning functions. In order to expand the AI ​​ecosystem, cooperation between software, server, and semiconductor companies is essential.

High-performance memory is required to store and process enormous amounts of learning

According to the semiconductor industry on the 10th, representative memory semiconductor companies such as Samsung Electronics and SK Hynix are researching and developing next-generation memory semiconductors for AI as future growth engines. “We expect AI services to have a positive impact on future memory demand,” said Kim Jae-joon, vice president of Samsung Electronics’ memory division. Park Myung-soo, in charge of DRAM marketing at SK Hynix, also said, “Commercialization of artificial intelligence can be a mid- to long-term growth engine in the memory semiconductor market.”

High-efficiency memory is required to process huge amounts of data. HBM (high-bandwidth memory), which is made by vertically connecting several DRAMs in 3D form, and PIM (intelligent memory), which is equipped with processing functions and has superior speed and performance, are typical examples.

Samsung Electronics developed ‘HBM-PIM’, a high-performance memory with built-in calculation function in 2021. The memory in charge of data processing and storage divides the learning and reasoning functions that the GPU used to do, making it easier to process enormous capacity. AMD, which divides the GPU industry with Nvidia, uses Samsung Electronics’ HBM-PIM. CXL, an interface concept that increases DRAM capacity, is also a next-generation memory solution presented by Samsung Electronics.

In December of last year, Samsung Electronics also signed a business agreement (MOU) with Naver to develop AI semiconductor solutions. Naver announced that it would release a Korean-style AI that will compete with ChatGPT in the first half of the year.

SK Hynix is ​​supplying HBM3, an ultra-high-performance memory, to Nvidia, the number one player in the GPU industry. HBM3 implements the speed of transmitting 163 full HD movies in one second. It also developed GDDR6, a next-generation DRAM standard that applies PIM. GDDR6 is 16 times faster in calculation speed than before, but consumes 80% less energy.

HBM3 developed by SK Hynix/Photo courtesy of SK Hynix

Increased data center expansion and increased memory capacity

The growing need for servers to support AI systems, that is, data centers, is also expected to work favorably for the memory semiconductor industry. As AI learns not only text but also various forms (multi-modal) information such as images and videos, it is necessary to expand serverage and storage. Microsoft (MS) recently promised to invest in Open AI, which developed ChatGPT, and announced that it plans to mount a super-giant AI in its cloud service, ‘Azure’.

Server DRAM, which is combined with CPU (central processing unit) and used in data centers, is a high value-added product, accounting for around 40% of Samsung Electronics and SK Hynix’s total sales, respectively. In particular, this year, as the development of the AI ​​industry coincides with the CPU replacement cycle of data centers, the demand for DRAM for servers can further increase. Data centers usually replace the new CPU when it comes out and replace the optimized DRAM as well. With Intel releasing a new CPU, Sapphire Rapids, earlier this year, the adoption of DDR5, the DRAM standard for next-generation servers, is expected to increase. DDR5 has twice the data transfer speed and higher power efficiency than the existing DDR4. The price is also 30-50% higher, so the profitability of the memory semiconductor industry can be expected to increase.

Gartner, a market research firm, predicted that the AI ​​semiconductor market will reach $55.3 billion this year and $86.1 billion by 2026.

An official from the domestic semiconductor industry said, “The key to an AI system is to process large amounts of data very quickly.” “It is essential to have high-performance memory semiconductors to help increase system capacity, and AI ecosystem when not only software but also server and semiconductor companies cooperate can be completed,” he said.

[저작권자 @머니투데이, 무단전재 및 재배포 금지]

Share:

Facebook
Twitter
Pinterest
LinkedIn

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.