news

H20 chip uses Samsung HBM3: Is Nvidia laying the foundation for a domestic "price war"?

2024-07-27

한어Русский языкEnglishFrançaisIndonesianSanskrit日本語DeutschPortuguêsΕλληνικάespañolItalianoSuomalainenLatina

Over the past few months, Samsung's HBM3 seeking to supply Nvidia's AI chips has been a hot topic in the industry. Until recently, it was rumored that Samsung's HBM3 memory has been approved by Nvidia and may be supplied to Nvidia's H20 chip as early as August. This is the first time that Samsung has supplied HBM memory to Nvidia, breaking the situation where SK Hynix's HBM was previously the only one to supply its AI chips.


Although it has not been officially announced that it has passed the test, the H20 chip adopting Samsung HBM3 should not pose product risks in the Chinese market, and may further enhance industry competitiveness through cost optimization, "price war" and other measures, thereby stirring up the domestic AI chip market. This cooperation will help mitigate some of Nvidia's losses caused by sanctions and competition, while promoting the diversification of the supply chain.

As the AI ​​industry landscape and development situation change, the sales of Nvidia's H20 chip in the domestic market have fluctuated, from being unpopular to being rushed by corporate customers, and even expected to generate $12 billion in revenue in China this year. Behind this is the cost change of the H20 chip, and Nvidia's stepping on the trend of industry development, but there are also challenges such as international restrictions.

Laying the foundation for a price war

In order to enter Nvidia's AI chip supply chain, Samsung's HBM3's progress in passing its tests has been "delayed" for several months.

As early as February this year, Samsung announced that it had developed the industry's first 36 GB 12-layer HBM3E chip and planned to mass-produce it in the second half of this year. However, the industry pointed out that due to heat and power consumption issues, Samsung HBM3 has not passed Nvidia's test since then. Until recently, Samsung HBM3 was reportedly approved for use in Nvidia's H20 chip exclusively for the Chinese market.

In this regard, Dr. Zhang Jun, Chairman of China Europe Capital and Chairman of the Intelligent Hardware Association, who has long focused on the field of AI chips, said that this to some extent shows that Samsung HBM3 has passed Nvidia's relevant tests.Although neither party has officially announced that they have passed the test,butH20 adoptedSamsung HBM3 performance should be guaranteed, and there will be no product risk in the huge Chinese market. In terms of strategic path, Nvidia is more concerned about cost-effectiveness and seeking supply chain diversification, rather than in'Hanged from a tree'."

SK Hynix has been the exclusive supplier of HBM3 for Nvidia AI chips since 2022. In terms of HBM technology, SK Hynix uses batch reflow molded underfill (MR-MUF) technology, which is a packaging process technology that injects liquid protective material into the space between stacked semiconductor chips and solidifies them.

Samsung uses thermal compression non-conductive film (TC-NCF) technology on HBM, placing a non-conductive adhesive film between each layer each time the chips are stacked. The film is a polymer material used to isolate the chips from each other and protect the connection points from impact. Samsung insists that NCF technology is the "best solution" for HBM, but it has been repeatedly criticized by the industry for issues such as yield.

With the vigorous promotion of HBM3 yield improvementAccording to a research report,YieldIt should be able to reach 55%. At the same time, in order to prevent customer complaints,Now in the middle of the processEvery link needs to be tested, but it is still lower than SK Hynix's 70%-80% yield.


As for whether Nvidia will apply Samsung HBM3 to other AI chips, a representative of a leading domestic computing company said,Since the H20 chip has made compromises in many areas, including computing power, BM capacity and bandwidth, and inter-card interconnection capabilities, whether NVIDIA will use HBM3 in other AI chips in the future depends mainly on whether there is similar demand, but it will not be used in their latest flagship products. "

In the future, this cooperation between Nvidia and Samsung will help mitigate some of the losses caused by sanctions and competition. Industry analysts said that in order to maintain a foothold in key markets, Nvidia's move to approve Samsung's HBM technology for the Chinese market is a key step in navigating a complex and competitive landscape. This is not only a regulatory compliance measure, but also a strategic response to fierce competition from Chinese companies.

Regarding whether the adoption of Samsung HBM3 will enhance the competitiveness of the H20 chip in the Chinese market, the above-mentioned computing power enterprise representative believes thatThe design of H20 itself is exclusively for domestic supply, and many compromises have been made in performance, which will not change because of HBM3.Dr. Zhang Jun said,In view of the current development trend,Nvidia needsIn the Chinese marketMakeBetter grades.The H20 chip adopts threeStar HBM3 can further optimize costs and provide a better basis for price wars, which will stir up the domestic AI-ChipMarket competition intensified.

In addition, after being adopted by the H20 chip, Samsung HBM3 may further challenge SK Hynix's dominance.

Bank of America pointed out that Samsung's HBM3 sales may not increase significantly in the short term, and sales are expected to reach US$500 million and US$2.4 billion in 2024 and 2025, respectively, accounting for 10% and 34% of HBM's total sales, with little impact on SK Hynix's earnings in the same period. If Samsung's HBM4 certification is successful, it will have a significant negative impact on Hynix's earnings in 2026, but the current forecast success rate is low.

Hitting the trend of industrial development

As the AI ​​industry landscape and development trends change, NVIDIA's H20 chip sales in the domestic market have fluctuated.

In January 2024, due to the significant reduction in computing power, Nvidia's H20 chip was exposed to be unpopular in China, and some large Internet companies and cloud vendors reduced their orders for Nvidia's AI chips. However, after the price of the H20 chip was reduced in May, the market gradually changed.

In early July, industry research firm SemiAnalysis predicted that Nvidia is expected to deliver more than 1 million H20 chips in 2024.Based on a single chip price of $12,000 to $13,000,Will contributeOver $12 billion in revenue.andLast fiscal yearNvidiaThe revenue in China was US$10.3 billion, which means that relying solely on H20 chips are expected to exceed last year's revenue.

Dylan Patel, chief analyst at SemiAnalysis, believes that although the H20 chip is weaker than Chinese manufacturers' domestic chips on paper, it still has certain advantages in actual use due to its advantages in HBM memory.


Data shows that the NVIDIA H20 chip has 96GB of memory, which is not only higher than the previous generation A800 (80GB) and H800 (40GB) which are exclusively for the Chinese market, but also significantly higher than the 64GB memory of major domestic AI chip competitors.

Since Samsung HBM profit margin is lower than SK Hynix, switching to Samsung HBM3 memory for H20 chip will help reduce costs. Dr. Zhang Jun said,US-China technological decouplingYingweiNvidia's domestic customers are price sensitive, and China remains one of its largest revenue sources.Will be further adopted in ChinaCastrated version H20 chips to test the possibility of price war, supplemented byCUDA's ecological advantages have created strong competitiveness, thereby squeezing the space for competitors and forcing domestic companies to pay the bill.

On the other hand, when applied to large models, the configuration of NVIDIA H20 is more suitable for reasoning rather than model training, which also hits the current development trend of the domestic large model industry.

An industry insider pointed out, “Over the past few months, the domestic large model market has undergone many changes. The demand for computing power is shifting from large-scale model training to inference needs. Enterprises' demand for inference in private deployment is growing. At the same time, as the capabilities of open source models continue to improve, they are increasingly inclined to build small models to complete specific tasks."The weakening demand for large-scale computing power and the increasing demand for small clusters have led to a warming inference market, which will help increase sales of Nvidia's H20 chip.

However, not everyone in the industry is optimistic about the sales prospects of the H20 chip. The above-mentioned computing power company representative believes that Nvidia H20 is one of the few chips that can be legally sold in China since the introduction of the Chip Act, but due to performance and other limitations, it is not realistic for H20 to achieve sales of $12 billion in China this year, and it is best to confirm it based on Nvidia's financial report.

Although the H20 chip still has certain advantages in terms of versatility and stability, the reduction in computing power has left part of the market for domestic chips. This is a good time to develop domestic AI chips."He said.

In addition, the H20 chip still faces some international geopolitical uncertainties. Recently, a report provided to clients by Jefferies Securities showed that the US government is considering new trade restrictions, which may prevent Nvidia from selling its H20 AI chip specifically launched for the Chinese market. If the restriction is officially implemented, Nvidia may lose about $12 billion in revenue.