Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124


As the high bandwidth memory (HBM) market continues to grow, projected to reach $33 billion by 2027, the competition between Samsung and SK Hynix is stepping up.
Tesla is fanning the flames as it is said to have reached out to both Samsung and SK Hynix, two of South Korea’s biggest memory chip makers, seeking samples of its next-generation HBM4 chips.
Now, a report from the Korean Economic Daily says Tesla plans to evaluate these samples for potential integration into its custom Dojo supercomputer, a critical system designed to power the company. AI ambitions, including its self-driving vehicle technology.
The Dojo supercomputer, powered by Tesla’s patented D1 AI chip, helps train the neural networks needed for its Full Self-Driving (FSD) function. This latest request suggests that Tesla is preparing to replace the older HBM2e chips with the more advanced HBM4, which offers significant improvements in speed, energy efficiency and overall performance. The company is also expected to incorporate HBM4 chips into its AI data centers and future self-driving cars.
Samsung and SK Hynix, longtime rivals in the memory chip market, are both preparing HBM4 chip prototypes for Tesla. These companies are also aggressively developing custom HBM4 solutions for major US technology companies such as MicrosoftGoal, yes Google.
According to industry sources, SK Hynix remains the current leader in the high bandwidth memory (HBM) market, supplying HBM3e chips to NVIDIA and hold a significant share of the market. However, Samsung is quickly closing the gap, forming partnerships with companies such as Taiwan Semiconductor Manufacturing Company (TSMC) to produce key components for its HBM4 chips.
SK Hynix seems to have made progress with its HBM4 chip. The company claims that its solution provides 1.4 times the bandwidth of HBM3e while consuming 30% less power. With a bandwidth expected to exceed 1.65 terabytes per second (TB/s) and reduced energy consumption, the HBM4 chips offer the performance and efficiency needed to train massive AI models with Tesla’s Dojo supercomputer.
The new HBM4 chips are also expected to feature a logic die at the base of the chip stack, which functions as the control unit for the memory walls. This logic die design enables faster data processing and better energy efficiency, making HBM4 an ideal fit for Tesla’s AI-driven applications.
Both companies are expected to accelerate their HBM4 development timelines, with SK Hynix aiming to deliver the chips to customers by the end of 2025. Samsung, on the other hand, is pushing forward its production plans with its process advanced foundry of 4 nanometers (nm), which could help secure a competitive advantage in the global HBM market.
Via TrendForce