news

Apple (AAPL.US) uses Google (GOOG.US, GOOGL.USTPU to train AI models, Nvidia (NVDA.US)

2024-07-30

한어Русский языкEnglishFrançaisIndonesianSanskrit日本語DeutschPortuguêsΕλληνικάespañolItalianoSuomalainenLatina

Zhitong Finance APP learned that on Monday, Apple (AAPL.US) said that the artificial intelligence model relied on by its artificial intelligence system Apple Intelligence was pre-trained through a processor designed by Google (GOOG.US, GOOGL.US), indicating that large technology companies are looking for alternatives to Nvidia (NVDA.US) when training cutting-edge AI.

Apple detailed its choice of Google's homegrown Tensor Processing Unit (TPU) for training in a just-released technical report, and it also released a preview of Apple Intelligence for some devices on Monday.

Nvidia's expensive graphics processing units (GPUs) dominate the market for high-end AI training chips, and demand has been so high in the past few years that it has been difficult to procure them in the quantities needed. OpenAI, Microsoft (MSFT.US) and Anthropic are all using Nvidia's GPUs to train their models, while other technology companies, including Google, Meta (META.US), Oracle (ORCL.US) and Tesla (TSLA.US), are also scrambling to buy these GPUs to build their AI systems and products.

Meta CEO Zuckerberg and Alphabet CEO Sampichai both said last week that their companies and others in the industry may be overinvesting in AI infrastructure, but also acknowledged that the business risks of not doing so are too high.

Apple did not mention Google or Nvidia in its 47-page report, but noted that its Apple Foundation Model (AFM) and AFM server were trained on a "cloud TPU cluster." This means that Apple rented servers from cloud service providers to perform calculations. "This system enables us to efficiently and scalably train AFM models, including AFM devices, AFM servers, and larger models," Apple said in the report.

Apple was late to reveal its AI plans compared to its peers who quickly embraced generative AI after OpenAI launched ChatGPT in late 2022. On Monday, Apple launched Apple Intelligence. The system includes several new features, such as a new look for Siri, better natural language processing, and AI-generated summaries in text fields.

In the coming year, Apple plans to launch features based on generative AI, including image generation, emoji generation, and an enhanced version of Siri that can access users' personal information and perform actions in apps.

In its report Monday, Apple said that the on-device AFM is trained on a "slice" of 2,048 TPU v5p chips, the most advanced TPU first introduced in December. The server AFM is trained on 8,192 TPU v4 chips, which are configured as eight slices working together over a data center network.

The latest TPUs cost less than $2 per hour to use when pre-ordered for three years, according to Google's website. Google first introduced TPUs for internal workloads in 2015 and made them available to the public in 2017. They are now among the most mature custom chips designed for artificial intelligence.

Still, Google remains one of Nvidia’s major customers, using Nvidia’s GPUs and its own TPUs to train AI systems, and also selling licenses for Nvidia’s technology on its cloud.

Apple has previously said that the inference process - using pre-trained AI models to generate content or make predictions - will be performed in part on its own chips in its data centers.

This is Apple's second technical paper on its AI systems, following a more general version released in June, when Apple said it used TPUs in developing its AI models.

Apple is scheduled to report quarterly results after trading closes on Thursday.