Sundar Pichai and Tim Cook
Source: Reuters, Apple
apple On Monday, he spoke about the artificial intelligence model that underpins Apple Intelligence. AI systemwas pre-trained on a Google-designed processor, a sign that big tech companies are looking for alternatives. NVIDIA When it comes to training cutting edge AI.
Apple’s Choice Google Regarding the self-developed TPU (tensor processing unit) for training, Recently published technical paper Separately, Apple Preview version Apple Intelligence was released for select devices on Monday.
Nvidia’s expensive graphics processing units (GPUs) dominate the market. High-end AI training chipand over the past few years, demand has been so high that it’s been difficult to get hold of the quantities needed. Microsoft,Anthropic uses all Nvidia GPUs as its model, but other tech companies such as Google Meta, Oracle and Tesla Many are also acquiring these companies to build AI systems and services.
Meta CEO Mark Zuckerberg Alphabet CEO Sundar Pichai The two men made comments last week that alluded to their own companies and others in the industry. You may be overinvesting They invested in AI infrastructure, knowing the business risk would be too high if they didn’t.
“The downside of being late is that you’ll be left out of the most important technology position for the next 10 to 15 years,” Zuckerberg said. Podcasts With Bloomberg’s Emily Chang.
Apple doesn’t name Google or Nvidia in the 47-page paper, but it does say that the Apple Foundation Model (AFM) and AFM server are trained on “Cloud TPU clusters,” meaning Apple rented servers from cloud providers to run the calculations.
“This system enables efficient and scalable training of AFM models, including on-device AFM, AFM servers, and larger scale models,” Apple said in the paper.
Representatives for Apple and Google did not respond to requests for comment.
Apple has revealed its AI plans later than many other companies, who were big proponents of generative AI shortly after OpenAI announced ChatGPT in late 2022. Apple IntelligenceThe system includes several new features, including a revamped look for Siri, improved natural language processing, and AI-generated summaries in text fields.
Over the next year, Apple plans to roll out features based on generative AI, including image generation, emoji generation and an improved Siri that can access a user’s personal information and take actions within apps.
Apple said in a paper on Monday that the on-device AFM was trained on a single “slice” of 2,048 TPU v5p chips working together — its most advanced TPU, first released in December — and that the AFM server was trained on 8,192 TPU v4 chips configured to work together as eight slices across a datacenter network, according to the paper.
Google’s latest TPUs can be reserved for a three-year supply for less than $2 per chip hour. Go to Google’s websiteGoogle first introduced TPUs for its internal workloads in 2015. Open to the public It was announced in 2017. Currently, they are one of the most mature custom chips designed for artificial intelligence.
Still, Google remains one of Nvidia’s major customers: It uses Nvidia’s GPUs and its own TPUs to train its AI systems, and also sells access to Nvidia’s technology on its cloud.
Apple has previously said that inference — which means taking pre-trained AI models and running them to generate content or make predictions — would be performed in part on Apple’s own chips in its data centers.
This is Apple’s second technical paper on its AI systems, following a more general version published in June. at that time The company said it uses TPUs to develop its AI models.
Apple is scheduled to report its quarterly earnings after the close of trading on Thursday.