Apple has unveiled Apple Intelligence — a suite of features introducing capabilities of generative AI, such as rewriting email drafts, composing notification summaries, and creating custom emojis, available on iPhone, iPad, and Mac. During the presentation at WWDC, company representatives detailed the benefits of these tools. Significant attention was also devoted to promises of complete confidentiality when using these new AI tools.
According to developers, the high level of privacy is ensured through Apple’s dual approach. On one hand, Apple Intelligence operates locally on the device, enabling quick handling of basic AI tasks. However, for more complex queries that require the transmission of personal data, cloud servers are involved.
A key feature is the use of Apple’s own AI models. Unlike competitors, the company does not train them on private user data or interactions. Instead, licensed materials and public information collected by Applebot crawlers are used. Authors can opt out of indexing their content, similar to Google and OpenAI. Training data also excludes credit card numbers, social security numbers, and explicit language.
One of the main advantages of Apple Intelligence is its deep integration into Apple’s operating systems and applications, as well as model optimization for energy efficiency and size to run on iPhones. Local processing of requests alleviates many concerns regarding privacy, although it necessitates using more compact and limited AI tools.
To enhance the efficiency of local models, Apple employs fine-tuning methods, specifically training them for tasks like spell checking or text summarization. These specialized skills are implemented as “adapters,” which can be flexibly attached to the base model for specific tasks, enhancing its capabilities.
To expedite AI operations, Apple applies techniques such as speculative decoding, selective context processing, and grouping of similar queries. Neural cores in Apple Silicon processors are utilized for this purpose. Chipmakers have recently begun embedding specialized neural processors in new system-on-chip designs, relieving CPUs and GPUs from processing machine learning and AI algorithms. This is why Apple Intelligence operates exclusively on devices with M-series chips, including iPhone 15 Pro and Pro Max.
According to internal Apple research, out of 750 analyzed responses for text summarization, the company’s local AI model (with the appropriate adapter) delivered results more appealing to users compared to Microsoft’s Phi-3-mini model. While this sounds like a significant achievement, modern chatbots typically use much more powerful cloud models to achieve better results. Apple aims to strike a balance between quality and privacy, offering seamless transmission of complex queries to cloud servers with confidential data processing.
For queries requiring a more powerful AI model, Apple forwards them to its Private Cloud Compute (PCC) servers. PCC runs on its own operating system (based on iOS) and has its own software stack for Apple Intelligence. According to the company, PCC features its own Secure Enclave module for storing encryption keys, compatible only with the requesting device. Additionally, a dedicated monitor ensures that only verified code runs on PCC.
Before sending a request, the user’s device establishes an encrypted connection with the PCC cluster. Apple asserts that it cannot access data on PCC servers, as they lack remote management tools and command shells. PCC also has no persistent storage, so queries and any personal data from the Apple Intelligence semantic index are deleted after cloud processing.
Each PCC build will have a public virtual version for research audits. Only signed and registered builds that pass verification will be deployed in the production environment.
Another way Apple addresses privacy concerns is by shifting this responsibility to third-party companies. The updated voice assistant Siri may redirect some complex queries to the ChatGPT cloud from OpenAI, but only with the user’s permission after posing a genuinely challenging question. According to Apple CEO Tim Cook in an interview with Marques Brownlee, the ChatGPT system will connect to answer general queries that extend beyond personal context.
Apple isn’t the first company to combine local and cloud data processing for its AI tools. Google uses the local Gemini Nano model on Android devices alongside cloud models Pro and Flash. Microsoft also employs local processing on computers with Copilot Plus, leveraging OpenAI resources and developing its own MAI-1 model. However, none of Apple’s competitors have made such a serious commitment to preserving the confidentiality of user data.
Of course, all of this looks impressive in prepared demonstrations and official documents. But for researchers, the most crucial task now is to verify the effectiveness of Apple Intelligence in practice when it becomes available later this year.