Listen to this story
|
After making the world wait, Apple has finally uttered the word ‘AI’ but not without giving it its own spin by introducing ‘Apple Intelligence’.
Apple Intelligence is deeply integrated with the new Siri, allowing users to speak more naturally to Siri, thanks to its enhanced language understanding capabilities. Siri is also getting a new look, featuring an elegant glowing light that wraps around the edge of the screen when it is active.
Apple claimed that Siri users make 1.5 billion voice requests every single day. The Cupertino-based tech giant showed the world that it doesn’t need OpenAI’s help to improve Siri, and it’s pretty clear that it is not just a wrapper around ChatGPT.
Apple has developed a 3 billion parameter on-device language model and a larger server-based model accessible via Private Cloud Compute on Apple silicon servers. These models are trained using Apple’s AXLearn framework, an open-source project released in 2023, built upon JAX and XLA.
Apple doesn’t Really Need OpenAI
Apart from building in-house Apple Intelligence, the iPhone maker has partnered with OpenAI to integrate ChatGPT-powered by GPT-4o into iOS.
Apple says Siri will now be able to tap into ChatGPT’s ‘expertise’ when needed. For example, if you need meal ideas using ingredients from your garden, you can ask Siri. Upon receiving your permission, Siri will send the prompt to ChatGPT and provide you with suggestions.
It’s worth noting that ChatGPT isn’t integrated directly into Siri, rather, it’s gaining access to it. Apple has not revealed the specifics of the deal or the amount it is paying OpenAI.
Meanwhile, Tesla chief Elon Musk is unhappy with the Apple-OpenAI partnership. He said, “It’s patently absurd that Apple isn’t smart enough to make their own AI, yet is somehow capable of ensuring that OpenAI will protect your security & privacy!”
However, this claim is not true.
“Why is Elon Musk lying about how Apple’s ChatGPT integration works? He’s threatening to put iPhones in Faraday cages and claiming Apple isn’t smart enough to make its own LLMs,” said AI expert Vin Vashishta.
“He knows Apple uses its own on-device LLMs. He heard Apple say that ChatGPT is only recommended when on-device LLMs can’t handle the request, and data is only sent if users choose that option,” he added.
Moreover there is no clarity on whether OpenAI will use users’ data to train its model. OpenAI said that users can choose to connect their ChatGPT account, which means their data preferences will apply under ChatGPT’s policies.
OpenAI’s GPT-4o’s voice feature is not yet released. It would be interesting to see whether Apple would integrate that as well in the future. However, the company did drop a hint about multiple possible future partnerships.
There is a possibility that Apple might also partner with Google, especially after Google recently introduced ‘Project Astra‘ at Google I/O 2024.
Astra is a universal AI agent built on Google’s Gemini models, designed to process multimodal information, such as text, images, video, and audio, to understand context and respond in real-time. Word has it that Apple may also partner with Anthropic.
Siri Can Take Actions
The highlight of the new Siri is its ability to take actions. With Apple Intelligence, Siri gains on-screen awareness, enabling it to understand and act on what’s displayed. For instance, if a friend texts you their new address, you can simply say, ‘Add this address to their contact card’, and Siri will do it right from the Messages thread.
Moreover, Siri can now take actions within apps on your behalf. It can perform hundreds of new tasks in and across apps, including utilising Apple’s new writing and image generation capabilities. For example, you could say, ‘Show me my photos of Stacey in New York wearing her pink coat,’ and Siri will display them instantly.
The feature is quite similar to Rabbit’s LAM and what Humane Ai Pin attempted to offer. However, it’s now clear that one doesn’t require a new device to access LLMs, and smartphones aren’t going anywhere anytime soon.
Recent reports suggest that Humane is actively seeking a potential buyer for its AI Pin business after it faced widespread criticism for failing to meet expectations.
Moreover, thanks to Apple Intelligence, Siri now understands your personal context. With a semantic index of your photos, calendar events, and files, as well as information from messages and emails—such as hotel bookings, concert tickets, and shared links—Siri can now find and comprehend things it couldn’t before.
For example, when you’re filling out a form, you can simply ask Siri to locate a personal document, such as your driver’s license, from your photos and auto-fill the details into the form.
Apple Intelligence enables Siri to perform hundreds of new actions within Apple and third-party apps. For instance, a user might say, “Retrieve the article about cicadas from my Reading List”, or “Share the photos from Saturday’s barbecue with Malia”, and Siri will handle the tasks effortlessly.
This was made possible through significant enhancements that Apple is making to App Intents, a framework that lets apps define a set of actions for Siri, Shortcuts, and other system experiences.