I’ve known Siri, Apple’s voice assistant, for almost a dozen years now, and I still can’t remember a single meaningful conversation we’ve had. Conversely, ChatGPT and I have known each other for six months, and yet we’ve talked about everything from the meaning of life to planning a romantic dinner for two and even collaborated on programming and film projects. I mean, we have a relationship.
Siri’s limits mean that it still can’t hold a conversation or engage in a long, project-oriented back-and-forth. For better or worse, the Siri we use today on our iPhones, iPads, MacBooks, Apple Watches, and Apple TVs isn’t all that different from the one we first encountered in 2011 on an iPhone 4s.
Six years ago, I wrote about Siri’s first brain transplant (opens in a new tab), the time when Apple started using machine learning to train Siri and improve her ability to respond to conversational queries. The introduction of machine learning and, soon after, an embedded neural network in the form of Apple’s A11 Bionic chip in the iPhone 8, marked what I thought was a turning point for, without a doubt , the first consumer-grade digital assistant.
This programming and silicon helped Siri understand a question and its context, allowing her to go beyond memorized answers to intelligent answers to more natural language questions.
Early Siri was not her
Not being able to fully converse with Siri didn’t seem like a big deal, even though we’d already seen the movie. she and we understood what we could expect from our chatbots.
It wasn’t, however, until OpenAI’s GPT-3 and ChatGPT brought the distant future into the present that Siri’s deficits came into stark relief.
Despite Apple’s best efforts, Siri has been idle in learning mode. Perhaps this is because Siri is still primarily based on machine learning and not generative AI. It’s the difference between learning and creating.
All of the generative AI chatbots and imaging tools we use today create something new from prompts and, soon, art and images. They are not answer bots, they are builder bots.
I doubt any of this is lost on Apple. The question is, what will and can Apple do about it? I think we won’t have more to see than their upcoming Worldwide Developers Conference (WWDC 2023). We’re all fixated on the potential $3,000 mixed-reality headset Apple could show off in June, but the company’s biggest announcements will surely revolve around AI.
“Apple must be under incredible pressure now that Google and Microsoft have released their natural language solutions,” said Patrick Moorhead, managing director and analyst at Moor Insights. (opens in a new tab) he told me via Twitter DM.
A chattier Siri
As reported by 9to5Mac, Apple may finally be working on its own language generation update for Siri (Bobcat). Note that this is not the same as “generative AI”. I think it means that Siri will get a little better at casual banter. I don’t expect much more than that either.
Unfortunately, Apple’s own ethos may prevent it from reaching GPT-4, let alone GPT-3. Industry watchers aren’t exactly expecting a breakthrough moment.
“I think what they do with AI won’t necessarily be a leap as much as a calculated, more ethical approach to AI in Siri. Apple loves, lives and dies by its privacy commitments and I expect no less in how they do it. A Siri more AI-driven,” Tim Bajarin, CEO and principle analyst at Creative Strategies (opens in a new tab) he wrote to me in an email.
Privacy above all else
Apple’s staunch adherence to user privacy may hold it back when it comes to true generative AI. Unlike Google and Microsoft Bing, it doesn’t have a massive search engine-based data store to tap into. Nor is it training its AI on the Internet’s vast ocean of data. Apple does its machine learning on the device. An iPhone and Siri know what they know about you based on what’s on your phone, not what Apple can learn about you and its 1.5 billion global iPhone users. Sure, developers can use Apple’s ML tools to build and integrate new AI models into their apps, but they can’t simply collect your data to learn more about you to help Apple deliver a better Siri AI .
As I wrote in 2016: “It’s also interesting to consider how Apple intentionally limits its own AI efforts. Data about your iTunes shopping habits, for example, isn’t shared with any of Apple’s other systems and services.” .
Apple’s local focus could hinder it in its potential generative AI efforts. As Moorhead told me, “I see most of the action in the device and in the cloud. Apple is strong in the device but weak in the cloud and that’s where I think the company will struggle.”
As I see it, Apple has a choice. Give up some user privacy to finally transform Siri into the voice assistant we’ve always wanted, or stay the course with incremental AI updates that improve Siri but never rival ChatGPT.