- Apple wowed the world with speech-driven AI when it introduced Siri 12 years ago.
- It's now dangerously lagging behind its competitors, the columnist Michael Gartenberg argues.
Apple is the tech giant known for its sleek designs, cutting-edge technology, and innovative consumer products. But when it comes to artificial intelligence, particularly Siri, Apple has gone from leader to laggard.
Companies such as Google and Microsoft — through its investment in OpenAI — have been showing off mind-blowing advances. Meanwhile, Apple's efforts in this area have been relatively lackluster.
Siri was introduced to the world in 2011 – 12 years ago – and was lauded at the time as a breakthrough in AI technology. Because it was. However, over the years, Siri has failed to live up to its potential. While the technology has improved since its inception, the AI efforts of Google and OpenAI have greatly eclipsed it.
One of the biggest limitations of Siri is its lack of natural-language-processing capabilities. Siri struggles to understand the context of a conversation and can reliably perform only simple tasks such as setting reminders or timers. Even after all these years, asking Siri to correctly answer a question or take dictation of a text and send it to the right person is chancy.
In contrast, Google Assistant and OpenAI's ChatGPT (which is integrated into Microsoft Bing and other Microsoft apps) have advanced natural-language-processing capabilities. This allows them to understand the nuances of human language and respond accordingly.
For instance, when asking Bing Chat to name things that it can do that Siri can't, it named things like how it can summarize complex political situations or be used with other search engines like DuckDuckGo.
When Siri was asked what it could do that Bing Chat couldn't, it responded with instructions on how to launch Bing by saying, "Open Bing." While I suppose that's true — Bing can't launch apps on the iPhone — it missed the point of the question.
Another area where Siri falls short is its third-party-app integration. Siri can perform tasks only within the confines of Apple's ecosystem, while Google Assistant and ChatGPT have integrations with a wide variety of apps, allowing them to perform a greater number of tasks.
Apple's closed ecosystem also limits the amount of data that Siri has access to. This lack of data makes it difficult for Siri to learn and improve over time, as machine-learning algorithms require large amounts of data to operate effectively. And while folks can (and have) argued about the ethics of how Google and OpenAI use people's data to train their AI models, no one can argue against that their access to vast amounts of data has allowed them to continuously improve AI capabilities.
Moreover, Apple has been slow to embrace open-source technologies, which are essential for AI research and development. Open-source technology allows developers to collaborate and contribute to a project, leading to faster and more efficient development cycles.
But Apple has a long history of secrecy, including with its AI projects, which kept it out of the loop on cutting-edge research for years. That's been changing. In 2015, Apple had published no research papers on AI. Today, it has a website that openly shares the roughly 370 papers it's published since 2017. Still, Google, which has a long history of open-source involvement, publishes hundreds of AI research papers yearly.
And while Apple has also been participating in communities like Hugging Face, where AI researchers share the models they use to train AI apps, its participation there has been relatively paltry. It's shared 11 models, compared with Microsoft's 245 and Google's 587. And much of Apple's contributions to other big AI open-source projects, including TensorFlow (a project that originated out of Google) and PyTorch (originated out of Facebook) have been aimed at tweaks that let developers run these technologies on Macs. While that's helpful (especially for selling Macs to AI developers), it's not the kind of kumbaya sharing that the open-source community relies on.
In a sign of just how stagnated, and isolated, Siri development has become, some Apple engineers have left the company to work on the type of large language models that powers OpenAI, The Information reported last month.
That said, I offer three things I would advise Apple to do as a technology analyst who has covered the company for nearly three decades and, at one point, worked there.
- Expand Siri's capabilities beyond basic commands: Apple should invest in building up Siri's abilities to handle more complex tasks, such as booking appointments, making reservations, and ordering food.
- Improve Siri's natural language processing: Siri's current NLP is less advanced than that of Google Assistant or ChatGPT. Apple could invest in improving Siri's language-understanding capabilities, making it easier for users to interact with the voice assistant.
- Open up Siri's platform: Apple should have done this years ago. Allowing non-Apple software to integrate with Siri would make it far more useful, which would encourage more use, which would help Siri improve. By opening up Siri's platform, Apple could also encourage developers to create more innovative and sophisticated applications that use Siri's voice recognition and natural-language-processing capabilities.
Some insiders say that Apple is working on all of the above, according to The Information story, and is planning on releasing a new-and-improved Siri in a future release of iOS. We'll be watching for word of this in June at Apple's next Worldwide Developers Conference.
In the meantime, the Siri we have is the Siri we have. And if Apple users want more expansive AI in their lives, they'll need to get it elsewhere.
Apple declined to comment.
Michael Gartenberg is a former senior marketing executive at Apple and has covered the company for more than two decades as a market-research analyst at Gartner, Jupiter Research, and Altimeter Group. He is also an Apple shareholder. He can be reached on Twitter at @Gartenberg.