scorecard
  1. Home
  2. tech
  3. news
  4. Apple's AI chief gave this warning to the Siri team when he started working with them

Apple's AI chief gave this warning to the Siri team when he started working with them

Dominick Reuter   

Apple's AI chief gave this warning to the Siri team when he started working with them
  • Apple's head of AI strategy and machine learning, John Giannandrea, has run Siri since 2018.
  • He said his first instruction to the Siri team was that "failure is not an option."

When Apple's head of AI strategy and machine learning, John Giannandrea, joined the company in 2018, the company's voice assistant, Siri, was struggling to live up to its promise.

Looking for a turnaround, Giannandrea said, his first instruction to the Siri team was inspired by the legendary NASA chief flight director Gene Kranz: "Failure is not an option."

"A lot of people use Siri a lot of the time," Giannandrea said in an interview with John Gruber after Apple's Worldwide Developers Conference last week. "And as it's gotten better over the years, we see in our data that people just use it more."

Now, 14 years after Apple launched Siri, the voice assistant's growing popularity has made it an exceptional proving ground for AI, raising the stakes even further.

The recent introduction of Apple Intelligence, as well as a partnership with OpenAI, may bring a flood of new users to Siri, each with more-complicated requests than ever before.

And given the central role of Apple's iPhone, it's fitting that Giannandrea is leading this effort. He's long argued that on-device AI models are critical to the technology being practical for everyday users.

"The inference of large language models is incredibly computationally expensive," he said, explaining why Apple's newest A17 chips are so well suited for the task. "It's the oomph in the device to actually do these models fast enough to be useful. You could, in theory, run these models on a very old device, but it will be so slow will not be useful."

In addition to getting the technical parameters right, Giannandrea said his team was also wrestling with philosophical and ethical questions about the role AI would play in people's lives.

"We've been very careful about applying this technology in a very thoughtful way," he said. "We've tried to corral this technology to do what it's really good at doing."

At the same time, even though researchers are concerned about "the safety problem," Giannandrea said Apple wasn't interested in limiting the creativity of its users with an overly restrictive set of rules.

"It's a very fine balance as we spend a lot of time and a lot of years trying to figure out because this is new for us," he said. "This is new for the whole industry."




Advertisement