
The long wait for a smarter Siri is to get even longer, with some indications that the new features we were originally expecting in iOS 18.4 may now be pushed back to iOS 19.
Apple hasnât provided any real explanation, but two theories have so far been put forward, and now a developer and data analyst has suggested that security concerns may be a third reason â and by far the biggest problem âŚ
Smarter Siri delay
Apple first promised a much smarter Siri at WWDC in June of last year. While the company has launched some Apple Intelligence features, and Siri can now at least cope with verbal stumbles and some compound commands, we havenât yet seen any sign of the three key improvements promised by Apple:
- Contextual awareness, where it can access your personal information
- The ability to see whatâs on your screen, and respond to that
- In-app actions, where you tell Siri what you want to do and it will use your apps to achieve it
We were originally expecting these features to be included in iOS 18.4, while a more conversational Siri would wait until next year, but Apple has now said that it has hit problems with this timeline.
Weâve also been working on a more personalized Siri, giving it more awareness of your personal context, as well as the ability to take action for you within and across your apps. Itâs going to take us longer than we thought to deliver on these features and we anticipate rolling them out in the coming year.
Itâs not completely clear what the company means by âthe coming yearâ â whether that is later this year, or at some point next year, but it could easily mean we wonât get a smarter Siri until iOS 19.
Two reasons have so far been suggested
Bloomberg has consistently reported that Apple has been struggling to prepare these features for launch, indicating that there are simply too many bugs at present.
Inside Apple, many employees testing the new Siri have found that these features donât yet work consistently.Â
At the time, however, Mark Gurman expected the launch to be delayed from iOS 18.4 to 18.5.
Itâs also been suggested that the underlying reason for these bugs is that Apple currently has two completely separate versions of Siri, with one layered on top of the other. The original version still handles tasks Siri has always been able to execute, while the second layer intercepts more complex commands. Apple is reportedly struggling to integrate everything into a single version of Siri.
But security may be the biggest concern
Developer Simon Willison, creator of the open-source data analysis tool Datasette, suggests that Apple may also be struggling to keep a smarter Siri secure. Specifically, he thinks it may be vulnerable to prompt injection attacks.
These are a well-known problem with all generative AI systems. Essentially an attacker attempts to override the built-in safety measures in a large language model (LLM) by fooling it into replacing the safeguards with new instructions. Willison says this poses a potentially massive risk with a new Siri.
These new Apple Intelligence features involve Siri responding to requests to access information in applications and then performing actions on the userâs behalf.
This is the worst possible combination for prompt injection attacks! Any time an LLM-based system has access to private data, tools it can call, and exposure to potentially malicious instructions (like emails and text messages from untrusted strangers) thereâs a significant risk that an attacker might subvert those tools and use them to damage or exfiltrating a userâs data.
In other words, fooling ChatGPT into giving you instructions for building a bomb is one level of risk, but tricking Siri into handing over your personal data is something else altogether.
Apple commenter John Gruber finds this theory credible, saying that nobody has yet succeeded in stopping prompt injection attacks.
A pessimistic way to look at this personalized Siri imbroglio is that Apple cannot afford to get this wrong, but the nature of LLMsâ susceptibility to prompt injection might mean itâs impossible to ever get right. And if it is possible, it will require groundbreaking achievements. Itâs not enough for Apple to âcatch upâ. They have to solve a vexing problemâââas yet unsolved by OpenAI, Google, or any other leading AI labâââto deliver what theyâve already promised.
He wonders aloud how Apple allowed itself to promise â and even advertise â features which may prove too dangerous to ever launch.
9to5Macâs Take
While Iâm sure the earlier reports are accurate, it does seem entirely plausible that security is also a key factor.
Apple has turned privacy into a key differentiator and selling point for its products, so any vulnerability that allows a rogue app to access and export your personal data would be a complete disaster for the iPhone maker.
While I wouldnât expect the smarter Siri project to turn into another AirPower, it is looking increasingly like Apple should have just kept quiet about timings until it was closer to addressing all of the challenges.
Image: Apple and Michael Bower/9to5Mac
FTC: We use income earning auto affiliate links. More.
<