According to a Bloomberg report, Apple will allow users to choose which AI chatbot integrates with Siri in the next update of its operating system. The feature, called Extensions, will work on iPhone, iPad, and Mac, and will allow connecting assistants like Google Gemini or Anthropic Claude so they can respond through Apple's assistant, following the model of the current integration with ChatGPT. This strategic move will be presented at the WWDC and marks a significant shift in Apple's approach to artificial intelligence.
Technical Mechanics and Scope of AI Extensions 🤖
The Extensions functionality will act as a connector or API that will allow Siri to redirect complex queries to the AI engine selected by the user, who will be able to activate or deactivate the services in Preferences. The integration will not be limited to the voice assistant, but will also work with a future standalone Siri app enhanced with AI, capable of executing actions in other applications. This suggests an architecture where Siri maintains control of the interface and basic system tasks, while delegating complex content generation and reasoning to external engines, creating a hybrid experience.
Implications for the Ecosystem and User Experience ⚖️
This openness transforms Siri from a closed product into a platform, which could commoditize AI assistants and give users more choice within Apple's traditionally closed ecosystem. However, it raises risks of fragmentation in the experience, possible privacy conflicts between Apple's policies and those of third parties, and added complexity for the average user. For Apple, it is a pragmatic way to compete in AI without having developed a leading model, changing its role from sole provider to manager of a market within its devices.
How will Siri's openness to third-party chatbots change the balance of power and user privacy in the digital ecosystem dominated by Apple? 🍎
(P.S.: at Foro3D we know that the only AI that doesn't generate controversy is the one that's turned off)