Apple is relying on Artificial Intelligence (AI) features to enhance the sales of the iPhone 16.
The company’s collection of AI tools is referred to as Apple Intelligence, but it will not be available at launch. New capabilities are set to be introduced through iOS updates over the course of the next year.
Another feature expected to improve over time is the dedicated Camera Control button. Bloomberg’s Mark Gurman shared a timeline for the rollout of several highly-anticipated Apple Intelligence features today. The initial set of features will go into beta next month, with additional ones to be released in the following months.
- Square Enix Porting Beloved RPG to Meta Quest 3: Childhood Nostalgia Lives On
- Xiaomi’s Latest Smartwatch is Incredibly Affordable
- Tim Cook and Jimmy Fallon Discuss Apple Career, iPhone 16 in NYC Stroll
- Redmi Note 14 Pro Specs: Snapdragon 7s Gen 3, AMOLED Display, Large Battery
- T-Mobile May Have Underpaid Thousands of U.S. Employees for Years
The update in December is anticipated to introduce three features: AI-generated emojis, Image Playground for creating images based on descriptions, and ChatGPT integration. Apple has stated that Apple Intelligence will enhance Siri’s functionality, but users will need to wait until March for the updated version of the digital assistant.
The AI-powered Siri will be better equipped to process queries and maintain context across multiple requests. Additionally, it will be capable of utilizing personal context and gaining onscreen awareness to deliver a customized experience.

In addition to Apple Intelligence, Apple plans to release updates that enhance the functionality of the Camera Control button. This button not only offers a quick way to access the camera but also includes controls for framing shots and adjusting zoom levels.
During the launch of the iPhone 16, Apple announced that the Camera Control would feature a two-stage shutter function, allowing users to lock focus and exposure with a light press, set to arrive this fall. The company also mentioned that users will be able to point the Camera Control button at various objects to gain more information about them.
This capability, referred to as Visual Intelligence, is said to operate similarly to Google’s Lens image recognition technology, according to Gurman. It’s somewhat amusing that it will take Apple over a year to fully implement the Apple Intelligence experience on the iPhone 16, despite their claims that the devices are designed for AI. While this may be frustrating for some customers, it might be preferable to deal with incomplete and glitchy features.