Apple launched the highly anticipated AI-powered visual intelligence feature at the iPhone 16 launch, which takes the iPhone's camera capabilities to a new level. It gives iPhones a visual search capability similar to Google Lens, where users can get relevant information by simply taking photos, such as identifying restaurant signs and finding their locations and reviews. This innovative feature greatly simplifies the information acquisition process and brings users a more convenient and smarter mobile experience.
At the iPhone 16 launch conference just held, Apple not only showed off the new iPhone, AirPods and Apple Watch Series 10, but also launched the AI-powered visual intelligence function for the first time. This new feature gives iPhone cameras similar to Google Lens, enabling a smarter photo experience.
Apple's visual intelligence feature allows users to capture objects around them by taking photos, such as singles or restaurant signs, and then use iPhone's AI technology to search and provide more relevant information. The user simply needs to turn on the camera and take pictures of the items they want to recognize, and the iPhone will quickly process and feedback the information.
In addition, Apple also emphasized data privacy in particular. User shooting data will be saved in Apple's private cloud computing to ensure security. However, if the user wants, he can also choose to integrate with the party's services. For example, users can choose to search for the captured content directly to Google. This is similar to opening Google Lens directly in the camera app.
What’s more interesting is that Apple has also introduced the integration of ChatGPT. Users can allow ChatGPT to process image data captured by the camera, further obtaining detailed information and suggestions. All of this requires users to choose whether to enable the corresponding permissions according to their own needs.
It is worth mentioning that although iPhone 16 and iPhone 16Plus have begun to accept pre-orders, with prices of $799 and $899 respectively, the visual intelligence feature will not be available immediately. Apple said that these AI features will be launched gradually in beta versions next month, and more features will be added in the next few months.
Key points:
Apple's visual intelligence features allow iPhone cameras to perform visual searches like Google Lens.
User data will be saved in Apple's private cloud computing to ensure privacy and security.
In the future, it can also be integrated with third-party applications such as ChatGPT to obtain more information.
All in all, Apple's visual intelligence features are a major upgrade to the iPhone camera, which will change the way users interact with the world around them. Although the functions are still in the testing stage, their potential is unlimited and it is worth looking forward to its future development and the launch of more functions.