By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
Times CatalogTimes CatalogTimes Catalog
  • Home
  • Tech
    • Google
    • Microsoft
    • YouTube
    • Twitter
  • News
  • How To
  • Bookmarks
Search
Technology
  • Meta
Others
  • Apple
  • WhatsApp
  • Elon Musk
  • Threads
  • About
  • Contact
  • Privacy Policy and Disclaimer
© 2025 Times Catalog
Reading: Apple partners with third parties, like Google, on iPhone 16’s visual search
Share
Notification
Font ResizerAa
Font ResizerAa
Times CatalogTimes Catalog
Search
  • News
  • How To
  • Tech
    • AI
    • Apple
    • Microsoft
    • Google
    • ChatGPT
    • Gemini
    • YouTube
    • Twitter
  • Coming Soon
Follow US
  • About
  • Contact
  • Privacy Policy and Disclaimer
© 2025 Times Catalog
Times Catalog > Blog > Tech > AI > Apple partners with third parties, like Google, on iPhone 16’s visual search
AIApple

Apple partners with third parties, like Google, on iPhone 16’s visual search

Usama
Last updated: September 11, 2024 12:11 pm
Usama
Share
8 Min Read
Apple partners with third parties, like Google, on iPhone 16’s visual search
SHARE

Apple’s relationship with Google has just entered a new chapter with the launch of the iPhone 16’s powerful new visual search feature, called Visual Intelligence. Unveiled at Monday’s “It’s Glowtime” event, Visual Intelligence promises to change the way users interact with the world around them. This new feature leverages the capabilities of Google’s visual search engine, offering a seamless blend of Apple’s innovative hardware and Google’s robust search technology.

Contents
A New Era of Camera Control and SearchA New Standard for Visual Search: Meet Apple’s Visual IntelligenceApple’s AI Play: ChatGPT Joins the FoldA Future Beyond Apps?What’s Next for Visual Search?

Alphabet already pays Apple a reported $20 billion per year to remain the default search engine in Safari, but this latest collaboration goes deeper. With the iPhone 16, users will not only enjoy Google’s search dominance on the browser but also tap into its visual search features directly from their camera. Apple’s new Camera Control button provides instant access to these capabilities, allowing users to find information about the world around them with a simple tap.

A New Era of Camera Control and Search

The iPhone 16’s Camera Control button might, at first glance, appear to be an upgraded shutter button, but Apple had much more in store. Beyond capturing photos and video, the Camera Control button introduces an entirely new dimension to the iPhone experience. In a matter of seconds, users can activate Visual Intelligence to search for information, identify objects, or even interact with services — all without launching an app.

Craig Federighi, Apple’s Senior Vice President of Software Engineering, showcased the versatility of Camera Control, explaining how the new feature goes beyond mere photography. With a simple swipe, users can frame shots and adjust settings like zoom, exposure, and depth of field. But the button’s true power lies in its ability to access third-party services — particularly Google’s visual search engine.

“The Camera Control is also your gateway to third-party tools, making it super fast and easy to tap into their specific domain expertise. So, if you come across a bike that looks exactly like the kind you’re in the market for, just tap to search Google for where you can buy something similar,” Federighi explained during the event.

Apple partners with third parties, like Google, on iPhone 16’s visual search
Image Credits: Apple

In the live demo, an iPhone user pointed their camera at a sleek bike, tapped the Camera Control button, and instantly received a grid of purchase options for similar models. A “More results from Google” button popped up, offering to extend the search even further — all without ever leaving the camera view.

A New Standard for Visual Search: Meet Apple’s Visual Intelligence

Apple is positioning Visual Intelligence as the ultimate visual search tool, rivaling services like Google Lens or Pinterest Lens. With just one tap, users can instantly learn more about the objects in their camera’s view. Whether it’s identifying a restaurant while walking through a new neighborhood, finding out the breed of a dog on your morning walk, or turning a poster into a calendar event, the possibilities feel limitless.

For example, Apple demonstrated how you could aim the iPhone’s camera at an event poster and, with a quick press of the Camera Control button, instantly convert it into a calendar entry with all event details automatically included. The power of Visual Intelligence lies in how seamlessly it bridges the gap between the physical and digital worlds — all through the iPhone’s camera.

Apple sees this feature as not just another utility, but as a paradigm shift. It allows iPhone users to access information and services from third-party providers, such as Google, in a way that feels native to the iPhone ecosystem.

Apple’s AI Play: ChatGPT Joins the Fold

In another exciting announcement, Apple revealed that OpenAI’s ChatGPT will be accessible via Siri as a third-party partner. In one demonstration, an iPhone user aimed the camera at a page of handwritten class notes, pressed the Camera Control button, and used ChatGPT to explain a concept or solve a problem in real time. The combination of Apple’s visual prowess and OpenAI’s language model could open the door to new ways of learning and working, where AI assists users in daily tasks.

However, Apple kept certain details under wraps. While it was clear that Visual Intelligence would pull data from both Apple’s in-house services and Google’s search engine, the company did not explain exactly how or when the system would know which service to use. Federighi did offer some reassurance, stating, “Of course, you’re always in control of when third-party tools are used,” but questions remain about the degree of customization and control users will have.

A Future Beyond Apps?

The introduction of Visual Intelligence also raises questions about the future of apps. The iPhone’s camera and Siri are increasingly becoming the go-to interfaces for interacting with AI and third-party services. Traditionally, users had to download specific apps to perform tasks, but with features like Visual Intelligence, Apple is moving towards a model where apps might not even be necessary.

Instead of creating a competing AI to challenge Google or OpenAI, Apple is positioning itself as the platform that seamlessly connects users with the best third-party tools. Visual Intelligence is more than just a search tool — it’s a gateway to an ecosystem of services. This model could transform how iPhone users interact with the digital world and how developers think about creating applications.

By partnering with major players like Google and OpenAI, Apple maintains its reputation as a leader in user experience, while avoiding the pitfalls that often come with AI. If Google’s search results are less than helpful, or if ChatGPT provides an inaccurate answer, the user won’t blame Apple. This approach lets Apple keep its hands clean while offering users cutting-edge technology.

What’s Next for Visual Search?

The partnership between Apple and Google for the iPhone 16’s Visual Intelligence feature signals a future where interacting with technology becomes more intuitive and integrated. Instead of navigating through a maze of apps, users can now access a wealth of information simply by pointing their camera. This move suggests a broader shift in how we will interact with AI, search engines, and third-party services.

As AI and machine learning continue to evolve, Apple is setting the stage for a future where users interact with their devices in ways that feel natural and seamless. Whether it’s through Siri, the camera, or third-party AI models like ChatGPT, the iPhone 16 represents a significant leap forward in making technology work for us — not the other way around.

You Might Also Like

ChatGPT search is growing quickly in Europe, OpenAI data suggests

Google is trying to get college students hooked on AI with a free year of Gemini Advanced

ChatGPT will now use its ‘memory’ to personalize web searches

ChatGPT is referring to users by their names unprompted, and some find it ‘creepy’

OpenAI’s new reasoning AI models hallucinate more

Share This Article
Facebook Twitter Pinterest Whatsapp Whatsapp Copy Link
What do you think?
Love0
Happy0
Sad0
Sleepy0
Angry0
Previous Article Apple punts on AI Apple punts on AI
Next Article Chrome wants to make sure your tabs and groups are accessible across devices Chrome wants to make sure your tabs and groups are accessible across devices
Leave a comment Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

144FollowersLike
23FollowersFollow
237FollowersPin
19FollowersFollow

Latest News

Logitech’s MX Creative Console now supports Figma and Adobe Lightroom
Logitech’s MX Creative Console now supports Figma and Adobe Lightroom
Apps News Tech April 23, 2025
Samsung resumes its troubled One UI 7 rollout
Samsung resumes its troubled One UI 7 rollout
Google News Samsung Tech April 23, 2025
Google Messages starts rolling out sensitive content warnings for nude images
Google Messages starts rolling out sensitive content warnings for nude images
Apps News Tech April 22, 2025
Vivo wants its new smartphone to replace your camera
Vivo wants its new smartphone to replace your camera
News Tech April 22, 2025
Times CatalogTimes Catalog
Follow US
© 2025 Times Catalog
  • About
  • Contact
  • Privacy Policy and Disclaimer
Welcome Back!

Sign in to your account

Lost your password?