Sunday, June 1, 2025
spot_imgspot_imgspot_imgspot_img
HomePhotographyGoogle Live Search uses AI to answer questions about what your camera...

Google Live Search uses AI to answer questions about what your camera sees


As part of its innumerable Announcements on i/o 2025Google unveiled significant changes to search, including AI’s ability to use what a smartphone camera sees in real time.

This new “live search” technology is made on Google’s Project Estra, which is this Started rolling A few months ago after unveiling it early last year. Essentially, Google has developed an AI that can “see” the world through the user’s camera.

Another popular tool of Project Estra Google also manufactures one of the Google lenses, which more than 1.5 billion people use to find what they see every month. While Google Lens has traditionally worked with an image, Project Estra can perform live work continuously, through a camera feed. By combining it with a comprehensive discovery enhancement of Google, including consequent discovery, users can learn about what their camera is indicating in real time.

Provides an example in Google A blog post Physics use of a common school-age, a person who builds a popsical stick bridge. When pointing to its camera on the bridge, the user asks, “What else should I do to strengthen it?” The real -time of Google responds with AI advice, which involves using triangular structures to improve the strength of the bridge.

“If you are feeling stumps on a project and need some help, just tap on the ‘Live’ icon in AI mode or in a lens, indicate your camera, and ask your question,” Google explains. “In the same way, the search becomes a learning partner who can see what you see – explain difficult concepts and suggests on the way, as well as links of various resources that you can find – such as websites, videos, forums and more.

As Petpixel Written in March, some can be useful for some photographers like Project Estra, while in the field. AI may evaluate a view and recommend which color will go best with some background, lens suggestions, and even provide in-field advice to operate a specific camera and dial to appropriate settings.

The Gemini app on Android and iOS will also soon allow users to share their screen with Google, so it can analyze that material in a real time, in addition what the user’s camera is.

These new features fall under the umbrella of Google “Live Search”, which will reach “later this heat” after the beta test, which will be available to members of Google Labs.


Image Credit: Google



Source link

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisment -
Google search engine

Most Popular

Recent Comments

Enable Notifications OK No thanks