Google on Wednesday announced a new feature that will make it easy to find quick facts about what people see on Google images in the form of a Knowledge Graph.
Starting this week, first in the US, when you search for an image on mobile, you might see information from the Knowledge Graph related to the result. That information would include people, places or things related to the image from the Knowledge Graph’s database of billions of facts, helping you explore the topic more.
For example, if you’re searching for beautiful state parks to visit nearby. You want to swim during your visit, so you tap on a picture of a park with a river. Beneath the photo you might see related topics, such as the name of the river, or which city the park is in. If you tap a specific topic, it will expand and show you a short description of the person, place or thing it references, along with a link to learn more and other related topics for you to explore. With this information, you can better understand the image you’re viewing and whether the web page is relevant to your search.
Or perhaps you’re looking for information about an architect’s work, clicking on images of buildings of that style will show up information about the architect, when it was built and other relevant information.
“To generate these links to relevant Knowledge Graph entities, Google understands about the image through deep learning, which evaluates an image’s visual and text signals, and combine it with Google’s understanding of the text on the image’s web page” the company said in a statement.
This information helps Google determine the most likely people, places or things relevant to a specific image. It is matched with existing topics in the Knowledge Graph, and then surface them in Google Images.
This feature will start to appear on some images of people, places and things in Google Images and will expand to more images, languages and surfaces over time.