Today I learned that you can recognize plants and flowers using only your iPhone camera

Sometimes, even as a technical reporter, you can catch how fast technology is improving. Example: I only found out today that my iPhone offers a feature I’ve been wanting for a long time – the ability to identify plants and flowers only in a photo.

It is true that various third-party applications have been offering this feature for years, but the last time I tried them I was disappointed with their speed and accuracy. And, yes, there are Google Lens and Snapchat Scan, but it’s always less convenient to open an application that I wouldn’t otherwise use.

But since the introduction of iOS 15 last September, Apple has offered its own version of this visual search feature. It’s called Visual Look Up, and it’s pretty damn good.

It works very simply. Just open a photo or screenshot in the Photos app and look for the blue “and” icon below. If it has a small glittering ring around it, then iOS has found something in the photo that it can identify using machine learning. Tap the icon, then click “Search” and it will try to gather some useful information.

Touching the “and” icon usually gives you more information about when you took the photo and the camera settings. However, if the ring shines, there is also Visual Look Up data that you can see.

After clicking the “and” icon, you will have the option to search for more information based on several selected categories.

It works not only for plants and flowers, but also for landmarks, art, pets and “other objects”. Not perfect, of course, but it surprised me more than it let me down. Here are a few more examples just from my photo:

Visual Look Up works for landmarks, animals and art, as well as plants and flowers.
The Verge

Although Apple announced this feature last year at VVDC, it didn’t really honk about its availability. (I noticed this via a link in one of my favorite technical newsletters, The Overspill.) Even the official support page for Visual Look Up gives mixed messages, telling you in one place that it’s “US only” and then listing other compatible regions on another page.

Visual Look Up is still limited in its availability, but access has expanded since launch. It is now available in English in the US, Australia, Canada, UK, Singapore and Indonesia; in French in France; in German in Germany; in Italian in Italy; and in Spanish in Spain, Mexico, and the United States.

That’s a great feature, but it also makes me wonder what else a visual search could do. Imagine taking a picture of your new houseplant, for example, just to ask Siri “do you want me to set reminders for watering schedules?” maps.

I learned a long time ago that it is foolish to hope that Siri is doing something too advanced. But these are the kinds of features we could possibly get with future AR or VR headphones. Hopefully, if Apple introduces this type of functionality, it will resonate more.

#Today #learned #recognize #plants #flowers #iPhone #camera