Did you ever feel the need to copy text from an image? Apple’s latest feature called Live Text is the solution you were looking for. The feature recognizes various objects in a photo to help you understand your environment better.
So how well does this feature work on different images? We decided to compare the feature with Google Lens, and the results are here!
Live Text is only available to use for Apple users. So it will be compatible with:
- iPhones with A12 Chip running iOS 15 or newer
- iPad mini (5th generation) and newer
- iPad Air (2019, 3rd generation) and newer
- iPad (2020, 8th generation) and newer
- iPad Pro (2020) and later, running iPadOS 15
- Macs with M1 chip
On the other hand, Google Lens is available to use on both iOS and Android. This post contains a review of Google Len on an iPhone 11 and Live Text on the same phone.
After a lot of testing with various images containing text, we came to the conclusion that Live Text is currently a hit or miss. It will work perfectly some times, and other times it will not detect anything.
The handwriting recognition did not pick anything at all. But this is maybe because iOS 15 was in beta testing, so the feature might work once the update rolls out. To use Live Text, just tap on the Live Text icon on the image and it will pick things. To pick text from the camera, tap on the text on the frame and then on the yellow icon at the bottom of the screen. You can copy, select all, Lookup or translate the text once the feature detects it.
Google Lens does a way better job of recognizing any text, including handwriting, from photos. It also gives relevant suggestions to deal with phone numbers and addresses so Google Lens is clearly the winner.
Apple’s Live Text currently covers seven languages, whereas Google Lens is available in 108 languages. Google Lens will show the translation right on the image and Live Text shows the translation below the image. In this regard both features work well, but Google Lens can translate handwriting which is a plus.
Visual Lookup is a pretty cool feature that identifies objects like plants, buildings and animals when you tap the i icon on the frame.
Live Text did not work at all on the iOS 15 beta version. No matter which objects we tried, there was no response. Google Lens showed no issues in identifying the objects, and worked smoothly.
Live Text did not get any results when we tried to find locations in photos. Google Lens was very savvy at it and identified landmarks from the photo.
Once again, Live Text could not pick up plants or flowers, but it was easy work for Google Lens.
Google Lens picked up animals from photos of our pets, but could not provide details about their breed. Like other aspects, Live Text could not identify animals on the iOS 15 beta version.
Even if most features are currently lagging in Live Text, one area where it excels is ease of use. It is well integrated into the iPhone’s camera and photo app and is right there if you need it. We are sure that it is going to be a hit once the official update releases on all Apple devices. In comparison, Google Lens is a little difficult because you have to go to the Google app to access the feature.
Google Lens is more accurate compared to Live Text at this time, but this may be because Google has amassed a lot of data to feed this feature. Handwriting, locations, animals, plants and landmarks are detected easily on Google Lens. However, we believe that Live Text will become more intelligent in the near future
As you can see, Google Lens more advanced than Live Text for translating text, searching places online, and ore. However, Live Tet is easier to use and much better when it comes to safeguarding your privacy. So, it’s a matter of whether you want more advanced features or better anonymity with your image recognition tool.