We are still embracing the new design of Google Translate, though its functioning remains the same. And here comes another innovation, bringing services by Google closer together. iOS users can already appreciate how Google Translate and Google Lens together create a better experience with texts in foreign languages that surround us.
Not that these services didn’t interfere before. On Android, for example, it’s been possible for a while to translate texts recognized by Lens offline and to send texts from screenshots to Google Translate as well. This sort of integration it features now, though, is way deeper, and it’s impressive to see how it work in real-time.
The key element of it is the camera button in the Translate app for iOS. With it, the app is able to translate captured texts, texts from imported or captured pictures. In addition, within the app users will be able to select, copy, and share text fragments. As you point the camera at the text, it shows the translation instead at the same place. Right here you can also select the original language if recognized incorrectly and the target one. To save your data when, say, abroad, you can install offline translation packs and use this integrated mode without an Internet connection.
First, an update by Google arrives on iOS devices (where it’s already available), and only then on Android. It seems strange for Google’s own services until you think of how fragmented Android as a platform is. So it will take a few months for the same integration to be seen on Android devices, and chances are Pixel phones will receive it way ahead of the others.
Have you tried this union of Lens and Translate? Were you satisfied, impressed, or disappointed? Were there situations where this feature could be lifesaving? Share our stories in the comments if you please!