iOS VPN App
(Note: This story originally appeared in my Release Notes newsletter. Get the good stuff first by signing up. Release Notes drops each Tuesday morning.)
As revealed earlier this month in Apple’s Worldwide Developers Conference, the next versions of iOS, macOS and iPadOS 15 this fall include a feature called Live Text, which looks remarkably useful. It lets you search for text in an image via the Photos app, copy that text and then paste it into another document.
For example, you can search for text in a photo of a restaurant menu that you snapped a while back, find the photo, then copy the words. You can then paste the copied text into an email, an SMS message or word processing document. Live Text also works in the Camera app if you go to take a picture that includes text – an icon appears that lets you work with the text in the viewfinder.
But not all iPhones, iPads or Macs will be able to use it. It relies on technology Apple calls the Neural Engine, present in the company’s A12 Bionic mobile processors – introduced in 2018 – and newer. On Macs, it’s limited to the latest models with its own M1 processor.
While the idea of searching for, copying and pasting text from images is new to Apple, it’s not a new technology. There are ways to get a form of it now, and have it even if your device doesn’t support Apple’s upcoming version. The technical name for it is optical character recognition – it’s what lets you scan or photograph a printed document and turn it into, say, a Microsoft Word file. Or, it’s what lets you photograph and upload a check into your account with a banking app.
A version of this feature has been around for a while in Google Photos, and it came to the iOS version in 2019. You can download the Google Photos app for iOS and iPad, to start using it, and the capability also is available on the on the Google Photos website.
Once the app is on your iPhone or…