Your photo data never leaves your device for this purpose, and Apple’s servers don’t analyze your photos for text at all. Most significantly, Apple is doing all the processing directly on the A-series or M-series chip in your iPhone, iPad, or Mac. There are already a few exceptions with specific apps that don’t use the standard iOS photo APIs, like Facebook and Instagram, but these may also simply be a matter of waiting for these apps to be updated. It’s a system-level feature that should work for any photo that you’re viewing, anywhere on your device. It’s a magical feature, and it doesn’t just work in the Camera and Photos apps, either. You can even use iOS’ data detectors to place a call directly to a recognized phone number, or navigate to a recognized address.You’ll be able to copy text from anything you’re looking at simply by dragging over the text in the photo preview to select it and copy it to your clipboard.The real benefit of the new Live Text feature will be found in iOS 15 on the iPhone, where users will be able to extract text right through the iPhone’s Camera app. While M1-powered models will almost certainly still leverage that for even faster processing, Intel versions can simply handle it “ opportunistically.” This means it might run a bit slower on older Intel Macs, but at least it will be possible. Instead, you’ll have to load it up in Photos or Preview.Īs Ritchie explains, this means that Live Text on the Mac doesn’t really need the M1’s Neural Engine. Of course, you’ll still be able to extract Live Text from pictures captured with the Mac’s FaceTime camera - you just won’t be able to do it while looking at a live preview of the image. This seems reasonable considering that the MacBook and iMac cameras are designed for things like video conferencing, and most users aren’t likely going to be pointing it at signs and receipts the same way they would with an iPhone. (So instead of kicking it to the ANE, it’ll just process opportunistically) /vM5Nd7TGh8- Rene Ritchie July 27, 2021Įven though most of Apple’s Macs feature a FaceTime camera, even the M1-powered models won’t allow you to view Live Text through the camera. Sounds like Apple prioritized it based on demand, but it was made much easier by the lack of real-time requirements for a camera system MacOS Monterey Beta 4 includes Live Text for Intel Macs 🎉 Live Text now works across all Mac computers that support macOS Monterey.Īpple didn’t mention the word “Intel” in the release notes at all, but “all Mac computers” naturally includes the many Intel variants that Apple has sold in the past - and still sells today.Īs Rene Ritchie theorizes, Apple seems to have changed course and added Live Text “based on demand,” but it also helps that support for Live Text on the Mac doesn’t need to be done in real-time like it does on the iPhone and iPad. However, buried in the release notes was an oblique confirmation that Live Text was quietly being added to Intel Macs as well: There wasn’t much else interesting about the latest macOS beta - Apple briefly suggested that Universal Control had finally been enabled, but sadly, that turned out to be premature. The change comes in the fourth macOS 12 beta, released to developers earlier this week, which was quickly followed by a third public beta of the same build. There are at least six big new features in macOS Monterey that were expected to require an M1 Mac, but now the latest macOS 12 beta is bringing at least one of them to Intel-flavoured Macs too - Live Text in Photos. Now, however, it appears that this line has been blurred, at least a bit. When Apple unveiled macOS 12 Monterey last month, it drew a line in the sand between its new leading-edge M1-powered Macs and those models still using Intel chips.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |