Apple’s Visual Intelligence could be a step toward Apple glasses
a month ago
Apple’s new “Visual Intelligence” feature was one of the most impressive things shown at Monday’s iPhone 16 event. The tool lets users scan the world around them through the iPhone’s camera to identify a dog breed, copy event details off a poster, or look up just about anything around them.
It’s a handy-looking feature that fits right in with the iPhone’s new camera button. But it may also be setting the stage for bigger products down the road: it’s the exact kind of thing Apple will need for future tech like AR glasses.
It’s not hard to imagine how Visual Intelligence could help you out on a device that sees everything you see. Take the idea of learning more about a restaurant, like Apple showed for Visual Intelligence on an iPhone:...