Tech & Telecoms

Your iPhone Could Have a Hidden AI Camera Built Into the Magnifier App — How to Find Out and Use It

Your iPhone features an AI subject detection tool built into the Magnifier app. Don’t worry if that’s news to you, because it was news to me, too. With all the buzz around the big names in AI tools, like ChatGPT and Google Bard, the humble Magnifier tool has flown under the radar a little.

Using AI, the iOS Magnifier app is capable of detecting and identifying subjects using the camera, then describing them to you via on-screen text and spoken descriptions. The app can also use the same detection system to identify doors and people. It’s an awesome accessibility feature that can describe surroundings to people who may not be able to see it for themselves.

While the Magnifier app is available on most iPhones, subject detection depends on your iPhone having a LiDAR scanner, which is limited to Pro and Pro Max models from the iPhone 12 Pro onward. That means if you have a standard iPhone model, you won’t be able to use the AI subject detection.

If you do have an iPhone 12 Pro or later, the AI powered Magnifier tool is easy to use, and this guide is here to show you how.

How to use the AI-powered iOS Magnifier app to detect your surroundings

The screenshots below were taken on an iPhone 15 Pro. Subject detection using the Magnifier app is available on iPhone Pro models only, from the iPhone 12 Pro onward. You need to be running at least iOS 16.

1. Open the Magnifier app, swipe up and tap the box icon

First up, open the Magnifier app, then swipe up on the bottom panel and tap the box icon, on the right-hand side.

2. Choose a Detection Mode and point your camera at a subject

Detection Modes will appear on the left. Tap one and then point the camera at a subject. Your iPhone will identify that subject and tell you what it sees. The middle speech bubble icon provides image descriptions of your surroundings, and acts kind of like a general purpose identification tool — it will simply try to identify whatever is on screen (here, a black camera in the screenshot on the left). By default your iPhone only displays a text description (we’ll show you how to change that later). The door icon will detect doors and tell you how far away they are and their handle type (as visible in the screenshot on the right).

Available Detection Modes are:

  • People detection (tells you the distance of nearby people)
  • Door detection (tells you the distance and type of nearby doors)
  • Image descriptions of surroundings 
  • Text detection (relays any text detected on screen)
  • Point and speak (describes what your finger points at)

3. Point and speak

All of the detection modes work similarly, except point and speak. Tap the bottom icon on the left to enter point and speak. This mode detects your finger in frame and will speak what is detected. Point somewhere in frame until a yellow box appears and your phone describes what it sees. By default, this mode speaks and does not provide text descriptions (we’ll show you how to tweak that next).

4. (Optional) Tap the settings cog to change settings

If you’d like to change the settings for one or more Detection Modes, tap the settings cog, top left.

5. Select a Detection Mode to customize

Select a Detection Mode.

6. Apply custom settings

Now make any changes you want using the toggles. Settings are not the same for all Detection Modes.

Comments

Source
Toms

Related Articles

Back to top button