Affiliate links on Android Authority may earn us a commission. Learn more.
Apple finally has an answer to Magic Eraser, and we’ve tried it
- The third betas of iOS 18.1, iPadOS 18.1, and macOS Sequoia 15.1 introduce Apple Intelligence’s Clean Up feature.
- The AI-powered tool can remove intrusive elements from photos, smoothen blemishes, and pixelate people’s faces.
- Once set up, Clean Up works offline using the on-device Neural Engine on iPhones, iPads, and Macs.
Back in June, Apple previewed iOS 18 and the AI features it plans to introduce over the next year. One notable Apple Intelligence perk is a Clean Up tool built into the upgraded Photos app. While this functionality won’t debut to the public until later this year, those running the third developer betas of iOS 18.1, iPadOS 18.1, and macOS Sequoia 15.1 can already test it. To spare you from jumping on the unstable beta OS bandwagon, I have tried the Clean Up tool and will document my experience using it below.
Clean Up prerequisites
To use the Clean Up feature in the Photos app, you must run the latest X.1 OS version (currently in beta) on a device that supports Apple Intelligence. This includes the iPhone 15 Pro models, along with M-powered iPads and Macs. If you meet the criteria, you’ll find a new Clean Up option when editing a photo in the Apple Photos app. Clicking it the first time downloads the needed models, which should take a few seconds. Once ready, the feature works offline using your device’s Neural Engine.
First impressions
Based on my early tests, Clean Up appears to have three main purposes: removing intrusive elements (like random strangers in the background), healing blemishes or dust particles, and pixelating people’s faces. In any case, the EXIF data is marked accordingly to indicate that the photo has been manipulated.
The first functionality, previewed in the video above, primarily works automatically. The feature highlights random subjects in your photo and allows you to remove them by simply tapping on their detected frames. If it fails to detect a particular element, you could still manually doodle on or circle it to get it removed. However, auto-detected elements tend to produce better results, as the AI can specify their exact borders.
The second use case is ideal for repairing shots that feature dust particles, blemishes, or other minor imperfections. You just zoom in on the area and tap on the flaw to heal it.
Lastly, if you doodle on a person’s entire face, the Clean Up tool will pixelate/censor it. It’s an excellent feature for concealing someone’s identity without completely removing them from the shot.
Expectedly, Clean Up works best when a photo’s background is mostly clear and well-lit. Using the feature to edit complex or dim shots may output weird results. Nevertheless, the tool generally works satisfactorily as a first beta. However, it’s undoubtedly lacking compared to Google’s Magic Editor, as it is restricted to removing elements from photos and does nothing else.