Affiliate links on Android Authority may earn us a commission. Learn more.
The secrets behind Google's Motion Photos technology
- A recent Google blog post has delved into the technology behinds its impressive Motion Photos feature.
- Motion Photos fuses intelligent software and data from hardware components to achieve its winning anti-shake formula.
- Google has implemented several other innovative ideas to deliver the tight Motion Photo user experience on the Pixel 2s.
Google has blogged about the technology behind Motion Photos on the Pixel 2 devices. The post, published earlier this week on the company’s research blog, dives into the more technical aspects of the feature.
When enabled, the feature takes a three-second video clip with each full-size photo, and those who’ve used it might have noticed that the results look pretty fine. It’s an extension of the Motion Stills for Android feature Google introduced last year, but the new version has undergone some significant improvements.
“By combining software-based visual tracking with the motion metadata from the hardware sensors, we built a new hybrid motion estimation for Motion Photos on the Pixel 2,” Google wrote in the post. This combines algorithmic software approach with “motion metadata” from the physical gyroscope and the optical image stabilization (OIS) components to more accurately distinguish between foreground and background layers in an image.
When this metadata is put to use, the result is less distortion caused by rolling stutter, and a better placement for the “stable camera path”—what determines the frame for the video (see below).
This anti-shake technology isn’t perfect, however. To reduce the video shaking the software must trim images, so you’ll always see a smaller area than in the still image. Further, as you can see in the GIF below, there can be some weird artefacts—take a look at how the building appears to bend in this one:
Or how the beginning of the clip is janky here:
When it works well, however, Motion Photos is capable of producing some silky smooth videos.
As well as the aforementioned techniques, Google has also developed a way to “automatically trim the video to remove any accidental motion caused by putting the phone away” while the company said it also starts videos “at the exact timestamp as the HDR+ photo,” so you’re presented with a seamless transition when you begin to “play” a still image.
Google was recently accused of failing to innovate by one of its former employees, and it’s a feeling shared by many fans and critics. It’s true that the idea of Motion Photos isn’t new—it goes back at least five years to HTC Zoe—but it’s not the ability to take video with still snaps that is fresh. Rather, the way it integrates with its video stabilization technology.
The way the Pixel 2s handle video is way beyond what most other manufacturers are managing to achieve with their smartphone photography, in terms of usefulness (reducing the ever-present shaking) and the level of success. It’s well-realized features like this that, for me, show Google can still deliver smart new concepts.
For more Motion Photos examples from Google, visit the gallery here.