Ever since ‘the dress’ broke the internet and our brains in 2015 by appearing white-and-gold to some viewers, and black-and-blue to others, similar optical illusions have followed in its viral wake. The latest is the ‘glitch in the Matrix’ wedding dress iPhone photo (below), which has baffled Instagram and stirred up debate about computational photography recently.
In the iPhone 12 photo, Tessa Coates – who shared the snap on Instagram – is shown standing in front of two mirrors wearing a wedding dress. It all looks perfectly normal on first glance, until you look at the mirrors – which show the subject standing with two completely different poses to the subject.
So what’s going on here? Tessa Coates says in the description that it’s a “real photo, not Photoshopped, not a pano, not a Live Photo”. This led many to speculate that the photo shows one of reality-twisting weak spots in the iPhone’s computational photography processing – including that it doesn’t recognize mirrors.
The lack of mirror detection, so the speculation claimed, meant that the iPhone’s multi-frame processing – a technique used by every modern smartphone – saw three separate subjects in the scene, rather than one, and so combined three different frames (with different poses) in the final shot.
But there’s a flaw in this argument. A typical indoor, daylight photo taken on an iPhone will have a shutter speed of at least 1/100 sec (which is the case in this photo) and Apple‘s Deep Fusion processing (which launched on the iPhone 11) will combine nine images almost instantaneously. So unless the photo’s subject was Dash from The Incredibles, she wouldn’t have had time to change poses so dramatically during the photo’s capture.
So what is actually going on in the photo? Despite Tessa Coates stating that the photo isn’t a pano (or panorama), this was always the most likely explanation – and some further digging from respected YouTuber iPhonedo has revealed that this is almost certainly the case.
As iPhonedo notes in the video above, the photo’s metadata shows that its resolution is 3028 x 3948, which isn’t the native resolution of the iPhone 12 that it was shot on (this resolution is instead 3024 x 4032). The photo also doesn’t have the iPhone’s native 4:3 aspect ratio.
But there is an understandable reason why even Apple’s own Genius Bar (who Tessa Coates spoke to for an explanation) didn’t conclude that the photo is a panorama – it doesn’t have the panorama symbol in its ‘info’ section.
The handy lesson from this whole affair is that if you don’t complete a full sweep across the guideline that appears when you tap ‘pano’ on the iPhone, it will still take a smaller pano photo – but it won’t be labelled as a panorama. After a few pano tests in landscape and portrait, we can confirm that this is the case.
Because a panoramic photo stitches several different shots together over a longer period than multi-frame processing, this explains the different poses in the wedding dress photo – which was likely taken by someone who accidentally switched the phone into ‘pano’ mode.
It’s certainly a simpler explanation than the neuroscience investigations that followed the viral blue-and-black dress in 2015 – and you now have a new iPhone camera trick to try during the holidays or next Halloween.
You might also like
email@example.com (Mark Wilson)