It is the to begin Apple's smartphones to provide Portrait Mode photos created totally with software techniques instead of hardware, which prompted the developers behind popular iOS camera software Halide to take a deep dive into how it operates.
The iPhone SE is equipped the same camera sensor as the iPhone 8, based on a recently available teardown done by iFixit, but its camera can do more because it's using "Single Image Monocular Depth Estimation," aka producing Portrait Mode effects by using a 2D image.
As Halide developer Ben Sandofsky points out, the iPhone XR is also a single-lens camera with Portrait Mode support, but the iPhone XR gets depth information through hardware. That's not possible on the iPhone SE for the reason that older camera sensor doesn't support the feature.
Halide has discovered that unlike other iPhones, the iPhone SE may take an image of another picture to try and create a depth map. The iphone app was even able to take a photo of an old slide film, adding depth effects to a 50 year old photo.
The iPhone SE's Portrait Mode is somewhat limited because it only works together with people, which is as a result of neural network that powers the feature. Whenever a Portrait Mode image with out a person is captured, it fails in a variety of ways since it can't create an accurate estimated depth map.
The iPhone XR also limited Portrait Mode to persons alone, and using Portrait Mode with other objects requires upgrading to 1 of Apple's more expensive phones.
According to Halide, depth maps on the iPhone SE (or any phone with Portrait Mode) can be viewed by using the Halide app and then shooting in Depth mode. Halide's full break down of the iPhone SE's Portrait Mode could be study on the Halide website.