Unless you’ve been living under a social media rock, you’ve probably taken your fair share of selfies by now. Have you ever noticed that your face in selfies isn’t exactly how your face looks in real life? That’s because most selfies are taken at about arm’s length from the face, causing the facial features to become very distorted. Whichever part of the face is closest to the camera, usually the nose or forehead, becomes disproportionately large, while features further away become relatively tiny.
Facial Distortions, to Minimize or Not?
Although some people my desire the effect of face distortion to create the illusion of a slimmer face, most professional photographers would go to great lengths to minimize those effects. The laws of physics dictate that the only way to avoid these distortions is for the camera to be further away from the face. In the real world, this would require either a long selfie stick or getting someone else to take the picture. It would also mean cropping the image significantly, as the wide angle lenses found in phones would then result in the face taking up only a small part of the overall image.
The Future is Here
…Until now. A research team from Princeton University and Adobe has come up with a technology that can digitally adjust the perspective of a portrait after it has been taken. It results in the ability to simulate a variety of different lenses from wide angle to telephoto used at a range of different shooting distances from the subject. This means the user could be presented with a simple slider to adjust the level of facial distortion in an app. Pretty remarkable!
The system works by mapping the 2D source image onto a 3D head model that is rendered at variable distances from a virtual camera. By estimating the camera distance in the original shot the team can then calculate how to warp the 2D image to match the changes in features as the virtual camera moves closer or further away. The future is here.