Researchers create tool to combine perspective from telephoto and wide-angle lens

Spread the love

University of Santa Barbara scientists and Nvidia researchers have developed a new tool that allows photographers to recompose a photo in real time, combining a wide-angle perspective with the perspective of a telephoto zoom.

The tool that the researchers have developed is called Computational Zoom. The system allows photographers to compose where the sense of depth can be controlled, the relative size of the different objects can be determined and the perspectives from which the objects are photographed can be adjusted.

Computational Zoom addresses the problem that with a single wide-angle photo, for example, it is difficult to get the entire background and the objects and subjects in the foreground in the picture, without distortion at short focal lengths. A subject in the foreground may move closer to the photographer to get a large enough shot, but perspective and possible lens distortion can make the subject appear different in relative size. This would not be the case if a greater focal length had been used from a greater distance, but then part of the background would disappear.

The Computational Zoom tool responds to this through a three-step process. First of all, the photographer has to take several shots of the same scene, moving the camera closer and closer, without changing the focal length. The tool then uses a stack of the captured images and an algorithm to estimate the camera’s position and orientation for each shot. A 3D reconstruction is then applied to estimate the depth information. Finally, all this information is processed in an interface, which allows users to combine all kinds of elements from the scene from different perspectives.

The researchers hope that their system will eventually be integrated as a plugin into existing photo editing programs. They have published their research under the title Computational Zoom: A Framework for Post-Capture Image Composition; the paper was presented at the Siggraph 2017 conference for computer graphics and interactive techniques.

You might also like