Thursday, November 21, 2024
10.4 C
New York

Researchers Develop 4D Cameras with Extra-Wide Field of View That could Improve Robotic Vision and Virtual Reality

Presented at CVPR this week, the camera designed by Gordon Wetzstein, Donald Dansereau and colleagues at the University of California in San Diego, is the very first light field, single-lens, wide field of view camera intended to improve the vision of robots.


Assistant Prof. Gordon Wetzstein and postdoctoral scholar Donald Dansereau with a prototype of the monocentric camera that captured the first single-lens panoramic light fields. Image credit: L.A. Cicero

Currently, the cameras used by robots are not the most effective. They gather information in a strictly 2-dimensional method, looking at an environment from multiple perspectives before it can understand the objects materials, and movements around them, not an ideal way to see for driverless cars or drones. The newly designed camera can obtain the same information with only one, clear 4D image.

Dansereau compares the old tech to the new as being like the difference between a peephole and a window. “A 2D photo is like a peephole because you can’t move your head around to gain more information about depth, translucency or light scattering. Looking through a window, you can move and, as a result, identify features like shape, transparency, and shininess,” he said.


The camera technology is based on research done 20 years ago by Pat Hanrahan and Marc Levoy, both professors at Stanford, into light field photography, a type of photography that captures additional light information. Where a typical 2D camera takes an image focused on only one object, light field photography allows a camera to capture a 4D image, one that includes special information like the distance and direction of light to the lens. With this additional information, users can focus the picture to anywhere in the camera’s field of vision, up to 140 degrees with the new camera, after it’s been taken.

Dansereau and Wetzstein hope that robots equipped with their new camera will be able to navigate through rain and other vision obstacles. “We want to consider what would be the right camera for a robot that drives or delivers packages by air,” said Dansereau.

138-degree LF panorama and a depth estimate based on a standard local 4D gradient method, shown as 2D slices of larger 72 MPix (15 × 15 × 1600 × 200) 4D LFs. Source: STANFORD COMPUTATION IMAGING.


Related Links;

More News to Read

Hot this week

Brooklyn Defendants Charged in Rideshare Hacking Scheme: Jailbroken Phones Used to Exploit Uber

Brooklyn federal court has charged two defendants, Eliahou Paldiel...

Detecting Defects in Next-Generation Computer Chips: The Future of TMD-Based Semiconductors

As technology advances, the demand for smaller, more powerful...

Merging Galaxies in the Early Universe: The Birth of a Monster Galaxy

Astronomers have recently observed a fascinating event in the...

Topics

Brooklyn Defendants Charged in Rideshare Hacking Scheme: Jailbroken Phones Used to Exploit Uber

Brooklyn federal court has charged two defendants, Eliahou Paldiel...

Detecting Defects in Next-Generation Computer Chips: The Future of TMD-Based Semiconductors

As technology advances, the demand for smaller, more powerful...

Merging Galaxies in the Early Universe: The Birth of a Monster Galaxy

Astronomers have recently observed a fascinating event in the...

NASA’s Roman Space Telescope to Uncover Galactic Fossils and Dark Matter Mysteries

NASA’s Roman Space Telescope is set to transform our...

Black Myth: Wukong – A Game that Gamers Love Despite Media Backlash

In a gaming industry increasingly influenced by social agendas,...

Gravitational Waves Reveal a ‘Supercool’ Secret About the Big Bang

In 2023, physicists made a groundbreaking discovery that could...

Related Articles

Popular Categories

Send this to a friend