DARPA Develops Virtual Eye That Captures a Real Time Virtual Reality View Using Two Cameras

IMTS

Share this Article

DARPA Vector Logo.epsDuring a disaster situation, first responders benefit from one thing above anything else: accurate information about the environment that they are about to enter. Having foreknowledge of specific building layouts, the locations of impassable obstacles, fires or chemical spills can often be the only thing between life or death for anyone trapped inside. Currently first responders need to rely on their own experience and observations, or possibly a drone sent in ahead of them sending back an unreliable 2D video feed. Unfortunately neither option is optimal, and sadly many victims in a disaster situation will likely perish before they are discovered or the area is deemed safe enough to be entered.

But a team at the Defense Advanced Research Projects Agency (DARPA) has developed technology that can offer first responders the option of exploring a disaster area without putting themselves in any risk. Virtual Eye is a software system that can capture and transmit video feed and convert it into a real time 3D virtual reality experience. It is made possible by combining cutting-edge 3D imaging software, powerful mobile graphics processing units (GPUs) and the video feed from two cameras, any two cameras. This allows first responders — soldiers, firefighters or anyone really — the option of walking through a real environment like a room, bunker or any enclosed area virtually without needing to physically enter.

A real time 3D virtual realight feed generated by combining two 2D video feeds.

A real time 3D virtual reality feed generated by combining two 2D video feeds.

“The question for us is, can we do more with the information we have? Can we extract more information from the cameras we’re using today? Understanding what we see is critical to making the right decisions in the battlefield. We can create a 3D image by looking at the differences between two images, understanding the differences and fusing them together,” explained Trung Tran, the program manager leading Virtual Eye’s development at DARPA’s Microsystems technology office.

Users of Virtual Eye would be able to take note of the layout, visualize any hazards, identify optimal paths of entry or potentially locate survivors completely risk free. Two drones or robots would be inserted into a questionable environment, each outfitted with a camera. The cameras would be strategically placed at different points in the room with opposing viewpoints. Both video feeds would then be fused together with the Virtual Eye software and converted into a 3D view, and it will extrapolate any missing data using the 3D imaging software so the real time virtual reality feed is complete.3dp_virtualeye_darpa_tech

Here is some video of the Virtual Eye system in action:

The Virtual Eye system works thanks to NVIDIA mobile Quadro and GeForce GTX GPUs that are small enough to be portable but powerful enough to generate the virtual reality view. The NVIDIA GPUs were specifically chosen because they have the muscle to accurately stitch the two video feeds together and extrapolate the 3D data in real time while also being able to fit inside a laptop. Currently the Virtual Eye system is only capable of combining data from two cameras, however Tran expects that to change soon. The DARPA team is hoping to have a new demo version that is capable of combining up to five different camera feeds by next year.

NVIDIA mobile Quadro and GeForce GTX GPUs

NVIDIA mobile Quadro and GeForce GTX GPUs

While the system was specifically created for military, emergency or battlefield applications, as with most technology developed by DARPA it has plenty of potential real world applications as well. The technology could be used to broadcast sporting events or live performances in streaming 3D virtual reality with only a handful of cameras. It would also allow users to visit locations anywhere in the world, from museums to Mount Everest, without needing to leave their homes. Discuss this software over in the DARPA 3D Imaging Visual Eye forum at 3DPB.com.

[Source: DARPA]

Share this Article


Recent News

New Report: Semiconductor Industry to See $1.4B in 3D Printing Revenues by 2032

Will Photonic-Crystal Lasers Revolutionize 3D Printing?



Categories

3D Design

3D Printed Art

3D Printed Food

3D Printed Guns


You May Also Like

Blue Laser Firm NUBURU Explores Strategic Alternatives Amid NYSE Compliance Challenges

In a strategic move reflecting the current macroeconomic landscape, NUBURU, Inc. (NYSE American: BURU), a pioneer in industrial blue laser technology, has announced its decision to explore a wide array...

Flexible Wireless Temperature Sensor Made with 3D Printing

Researchers from the University of Glasgow, University of Southampton, and Loughborough University have developed an innovative flexible temperature sensor utilizing microwaves and 3D printing technology. As detailed in an article...

3D Printing Laser Maker NUBURU Faces NYSE American Compliance Challenge 

NUBURU (NYSE American: BURU), known for its innovative high-power and high-brightness industrial blue laser technology, has received a non-compliance notice from the NYSE American, formerly the American Stock Exchange (AMEX)....

Featured

3D Printing Resilience: the Case of Fiber Lasers

Since at least 2020, additive manufacturing (AM) has become more and more synonymous with the concept of supply chain resilience. In 2024, there is almost guaranteed to be a striking...