Within the next few years virtual reality is going to be a huge part of our lives. Not only will it affect the way that we consume media like movies, television and video games, but it will be integrated with our social media experiences, shopping, business meetings and it will even offer us the ability to take virtual tours of homes, museums and landmarks. But one of the primary limitations of virtual reality technology is the fact that it will entirely cut users off from the real world, limiting our ability to interact with our environment while wearing the VR headsets and goggles.
Integrating the virtual world with the real world is the obvious next step, and even before virtual reality gets comfortable new technologies like augmented reality and mixed reality are already being explored. Augmented reality allows users wearing devices, similar to the much maligned Google Glass, to view the real world but also allow virtual objects visible only to those wearing the device to be included in their field of vision. These virtual objects can be placed at specific points in the world using GPS technology that can be viewed in 3D space in real time. Mixed reality, or hybrid reality, is when these virtual elements are merged with physical elements and can both coexist in real space and interact in real time.
A group of researchers, led by French post-doctorate researcher Pierre-Antoine Arrighi, at the Tokyo Institute of Technology, often called Tokyo Tech, are exploring tools that allow users to manipulate real world objects within a virtual space. Arrighi is also the co-founder of the iMDB of 3D printing, Aniwaa.com, and his extensive 3D printing experience was key in designing the Mixed Reality Tool prototype. Developed in Tokyo Tech’s cm.Design.Lab with professor Céline Mougenot, the tool includes a virtual room with virtual furnishings that are viewable using an Oculus Rift headset. Meanwhile the virtual furnishings have corresponding real world analogs called Tangible User Interfaces (TUIs) that when manipulated by the user will move the virtual object in real time within the virtual environment.
The virtual environment created by the researchers is a home interior simulation design tool that allows the user to design the room layout by rearranging furnishings. If the tool were used in a commercial setting then basically within the virtual world the furnishings can appear as an entire catalog of products available from a furniture seller that the potential customer can cycle through and view in 3D space. The environment can be created by simply inputting the user’s real world room measurements. The small, Zortrax M200-3D printed TUIs are tracked using an augmented reality marker system that will allow the user to move the virtual furnishings throughout the digital environment, trying out new room layouts and testing out combinations of different products.
The virtual world was generated using the Oculus Rift Development Kit 2, which is a vast improvement over the first DK. It includes low persistence OLED displays that help to eliminate motion blur and image rendering judders which are two of the largest contributors to simulator sickness caused by the VR device. The second gen DK also includes more precise, low-latency positional tracking that allows the small TUI movements to be more accurate within the VR space. While this is just an very early prototype using this technology, it is some of the earliest research being conducted on the development of technology that allows both real world and virtual world interaction.
You can see some video of the prototype being tested here:
It is easy to imagine it being used in environments like a gym for a more relaxing workout environment, so wearers can exercise in a location of their choosing without having their view tainted by fellow gym-goers. It also lends itself very well to collaborative design projects where two users, who can be on the opposite sides of the planet, can each use their own TUIs to work together on a design project. And of course the same technology can be paired with immersive touch technology, or even teledonics that will simulate human touch so virtual interactions between two or more people can be made. Discuss this new tool in the 3D Printed Tangible User Interface forum over at 3DPB.com.
You May Also Like
3D Printing News Briefs, September 9, 2021: Events, Materials, & More
In today’s 3D Printing News Briefs, the first Formnext + PM South China finally opens this week. In materials news, a biomedical company introduced what it calls the first purified...
US Navy Issues $20M to Stratasys to Purchase Large-Format 3D Printers
The U.S. Navy has been steadily increasing its investment into practical 3D printer usage, as opposed to research. The latest comes in the form of a whopping $20 million contract...
3D Printing Webinar and Event Roundup: August 22, 2021
From food 3D printing and GE Additive’s Arcam EBM Spectra L 3D printer to 3D printing and CAD in a post-pandemic world and topology optimization, we’ve got a busy week...
The Largest 3D Printed Structure in North America: a Military Barracks in Texas
ICON’s latest 3D printed training barracks structure in Texas signals another positive step for the additive construction industry. Described by the company as the largest 3D printed structure in North...
View our broad assortment of in house and third party products.