If you’ve ever worked with tiny parts during a home improvement project or on the job, you know the feeling of wishing you could magnify the whole thing so it would be easier to handle and understand. Putting a replacement screw into a pair of glasses would be as easy as connecting some Tinkertoys. Some scientists are now doing this in virtual reality labs, and new software is making the experience more accurate than ever before.

Scientists at the National Institute of Standards and Technology (NIST) have developed software that improves the accuracy of the tracking devices in its immersive (virtual) research environment by at least 700%. What’s more, scientists in other immersive environments can use the software with only slight modifications to suit their unique laboratories. This advance is a step forward in transforming immersive technology, which has traditionally been a qualitative tool, into a scientific instrument with which precision measurements can be made.

Immersive environments such as NIST’s are typically made up of two or more 8 x 8-foot walls onto which images ranging from larger-than-life bodies or actual-size buildings can be displayed on the walls and floors. The images are 3-D. Researchers then wear 3-D glasses and hold a wand, each of whose location is tracked. Using these devices, the researcher can walk around and interact with the virtual world with the help of the underlying graphics system.

While these small virtual reality laboratories have been around for more than a decade, they’ve mainly been used for a scientist to get inside a project and develop a feel of the object of study, says NIST mathematician John Hagedorn. Researchers can walk inside hallways of newly designed buildings before they are constructed to ensure the proportions are correct, or inspect microscopic structures, for example.

The visuals in immersive environments are sometimes not quite accurate because of an inherent problem with the electromagnetic transmitters and receivers used to track where the user is in the space. Ferrous metals such as rebar in the walls, other metal in the room, or metal walls, throw off the communications between the stationary and the small receivers attached to the tracked devices. These distortions are especially obvious when an image with lines or edges meets the virtual environment’s 90° angles where the walls and floor meet. The distortions interfere with the “reality” aspect and limit the immersive environment’s value as a measurement tool.

To improve the image’s accuracy, Hagedorn and his colleagues concentrated on the inaccuracy of the tracking devices. They knew there was a difference between where the tracking device said it was and where it really was. The researchers mapped two sets of data points — where they knew the sensors actually were and where the computer said they were. Using this data, they developed software that transforms the reported positions of the sensors into the actual position. Average location errors were reduced by a factor of 22; average orientation errors by a factor of 7.5.

“This improvement in motion tracking has furthered our goal of turning the immersive environment from a qualitative tool into a quantitative one — a sort of virtual laboratory,” Hagedorn explains. The first test with the new software was measuring a lattice structure with elements of about 2 to 3 mm in size, designed to grow artificial skin replacements or bone. A 3-D image of the structure was constructed using data obtained from a high-resolution microscope. NIST scientists interactively measured the diameters of the fibers and the spacing between the layers of fiber using the virtual lab. These precision measurements enabled the researchers to determine that the manufactured material substantially deviated from the design specification. On the other hand, additional measurements in the immersive environment showed that the angles between fibers in the manufactured material closely matched the design. For more information, visit www.nist.gov.