Anolkil/Dreamstime
Soldier wearing virtual reality glasses

Q&A: Design Engineers Can Change the Game with Extended Reality

July 1, 2022
Extended reality (XR) platforms lighten the load with “what-if” analyses at the design stage, particularly for the aerospace and defense industries.

If Silicon Valley’s metaverse overlords were to get it right, the day will surely come when industries scramble to put together their “virtual world” strategies.

Engineers are already at the forefront of harnessing nearly-perfected versions of extended reality (XR) platforms to visualize their conceptual designs and to apply new technologies to test manufacturing processes. Yet, the mix of augmented reality (AR), virtual reality (VR), artificial intelligence (AI) and video game graphics perpetually merging to create augmented worlds that intersect with the physical one makes it hard to keep pace.

“As with any new industry, terms are coined,” said according to Dijam Panigrahi, COO and co-founder of GridRaster Inc., a provider of cloud-based platforms that works with manufacturers in industrial environments to scale AR/VR solutions.

“Virtual reality is when you are fully immersed in the virtual world and your real world is blocked out,” Panigrahi explained. “Augmented reality is about augmenting some of the data points, like the work instructions. Pokémon GO is a pretty easy example of AR. Mixed reality is when there is an interaction between the virtual and physical world…Mixed reality and extended reality are nothing but umbrella terms for it all.”

RELATED

AR Services Deliver Expertise, Boost Productivity and Increase Safety

Augmented Reality Eyes Enterprise Adoption

Virtual Reality Distorts Your Sense of Time

As elusive as the terms may seem, and as costly as it is to develop and implement the technology, companies like GridRaster are poised to solve the issues and facilitate more of it coming online in the years ahead.

Panigrahi can attest to the benefits, functionality and effective use in operational applications in the aerospace and defense industries. GridRaster is developing an AR toolset prototype for the U.S. Air Force to improve aircraft wiring maintenance on the USAF’s fleet of CV-22 Osprey aircraft. The USAF’s CV-22 nacelle wiring is responsible for about 60% of the overall maintenance effort. The AR tool will enable maintainers to troubleshoot, repair and train in the operational environment.

In the Q&A that follows below, Machine Design asked Panigrahi about trends in immersive technology and the role XR plays in the design and production of industrial operations. The conversation has been condensed and edited for clarity.

Machine Design: The relevance of extended reality in design and engineering, especially when we talk about photorealism and mixed reality simulations, is gaining momentum. There seems to be an abundance of makers and adopters for both creation and work. What do you see happening in this market space right now?

Dijam Panigrahi: One of the key things for mixed reality—the new term for everything put together is “metaverse”—is the capability it brings in. We already had digital twins and CAD models from the visualization point of view, and we always had the PLM systems, which was part of the manufacturing setup. Now, with the advent of the cloud and the virtualization of the GPUs (graphics processing units) and advances in headsets and sensor technology, it places us in a position where we have a device with which we can interact with the real world very seamlessly.

Digital twins are being created—it’s like creating a soft copy of your physical world. You’re taking that soft copy of the physical world and applying all the software techniques to try out variations or “what-if” analysis. You’re analyzing the different data and doing it iteratively to learn from the different scenarios, and taking those learnings and applying it in the real world to test certain things that you may have faced down the road.

You can now take care of some issues and what-if scenarios at the design stage itself. You don’t have to wait for the operational and aftersales environment; you can simulate all that in the design stage itself. For companies, particularly in aerospace, we have seen that, if you traced it, seven out of 10 issues would have been captured and addressed at the design phase. This has been huge for the overall ecosystem.

MD: I want to dig a little bit deeper into defining how you actually do that in the design phase. But first, who does GridRaster work with and how does that relationship work?

DP: We are working with two top contractors in the aerospace and defense industry. We have been working with the Department of Defense, the U.S. Air Force and multiple entities within the Air Force. On the simulation side, we’re worked on the USAF CV-22 aircraft maintenance. On the telecom side, we are working with some of the large telcos and cable operators. What we’re building is going to be infrastructurally important, so we have the end-user plus the enablers, such as the cloud providers and the telecom players.

MD: Let’s get into the technology. Can you talk about how a physical mock-up, let’s say a CAD model of a component of a car, is brought to life through mixed reality in the head-mounted display?

DP: The CAD models are a visual representation, but when you talk about the digital twin, you’re also mapping out all the physics behavior. Let’s suppose that there is a nut. If I rotate it, it will rotate in a certain way. All those behavioral aspects are also part of the digital twin, which means that your object or your environment, based on your interaction, will behave the way it would if you did those things in a physical world.

All of the digital twin content is complex and heavy. Today, if you’re trying to put those things on a standalone headset such as a HoloLens or Oculus Quest, you go through this painful cycle of optimizing things. On a headset there is only so much compute power that is available to run all of the data.

But you can take the data needed for digital twins—which is integral to making all this realistic, immersive experiences possible—and put it in the cloud and run it and stream it to different devices. Then, based on your interaction with the sensors that capture all the interaction and input from the devices, you can simulate the environment and all of the interaction. You can visually see that on a Hololens or in a VR headset like Oculus Quest, depending on what experience you’re trying to enable. That’s broadly how it is brought to life today.

MD: How do you achieve that precise overlay over the 3D model or the digital twin? And how does this help industrial design?

DP: This overlay is done both ways. You can put it on the headset. Microsoft Hololens, for example, can track the environment and they can detect surfaces using computer vision to identify objects and surfaces. Based on where you want to align that, it can facilitate it.

The challenge with the standard, standalone headset is it can only do it to a certain accuracy, or track certain kinds of shapes. It needs an ideal condition to perform. By bringing it into the cloud for high-precision alignment, you need to be able to create a very fine mesh of this whole world that you’re seeing in 3D. Then can you identify all the individual objects and figure out which object is of interest to you.

For example, if I have the entire car, and I’m trying to overlay only the door on top of that car, I’m able to isolate that door and create that fine mesh off the point cloud containing all the information and will be able identify each structure in the physical world. I’m able to align the corresponding digital twin or the CAD model that I have, because I know all the anchor points, and now I can align it perfectly. That’s what we do at our end.

Here's why it is critical. If, for example, you want to enable a use case such as auto detection of defects or identifying the anomalies, you already have the digital twin. It captures the ideal state of how things should be in the physical world. Let’s suppose that in an aircraft there’s a dent or some deviation. If you are able to precisely overlay that digital CAD model or the digital twin perfectly in the in the physical world, you can do a diff between both those scenes.

That’s where the software technique of analyzing all this comes into the picture. It is only possible because we are now able to create that softer copy of the physical world and do a diff between the ideal state and what we are currently seeing.

One of the use cases that we are also working on is how to do the installation of the nestle wiring harness of USAF’s CV-22 aircraft. Wiring harnesses are a very congested space. If you don’t have the millimeter accuracy to overlay those harnesses right on top of the physical harnesses, then you won’t be able to put the right instruction for somebody to follow. So, the precision for a lot of these use cases is extremely important, and that’s what we try to solve with our platform.

MD: What are some of the shortcomings and challenges that you have now, and what are you working on going forward?

DP: From a technology point of view, there are a lot of dependencies. One of the things that we run into is that you may not have the CAD model, so we have to scan that whole environment and that consumes time. Sometimes it affects the accuracy. But we know that going forward, invariably everything will be designed in the 3D world, so you will have the CAD data for everything. But today, these are some of the challenges.

Another challenge relates to doing the rendering on the headset, in terms of getting the depth information and the color code information. We are often dependent on what the camera sees. As the precision of the camera and the realism from that camera improves, our performance improves.

Those are the primary things—the dependency on the headset and the available data or content. We have to bridge that, and we are doing it through our technology.

MD: The development and access to the XR technology has been fairly expensive, both recreationally as well as for commercial use. Do you see the price and the cost coming down?

DP: Absolutely, I think that’s going to happen. Invariably, if you look at any technology which has picked up, it makes sense in terms of value per price. There is a reason why we went after the aerospace, defense and automotive industry. In aerospace and defense, think of getting an aircraft repaired and if I am able to improve just by 30%. Every hour an aircraft is grounded, you are looking at losing hundreds of thousands of dollars.

The headset price may be $5,000. Trying to set up this whole system may cost another $10,000. But the return that you’re getting for that $10K to $15K is tenfold. For those kinds of use cases today, the price points don’t matter.

As you go into medical and education applications, where you’re looking at people adopting en masse, the value per price would begin to matter. But the good thing is the prices are coming down right now. When we started, we had the Oculus DK1. To get this whole setup up and running, you needed $3,000 worth of equipment. To get it running now, you can get an Oculus Quest for $299 and you’ll be up and running.

You’re already seeing one-tenth the cost and it will keep on decreasing. Most likely, it will go the mobile way, where you have telecom players subsidize this and you get the headset and pay over a time period, or as part of your monthly rental. It’s just a matter of time.

About the Author

Rehana Begg | Editor-in-Chief, Machine Design

As Machine Design’s content lead, Rehana Begg is tasked with elevating the voice of the design and multi-disciplinary engineer in the face of digital transformation and engineering innovation. Begg has more than 24 years of editorial experience and has spent the past decade in the trenches of industrial manufacturing, focusing on new technologies, manufacturing innovation and business. Her B2B career has taken her from corporate boardrooms to plant floors and underground mining stopes, covering everything from automation & IIoT, robotics, mechanical design and additive manufacturing to plant operations, maintenance, reliability and continuous improvement. Begg holds an MBA, a Master of Journalism degree, and a BA (Hons.) in Political Science. She is committed to lifelong learning and feeds her passion for innovation in publishing, transparent science and clear communication by attending relevant conferences and seminars/workshops. 

Follow Rehana Begg via the following social media handles:

X: @rehanabegg

LinkedIn: @rehanabegg and @MachineDesign

Sponsored Recommendations

Flexible Power and Energy Systems for the Evolving Factory

Aug. 29, 2024
Exploring industrial drives, power supplies, and energy solutions to reduce peak power usage and installation costs, & to promote overall system efficiency

Timber Recanting with SEW-EURODRIVE!

Aug. 29, 2024
SEW-EURODRIVE's VFDs and gearmotors enhance timber resawing by delivering precise, efficient cuts while reducing equipment stress. Upgrade your sawmill to improve safety, yield...

Advancing Automation with Linear Motors and Electric Cylinders

Aug. 28, 2024
With SEW‑EURODRIVE, you get first-class linear motors for applications that require direct translational movement.

Gear Up for the Toughest Jobs!

Aug. 28, 2024
Check out SEW-EURODRIVEs heavy-duty gear units, built to power through mining, cement, and steel challenges with ease!

Voice your opinion!

To join the conversation, and become an exclusive member of Machine Design, create an account today!