Published

Making CAD Immersive

Here’s technology that helps transform 2D CAD designs into the third dimension.

Share

No matter how a 3D model is sliced in computer-aided design (CAD), it appears flat, whether on a computer screen, mobile device display, humongous flat-screen TV, or projection on a wall. Immersive virtual reality (VR) software changes that by making those models “pop out” from the world of flat, making them appear as in the physical world.

CAD packages often include tools for generating animated simulations of the 3D geometry being designed. These tools generally have simple functionality, best suited for simple presentations. VR development software creates more complex simulations and has more-advanced features for modifying animations. The software also optimizes the running of those animations, resulting in more fluid simulations of reality. Last, explains Mark Cheben, marketing manager for EON Reality (eonreality.com), VR development software typically works in real time. True, CAx and many visualization products “can render animation, but only if it’s pre-rendered and the product includes a simulation engine rather than a game engine shoehorned into simulation.”

In creating VR environments, solid geometries files created in CAD are first imported into VR software. This software doesn’t interpret the geometry data directly. “CAD data is mathematical in nature and needs to be scaled for real-time rendering,” explains Cheben. “The data needs to be interpreted into bite-size polygonal data so it can be [displayed] without chop or lag.” Once that’s done, “developers animate the geometry, add different behaviors, and surround it with a scene.” (Backgrounds and scenes can come from CAx or visualization software.)

EON’s software creates left/right images of the geometry, along with the proper parallax and front/back (hidden) object relationships for 3D or immersive views, or both. (The 3D and immersive effects require two images—one for a person’s right eye and one for the person’s left eye.) The “knowledge” of relationships between individual parts in an assembly depends on how the source geometry data was compiled and imported. “If the CAD or other modeling program has the proper relationship, they will stay inside EON. If those relationships are not properly formatted, additional work will be needed to create them,” says Cheben.

Developers create interactive simulations in EON’s authoring software by putting nodes together. Nodes are snippets of visual code; they act as basic building blocks. The nodes interact with each other to create the simulation. Each node has a specific function and behavior. For instance, there are sensor nodes (e.g., mouse and keyboard), visual nodes (e.g., shape, material, and textures), motion models (e.g., walk and orbit), animation nodes (e.g., key frame, rotate, and skeleton), and base nodes (e.g., anti-aliasing, shadow quality, and frame size). The Walk node, for example, lets users navigate through a scene in the VR environment by dragging a mouse, moving a wand, or moving a head- mounted display (HMD) in the desired direction. Walk node translates these movements into coordinates, which are then applied to the VR scene. Incidentally, Walk node “doesn’t necessarily have to move the camera frame; it can move any other frame as well,” explains Cheben. Another common node is ClickSensor, which triggers events in the simulation when a user clicks on a virtual object, which is defined by two or three coordinates, with a mouse.

Each node contains fields, which receive, store, and send values. When a value is updated, the node triggers one of its behaviors. When a node transmits a value to another node, an event happens. “The communication between the nodes in a simulation creates a cause-and-effect mechanism,” explains Cheben. “These events drive the EON simulation” (i.e., the visual result).

People “program” a VR simulation by choosing combinations of nodes and connecting them together (“making routes”).  For simple simulations, no programming or modeling experience is required; programming is just a matter of dragging-and-dropping and clicking through popup lists of nodes. 

Some product  details

Both EON Studio and EON Professional are authoring packages for creating, rendering, and displaying interactive 3D simulations and VR environments. The software can generate VR simulations from “phone to dome”; that is, the same software can be used to create augmented reality on mobile devices, stereoscopic simulations on workstations and VR displays in HMDs, all the way up to VR in immersive theaters.

Studio is the entry-level product (about $5,000 per seat). It supports 18 import formats, including ACIS, Autodesk 3D Studio and AutoCAD, IGES, Pro/Engineer, Solidworks, STL, the United States Geological Survey (many 3D landscape files are available in this format), and VRML2.0. Studio’s real-time rendering features include proprietary algorithms for anti-aliasing, transparency, environment mapping, and shading. Studio can be integrated with popular web browsers and software programs such as Microsoft PowerPoint and Visual Basic.

While Studio lets users see 3D images, Professional provides the immersive element by adding such capabilities as head tracking and wand interaction. EON Professional (about $30,000 per seat) includes the features of EON Studio and more. Professional supports more than 100 2D/3D file formats, including native support for Alias, Catia, NX, Maya, and Microstation. 

It also has several features to create more-realistic simulations. For example, EON Visual Effects creates high-quality, flexible shading by using the Cg shader technologies on the 
latest generation of graphics processing units, and with EON Physics, people can incorporate kinematics, mechanisms, collisions, gravity, friction, and dynamics within the VR simulations.

With Professional, says Cheben, “When you put on head-tracked glasses in an immersive environment, the scene is rendered specifically for your eyes. By moving your head in an immersive, but natural scene, you’re able to look under  a car or, inside a car, you’re able to see the different knobs on the dash and all sorts of other parts.”

It’s like the real car, virtually speaking.

RELATED CONTENT

  • BMW Uses Fabric Skin Again

    In 2008 BMW revealed a concept vehicle that was unusual in that the body panels weren’t made from steel, aluminum or composites but, rather, a fabric that was fitted over an underlying metal frame.

  • Jeeps Modified for Moab

    On Easter morning in Moab, Utah, when the population of that exceedingly-hard-to-get-to town in one of the most beautiful settings on Earth has more than doubled, some people won’t be hunting for Easter eggs, but will be trying to get a good look at one of the vehicles six that Jeep has prepared for real-life, fast-feedback from the assembled at the annual Easter Jeep Safari.

  • On Electric Pickups, Flying Taxis, and Auto Industry Transformation

    Ford goes for vertical integration, DENSO and Honeywell take to the skies, how suppliers feel about their customers, how vehicle customers feel about shopping, and insights from a software exec

Gardner Business Media - Strategic Business Solutions