WO2018156087A1 - Système et procédé de réalité augmentée d'analyse d'élément fini - Google Patents

Système et procédé de réalité augmentée d'analyse d'élément fini Download PDF

Info

Publication number
WO2018156087A1
WO2018156087A1 PCT/SG2018/050091 SG2018050091W WO2018156087A1 WO 2018156087 A1 WO2018156087 A1 WO 2018156087A1 SG 2018050091 W SG2018050091 W SG 2018050091W WO 2018156087 A1 WO2018156087 A1 WO 2018156087A1
Authority
WO
WIPO (PCT)
Prior art keywords
model
fea
input device
virtual
user
Prior art date
Application number
PCT/SG2018/050091
Other languages
English (en)
Inventor
Jiming HUANG
Soh Khim Ong
Yeh Ching Andrew Nee
Original Assignee
National University Of Singapore
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National University Of Singapore filed Critical National University Of Singapore
Publication of WO2018156087A1 publication Critical patent/WO2018156087A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • G06F30/23Design optimisation, verification or simulation using finite element methods [FEM] or finite difference methods [FDM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2111/00Details relating to CAD techniques
    • G06F2111/18Details relating to CAD techniques using virtual or augmented reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]

Definitions

  • the present invention relates broadly to a finite-element analysis augmented reality system and method.
  • FEA finite element analysis
  • VR virtual reality
  • the response time of interactive VR simulation systems for FEA visualization is important since it largely affects the user experience and perception.
  • the systems are expected to generate results of acceptable accuracy with minimum lag time.
  • Liverani et al. [1] proposed a VR system for FEA of shell structures. Using stylus and gloves as input devices, the user can create, adjust the mesh, and specify boundary conditions. Real-time results can be obtained by changing loads on a small scale model. For complex models, classical solvers are not able to achieve real-time solutions.
  • a number of studies used artificial neural networks (ANN) or approximation methods to achieve real-time interaction for the specific tasks.
  • ANN artificial neural networks
  • [2] used an ANN to generate real-time deformation of a tennis ball and racket during impact.
  • the users can play tennis in a VR environment, and the impact is felt via a haptic glove.
  • Connell and Tullberg [3] presented a framework to simulate a bridge in VR. The user can move the loads acting on the bridge, and FEA results are updated immediately using an approximate module.
  • Cheng and Tu [4] reported an ANN-based approach for real-time deformation of mechanical parts under forces. The user can make geometric adjustment by changing the feature parameters among the trained values. Although achieving fast speeds, these simulation approaches do not compute the exact results.
  • ANNs require training and may generate unreliable results for untrained inputs.
  • Rose et al. [7] developed an approach to edit the mesh by manipulating nodes directly. To avoid poor quality of elements after editing the mesh, relaxation and local restructuring procedures were performed.
  • the VR-based environment was constructed using a haptic device and an auto stereoscopic display.
  • Graf and Stork [8] presented a simulation approach in VR. The user can drag geometric features directly to change the positions, and create cross-sections to access the interior FEA results. To obtain real-time results for moving loads, the inverse of the stiffness matrix was pre-computed and the results were calculated only for the visible elements.
  • AR augmented reality
  • Scientific visualization of measured or simulated datasets in the real scene is promising for various applications.
  • Superimposing MRI data on a patient will enable the surgeon to plan a surgery and provide navigation guidance during surgical procedures [9].
  • Visualizing the real-time data captured by sensors located on a bridge is helpful for monitoring the structural health of the bridge [10].
  • Superimposing simulated electromagnetic fields on the corresponding devices facilitates the teaching of electrodynamics [11].
  • Comparison of simulated results and actual measurements onsite becomes possible using AR, which is helpful for model validation and evaluation.
  • Ham and Golparvar-Fard [12] developed a system which visualizes measured and simulated spatio-thermal data onsite.
  • AR can be utilized to facilitate numerical simulations in both visualization and interaction.
  • AR interfaces enhance data exploration and user collaboration for simulation [15,16].
  • ADRON technical drawings are augmented with digital data, such as CAD models, FEA results and multimedia annotations.
  • Each client in the workspace network can navigate the model and create annotations on the augmented drawings.
  • the annotations are shared among all the users and finally lead to a design modification.
  • Issartel et al. [18] presented a portable interface for exploration of volumetric datasets. By using a tablet and a stylus, the user can explore datasets via natural operations, such as slicing, iso-surface picking, particle tracing, etc.
  • Valentini and Pezzuti [20] proposed a method for simulating elastic beams in AR.
  • Elastic beams are modeled as dynamic splines.
  • the user can control virtual beams to perform simulations.
  • the method allows dynamic simulation of beams in real time, but has limitations for models with complex geometries.
  • the deformation of practical structures is usually small, which cannot be measured using regular trackers.
  • CFD computational fluid dynamics
  • Haouchine et al. [22] presented a method to visualize deformation of human tissue during minimally invasive liver surgery. A real time model is built based on co-rotational FE method. To deform the model, external stretching forces are induced by tracking the surface motion of the organ using a stereo camera. The simulated deformations are superimposed on real human organs to assist surgery. This study estimates the deformation of soft objects with the purpose of achieving visually plausible results. The method is not suitable for engineering analysis. Bernasconi et al.
  • a finite-element analysis augmented reality, FEA-AR system comprising a camera configured for capturing an image of a physical structure; a finite-element, FE, model unit configured for maintaining an FE model of the physical structure and for processing the FE model for stress analysis under one or more loads; an interface configured for displaying the captured image of the physical structure and for rendering the FE model overlaying the image of the physical structure and for rendering results of the stress analysis; an input device configured for user- input relating to a virtual structure to be added to the FE model; and wherein the interface is further configured for rendering a modified FE model overlaying the image of the physical structure based on the user-input relating to the virtual structure to be added to the FE model.
  • a finite-element analysis augmented reality, FEA-AR method comprising the steps of capturing an image of a physical structure; maintaining an FE model of the physical structure; processing the FE model for stress analysis under one or more loads; displaying the captured image of the physical structure; rendering the FE model overlaying the image of the physical structure; rendering results of the stress analysis; receiving user-input relating to a virtual structure to be added to the FE model using an input device; and rendering a modified FE model overlaying the image of the physical structure based on the user-input relating to the virtual structure to be added to the FE model.
  • a computer readable media having embodied therein data and/or instructions for instructing a computing device to implement the system as defined in the first aspect and/or to execute the method as claimed in the second aspect.
  • Fig. 1(a) illustrates adding a virtual beam to an existing structure 102 according to an example embodiment.
  • Fig. 1(b) illustrates visualizing the FEA data and associated results of the modified structure based on Fig, 1(a) under a load according to an example embodiment.
  • Fig. 2(a) illustrates "slicing" a structure using a virtual plane attached to a handheld 3D input device to view a cross-section of the FEA data, according to an example embodiment.
  • Fig. 2(b) illustrates selecting use a region of a structure within a cube attached to a handheld 3D input device used to view the FEA data within the cube only, according to an example embodiment.
  • Fig. 3(a) to (c) illustrate components of a system according to an example embodiment.
  • Fig. 4(a) and (b) illustrates details of components of a system according to an example embodiment.
  • Fig. 5 illustrates a system architecture according to an example embodiment.
  • Fig. 6 illustrates a system setup according to an example embodiment.
  • Figs. 7(a) to (f) illustrate 3D selection using a virtual cube according to an example embodiment.
  • Fig. 8 illustrates a data structure for load acquisition according to an example embodiment.
  • Fig. 9 illustrates graphic representation of a point load according to an example embodiment.
  • Fig. 10 illustrates schematics for data visualization and exploration according to an example embodiment.
  • Figs 11(a) and (b) illustrates a stepladder and finite-element model, respectively, according to an example embodiment.
  • Figs. 12(a)-(d) illustrate application of virtual loads to a stepladder according to an example embodiment.
  • Figs. 13(a) to (d) illustrate real-time simulations for a stepladder according to an example embodiment.
  • Figs. 14(a) to (c) illustrate different visualization styles according to an example embodiment.
  • Figs. 15(a) to (c) illustrate direct manipulation of FAE results according to an example embodiment.
  • Figs. 16(a) to (f) illustrate slicing and clipping FAE results ion hand-operated mode according to an example embodiment.
  • Figs. 17(a) and (b) illustrate slicing with unbounded and bounded cutters, respectively, according to an example embodiment.
  • Figs. 18(a) to (c) illustrate slicing and clipping FAE results in view -based mode according to an example embodiment.
  • Figs. 19(a) to (f) illustrate adding beams to stiffen a structure according to an example embodiment.
  • Figs. 20(a) to (d) illustrate adding a geometric model for design modification according to an example embodiment.
  • Figs. 21(a) to (d) illustrate interactive mesh refinement according to an example embodiment.
  • Fig. 22 shows a flowchart illustrating a finite-element analysis augmented reality, FEA-AR, method, according to an example embodiment.
  • the Finite-Element Analysis Augmented Reality (FEA-AR) System enables FEA of structures to be performed through an AR interface.
  • the AR scene is viewed through a computer display such as a monitor or head- mounted display.
  • the AR scene includes a view of the physical environment containing the structure to be analyzed, with graphical visualizations of FEA data, such as structural stresses and strains, overlaying the physical structure.
  • the FEA data is updated in real-time in response to changes in loading conditions received through force sensors attached to the structure, and/or by simulated loads added by the user.
  • An example embodiment of the present invention allows existing structures to be analyzed under actual loading conditions, and simulated loads to be added to investigate hypothetical scenarios.
  • Hypothetical changes to the structure can advantageously be analyzed under actual or simulated loads according to a preferred embodiment of the present invention, for example design modifications to structures, such as bridges and supports.
  • An example embodiment of the present invention uses efficient computation methods to carry out real-time finite element analysis, making it particularly suitable for engineering analysis.
  • an example embodiment of the present invention provides user-friendly interactive tools for deeper analysis than the above works, by allowing for the application of loads, exploration of results using natural interaction methods, the addition and/or removal of geometric elements to existing structures, and slicing of structural members to examine internal stress distribution.
  • An example embodiment of the present invention provides a more intuitive experience in carrying out FEA while also enabling the real-time simulation of more complex structures than the above works.
  • the commercial applications of an example embodiment of the present invention include the finite-element analysis for engineering design in near real time and the training of students taking structural mechanics in a real environment.
  • the FEA-AR according to an example embodiment enables FEA of structures to be performed through an interactive Augmented Reality (AR) interface that displays graphical visualization of FEA data, such as structural stresses and strains, overlaying the physical structure.
  • the FEA data can either be based on loading conditions picked up by force sensors attached to the structure, or by simulated loads added by the user.
  • a preferred embodiment of the present invention allows the system to be applied to the study of physical structures under actual as well as hypothetical loading conditions.
  • Fig. 1(a) illustrates adding a virtual beam 100 to an existing structure 102, and visualizing the FEA data, i.e. the mesh 104 and associated results of the modified structure under a load, as illustrated in Fig. 1(b).
  • Interactions with the visual data are performed by the use of a handheld pointer to add and/or remove virtual structures, add loads to specific points on the structure or define regions of interest.
  • "slicing" the structure 200 using a virtual plane attached to a handheld 3D input device 202 to view a cross-section of the FEA data can be performed in an example embodiment, as illustrated in Fig. 2(a).
  • a region 204 of the structure 200 within a cube 206 attached to the handheld 3D input device 202 is used to view the FEA data within the cube 206 only, as illustrated in Fig. 2(b).
  • intuitive understanding of the data is enhanced; this also enhances the efficiency of the process of analyzing and designing structures.
  • the system comprises a hardware and software component.
  • the hardware component includes a computer 300 with a PC camera 302 as illustrated in Fig. 3(a), a handheld 3D input device 304 as illustrated in Fig. 3(b) for user interaction and a force sensor network 306 attached to the physical structure 308 to be analyzed using the system, as illustrated in Fig. 3(c).
  • the software component is configured to perform the FEA calculations and display of the AR scene.
  • the 3D input device 304 includes a marker-cube 310 attached to a wireless mouse 312, as illustrated in Fig. 3(b).
  • the marker-cube 310 enables the 3D pose or orientation of the 3D input device 304 to be tracked by the camera 302, thus allowing for interactions to be carried out in 3D with respect to the structure 308.
  • the force sensor network 306 in an example embodiment includes remote sensor nodes in the form of wireless Radio Frequency (RF) transmitters 400 attached to the force sensors 402, as illustrated in Fig. 4(a).
  • the RF transmitters 400 establish a self-healing mesh network that enables a large number of sensor nodes to be added to the network, according to an example embodiment.
  • a gateway node 404 is provided in an example embodiment, in the form of an RF transmitter 406 connected to the computer 300 via an interface 408, which enables the data transmitted by the remote sensor nodes to be received by the software component.
  • the RF transmitters 400, 406 used are Synapse RF Engines.
  • a 3D selection method has been developed to support multiple object selection in the FEA- AR environment for FEA analysis and result visualization.
  • a real-time interaction method for investigating the behavior of structures under different loading conditions The user can apply virtual loads using a natural interface and measure loads using force sensors.
  • FEA simulation begins with a mesh model.
  • the matrix equation is formed and solved after loads and constraints have been applied, and the results can be visualized.
  • various FEA results can be achieved for simulation and visualization by changing the variables.
  • sensors can be employed to measure varying boundary conditions, e.g., displacements and loads.
  • Engineers usually conduct FEA to investigate structures under different loading conditions. While real-time FEA methods are available to simulate structures under varying loads, as was described in the background section above, the existing FEA methods have limited capabilities.
  • the system 500 has seven modules, namely, a sensor module 502, a load acquisition module 504, an FE model module 506, a solution module 508, a post-processing module 510, a rendering module 512 and a user interaction module 514.
  • the sensor module 502 acquires information from the real world using different sensors, such as using force sensors to measure loads, using trackers to obtain the position and orientation data for AR rendering and user interaction, etc.
  • wireless sensor networks can be established to monitor spatially distributed loads in various embodiments.
  • the load acquisition module 504 manages the load data acquired from sensor measurement or user input, and converts them into nodal forces for FEA computation. With the nodal forces and the FE model module 506, the solution module 508 is established to solve the equations. Besides having a classical pre-conditioned conjugate gradient (PCG) solver, the solution module 508 according to this embodiment incorporates a real-time solver to accelerate the simulation for varying loads.
  • the real-time solver is built based on the concept of pre- computing the inverse stiffness matrix [8,27], which is implemented in two phases, i.e., offline pre-computation and online solution.
  • the offline pre-computation generates the inverse matrix and load vector so as to compute the FEA results online at a fast speed. When there are modifications of the model, updating of the inverse matrix is performed.
  • Data filters are established by implementing scientific visualization techniques, such as data slicing and clipping.
  • the rendering module 512 By tracking the user's view, the rendering module 512 renders virtual objects in the AR environment, such as the FEA results, loads, slicing planes, added geometric elements etc.
  • FEA simulation and AR rendering are performed in different computation threads in this embodiment. During realtime simulation, a synchronization process is performed to preferably ensure that results are updated in every frame.
  • the user interaction module 514 provides the methods to interact with the FE model module 506, which include load application, adding of geometrical elements, model modification, result exploration, etc.
  • Linear elastic models are built in this example embodiment, which are computationally viable and widely applicable for engineering structures, to achieve real-time simulation for varying loads and added geometrical elements. Quasi-static simulations are performed, which is feasible in situations where the dynamic effects of varying loads can be neglected.
  • An elastic FE model can be expressed as a sparse linear equation system in Eq. (1), where K is the stiffness matrix, u and / are the vectors representing nodal displacements and forces, respectively.
  • the inverse stiffness matrix K 1 is computed by assigning a unit load to each degree of freedom (DOF), and solving the linear systems using a PCG solver. Each column of the inverse matrix is the displacement solution for the unit load that is applied to the corresponding DOF. With nodal displacements, the nodal stresses can be derived using conventional methods [28], such as direct evaluation at the nodes, extrapolation from the Gauss points, etc.
  • the FE modeling and pre-computation can be performed using external FEA systems in various embodiments, and the computation of the inverse matrix can be programmed using the scripting tools provided.
  • the solver Rather than computing the entire inverse matrix, the solver preferably only needs to compute the columns for the nodes in the regions that may be subject to loads, e.g., certain faces of a structure, in the example embodiment such that the pre-computation time will be largely reduced.
  • the error tolerance of the PCG solver can be increased to reduce the precomputation time at the expense of solution accuracy in various embodiments.
  • the computation in Eq. (2) is preferably sped up in the example embodiment by skipping the zero entries in the load vector.
  • the inverse matrix is preferably updated accordingly.
  • the solution module will exit the real-time solution mode, and use the PCG solver to update the FEA results asynchronously in response to model modifications, according to the example embodiment.
  • the inverse matrix will preferably be updated in a background process.
  • the real-time solution mode can be reactivated after the update is completed.
  • a local or structure coordinate system (CS) 600 is defined for the physical structure.
  • the FEA meshes and results can be augmented to be aligned with the physical structure, here in the form of a stepladder 606.
  • Interaction with virtual objects is achieved using a 3D input device 608 and a virtual panel 610 [29].
  • This virtual panel 610 is displayed on the screen 612 and consists of virtual buttons e.g. 614 and text displays (see numeral 2008 in Fig. 20(d), discussed below).
  • the virtual buttons 614 are designed for triggering commands user by using the 3D input device 608, i.e. moving onto the desired button and clicking the mouse, and the text displays provide information to the user.
  • the 3D input device 608 is created in the example embodiment by combining a marker cube 616 with a wireless mouse 618.
  • the marker cube 616 is used for tracking the pose of the device 608, i.e. device CS 620, with respect to the camera CS 604, which is represented by the device matrix Tc D .
  • the buttons and wheel of the mouse 618 are used for triggering and parameter adjustment.
  • the coordinate transformation, represented by the matrix T ⁇ , between the input device CS and the CS of the physical structure can be computed using Eq. (3).
  • the user can use the input device 608 to perform various interactions, such as applying virtual loads at locations by pointing at these locations, exploring volumetric FEA results using a handheld slicing plane, selecting virtual models as additional geometrical objects and placing them at desired locations, etc.
  • a 5D selection approach is developed based on the OpenGL selection mode. Using this approach, a viewing volume is used to select the objects that are drawn in this volume.
  • the pose and size of the viewing volume are controlled by setting the relevant camera matrix.
  • a viewing volume represented by a virtual cube 700a, 700b
  • the size of the cube can be adjusted by rolling the mouse wheel, resulting in an adjusted cube 700c with correspondingly adjusted selected drawn objects 702c, as illustrated in Fig. 5 7(c).
  • This selection method in the example embodiment advantageously enables the user to choose multiple objects efficiently in the AR environment. It is easy to select element faces 702d as shown in Fig. 7(d). All the faces that intersect with the virtual cube 700d will be selected, which may not be practical when only the faces of specific orientations are to be selected, e.g., the horizontal faces. To address this, the selection method according to the example embodiment can be optionally refined by using one of the faces 704e of the virtual cube 700e as a reference face, as illustrated in Fig. 7(e). Element faces 702e are selected when the angles between the faces and the reference face 704e are smaller than a threshold. Therefore, the user can select specific faces by rotating the virtual cube 700e via rotating of the marker cube 705, or by changing to another reference face 704f specified via a virtual panel (not shown), as illustrated in Fig. 7(f).
  • the user can use the 3D input device to exert virtual loads, or use sensors to acquire loads automatically in the actual loading environment.
  • the application of point loads is elaborated in the example embodiment.
  • the adaptation of this method for distributed loads in various embodiments is also demonstrated.
  • All the load data is preferably converted to nodal forces for FEA computation in the example embodiment.
  • Graphic representations are created to visualize the loads.
  • a data structure is established in the example embodiment to manage the loads as shown in Fig. 8.
  • Each load sensor 800 or tracker 802 has a unique ID number for indexing, and a communication address for accessing the output.
  • the outputs of the load sensors 800 are interpreted in a load interpreter 804 into load values 806 after being multiplied by a calibration matrix.
  • the interpreter 804 is established for calibrating common force sensors and for adapting to situations where loads are identified through measuring local strains or deflections [33]. For a virtual load, the value, location and orientation are input using the 3D input device.
  • sensors are preferably used and installed according to the measurement requirements in the example embodiment, e.g., position trackers are used to obtain the locations of moving constant loads. For loads acting on fixed locations on physical structures, these load locations can be manually input by the user. For moving loads tracked using sensors, collision detection is used to obtain the different locations during real-time simulation in the example embodiment.
  • a load transformation matrix is computed. Using this matrix, the loads acquired from user input or sensors are transformed into the structure CS, and allocated to the nodes subject to these loads, i.e., converted to nodal forces. The loaded nodes are determined according to the load locations; the allocation is achieved by assigning weights to the loaded nodes, which are derived by computing the equivalent nodal forces with applied unit loads in the example embodiment. In each rendering loop during the real-time simulation, the load vector / is updated by accumulating the nodal forces, so as to compute the FEA results for every frame. Using the data structure and computation approach described above with reference to Fig. 8 for the example embodiment, the loads can be manipulated easily, such as adding, removing and adjusting. Real-time simulation with virtual loads according to an example embodiment
  • Virtual loads can be attached to the input device to allow the user to manipulate the application and location of these virtual loads.
  • a virtual point load is graphically represented by a solid cone 900 in the example embodiment, as shown in Fig. 9.
  • the pointing direction of the cone 900 is the load direction, and a number indicates the load value which can be adjusted with the mouse wheel of the 3D input device 901.
  • the polygonal surface mesh is extracted from the 3D solid mesh.
  • the cone 900 tip is close to a face of the model, a ray from the cone tip will be cast towards the face.
  • the vertex of the intersection will be the node subject to the load, and the intersection point will be taken as the load location in the example embodiment.
  • the user can intuitively control the orientation, location and value of a load through manipulating the 3D input device 901.
  • a virtual load can be finalized at a specific location on the structure accordingly.
  • the relevant load parameters such as the location, orientation and value, do not change. If adjustments are required, the user can select the load and change the parameters.
  • the load variations caused by the manipulation or adjustments are computed, and used to compute and update the variations of FEA results.
  • the loading approach according to the example embodiment can be adapted for applying distributed loads in various embodiments.
  • a typical example is applying pressure to the faces of a physical structure.
  • the user can manipulate a virtual cube to select a model face or part of it by selecting the element faces composing it. With the pressure and areas of the selected faces, the force on each element face can be computed and allocated to the relevant nodes in various embodiments.
  • VTK Visualization Toolkit
  • VTK is an open-source library that provides various visualization algorithms and supports interactive applications. Visualization is achieved using a data pipeline from the source data to images that are rendered.
  • Fig. 10 illustrates the approach for data visualization and exploration in the FEA-AR environment.
  • vtkUnstructuredGrid dataset is created to store the FE mesh and solutions. This dataset can be modified and/or transformed with vtkFilters for specific requirements.
  • vtkMappers are used to map the resulting datasets to visualization objects, i.e., vtkActors, and the visual properties of these vtkActors are adjustable, such as scale, color, transparency, etc.
  • the vtkActors are rendered using vtkRenderers.
  • the viewing transformations for rendering vtkActors are controlled by the vtkCameras.
  • vtkActors can be superimposed on the physical structure. Therefore, the user can observe the FEA results from different perspectives intuitively through moving the camera viewpoint, e.g., a user can wear a head-mounted display (HMD) and walk around the structure.
  • HMD head-mounted display
  • this approach may encounter difficulties due to restrictions from the physical environment and the limitations of the tracking method. For instance, the user has to position the viewpoint at an awkward angle to access the bottom views of unmovable structures.
  • the trackers would fail or have poor performance when the markers are outside the field of view or at inappropriate distances from the camera.
  • vtkActors can be attached to the 3D input device, such that the user can manipulate the data manually.
  • the user can adjust the scales of the vtkActors by rolling the mouse wheel.
  • Data slicing and clipping are fundamental scientific visualization techniques for exploring volumetric datasets. AR interfaces can contribute to intuitive and efficient exploration of these volumetric datasets.
  • the filters vtkCutter and vtkClipDataset are utilized to build the interfaces.
  • the data slicing method allows the user to access the interior of a volumetric dataset by manipulating a slicing plane.
  • the data clipping method allows the user to clip a dataset with a cube or plane, so as to isolate the portions of data that are of interest.
  • the planes and cubes used for slicing and clipping are created using vtkPlane and vtkBox, respectively, and their sizes are adjustable.
  • a virtual plane or cube When a virtual plane or cube is attached to the device CS, the user can manipulate this plane or cube. When attached to the camera CS, the user can manipulate the plane or cube by moving his viewpoint. However, it is may not be very intuitive to control a cube using one's viewpoint. Hence, only planes are manipulated in this manner in the example embodiment.
  • Multiple data slices or clips can be created at different locations on the physical structure for observation and comparison. The user can manipulate a data slice or clip after attaching it to the 3D input device. In addition, the data slice or clip can be updated during real-time simulation in the example embodiment.
  • model modifications are usually performed for different purposes, e.g., refining the mesh locally to improve the accuracy of the results in critical regions, modifying the geometric model to investigate the design variables, etc.
  • the modified model is usually re-analyzed with unchanged boundary conditions.
  • the modification and re-analysis may be performed repeatedly until satisfactory results are obtained.
  • the user needs to conduct laborious data entry and repeat the operations of re- analysis.
  • customized AR interfaces can be established in the example embodiment to allow intuitive and efficient control of the variables for modification.
  • the FEA results can be updated using an automated procedure for model modification and reanalysis. As a result, the modification is conducted interactively with direct update of the FEA results.
  • the method of adding geometric models advantageously allows the user to join new geometric models to the current FE model, for example, adding structural members, such as ribs and braces, to stiffen a structure.
  • the user can manipulate the new geometric models, and place them at specific locations.
  • one method is merging all the geometric elements and regenerating the global mesh, which may not be most efficient. Therefore, the new geometric models are preferably trimmed and meshed individually according to the example embodiment.
  • the trimming approach is achieved in the example embodiment by dividing the geometric model with the intersected faces of the FE model, and removing the portions that intrude into the FE model.
  • the next step is to connect the dissimilar meshes, i.e., the newly-generated mesh and the original mesh.
  • a smooth connection according to the example embodiment uses mesh adaptation to generate common nodes that are shared by the contacting meshes.
  • the example embodiment adopts an existing method of connecting dissimilar meshes, i.e., applying linear constraints to the nodes located in the contacting areas.
  • the FE model is re-analyzed to update the results.
  • the user places the new geometric models and specifies the intersected faces of the FE model. The subsequent tasks can be performed automatically.
  • Local mesh refinement as another example of interactive model modification according to an example embodiment, is usually performed semi-automatically with standard FEA software.
  • the user selects the nodes or elements in the regions to be refined, specify the level of refinement and activate the automatic mesh refinement algorithms.
  • the model is re-analyzed.
  • the ease of data exploration facilitates the determination of the regions to be refined.
  • the system can perform the computation with an automated procedure for mesh refinement and re-analysis.
  • the refined model and results are finally rendered to the user.
  • the mesh topology is changed after refinement.
  • the loads and constraints which are affected by the changes are preferably reapplied.
  • the inverse stiffness matrix can be updated automatically if real-time simulation is required. Automatic mesh generation or refinement is involved in the modification processes. The feasibility depends on the element type used and meshing tools available.
  • a prototype system according to an example embodiment has been developed using C++ language and runs on a Microsoft Windows operating system.
  • a PC that has a 3.2 GHz Intel processor and 8 GB RAM is utilized. Marker-based tracking is implemented using ARToolkit library, and a webcam with a capture rate of 30 fps is used.
  • a virtual panel that has a customized menu is created.
  • ANSYS software is employed to support certain FEA tasks, such as mesh generation, solution, model modification, etc.
  • the PCG solver provided by ANSYS is adopted with an error tolerance of 1.0E-5.
  • the communication between ANSYS and the AR-based system is achieved using the ANSYS Parametric Design Language (APDL) programs.
  • the AR-based system generates APDL codes to control the specific FEA tasks.
  • APDL ANSYS Parametric Design Language
  • ANSYS performs FEA tasks and outputs the data.
  • the data files generated by ANSYS are read by the AR-based system.
  • the prototype system is applied to the structural analysis of an off-the shelf stepladder 1100 as shown in Fig. 11(a).
  • the stepladder 1100 is selected because it is considered as having moderate failure risk in usage, and the model is typical and not difficult to understand.
  • the prototype system is focused on demonstrating the proposed interaction methods according to example embodiments of the present invention, rather than conducting a detailed analysis of the structure.
  • Linear hexahedral and truss elements are selected to model 1101 the wooden components and metal linkages, respectively, as shown in Fig. 11(b).
  • Fig. 12 shows a scenario in which virtual loads are applied to the stepladder 1201.
  • the resulting von Mises stresses and deformations can be visualized on the stepladder 1201, i.e. by the gray scale coding and deformation in the drawn stepladder mesh 1202 overlaying the real physical structure 1201 image, as illustrated in Fig. 12(a).
  • the load position and value is displayed on the real physical structure 1201 instead of the deformed mesh 1202, such that the user can locate and move the loads easily by referencing the real structure 1201.
  • more loads can be applied in the same way, as illustrated in Fig. 12(b).
  • the user can select a load, adjust the value and orientation as illustrated in Fig. 12(c), or delete the load.
  • Fig. 12(d) shows a real-time simulation in which pressure is exerted on the selected areas. The selected element faces are marked with crosses to indicate the load locations.
  • the system allows load acquisition from the actual loading environment in an example embodiment.
  • Fig. 13(a) four wireless force sensors 1301-1304 are attached on the stepladder 1300 to measure the loading caused by users stepping on the ladder.
  • Fig. 13(b)-(d) shows the FEA results when a user steps on the ladder 1300.
  • the locations of the sensors can be changed or more sensors can be added in various embodiments.
  • the real-time performance depends largely on the solution time, i.e. the time for computing K ⁇ ! f , as will be appreciated by a person skilled in the art.
  • the FE model has 944 nodes.
  • the pre-computation takes around 88 s.
  • the AR rendering has a frame rate of 28 fps when virtual point loads are applied. When virtual pressure is applied to a single element face, the frame rate is around 26 fps. This frame rate decreases when more element faces are loaded, because more non-zero entries are involved in the load vector.
  • a frame rate of 23 fps can be achieved for the simulation using force sensors, which is lower than the simulation using virtual point loads.
  • directly overlaying the deformed model 1400 on the real structure 1402 can provide intact views of the results, as illustrated in Fig. 14(a), but may cause misinterpretation of the deformation when occlusion is not taken into consideration. However, when the occlusion is taken into account, part of the results becomes invisible, as illustrated in Fig. 14(b).
  • Semi-transparent rendered objects allow the user to see both the FEA results and the structure, as illustrated in Fig. 14(c), but the colors of different depths at a region are mixed.
  • Each visualization style has its pros and cons. According to their requirements, the user can switch among the different styles and adjust the relevant parameters according to an example embodiment, such as the scale factor of exaggeration and the opacity of the graphic objects.
  • the user can also walk around the structure 1500 to observe the FEA results, i.e. drawn models 1502a-c, from different perspectives.
  • the user can move the 3D input device 1504 to any position in the 3D space, and attach the model to the input device by clicking a mouse button 15, in order to achieve translation (as illustrated in Fig. 15(a), rotation and/or zooming of the model, as illustrated in Fig. 15(b).
  • the model 1502a-c is manipulated, the user preferably keeps the marker cube in the field of view of the camera for tracking. Some regions of the model which are at a distance from the marker cube may not be observable. To address this issue, the user can place the model at a location for further observation, as illustrated in Fig. 15(c), or release it back to overlay the real structure.
  • Fig. 16 illustrates the hand-operated mode.
  • the FEA results i.e. rendered models e.g. 1600, can be examined using a handheld cutter plane 1602 or clipped using a virtual cube 1604, as illustrated in Figs. 16(a) and (d), respectively.
  • the user can create data slices or clips at different locations, as illustrated in Figs. 16(b) and (e), respectively, and manipulate each slice 1606 or clip 1608 for observation, as illustrated in Figs. 16(c) and (f), respectively.
  • the slicing plane e.g. 1602 or clipping cube e.g.
  • An unbounded slicing plane e.g. 1602 allows users to slice a model e.g. 1600 that is far away from the user, but may generate redundant slices when there are multiple intersection areas.
  • a bounded plane of adjustable size is additionally or alternatively implemented in an example embodiment to overcome this problem, and the model is preferably near the user.
  • a comparison of bounded and unbounded cutters is shown in Fig. 17.
  • an unbounded slicing plane or clipping plane is placed in parallel with the view plane at an adjustable distance.
  • the user can manipulate the slicing or clipping plane by moving his viewpoint, or rolling the mouse wheel to adjust the position of the slicing or clipping plane relative to the view plane to explore the data (Fig. 18(a) and (b)).
  • the slicing or clipping plane will be fixed on the structure to allow data observation from different perspectives (Fig. 18(c)).
  • Fig. 18(a) shows the data in a sliced plane 1800a
  • Fig. 18(b) shows the data exploration from the same view as in Fig.
  • Fig. 18(c) shows the sliced plane 1800c is fixed at a position and the user can view it from another angle.
  • This operating mode is suitable for HMDs and handheld displays in example embodiments, which allow easy manipulation of the user's viewpoint.
  • the mouse wheel and button can be integrated with the display devices in such embodiments, such that the user does not need to hold the 3D input device for data exploration.
  • the initial model 1900 and associated FE results are shown in Fig. 19(a).
  • the user can switch to the mesh display or line-model 1901, i.e., rendering the boundary elements but without the results displayed, and select a cross-section using the 3D input device 1902 to create a beam 1904 of an adjustable length.
  • the beam 1904 can be manipulated and placed at a specific location, as illustrated in Fig. 19(b). Next, the user can specify the faces of the stepladder where the beam intersects.
  • An intersected face is specified by selecting an arbitrary cell 1908 on this face, as illustrated in Fig. 19(c). More beams can be added by repeating these processes.
  • All the data is transferred to ANSYS via APDL codes in an example embodiment.
  • the relevant tasks are performed automatically using ANSYS, which include creating beam models, coordinate transformations, trimming beams using the intersected faces, meshing the beams, connecting the meshes to the original mesh, and computing the FEA results with unchanged boundary conditions.
  • the mesh connection is achieved using the CEINTF command in ANSYS.
  • the new mesh 1912 and associated results are imported to the FEA-AR environment and rendered, as illustrated in Fig. 19(d).
  • the inverse stiffness matrix is re-calculated in a background process, which can take several minutes. Real-time simulations are allowed after the computation is completed, as illustrated in Fig. 19(e).
  • the FE model of the modified stepladder model is advantageously enhanced after adding the beams 1913, 1915, which can be observed from the reduction in the deformation and stresses of the step board 1914a and 1914d between the initial and updated models respectively, as illustrated in Figs. 19(a) and (d).
  • the method of connecting dissimilar meshes leads to discontinuous stress fields in the connecting areas (Fig. 19(f)). It is understood from published literature that whenever the cross-section of a structural member changes abruptly, a structural discontinuity would arise. Discontinuous stress fields can be reduced by increasing the mesh density around the region in different embodiments, but this could lead to longer computation time.
  • the model 2000a of the virtual structure to be added relative to the original model 2002 is trimmed using the intersected face of the model, resulting in model 2000b as shown in Fig. 20(b), and then the model 2000b is meshed individually, as shown in Fig. 20(c), resulting in the meshed model 2000c.
  • the next step is to connect the newly-generated mesh 2000c and the original mesh 2002.
  • the connection is simplified by applying linear constraints. For example, the connection at node E (Fig. 5(d)) is achieved by imposing constraints to node E and the four nodes A-D around it.
  • FEA solutions are computed with the new, modified FE model and unchanged boundary conditions by using the PCG (pre-conditioned conjugate gradient) solver.
  • PCG pre-conditioned conjugate gradient
  • the system response time for model modification in a preferred embodiment has been tested for different mesh resolutions as shown in Table 1.
  • meshes of different resolutions are created for both the stepladder and added beams.
  • the treatment for beams has been described above in relation to one preferred embodiment, the method is adaptable to other types of structures depending on the applications in different embodiments.
  • other geometric models can be built e.g. online through parametric modeling for use in embodiments of the present invention.
  • Geometric models can also be built offline using CAD, in particular for more complex geometric models, and can be imported to the system according to various embodiments, and meshed automatically using e.g. tetrahedral elements.
  • the method for mesh refinement is implemented in an example embodiment on a tetrahedral mesh of 1022 nodes.
  • the user determines the region to refine, and selects the elements 2102 in the region using the 3D input device 2104, as illustrated in Fig. 21(b).
  • the level of refinement is set by rolling the mouse wheel on the 3D input device, as illustrated in Fig. 21(c).
  • a higher number at numeral 2106 results in a denser mesh.
  • the mesh refinement is carried out automatically, followed by re- analysis of the refined model.
  • the equivalent nodal forces of the point loads are re-calculated for the refined mesh.
  • a text message 2108 is displayed to inform the user of the updated status.
  • the updated FE model 2110 is rendered, as illustrated in Fig. 21(d), showing more detailed results in the refined region 2112.
  • the inverse stiffness matrix can be updated in a background process for real-time simulation.
  • the advantages of the interactive AR simulation environment can include, different from VR-based systems, that the AR-based system can render FEA models and results in the exact physical context, thereby benefitting interpretation and validation of the FEA data.
  • the users can examine the actual loading conditions, and locate the regions of large deformation or stresses on the real structure directly.
  • the actual deformation and stresses can be acquired in situ through measurement or observation to validate the FEA model.
  • user manipulation is implicitly mapped to the virtual space for interaction with virtual objects.
  • users of example embodiments of the present invention are able to manipulate virtual objects directly by using the tangible AR interfaces, such as moving virtual loads, slicing planes, and/or advantageously adding virtual structures such as beams, etc., and the virtual space is spatially aligned with the real world.
  • the AR interfaces according to example embodiments are particularly valuable for users without formal FEA training. While the stability and precision of user manipulation may be limited by hand motions and tracking performance, this can be improved by adding constraints to the manipulation and using more accurate trackers in different embodiments.
  • FEA simulators for use in example embodiments can acquire various parameters directly from the physical world, such as geometry, loads and boundary conditions, such that simulation can be performed in situ.
  • example embodiments can perform the simulation approach for varying loads. Different approaches can be developed depending the purpose of simulation and measurement requirements in different embodiments.
  • the geometry of structures may be captured using 3D scanning in an example embodiment and construct mesh models in situ.
  • Example embodiments of the present invention can have many applications including for educational and commercial applications, and can be utilized to enhance practical analysis tasks, e.g., evaluation of structures in actual operating environments, investigation of structural failure, determining stiffening strategies onsite, etc.
  • An AR-based system integrates sensor measurement, FEA simulation and scientific visualization techniques, and/or interfaces to enhance the visualization and interaction of structural analysis.
  • the investigation of structures can be facilitated by combining real-time FEA simulation and automatic load acquisition according to an example embodiment.
  • the inverse stiffness matrix can be stored for reuse according to an example embodiment, and can be updated when the FE model is modified. Exploration of FEA results are enhanced according to an example embodiment by enabling natural interfaces for manipulating, slicing and clipping the data.
  • the user can modify the model and perform re-analysis in efficient and intuitive manners.
  • structural enhancement through mesh modification by addition and/or removal of virtual structures, e.g.
  • more interactive methods can be provided, such as material reduction while sustaining strength, and the system response time can be largely reduced by integrating FEA tools fully into the system according to an example embodiment.
  • the system can be implemented on mobile AR platforms in example embodiments for outdoor and onsite applications.
  • the input device may be further configured for user-input relating to a location and magnitude of one or more simulated loads, and the FE model unit is configured for stress analysis under the one or more simulated loads.
  • the interface may be configured for attaching a virtual pointer to an image of the input device captured by the camera for applying the one or more simulated loads based on a rendered location and orientation of the virtual pointer.
  • the FEA-AR system may be further comprising one or more sensor elements configured to be coupled to the physical structure and for measuring one or more actual loads on the physical structure, and the FE model unit may be configured for stress analysis under the one or more actual loads.
  • the input device may be further configured for user-input relating to viewing a cross-section of the rendered results of the stress analysis of the FE model, and the interface may be configured for rendering the cross-section of the rendered results of the stress analysis of the FE model.
  • the input device may be further configured for user-input relating to creating a slice of the rendered results of the stress analysis of the FE model.
  • the interface may be configured for attaching the created slice to an image of the input device captured by the camera for manipulation of the created slice based on a location and orientation of the input device.
  • the interface may be configured for attaching a virtual plane to an image of the input device captured by the camera for selecting the cross-section based on a rendered location and orientation of the virtual plane and/or for creating the slice based on the rendered location and orientation of the virtual plane.
  • the input device may be further configured for user-input relating to viewing a region of the rendered results of the stress analysis of the FE model, and the interface may be configured for rendering the region of the rendered results of the stress analysis of the FE model.
  • the interface may be configured for attaching a virtual volume to an image of the input device captured by the camera for selecting the region based on a rendered location and orientation of the virtual volume.
  • Fig. 22 shows a flowchart 2200 illustrating a finite-element analysis augmented reality, FEA- AR, method according to an example embodiment.
  • an image of a physical structure is captured.
  • an FE model of the physical structure is maintained.
  • the FE model is processed for stress analysis under one or more loads.
  • the captured image of the physical structure is displayed.
  • the FE model is rendered overlaying the image of the physical structure.
  • results of the stress analysis are rendered.
  • user-input relating to a virtual structure to be added to the FE model using an input device is received.
  • a modified FE model overlaying the image of the physical structure is rendered based on the user-input relating to the virtual structure to be added to the FE model.
  • the FEA-AR method may be further comprising receiving user-input relating to a location and magnitude of one or more simulated loads using the input device, and performing stress analysis under the one or more simulated loads.
  • the FEA-AR method may be further comprising attaching a virtual pointer to a captured image of the input device for applying the one or more simulated loads based on a rendered location and orientation of the virtual pointer.
  • the FEA-AR method may be further comprising coupling one or more sensor elements to the physical structure and measuring one or more actual loads on the physical structure using the one or more sensor elements, and performing stress analysis under the one or more actual loads.
  • the FEA-AR method may be further comprising receiving user-input relating to viewing a cross-section of the rendered results of the stress analysis of the FE model using the input device, and rendering the cross-section of the rendered results of the stress analysis of the FE model.
  • the FEA-AR method may be further comprising receiving user-input relating to creating a slice of the rendered results of the stress analysis of the FE model using the input device.
  • the FEA-AR method may be comprising attaching the created slice to a captured image of the input device for manipulation of the created slice based on a location and orientation of the input device.
  • the FEA-AR method may be comprising attaching a virtual plane to a captured image of the input device for selecting the cross-section based on a rendered location and orientation of the virtual plane and/or for creating the slice based on the rendered location and orientation of the virtual plane.
  • the FEA-AR method may be further comprising receiving user-input relating to viewing a region of the rendered results of the stress analysis of the FE model using the input device, and rendering the region of the rendered results of the stress analysis of the FE model.
  • the FEA-AR method may be comprising attaching a virtual volume to a captured image of the input device for selecting the region based on a rendered location and orientation of the virtual volume.
  • Computer-readable media in which such formatted data and/or instructions may be embodied include, but are not limited to, non-volatile storage media in various forms (e.g., optical, magnetic or semiconductor storage media) and carrier waves that may be used to transfer such formatted data and/or instructions through wireless, optical, or wired signaling media or any combination thereof.
  • Examples of transfers of such formatted data and/or instructions by carrier waves include, but are not limited to, transfers (uploads, downloads, e-mail, etc.) over the internet and/or other computer networks via one or more data transfer protocols (e.g., HTTP, FTP, SMTP, etc.).
  • data transfer protocols e.g., HTTP, FTP, SMTP, etc.
  • a processing entity e.g., one or more processors
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • PAL programmable array logic
  • ASICs application specific integrated circuits
  • microcontrollers with memory such as electronically erasable programmable read only memory (EEPROM)
  • EEPROM electronically erasable programmable read only memory
  • embedded microprocessors firmware, software, etc.
  • aspects of the system may be embodied in microprocessors having software -based circuit emulation, discrete logic (sequential and combinatorial), custom devices, fuzzy (neural) logic, quantum devices, and hybrids of any of the above device types.
  • the underlying device technologies may be provided in a variety of component types, e.g., metal-oxide semiconductor field-effect transistor (MOSFET) technologies like complementary metal-oxide semiconductor (CMOS), bipolar technologies like emitter-coupled logic (ECL), polymer technologies (e.g., silicon-conjugated polymer and metal-conjugated polymer-metal structures), mixed analog and digital, etc.
  • MOSFET metal-oxide semiconductor field-effect transistor
  • CMOS complementary metal-oxide semiconductor
  • bipolar technologies like emitter-coupled logic (ECL)
  • polymer technologies e.g., silicon-conjugated polymer and metal-conjugated polymer-metal structures
  • mixed analog and digital etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Geometry (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

L'invention concerne un système et un procédé de réalité augmentée d'analyse d'éléments finis (FEA-AR) et un support lisible par ordinateur. Le système FEA-AR comprend une caméra configurée pour capturer une image d'une structure physique ; une unité de modèle d'un élément fini (FE) configurée afin de maintenir un modèle de FE de la structure physique, et afin de traiter le modèle de FE pour une analyse de contrainte sous une ou plusieurs charges ; une interface configurée afin d'afficher l'image capturée de la structure physique et afin de restituer le modèle de FE recouvrant l'image de la structure physique et afin de restituer les résultats de l'analyse de contrainte ; un dispositif d'entrée configuré pour une entrée d'utilisateur relative à une structure virtuelle à ajouter au modèle de FE ; et l'interface étant en outre configurée pour restituer un modèle de FE modifié recouvrant l'image de la structure physique sur la base de l'entrée d'utilisateur relative à la structure virtuelle à ajouter au modèle de FE.
PCT/SG2018/050091 2017-02-27 2018-02-27 Système et procédé de réalité augmentée d'analyse d'élément fini WO2018156087A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SG10201701539V 2017-02-27
SG10201701539V 2017-02-27

Publications (1)

Publication Number Publication Date
WO2018156087A1 true WO2018156087A1 (fr) 2018-08-30

Family

ID=63252282

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SG2018/050091 WO2018156087A1 (fr) 2017-02-27 2018-02-27 Système et procédé de réalité augmentée d'analyse d'élément fini

Country Status (1)

Country Link
WO (1) WO2018156087A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110047145A (zh) * 2019-04-15 2019-07-23 山东师范大学 基于深度学习和有限元建模的组织变形模拟系统及方法
CN112084663A (zh) * 2020-09-14 2020-12-15 江苏海洋大学 一种基于复杂海洋环境的柔性管道三维数值模拟方法
CN113538687A (zh) * 2021-06-08 2021-10-22 广州颖力土木科技有限公司 基于vtk的有限元可视化方法、系统、装置及存储介质
CN113536604A (zh) * 2021-09-01 2021-10-22 武汉大学 约束无梯度通用解算的结构健康监测传感器布设优化方法
US11557078B1 (en) 2022-02-08 2023-01-17 Honeywell Federal Manufacturing & Technologies, Llc Machine learning simulation of finite element analysis in augmented reality

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7356449B2 (en) * 2002-04-16 2008-04-08 The Board Of Trustees Of The University Of Illinois Method and program product for solid mechanics modelling workbench and dynamic display
US7634394B2 (en) * 2004-03-05 2009-12-15 The Procter & Gamble Company Method of analysis of comfort for virtual prototyping system
US8599194B2 (en) * 2007-01-22 2013-12-03 Textron Innovations Inc. System and method for the interactive display of data in a motion capture environment
CN104091033A (zh) * 2014-07-25 2014-10-08 哈尔滨工业大学 基于超单元结合虚拟变形法的桥梁静力有限元模型修正方法
US20150262426A1 (en) * 2012-08-28 2015-09-17 University Of South Australia Spatial Augmented Reality (SAR) Application Development System

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7356449B2 (en) * 2002-04-16 2008-04-08 The Board Of Trustees Of The University Of Illinois Method and program product for solid mechanics modelling workbench and dynamic display
US7634394B2 (en) * 2004-03-05 2009-12-15 The Procter & Gamble Company Method of analysis of comfort for virtual prototyping system
US8599194B2 (en) * 2007-01-22 2013-12-03 Textron Innovations Inc. System and method for the interactive display of data in a motion capture environment
US20150262426A1 (en) * 2012-08-28 2015-09-17 University Of South Australia Spatial Augmented Reality (SAR) Application Development System
CN104091033A (zh) * 2014-07-25 2014-10-08 哈尔滨工业大学 基于超单元结合虚拟变形法的桥梁静力有限元模型修正方法

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
HUANG J. M. ET AL.: "Real-time finite element structural analysis in augmented reality", ADVANCES IN ENGINEERING SOFTWARE, vol. 87, 18 May 2015 (2015-05-18), pages 43 - 56, XP055541033, [retrieved on 20180406] *
RYKEN M. J. ET AL.: "Applying virtual reality techniques to the interactive stress analysis of a tractor lift arm", FINITE ELEMENTS IN ANALYSIS AND DESIGN, vol. 35, no. 2, 31 May 2000 (2000-05-31), pages 141 - 155, XP055541023, [retrieved on 20180406] *
VALENTINI P. P. ET AL.: "Dynamic Splines for interactive simulation of elastic beams in Augmented Reality", IMPROVE'11, 17 June 2011 (2011-06-17), pages 89 - 96, XP055541031, [retrieved on 20180406] *
YEH T. P. ET AL.: "Applying Virtual Reality Techniques to Sensitivity-Based Structural Shape Design", JOURNAL OF MECHANICAL DESIGN, vol. 120, no. 4, 1 December 1998 (1998-12-01), pages 612 - 619, [retrieved on 20180406] *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110047145A (zh) * 2019-04-15 2019-07-23 山东师范大学 基于深度学习和有限元建模的组织变形模拟系统及方法
CN110047145B (zh) * 2019-04-15 2023-05-16 山东师范大学 基于深度学习和有限元建模的组织变形模拟系统及方法
CN112084663A (zh) * 2020-09-14 2020-12-15 江苏海洋大学 一种基于复杂海洋环境的柔性管道三维数值模拟方法
CN112084663B (zh) * 2020-09-14 2021-03-30 江苏海洋大学 一种基于复杂海洋环境的柔性管道的试验系统
CN113538687A (zh) * 2021-06-08 2021-10-22 广州颖力土木科技有限公司 基于vtk的有限元可视化方法、系统、装置及存储介质
CN113536604A (zh) * 2021-09-01 2021-10-22 武汉大学 约束无梯度通用解算的结构健康监测传感器布设优化方法
US11557078B1 (en) 2022-02-08 2023-01-17 Honeywell Federal Manufacturing & Technologies, Llc Machine learning simulation of finite element analysis in augmented reality

Similar Documents

Publication Publication Date Title
Huang et al. Visualization and interaction of finite element analysis in augmented reality
Huang et al. Real-time finite element structural analysis in augmented reality
WO2018156087A1 (fr) Système et procédé de réalité augmentée d'analyse d'élément fini
Lin et al. Visualization of indoor thermal environment on mobile devices based on augmented reality and computational fluid dynamics
Ng et al. GARDE: a gesture-based augmented reality design evaluation system
Huang et al. An approach for augmented learning of finite element analysis
Ong et al. Structure design and analysis with integrated AR-FEA
WO2020097343A1 (fr) Système et procédé interactifs fournissant une visualisation en réalité virtuelle en temps réel de données de simulation
US20150088474A1 (en) Virtual simulation
Chen et al. A naked eye 3D display and interaction system for medical education and training
Chang et al. Development scheme of haptic-based system for interactive deformable simulation
US10474763B2 (en) Computer-implemented method for defining initial conditions for dynamic simulation of an assembly of objects in a three-dimensional scene of a system of computer-aided design
Bruno et al. Visualization of industrial engineering data visualization of industrial engineering data in augmented reality
CN112560308A (zh) 一种基于有限元的汽车碰撞试验平台构建方法及装置
Li et al. Mobile augmented reality visualization and collaboration techniques for on-site finite element structural analysis
Bowman et al. Virtual-SAP: an immersive tool for visualizing the response of building structures to environmental conditions
US7155673B2 (en) System and method of interactive evaluation of a geometric model
Huang et al. An augmented reality platform for interactive finite element analysis
Liu et al. COMTIS: Customizable touchless interaction system for large screen visualization
KR102519114B1 (ko) 가상 현실 기반의 수술 환경을 제공하는 장치 및 방법
Wenhua Architecture and key techniques of augmented reality maintenance guiding system for civil aircrafts
Tadeja et al. PhotoTwinVR: An Immersive System for Manipulation, Inspection and Dimension Measurements of the 3D Photogrammetric Models of Real-Life Structures in Virtual Reality
Merckel et al. Multi-interfaces approach to situated knowledge management for complex instruments: First step toward industrial deployment
Scalas et al. A first step towards cage-based deformation in virtual reality
Malkawi Immersive building simulation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18758101

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18758101

Country of ref document: EP

Kind code of ref document: A1