CN111462344A - Real-time sectioning interaction method for field data visualization in virtual reality simulation - Google Patents

Real-time sectioning interaction method for field data visualization in virtual reality simulation Download PDF

Info

Publication number
CN111462344A
CN111462344A CN202010251736.1A CN202010251736A CN111462344A CN 111462344 A CN111462344 A CN 111462344A CN 202010251736 A CN202010251736 A CN 202010251736A CN 111462344 A CN111462344 A CN 111462344A
Authority
CN
China
Prior art keywords
sectioning
virtual reality
plane
model
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010251736.1A
Other languages
Chinese (zh)
Other versions
CN111462344B (en
Inventor
刘振宇
侯宇
裘辿
谭建荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN202010251736.1A priority Critical patent/CN111462344B/en
Publication of CN111462344A publication Critical patent/CN111462344A/en
Application granted granted Critical
Publication of CN111462344B publication Critical patent/CN111462344B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/21Collision detection, intersection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/28Force feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2008Assembling, disassembling

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Architecture (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a real-time sectioning interaction method for field data visualization in virtual reality simulation. In a visual scene of virtual reality, a force feedback glove is used for picking up a model to be cut; drawing a sectioning plane by using a handle of virtual reality equipment in a virtual reality visualization scene, and transforming the sectioning plane to a coordinate space where a model to be sectioned is located; transmitting the model to be dissected and the parameters of the dissecting plane after transformation to a dissecting processing module, and dissecting the model to be dissected by using a data set dissecting filter to obtain each model part; successively adding a plurality of sectioning planes, and repeating the processing steps for each model part in the real-time sectioning source data to carry out series processing; and extracting the surface and surface data of each model part after sectioning, and transmitting the surface and surface data back to a visual scene of virtual reality to update the model to be sectioned. The method can improve the intuition and naturalness of the equipment virtual reality simulation and improve the display effect of finite element and fluid field data in the equipment virtual reality simulation.

Description

Real-time sectioning interaction method for field data visualization in virtual reality simulation
Technical Field
The invention relates to a real-time interaction method, in particular to a real-time sectioning interaction method for field data in virtual reality simulation, and belongs to the field of graphics and image software.
Background
Modern equipment is a complex system involving the action of inter-coupled multidisciplinary laws such as mechanics, electronics, civil engineering, chemical engineering, materials, controls, etc. Also due to their complexity, the physical environments involved in the system are very diverse, such as displacement fields, strain fields, velocity fields, temperature fields, electromagnetic fields, etc. coupled to each other. How to be able to evaluate a complex system quickly and correctly is a very important issue in a complex and lengthy decision-development loop cycle.
In response to the demand, the digital prototype and the digital twin simulation technology gradually become important tools for accelerating analysis and decision, improving research and development efficiency and shortening research and development period. With the development assistance of the virtual reality technology, the application thresholds of the digital prototype and the digital twinning technology are greatly reduced, and the application effect is greatly improved.
However, the current virtual reality-based equipment simulation technology still has many disadvantages. On one hand, in a virtual reality scene, the interaction of people between complex physical fields is not intuitive and natural enough, and on the other hand, the disciplines and physical fields which can be covered in simulation are limited.
Disclosure of Invention
In order to solve the problems existing in the existing equipment simulation technology based on virtual reality, the invention provides a real-time sectioning interaction method for field data visualization in equipment virtual reality simulation.
As shown in fig. 1, the technical solution adopted by the present invention is:
s1, in a visual scene of virtual reality, one hand picks up a model to be cut by using a force feedback glove;
s2, in a visual scene of virtual reality, drawing a sectioning plane by using a handle of virtual reality equipment by the other hand, and transforming the sectioning plane to a coordinate space where a model to be sectioned is located;
s3, transmitting the model to be cut and the converted cutting plane parameters to a cutting processing module to cut the source data of the model to be cut by using a data set cutting (ClipDataSet) filter;
and S4, extracting the surface and surface data of each model part after sectioning, transmitting the surface and surface data back to a virtual reality visual scene, and updating the model to be sectioned.
S5, adding a plurality of sectioning planes in sequence, and repeating the processing steps S1-S4 for each model part in the real-time sectioning source data to carry out series processing.
In step S1, the model to be cut is provided with or quoted with time-varying field data, which means that each position point of the model to be cut has data, not only on the outer surface, but also inside.
The model to be cut in the step S2 has a collision volume, and can interact with the force feedback glove to show a tactile sensation and a volume sensation.
The handle in the step S2 has absolute positioning capability in space and has an operable button.
The step 2 is specifically as follows:
s21, when an operating button of the handle is pressed, entering a sectioning plane drawing state, calling a current frame as an initial frame, and recording the position and the posture of the handle at the time;
s22, when a sectioning plane is in a drawing state, in the space of a virtual reality visual scene, the front end of the handle keeps projecting a section of linear indicating light rays with a fixed length, and the position of the linear indicating light rays is updated in each frame, so that the linear indicating light rays and the handle keep relatively static;
s23, when the sectioning plane is in a drawing state, in the space of a virtual reality visualization scene, processing to obtain a global coordinate system plane Q of the current frame according to the position and the posture of the handle in the initial frame and the position and the posture of the handle in the current frame, and visualizing in a semitransparent mode;
s24, when an operation button of the handle is released, the last global coordinate system plane is received as a sectioning plane, and the sectioning plane drawing state is quitted.
The plane calculation method in step 23 is specifically as follows:
s231, setting the starting point of the projection light at the initial position of the handle as P1Positioning the end point of the projected light at the initial position of the handle to be P2Setting the starting point of the projection light at the current frame position of the handle as P3Setting the end point of the projection light at the current frame position of the handle as P4
S232. with point P1Point P3And point P4Three points determine plane Q, and then point P2The point projected onto the plane Q is denoted as point P2′;
S234. point P1Point P'2Point P4Triangle formed, and point P'2Point P4Point P3The resulting triangle has common sides and lies in plane Q. Using them to visualize the plane Q for visualization.
The transformation in said step S2 transforms the plane Q into the local coordinate system of the model of the object to be cut.
Step S3, introducing model data containing three-dimensional field data prepared in advance and a sectioning plane described by a local coordinate system into a sectioning processing module, where the sectioning processing module uses VTK as a main data processing tool; the method comprises the following steps:
s31, reading in a model to be cut, and generating unstructured grid (vtkUnstructured grid) type data of a plurality of time steps;
s32, according to the requirement on the number of key frames, uniformly re-sampling the unstructured grid type data of a plurality of time steps to serve as the data of each key frame;
and S33, sectioning all the key frames by using a data set sectioning (ClipDataSet) filter respectively to obtain sectioned unstructured grid (vtkUnstructured grid) type data of all the key frames.
In step S4, the geometric (vtkGeometry) filter and the triangulated (vtkTriangleFilter) filter are used to extract the geometric and field data of the unstructured mesh (vtkUnstructured grid) type data of each cut model part, so as to generate polygon (vtkPolyData) type data, and then the polygon (vtkPolyData) type data is converted into triangle mesh and vertex data supported by the visualized scene, so as to update the outer surface of the original model in the visualized scene.
And the plurality of cutting planes of the step S5 are created by repeating the step S2-the step S3, the plurality of cutting processes are connected in series in an ending way, and the step is repeated to carry out the plurality of times of cutting.
The method comprises the steps of picking up a force feedback glove, drawing a sectioning plane by a VR handle, transforming the plane, sectioning VTK field data, extracting a sectioned surface, updating a model and the like.
The implementation of the invention is divided into two levels, an upper layer and a bottom layer. The interactive module in the upper layer is not limited to a specific virtual environment frame, and can be applied to Unity, non-Engine or other graphic frames supporting virtual reality, and Unity is taken as a target in the implementation. The bottom sectioning processing module realizes main geometric calculation work by means of a VTK framework, compiles the geometric calculation work into a standard C dynamic link library and performs necessary data exchange with the upper layer.
The invention has the beneficial effects that:
the method can execute the operation of sectioning the field data in a virtual reality scene in a very intuitive mode on one hand, and can perform seamless replacement on the field data models before and after sectioning on the other hand. Meanwhile, the use method can easily realize multiple times of sectioning of multiple sectioning planes.
The method can improve the intuition and naturalness of the equipment virtual reality simulation and improve the display effect of finite element and fluid field data in the equipment virtual reality simulation.
Drawings
FIG. 1 is a flow chart of the present method;
FIG. 2 is a view showing the visual effect of the step S23 of adjusting the model posture and the cutting plane;
FIG. 3 is a schematic plane calculation diagram of step S23;
fig. 4 is a view showing a visual effect after a cutting operation is performed.
Detailed Description
The invention is further illustrated by the following figures and examples.
As shown in fig. 1, an embodiment of the present invention is as follows:
the specific implementation is an office virtual reality scene, and a notebook model exists.
Or, for example, in a virtual reality simulation scenario of an industrial steam turbine, there is one turbine blade with a time-varying stress field. The blade needs to be sectioned to see the internal stress distribution thereof.
S1, in a visual scene of virtual reality, picking up the blade by a right hand by using a force feedback glove;
s2, in a visual scene of virtual reality, drawing a sectioning plane by using a handle of virtual reality equipment with a left hand, and transforming the sectioning plane to a coordinate space where a blade model is located;
s21, when an operating button of the handle is pressed, entering a sectioning plane drawing state, calling a current frame as an initial frame, and recording the position and the posture of the handle at the time;
s22, when a sectioning plane is in a drawing state, in the space of a virtual reality visual scene, the front end of the handle keeps projecting a section of linear indicating light rays with a fixed length, and the position of the linear indicating light rays is updated in each frame, so that the linear indicating light rays and the handle keep relatively static;
and S23, as shown in the figures 2 and 3, when the section plane is in a drawing state, processing to obtain a global coordinate system plane Q of the current frame in the space of the virtual reality visualization scene according to the position and the posture of the handle in the initial frame and the position and the posture of the handle in the current frame, and visualizing in a semitransparent mode. The cutting plane can be placed at a desired position by adjusting the postures of the blades and the handle;
s231, setting the starting point of the projection light at the initial position of the handle as P1Positioning the end point of the projected light at the initial position of the handle to be P2Setting the starting point of the projection light at the current frame position of the handle as P3Setting the end point of the projection light at the current frame position of the handle as P4
S232. with point P1Point P3And point P4Three points determine plane Q, and then point P2The point projected onto the plane Q is denoted as point P2′;
S234. point P1Point P'2Point P4Triangle formed, and point P'2Point P4Point P3The resulting triangle has common sides and lies in plane Q. Using them to visualize the plane Q for visualization.
S24, when an operation button of the handle is released, the last global coordinate system plane is received as a sectioning plane, and the sectioning plane drawing state is quitted.
And S3, transmitting the blade model with the time-varying field data and the converted sectioning plane parameters to a sectioning processing module, and sectioning the source data of the blade model by using a VTK (ClipDataSet) filter.
The method comprises the following steps:
s31, reading in a blade model with time-varying field data, and generating unstructured grid (vtkUnstructured grid) type data of a plurality of time steps;
s32, according to the requirement on the number of key frames, uniformly re-sampling the unstructured grid type data of a plurality of time steps to serve as the data of each key frame;
and S33, as shown in FIG. 4, respectively sectioning all the key frames by using a data set sectioning (ClipDataSet) filter to obtain sectioned unstructured grid (vtkUnstructured grid) type data of all the key frames.
And S4, extracting the surface and surface data of the dissected blade model, transmitting the data back to a virtual reality visual scene, and updating the model to be dissected.
Specifically, a geometry (vtkGeometry) filter and a triangularization (vtkTriangleFilter) filter are used for extracting the outer surfaces of the geometry and field data of non-structural grid (vtkUnstructured grid) type data of each cut model part to generate polygon (vtkPolyData) type data, and then triangular grids and vertex data supported by a visual scene are generated through conversion, and the outer surface of an original blade model in the visual scene is updated;
s5, a plurality of cutting planes can be added in sequence, the processing steps S2-S3 are repeated for the cutting result of the previous blade model, and the cut data are processed in series.

Claims (9)

1. A real-time sectioning interaction method for field data visualization in virtual reality simulation is equipped, and is characterized by comprising the following steps:
s1, in a visual scene of virtual reality, one hand picks up a model to be cut by using a force feedback glove;
s2, in a visual scene of virtual reality, drawing a sectioning plane by using a handle of virtual reality equipment by the other hand, and transforming the sectioning plane to a coordinate space where a model to be sectioned is located;
s3, transmitting the model to be cut and the changed parameters of the cutting plane to a cutting processing module, and cutting the model to be cut by using a data set cutting filter;
and S4, extracting the surface and surface data of each model part after sectioning, transmitting the surface and surface data back to a virtual reality visual scene, and updating the model to be sectioned.
S5, adding a plurality of sectioning planes in sequence, and repeating the processing steps S1-S4 for each model part in the real-time sectioning source data to carry out series processing.
2. The real-time sectioning interaction method provided with field data visualization in virtual reality simulation, according to claim 1, characterized in that: in step S1, the model to be cut is provided with or quoted with time-varying field data, which means that each position point of the model to be cut has data, not only on the outer surface, but also inside.
3. The real-time sectioning interaction method provided with field data visualization in virtual reality simulation, according to claim 1, characterized in that: the model to be cut in the step S2 has a collision volume, and can interact with the force feedback glove to show a tactile sensation and a volume sensation.
4. The real-time sectioning interaction method provided with field data visualization in virtual reality simulation, according to claim 1, characterized in that: the handle in the step S2 has absolute positioning capability in space and has an operable button.
5. The real-time sectioning interaction method provided with field data visualization in virtual reality simulation, according to claim 1, characterized in that: the step 2 is specifically as follows:
s21, when an operating button of the handle is pressed, entering a sectioning plane drawing state, calling a current frame as an initial frame, and recording the position and the posture of the handle at the time;
s22, when a sectioning plane is in a drawing state, in the space of a virtual reality visual scene, the front end of the handle keeps projecting a section of linear indicating light rays with a fixed length, and the position of the linear indicating light rays is updated in each frame, so that the linear indicating light rays and the handle keep relatively static;
s23, when the sectioning plane is in a drawing state, in the space of a virtual reality visualization scene, processing to obtain a global coordinate system plane Q of the current frame according to the position and the posture of the handle in the initial frame and the position and the posture of the handle in the current frame, and visualizing in a semitransparent mode;
s24, when an operation button of the handle is released, the last global coordinate system plane is received as a sectioning plane, and the sectioning plane drawing state is quitted.
6. The real-time sectioning interaction method provided with field data visualization in virtual reality simulation, according to claim 5, characterized in that: the plane calculation method in step 23 is specifically as follows:
s231, setting the starting point of the projection light at the initial position of the handle as P1Positioning the end point of the projected light at the initial position of the handle to be P2Setting the starting point of the projection light at the current frame position of the handle as P3Setting the end point of the projection light at the current frame position of the handle as P4
S232. with point P1Point P3And point P4Three points determine plane Q, and then point P2Point projected onto plane Q is denoted as Point P'2
S234. point P1Point P'2Point P4Triangle formed, and point P'2Point P4Point P3The resulting triangle has common sides and lies in plane Q. Using them to visualize the plane Q.
7. The real-time sectioning interaction method provided with field data visualization in virtual reality simulation, according to claim 1, characterized in that: in step S3, the steps are as follows:
s31, reading in a model to be cut, and generating unstructured grid type data of a plurality of time steps;
s32, according to the requirement on the number of key frames, uniformly re-sampling the unstructured grid type data of a plurality of time steps to serve as the data of each key frame;
and S33, sectioning all the key frames by using a data set sectioning filter respectively to obtain the sectioned non-structural grid type data of all the key frames.
8. The real-time sectioning interaction method provided with field data visualization in virtual reality simulation, according to claim 1, characterized in that: in step S4, the geometric filter and the triangularization filter are used to extract the geometric and field data of the non-structural mesh type data of each cut model part, so as to generate polygon type data, and then triangular meshes and vertex data supported by the visual scene are generated through conversion, so as to update the outer surface of the original model in the visual scene.
9. The real-time sectioning interaction method provided with field data visualization in virtual reality simulation, according to claim 1, characterized in that: and the plurality of cutting planes of the step S5 are created by repeating the step S2-the step S3, the plurality of cutting processes are connected in series in an ending way, and the step is repeated to carry out the plurality of times of cutting.
CN202010251736.1A 2020-04-01 2020-04-01 Real-time sectioning interaction method for field data visualization in virtual reality simulation Active CN111462344B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010251736.1A CN111462344B (en) 2020-04-01 2020-04-01 Real-time sectioning interaction method for field data visualization in virtual reality simulation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010251736.1A CN111462344B (en) 2020-04-01 2020-04-01 Real-time sectioning interaction method for field data visualization in virtual reality simulation

Publications (2)

Publication Number Publication Date
CN111462344A true CN111462344A (en) 2020-07-28
CN111462344B CN111462344B (en) 2022-09-20

Family

ID=71680554

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010251736.1A Active CN111462344B (en) 2020-04-01 2020-04-01 Real-time sectioning interaction method for field data visualization in virtual reality simulation

Country Status (1)

Country Link
CN (1) CN111462344B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112131626A (en) * 2020-09-16 2020-12-25 中国工程物理研究院计算机应用研究所 CAD model geometric feature interaction method and system for non-regional Engine
CN113189947A (en) * 2021-04-27 2021-07-30 浙江大学 Assembly production line digital twin real-time action simulation method based on PLC data

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101908228A (en) * 2010-07-16 2010-12-08 浙江大学 Digital building measuring and drawing method for acquiring building vertical face map
CN102013113A (en) * 2010-12-07 2011-04-13 中国地质大学(武汉) Method for dynamically sectioning multiple-object model based on template buffering
CN106456250A (en) * 2013-08-13 2017-02-22 波士顿科学国际有限公司 Computer visualization of anatomical items
US20180098813A1 (en) * 2016-10-07 2018-04-12 Simbionix Ltd. Method and system for rendering a medical simulation in an operating room in virtual reality or augmented reality environment
WO2019044111A1 (en) * 2017-08-31 2019-03-07 ソニー株式会社 Tactile presentation apparatus
CN109685914A (en) * 2018-11-06 2019-04-26 南方电网调峰调频发电有限公司 Cutting profile based on triangle grid model mends face algorithm automatically

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101908228A (en) * 2010-07-16 2010-12-08 浙江大学 Digital building measuring and drawing method for acquiring building vertical face map
CN102013113A (en) * 2010-12-07 2011-04-13 中国地质大学(武汉) Method for dynamically sectioning multiple-object model based on template buffering
CN106456250A (en) * 2013-08-13 2017-02-22 波士顿科学国际有限公司 Computer visualization of anatomical items
US20180098813A1 (en) * 2016-10-07 2018-04-12 Simbionix Ltd. Method and system for rendering a medical simulation in an operating room in virtual reality or augmented reality environment
WO2019044111A1 (en) * 2017-08-31 2019-03-07 ソニー株式会社 Tactile presentation apparatus
CN109685914A (en) * 2018-11-06 2019-04-26 南方电网调峰调频发电有限公司 Cutting profile based on triangle grid model mends face algorithm automatically

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张树有,等: "A Research Review on the Key Technologies of Intelligent Design for Customized Products", 《ENGINEERING》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112131626A (en) * 2020-09-16 2020-12-25 中国工程物理研究院计算机应用研究所 CAD model geometric feature interaction method and system for non-regional Engine
CN113189947A (en) * 2021-04-27 2021-07-30 浙江大学 Assembly production line digital twin real-time action simulation method based on PLC data

Also Published As

Publication number Publication date
CN111462344B (en) 2022-09-20

Similar Documents

Publication Publication Date Title
EP2363819A1 (en) Method for simulation of welding distortion
CN104484522B (en) A kind of construction method of robot simulation's drilling system based on reality scene
JP6787661B2 (en) Simulation of machining of workpieces
CN111462344B (en) Real-time sectioning interaction method for field data visualization in virtual reality simulation
CN104573230A (en) Virtual human work task simulation analyzing system and method for spacecraft repair
Kang et al. Instant 3D design concept generation and visualization by real-time hand gesture recognition
US20150088474A1 (en) Virtual simulation
Dave et al. Gesture interface for 3d cad modeling using kinect
Craig et al. William R. Sherman
Keller et al. Use of virtual reality for optimizing the life cycle of a fusion component
CN107515587B (en) System, method, device, equipment and storage medium for man-machine interaction operation simulation
Guo et al. Exploration of human-computer interaction system for product design in virtual reality environment based on computer-aided technology
Briggs et al. Integrated, synchronous multi-user design and analysis
Wang et al. A simulation system based on ogre and physx for flexible aircraft assembly
JP2017004143A (en) Analytical mesh generation device and method
US7782322B2 (en) Plane shape creation system, plane shape creation method and program recording medium
Gandotra et al. TOOLS AND TECHNIQUES FOR CONCEPTUAL DESIGN IN VIRTUAL REALITY ENVIRONMENT.
Wesche Three-dimensional visualization of fluid dynamics on the Responsive Workbench
Huang et al. An Augmented Reality Platform for Interactive Finite Element Analysis
Perles et al. Interactive virtual tools for manipulating NURBS surfaces in a virtual environment
Lian et al. Real-time finite element analysis with virtual hands: An introduction
Qi et al. 3D Shape Deformation Simulation Algorithm Based on Haptics
Lang et al. A survey of virtual assembly technology
Kim et al. Visualization of Simulation Results Based on Mobile Augmented Reality
Damyanova et al. Pre-and post-processing of data for simulation of gyrotrons by the GYREOSS software package

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant