CN107967054B - Immersive three-dimensional electronic sand table with virtual reality and augmented reality coupled - Google Patents

Immersive three-dimensional electronic sand table with virtual reality and augmented reality coupled Download PDF

Info

Publication number
CN107967054B
CN107967054B CN201711136750.1A CN201711136750A CN107967054B CN 107967054 B CN107967054 B CN 107967054B CN 201711136750 A CN201711136750 A CN 201711136750A CN 107967054 B CN107967054 B CN 107967054B
Authority
CN
China
Prior art keywords
enhanced
virtual
sand table
electronic sand
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711136750.1A
Other languages
Chinese (zh)
Other versions
CN107967054A (en
Inventor
徐丙立
荆涛
崔巅博
张飞
蔺敏
饶毅
邵小耀
李霖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Academy of Armored Forces of PLA
Original Assignee
Academy of Armored Forces of PLA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Academy of Armored Forces of PLA filed Critical Academy of Armored Forces of PLA
Priority to CN201711136750.1A priority Critical patent/CN107967054B/en
Publication of CN107967054A publication Critical patent/CN107967054A/en
Application granted granted Critical
Publication of CN107967054B publication Critical patent/CN107967054B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Geometry (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses an immersive three-dimensional electronic sand table coupling virtual reality and augmented reality, which comprises the following components: the system comprises a peripheral/inner periphery positioning sensor group, a peripheral/inner periphery enhanced orientation identification tag group, a virtual/enhanced menu area, a virtual environment software control and drive, an enhanced environment software control and drive, a virtual/enhanced electronic sand table coupling three-dimensional visual expression area, a virtual reality helmet and an enhanced reality helmet. The virtual electronic sand table is effectively aligned with the enhanced electronic sand table space based on the real physical space; providing virtual and enhanced electronic sand table coupling visual expression and interaction; the method can be adaptive to the construction requirements of immersion type electronic sand tables with different shapes and sizes; the electronic sand table can be operated by one person or by multiple persons in a coordinated manner; the method can be used for aspects such as situation expression, space visualization, virtual-real coupling and the like, and provides a more visual, intuitive and immersive sand table platform for various application situations.

Description

Immersive three-dimensional electronic sand table with virtual reality and augmented reality coupled
Technical Field
The invention relates to the technical field of electronic sand tables, in particular to an immersive three-dimensional electronic sand table with a virtual electronic sand table and an enhanced electronic sand table coupled with each other.
Background
Unlike traditional physical sand tables, electronic sand tables are computer-generated digital sand tables that present, express, simulate, etc. entities, phenomena, events, etc. in a space in an electronic map or virtualization manner, which are favored in many fields with the advantages of flexibility, variability, dynamic controllability, economy, high efficiency, etc., and have become key support means for operational command, emergency response, disaster rescue, city planning, etc. After decades of development, electronic sand tables are becoming mature and have been widely applied in various fields such as national defense, economy, cities, traffic, commerce and the like. With the development of computer technology, electronic sand tables are continuously evolving from two dimensions to three dimensions or even higher dimensions, and are evolving from computer screens to three-dimensional projection, CAVE, virtual reality helmets and augmented reality helmets in a visualization means.
The two-dimensional electronic sand table is a digital map which is based on a geographic information system and is realized through computer visualization and appears in the 80 s of the 20 th century. The method is based on a point, line, plane and other symbolization systems, three-dimensional information of real world such as geographic elements, weaponry, weapon force deployment, evolution trend and the like is compressed into a two-dimensional plane according to a certain scale, and a two-dimensional effect is expressed by utilizing superposition of image layers. Compared with the traditional physical sand table, the two-dimensional electronic sand table has the characteristics of multi-scale fusion, changeable data, simplicity and convenience in operation, accurate positioning and the like, and can well support dynamic visual expression. However, due to the dimension reduction processing of the information, the topological relation between the entities in the three-dimensional space is misjudged, and the effective understanding of the information is further influenced. In a visual representation method, the two-dimensional electronic sand table is mainly displayed by adopting a computer screen, curtain projection and other modes. The visual expression mode is convenient, efficient and clear, but the immersion feeling is difficult to bring to users.
The three-dimensional electronic sand table is gradually developed under the promotion of a virtual reality technology, and a space entity, an event, a relation, evolution and the like are displayed in a three-dimensional or even higher-dimensional visual expression mode by utilizing a computer graphic processing and displaying technology. Compared with a two-dimensional electronic sand table, the three-dimensional electronic sand table has obvious advantages. First, the three-dimensional electronic sand table is constructed in a three-dimensional mode, so that the integrity of information in the three-dimensional space expression is guaranteed, and the understandability of the information is greatly improved. Secondly, in a visual expression mode, the application of a stereoscopic projection display technology, a CAVE system and the like greatly improves the immersion feeling of users. And thirdly, the three-dimensional electronic sand table can be used for multi-view observation, and the spatial all-dimensional display and understanding of the sand table are facilitated.
The novel breakthrough of the helmet-type visualization and interaction of the virtual reality and the augmented reality provides a good opportunity for constructing the three-dimensional electronic sand table and improving the immersion effect of the three-dimensional electronic sand table. Currently, a variety of virtual reality or augmented reality oriented helmet devices are known in succession, such as HTC Vive, Oculus, 3Glasses, Hololens, storm goggles, wind tables, and the like. However, at present, the technology for constructing the helmet-type virtual electronic sand table and the enhanced electronic sand table is still in the research stage, some key technologies, methods, application modes and the like are not clear, and innovative exploration and practice are needed. Currently, a construction technology for a virtual electronic sand table or an enhanced electronic sand table respectively appears, and interactive linkage between the virtual electronic sand table and the enhanced electronic sand table is also explored, but the following obvious problems exist in research and exploration in the prior art: the linkage of the existing virtual electronic sand table and the enhanced electronic sand table is realized by adopting a data transmission mode, namely, three-dimensional electronic sand tables are firstly constructed in respective spaces, and then data synchronization under the two spaces is realized through parameter transmission, so that the two can not realize coupling on a unified physical space and can not support the collaborative interaction of the virtual/enhanced sand tables under the unified space-time environment. To solve the problem, an immersive virtual electronic sand table based on the coupling of virtual reality and augmented reality in a unified physical space needs to be constructed.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention provides an immersive three-dimensional electronic sand table, which is a platform for coupling and interacting a virtual electronic sand table and an enhanced electronic sand table, and can enable the virtual electronic sand table and the enhanced electronic sand table to be visualized and effectively coupled and interacted in a unified real physical space.
The technical scheme provided by the invention is as follows:
an immersive three-dimensional electronic sand table with coupled virtual reality and augmented reality, comprising: the system comprises a peripheral positioning sensor group, an inner peripheral positioning sensor group, a peripheral enhanced orientation identification tag group, an inner peripheral enhanced orientation identification tag group, a virtual/enhanced menu area, a virtual environment software control and drive module, an enhanced environment software control and drive module, a virtual/enhanced electronic sand table coupling three-dimensional visual expression area, a virtual reality helmet, an enhanced reality helmet and the like; the peripheral positioning sensor group and the inner positioning sensor group both comprise a plurality of positioning sensors; the positioning sensor can communicate with the driving module in a wireless mode and under the control of virtual environment software, and provides position coordinates and equipment identification codes of the positioning sensor; the peripheral enhanced direction identification tag group and the inner enhanced direction identification tag group both comprise a plurality of enhanced direction identification tags; the augmented orientation identification tag comprises a center point and orientation information of the augmented reality tag; the virtual environment software control and drive module is mainly used for acquiring sensor information, performing rapid calculation, rendering and generating a virtual environment, realizing man-machine interaction control and the like, and mainly completes the functions of reading information of an inner positioning sensor and a peripheral positioning sensor, converting real and virtual position coordinates, drawing a virtual console, drawing a virtual menu, visually expressing and interacting control a virtual electronic sand table, sending self interaction information, receiving interaction information of other operators and the like; the enhanced environment software control and drive module is mainly used for obtaining and calculating enhanced azimuth tag information, generating an enhanced electronic sand table, realizing human-computer interaction control and the like, and mainly completes functions of reading the enhanced azimuth tag information at the inner periphery, converting real and enhanced position coordinates, drawing an enhanced menu, controlling the interaction of the enhanced environment, drawing and visually expressing the enhanced sand table, sending the self interaction information of the enhanced environment, receiving the interaction information of other operators and the like; the virtual/enhanced electronic sand table coupling three-dimensional visual expression area is a display area of the immersive three-dimensional electronic sand table, the edge of the area is respectively determined by an inner periphery positioning sensor group and an inner periphery enhanced orientation identification tag group, the visual expression in the area has high coupling, and the space-time scale of the represented three-dimensional virtual space is consistent; the virtual reality helmet realizes the visualization of the virtual electronic sand table through a virtual reality technology, and a virtual electronic sand table operator wearing the virtual reality helmet realizes the operation and the use of the virtual electronic sand table under the action of a virtual environment software control and drive module; the augmented reality helmet is visual for the augmented electronic sand table, and an operator wearing the augmented reality helmet can operate and use the augmented electronic sand table under the action of the augmented environment software control and drive module.
When the method is implemented specifically, the immersive three-dimensional electronic sand table can be a platform in various shapes; the inner/outer enhanced position identification tag group corresponds to the inner/outer positioning sensor group. The peripheral positioning sensor group comprises a plurality of positioning sensors which are respectively arranged at the vertex of the corner of the outer edge of the platform and are mainly used for providing the position information of the control point of the outer edge for the three-dimensional drawing of the virtual electronic sand table platform, thereby limiting the peripheral boundary of the virtual electronic sand table platform. The inner periphery positioning sensor group comprises a plurality of positioning sensors which are respectively arranged at the corner vertex points at a certain distance from the outer edge of the platform and are mainly used for providing drawing edge control information for the multi-dimensional visual expression of the virtual electronic sand table so as to determine the three-dimensional visual expression area of the virtual electronic sand table. The positioning sensor can communicate with the driving module in a wireless mode and in a virtual reality electronic sand table control mode, and provides position coordinates and equipment identification codes of the positioning sensor.
The peripheral enhanced direction identification label group comprises a plurality of enhanced direction identification labels which are respectively placed at the vertex of the corner of the outer edge of the platform, the center point of the enhanced direction identification labels is superposed with the center point of the positioning sensor at the position, and the peripheral enhanced direction identification label group is used for providing position information of the control point of the outer edge for the three-dimensional drawing of the enhanced electronic sand table platform so as to limit the peripheral boundary of the enhanced electronic sand table platform. The inner periphery enhanced position identification label group comprises a plurality of enhanced position identification labels which are respectively arranged at a plurality of top points at a certain distance from the outer edge of the platform, the center points of the top points coincide with the center points of the positioning sensors at the top points, and the inner periphery enhanced position identification label group is mainly used for providing drawing edge control information for multi-dimensional visual expression of an enhanced electronic sand table and further determining a three-dimensional visual expression area of the enhanced electronic sand table. The augmented orientation identification tag is constructed by adopting a two-dimensional code, the two-dimensional code comprises a center point and orientation information of the tag, and the information can be automatically identified through the tag identification function of the augmented reality helmet.
The virtual/enhanced menu area is positioned between the periphery of the virtual/enhanced sand table and the edge of the three-dimensional visual expression area of the virtual/enhanced sand table. And calculating the starting and stopping positions of the virtual/enhanced menu area according to the width between the two edges and the vertex position information of the outer edge. The content and presentation form of the virtual/enhanced menu can be determined according to actual needs.
The virtual environment software control and drive module is mainly used for acquiring sensor information, performing rapid calculation, rendering and generating a virtual environment, realizing man-machine interaction control and the like, and mainly comprises functions of reading information of an internal and external periphery positioning sensor, converting real and virtual position coordinates, drawing a virtual console, drawing a virtual menu, visually expressing and interacting control a virtual electronic sand table, sending self interaction information, receiving interaction information of other operators and the like. The information reading of the inner and outer peripheral positioning sensors mainly completes the receiving of the original positioning information of the inner and outer peripheral sensors; the real and virtual position coordinate conversion converts the original positioning information of the sensor into coordinate position information under a unified virtual environment; the virtual control console drawing is carried out by taking the sensor position information and the identification code as the basis to construct a virtual electronic sand table platform; constructing a virtual operation menu suitable for situation interaction and control at a proper position by drawing the virtual menu; the visual expression of the virtual electronic sand table is in the range of the inner positioning area, and the three-dimensional dynamic visual expression, simulation and the like of the situation are completed; the interactive control mainly completes the interactive control function between the operator and the situation, and realizes the driving of the operation instruction of the operator to the situation and the dynamic feedback visualization expression driving of situation information and the like; the self-interaction information sending completes the collection and packaging of the information of self position, operation, action and the like, and then sends the information to other operators through the network; and the other operator interaction information receiving module is responsible for receiving information generated by operation interaction of other operators in the same network environment and transmitting the information to the visual expression functional module to finish the virtual expression of the operators.
The enhanced environment software control and drive module is mainly used for obtaining and calculating enhanced azimuth tag information, generating an enhanced electronic sand table, realizing human-computer interaction control and the like, and mainly comprises functions of reading the information of an enhanced azimuth tag on the inner periphery and the outer periphery, converting the coordinates of a real position and an enhanced position, drawing an enhanced menu, controlling the interaction of the enhanced environment, drawing and visually expressing the enhanced sand table, sending the self interaction information of the enhanced environment, receiving the interaction information of other operators and the like. The enhanced orientation label information acquisition and calculation are used for completing the scanning identification of the enhanced orientation label and acquiring the position and orientation information in the enhanced orientation label; the real and augmented position coordinate conversion converts the acquired position and orientation information into coordinate position information under a unified augmented environment; constructing an enhanced operation menu suitable for situation interaction and control at a proper position by the enhanced menu drawing; the reinforced sand table drawing and visual expression is completed by adopting a reinforced reality expression method to finish three-dimensional dynamic visual expression, simulation and the like of the situation in the area range determined by the inner periphery label; the interactive control module mainly completes the interactive control function between the operator and the situation, and realizes the driving of the operation instruction of the operator to the situation and the driving of dynamic feedback visualization expression such as situation information and the like. The self-interaction information sending completes the collection and packaging of the information of self position, operation, action and the like, and then sends the information to other operators through the network; and the other operator interaction information receiving module is responsible for receiving information generated by operation interaction of other operators in the same network environment and transmitting the information to the visual expression functional module to finish the virtual expression of the operators.
The virtual/enhanced electronic sand table coupling three-dimensional visual expression area is a common display area of the electronic sand tables in the virtual/enhanced modes. The edge of the area is respectively determined by an inner periphery positioning sensor group and an inner periphery enhanced direction identification tag group. In a real environment, the area may not have any device or object, and may also assist a certain location identification or physical plane. Based on the unified real physical space, the virtual electronic sand table and the enhanced electronic sand table are completely coupled in space.
The virtual reality helmet mainly adopts the virtual reality technology, realizes the visual to virtual electron sand table. The space positioning of the virtual reality helmet is matched with the positioning sensor, so that the virtual sand table platform in the virtual reality helmet is consistent with the position sensor in the real environment. An augmented reality helmet has similar functionality to a virtual reality helmet, but it is only visualized for an augmented electronic sand table.
The virtual electronic sand table operator is an operator of the virtual electronic sand table, wears a virtual reality helmet and realizes the operation and the use of the virtual electronic sand table under the action of a virtual environment software control and drive module. Similarly, the operator of the enhanced electronic sand table is an operator of the enhanced electronic sand table, wears an augmented reality helmet, and realizes the operation and use of the enhanced electronic sand table under the action of the enhanced environment software control and drive module.
When the platform for coupling and interacting the virtual electronic sand table and the enhanced electronic sand table works, at least the following multiple working modes can be provided:
(1) the method comprises the following steps of supporting a virtual electronic sand table operator to operate and use, namely only one operating user wears a virtual reality helmet to operate and use the virtual electronic sand table;
(2) the operation and use of a plurality of virtual electronic sand table operators are supported, namely a plurality of operation users simultaneously participate in the use of the platform, each participant wears a virtual reality helmet, people face the same virtual electronic sand table to carry out cooperative operation, and simultaneously, the operation actions and results of all the operators can be perceived and visually expressed by other people;
(3) an operator of the enhanced electronic sand table is supported for operation, namely only one operator wears the augmented reality helmet to operate and use the enhanced electronic sand table;
(4) the method comprises the steps that a plurality of operators of the enhanced electronic sand table are supported for operation, namely a plurality of operators participate in a platform at the same time, each participant wears an augmented reality helmet, people face the same enhanced electronic sand table to carry out cooperative operation, and meanwhile, operation results of all the operators can be perceived and visually expressed by other people;
(5) support a virtual electronic sand table operator and a plurality of enhancement electronic sand table operators and operate the use simultaneously, have a plurality of operation users to participate in the platform simultaneously and use promptly, wherein a participant wears the virtual reality helmet, and other people (more than one person) wear the augmented reality helmet. An operator wearing a virtual reality helmet sees a virtual electronic sand table, while an operator wearing an augmented reality helmet sees an augmented electronic sand table. Visualization regions of the virtual reality sand table and the augmented reality sand table are coupled within the virtual/augmented electronic sand table visualization expression region. The boundaries of the electronic sand tables in the two modes are consistent, the zooming-in and zooming-out are synchronous, and the operation is synchronous. The operation results in the two modes can be mutually sensed and visually expressed.
(6) Support a plurality of virtual electronic sand table operators and an enhancement electronic sand table operator and operate the use simultaneously, have a plurality of operation users to participate in the platform simultaneously and use promptly, wherein a participant wears the augmented reality helmet, and other people (more than one person) wear the virtual reality helmet. An operator wearing the virtual reality helmet sees a virtual electronic sand table, and an operator wearing the augmented reality helmet sees an augmented electronic sand table. Visualization regions of the virtual reality sand table and the augmented reality sand table are coupled within the virtual/augmented electronic sand table visualization expression region. The boundaries of the electronic sand tables in the two modes are consistent, the zooming-in and zooming-out are synchronous, and the operation is synchronous. The operation results in the two modes can be mutually sensed and visually expressed.
(7) Support a plurality of virtual electronic sand table operators and a plurality of enhancement electronic sand table operators and operate the use simultaneously, have a plurality of operation users to participate in the platform simultaneously and use promptly, wherein have a plurality of participants (more than one person) to wear the augmented reality helmet, other people (more than one person) wear the virtual reality helmet. An operator wearing a virtual reality helmet sees a virtual electronic sand table, while an operator wearing an augmented reality helmet sees an augmented electronic sand table. Visualization regions of the virtual reality sand table and the augmented reality sand table are coupled within the virtual/augmented electronic sand table visualization expression region. The boundaries of the electronic sand tables in the two modes are consistent, the zooming-in and zooming-out are synchronous, and the operation is synchronous. The operation results in the two modes can be mutually sensed and visually expressed.
Compared with the prior art, the invention has the beneficial effects that:
the invention provides an immersive three-dimensional electronic sand table platform for coupling and interacting a virtual electronic sand table and an enhanced electronic sand table, which can enable the virtual electronic sand table and the enhanced electronic sand table to be visualized and effectively fused and interacted in a unified real physical space. The invention has at least the following new characteristics and technical advantages:
firstly, based on a real physical space, a positioning sensor and an enhanced orientation identification tag are utilized to effectively align two spaces of a virtual electronic sand table and an enhanced electronic sand table, so that space consistency is achieved between the two spaces;
secondly, forming a virtual electronic sand table and an enhanced electronic sand table for coupling visual expression and interaction capacity under the action of space consistency;
thirdly, a control mode that a positioning sensor and an enhanced position identification tag are used as boundaries can be adaptive to the construction requirements of electronic sand tables with different shapes and sizes;
fourthly, the sand table can be used for multiple persons to respectively utilize two virtual or enhanced modes to carry out sand table collaborative interactive use, and also can be used for single persons to interactively use in one of the virtual or enhanced modes.
The electronic sand table can be used for aspects such as situation expression, space visualization, virtual-real coupling and the like, can provide visual visualization and interaction modes for battle command, city planning, emergency response, disaster rescue and the like, and improves the cognitive ability, the control ability and the like of users on research objects.
Drawings
FIG. 1 is a general structure diagram of an electronic sand table interactive platform with virtual and enhanced coupling;
FIG. 2 is a schematic view of a relationship between a position sensor and an enhanced orientation identification tag;
FIG. 3 is a schematic diagram of a virtual electronic sand table and an enhanced electronic sand table coupling and interaction platform control information link;
in fig. 1 to 3:
11-a first set of peripheral location sensors and enhanced orientation identification tags;
12-a second set of peripheral alignment sensors and enhanced orientation identification tags;
13-a third set of peripheral alignment sensors and enhanced orientation identification tags;
14-a fourth group of peripheral positioning sensors and enhanced orientation identification tags;
15-a first group of inner circumference positioning sensors and enhanced orientation identification tags;
16-a second group of inner circumference positioning sensors and enhanced orientation identification tags;
17-a third group of inner circumference positioning sensors and enhanced orientation identification tags;
18-a fourth group of inner circumference positioning sensors and enhanced direction identification labels;
11-1-protection and installation fittings;
11-2-enhanced orientation identification tag;
11-3-a positioning sensor;
21. 22, 23, 24-virtual/enhanced menu area;
30-virtual environment software control and drive module;
31, reading information of an inner periphery positioning sensor;
32-real/virtual position coordinate conversion;
33-drawing a virtual console;
34-drawing virtual menu;
35-visual expression of the virtual electronic sand table;
36-virtual environment interactive control;
37-sending self interactive information of the virtual environment;
40-enhancing an environment software control and drive module;
41-enhancing the information reading of the azimuth tag at the inner periphery;
42-real/augmented position coordinate transformation;
43-enhanced menu drawing;
44-enhancing sand table drawing and visual expression;
45-enhanced environmental interactive control;
46-enhanced environment self-interaction information sending;
47-other operator interaction information reception;
50-virtual reality helmets;
60-augmented reality helmets;
70-virtual/enhanced electronic sand table coupling three-dimensional visual expression area.
Detailed Description
The invention will be further described below by way of examples of implementation with reference to the accompanying drawings, without in any way limiting the scope of the invention. Further, the rectangular shape of the electronic sand table is presented below for illustration purposes only and is not intended to limit the claims of the present invention to various shapes of platforms.
The invention provides an immersive three-dimensional electronic sand table platform for coupling and interacting a virtual electronic sand table and an enhanced electronic sand table, which can enable the virtual electronic sand table and the enhanced electronic sand table to be visualized and effectively fused and interacted in a unified control platform and a display range (interaction space).
Fig. 1 is a schematic diagram of the general structure of a virtual and enhanced coupled immersive three-dimensional electronic sand table interaction platform provided by the present invention. In this embodiment, the present platform includes: the system comprises a peripheral positioning sensor group and a peripheral enhanced orientation identification tag group 11-14, an inner periphery positioning sensor group and an inner periphery enhanced orientation identification tag group 15-18, a virtual/enhanced menu area 21-24, a virtual environment software control and drive module 30, an enhanced environment software control and drive module 40, a virtual reality helmet 50, an enhanced reality helmet 60, a virtual/enhanced electronic sand table coupling three-dimensional visual expression area 70, a virtual electronic sand table operator, an enhanced electronic sand table operator and the like.
The inner/outer set of registration sensors is coupled to the inner/outer set of enhanced orientation identification tags, each set including a registration sensor and an enhanced orientation identification tag. For the convenience of installation and the protection effect on the sensor and the label, a protection and installation accessory is arranged below each group. FIG. 2 is a schematic diagram of a relationship between a position sensor and an enhanced orientation identification tag. The structural relationship of the position sensor, the enhanced orientation identification tag, and the protective and mounting accessory may be any of those shown in fig. 2. The position sensor may be placed above or below the enhanced position identification tag. However, in either case, the front pattern of the enhanced orientation identification tag must not be obscured in its entirety. The positioning sensor can be an active sensor such as a Bluetooth emission source and a WIFI emission source, and can also be a passive calculation positioning sensor such as an optical identification punctiform ball. The enhanced orientation identification tag can be a two-dimensional code or any pattern defined in advance.
In fig. 1, the area surrounded by the peripheral positioning sensor group and the peripheral enhanced orientation identification tag, and the area surrounded by the peripheral positioning sensor group and the peripheral enhanced orientation identification tag, can be constructed by a plurality of methods, such as the following three schemes: the method comprises the following steps of (1) completely surrounding wood, steel and the like to form an actual physical table to form a physical table top; secondly, part of the table top is surrounded by wood, steel and the like to form a part of a materialized table top; and thirdly, the sensor group and the enhanced direction identification tag group are only placed at 8 vertexes, and no real object is placed at other places to form a virtual table top.
Fig. 3 is a schematic diagram of a virtual electronic sand table and an enhanced electronic sand table coupling and interaction platform control information connection link. As shown in fig. 3, the virtual environment software control and drive module 30 includes functional sub-modules such as an inner and outer periphery positioning sensor information reading 31, a real/virtual position coordinate conversion 32, a virtual console drawing 33, a virtual menu drawing 34, a virtual electronic sand table visualization expression 35, a virtual environment interaction control 36, a virtual environment self-interaction information sending 37, and other operator interaction information receiving 38. The functions of the sub-modules can be realized by a computer, a singlechip and the like. The information reading 31 of the inner and outer peripheral positioning sensors completes the interpretation of the position through the procedures of data receiving, identification and the like under the support of the information receiving equipment. The real/virtual position coordinate conversion 32 is realized by background software, and is mainly realized by algorithms such as coordinate rotation and translation. The virtual console rendering 33 is built using three-dimensional graphics engines such as OpenGL, DirectX, Unity3D, non Engine, and the like. Meanwhile, the position of the virtual console is consistent with the position determined by the inner and outer periphery positioning sensors. The virtual menu rendering 34 is constructed using UI interface technology, and the style of the interface may be any of two-dimensional or three-dimensional. The virtual electronic sand table visual representation 35 is built using the same three-dimensional engine as the virtual console rendering 33. The visual representation may be planar or three-dimensional dynamic. The virtual environment interaction control 36 may be implemented in a variety of ways, such as a virtual mouse, capture of interactive actions, UI triggers, and the like. The self interactive information sending 37 of the virtual environment is a background communication program and can be realized through Socket; the link for transmission may be a wired network or a wireless network. The other operator interaction information receiving 38 is also a background communication program, and can be realized by Socket in the same way; the received link may also be a wired network or a wireless network.
The augmented environment software control and drive module 40 comprises functional sub-modules of internal and external augmented orientation tag information reading 41, reality/augmented position coordinate conversion 42, augmented menu drawing 43, augmented sand table drawing and visual expression 44, augmented environment interaction control 45, augmented environment self interaction information sending 46, other operator interaction information receiving 47 and the like. These sub-module functions may be implemented by a computer or an augmented reality helmet, etc. The tag information reading 31 of the inner and outer periphery augmented orientation is to use a tag scanning device and identification software attached to the augmented reality helmet to complete the scanning of the tag and the extraction of information such as the position, orientation, identification code and the like of the tag. The real/augmented position coordinate transformation 42 is mainly implemented by an algorithm such as coordinate rotation and translation, and may also be implemented by using an instant positioning and mapping function (SLAM) attached to the augmented reality helmet. The enhanced menu drawing 43 is constructed using UI interface technology, and the style of the interface may be any of two-dimensional or three-dimensional. The augmented sand table rendering and visualization representation 44 is constructed using an augmented reality oriented three-dimensional graphics engine, such as Unity 3D. The enhanced environmental interaction control 45 may be implemented in a variety of ways, such as visual, voice, gesture, UI trigger, and the like. The augmented environment self-interaction information sending 46 is a background communication program, and can be realized through a communication program of the augmented reality helmet, and a sending link can adopt a wireless network. And the other operator interaction information receiving 47 is also a background communication program, and can also be realized by adopting a communication program of an augmented reality helmet, and the received link can also be a wireless network.
In order to make the technical problems, technical solutions and advantageous effects solved by the present invention more apparent, the following description will explain the use of the present invention. In the following, the use of the platform is explained by taking two modes of respectively wearing a virtual reality helmet and an augmented reality helmet by multiple persons for cooperative operation. It is to be understood that the specific use described herein is merely illustrative of the invention and is not intended to be limiting.
The first step is as follows: the modules integrating the positioning sensor group and the enhanced direction identification tag group are arranged at 8 corner points of the table or directly placed in an empty area to form two rectangular boundaries as shown in figure 1.
The second step is that: and part of people wear the virtual reality helmet, and the helmet is adjusted to be more comfortable. The helmet is linked with a computer provided with a virtual environment software control and drive module.
The third step: some people wear the augmented reality helmet, and the helmet is adjusted to make it more comfortable.
The fourth step: opening a computer provided with a virtual environment software control and drive module, starting a virtual environment software control and drive module program, reading the position information of the positioning sensor, drawing a virtual electronic sand table platform, drawing a virtual menu in a virtual menu area, and drawing a virtual electronic sand table in a visual expression area.
The fifth step: and opening the augmented reality helmet, scanning the augmented orientation label, drawing an augmented electronic sand table platform, drawing an augmented menu in the virtual menu area, and drawing an augmented electronic sand table in the visual expression area.
And a sixth step: the virtual electronic sand table operator can enlarge, reduce, rotate, label, edit and the like the virtual electronic sand table through a virtual menu, a virtual mouse, action gestures, voice and the like, and can see the virtual avatar of other operators and the visual expression effect of the operation result. Meanwhile, an operator of the enhanced electronic sand table can enlarge, reduce, rotate, label and edit the enhanced electronic sand table through an enhanced menu, action gestures, visual watching, voice and the like, and can see the visual expression effect of the operation results of other personnel.
The seventh step: and after the operation is finished, the virtual electronic sand table operator quits the virtual environment, takes down the helmet and closes the computer. And simultaneously, the operator of the augmented electronic sand table exits the augmented environment and closes the augmented reality helmet.
It is noted that the disclosed embodiments are intended to aid in further understanding of the invention, but those skilled in the art will appreciate that: various substitutions and modifications are possible without departing from the spirit and scope of the invention and appended claims. Therefore, the invention should not be limited to the embodiments disclosed, but the scope of the invention is defined by the appended claims.

Claims (5)

1. An immersive three-dimensional electronic sand table with coupled virtual reality and augmented reality, wherein the immersive three-dimensional electronic sand table is an arbitrarily-shaped platform; the method comprises the following steps: the system comprises a peripheral positioning sensor group, an inner peripheral positioning sensor group, a peripheral enhanced orientation identification tag group, an inner peripheral enhanced orientation identification tag group, a virtual/enhanced menu area, a virtual environment software control and drive module, an enhanced environment software control and drive module, a virtual/enhanced electronic sand table coupling three-dimensional visual expression area, a virtual reality helmet and an enhanced reality helmet; the peripheral enhanced position identification tag group corresponds to the inner and outer peripheral positioning sensor groups;
wherein:
the peripheral positioning sensor group and the inner positioning sensor group both comprise a plurality of positioning sensors; the positioning sensor is communicated with the driving module through wireless mode and virtual environment software control, and provides position coordinates and equipment identification codes of the positioning sensor; the peripheral positioning sensor group comprises a plurality of positioning sensors which are respectively placed at the vertex of the corner of the outermost periphery of the platform and used for determining the outer edge of the virtual electronic sand table platform and realizing the superposition of the virtual electronic sand table and the physical table top; the positioning sensors included by the inner periphery positioning sensor group are respectively placed at the vertex of the inner periphery angle and used for determining the edge of the visual area of the virtual electronic sand table; the peripheral enhanced direction identification tag group and the inner enhanced direction identification tag group both comprise a plurality of enhanced direction identification tags; the augmented orientation identification tag comprises a center point and orientation information of the augmented reality tag;
the virtual environment software control and drive module is used for acquiring sensor information, performing rapid calculation, rendering and generating a virtual environment, realizing man-machine interaction control, and finishing functions of reading information of an inner positioning sensor and a peripheral positioning sensor, converting real and virtual position coordinates, drawing a virtual console, drawing a virtual menu, visually expressing and interacting control a virtual electronic sand table, sending self interaction information and receiving interaction information of other operators;
the enhanced environment software control and drive module is used for obtaining and calculating enhanced azimuth tag information, generating an enhanced electronic sand table, realizing human-computer interaction control and the like, and finishing functions of reading the enhanced azimuth tag information of the inner periphery, converting real and enhanced position coordinates, drawing an enhanced menu, controlling the enhanced environment interaction, drawing and visually expressing the enhanced sand table, sending the self interaction information of the enhanced environment and receiving the interaction information of other operators;
the virtual/enhanced electronic sand table coupling three-dimensional visual expression area is a display area of the immersive three-dimensional electronic sand table, and the edges of the area are respectively determined by an inner periphery positioning sensor group and an inner periphery enhanced orientation identification tag group; the visual expression in the region has high coupling, and the space-time scale of the represented three-dimensional virtual space is consistent;
the virtual reality helmet realizes the visualization of the virtual electronic sand table through a virtual reality technology, and a virtual electronic sand table operator wearing the virtual reality helmet realizes the operation and the use of the virtual electronic sand table under the action of a virtual environment software control and drive module; the augmented reality helmet is used for visualizing the augmented electronic sand table; an enhanced electronic sand table operator wearing the augmented reality helmet operates and uses the enhanced electronic sand table under the action of the enhanced environment software control and drive module.
2. The immersive three-dimensional electronic sand table as claimed in claim 1, wherein the peripheral positioning sensor group, the inner positioning sensor group, the peripheral enhanced orientation identification tag group and the inner enhanced orientation identification tag group are embedded on a real object table top of the electronic sand table, or are placed in a bracket manner without being supported by the table top.
3. The immersive three-dimensional electronic sand table as claimed in claim 1, wherein the virtual/enhanced menu region is disposed between an inner peripheral edge and an outer peripheral edge formed by the positioning sensor group or the enhanced orientation identification tag group for providing a human-machine interaction interface for the operation of the virtual electronic sand table and the enhanced electronic sand table.
4. The immersive three-dimensional electronic sand table as claimed in claim 1, wherein the coupling three-dimensional visual expression area of the virtual/enhanced electronic sand table is defined by an inner periphery positioning sensor group and/or an inner periphery enhanced orientation identification tag group, and is used for three-dimensional visual expression of the virtual electronic sand table and the enhanced electronic sand table.
5. The immersive three-dimensional electronic sand table of claim 1, wherein said immersive three-dimensional electronic sand table is used in a plurality of modes of operation, including: and multiple persons respectively use the virtual or enhanced modes to carry out the cooperative interactive use of the sand table, and a single person uses the virtual or enhanced mode to carry out the interactive use.
CN201711136750.1A 2017-11-16 2017-11-16 Immersive three-dimensional electronic sand table with virtual reality and augmented reality coupled Active CN107967054B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711136750.1A CN107967054B (en) 2017-11-16 2017-11-16 Immersive three-dimensional electronic sand table with virtual reality and augmented reality coupled

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711136750.1A CN107967054B (en) 2017-11-16 2017-11-16 Immersive three-dimensional electronic sand table with virtual reality and augmented reality coupled

Publications (2)

Publication Number Publication Date
CN107967054A CN107967054A (en) 2018-04-27
CN107967054B true CN107967054B (en) 2020-11-27

Family

ID=62001132

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711136750.1A Active CN107967054B (en) 2017-11-16 2017-11-16 Immersive three-dimensional electronic sand table with virtual reality and augmented reality coupled

Country Status (1)

Country Link
CN (1) CN107967054B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111104470A (en) * 2019-11-19 2020-05-05 青岛海信网络科技股份有限公司 Method and system for linkage of electronic sand table and emergency platform
CN111199561B (en) * 2020-01-14 2021-05-18 上海曼恒数字技术股份有限公司 Multi-person cooperative positioning method and system for virtual reality equipment
WO2022047768A1 (en) * 2020-09-07 2022-03-10 桂林旅游学院 Virtual experience system and method combining hololens and cave
CN114531582B (en) * 2020-11-02 2023-06-13 华为技术有限公司 Augmented reality function control method and electronic equipment
CN117351797A (en) * 2023-10-23 2024-01-05 江苏安承科技有限公司 Position real-time linkage system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102884490A (en) * 2010-03-05 2013-01-16 索尼电脑娱乐美国公司 Maintaining multiple views on a shared stable virtual space
CN103460256A (en) * 2011-03-29 2013-12-18 高通股份有限公司 Anchoring virtual images to real world surfaces in augmented reality systems
CN105027190A (en) * 2013-01-03 2015-11-04 美达公司 Extramissive spatial imaging digital eye glass for virtual or augmediated vision
CN105425955A (en) * 2015-11-06 2016-03-23 中国矿业大学 Multi-user immersive full-interactive virtual reality engineering training system
CN105679169A (en) * 2015-12-31 2016-06-15 中国神华能源股份有限公司 Railway electronic sand table system
CN106383578A (en) * 2016-09-13 2017-02-08 网易(杭州)网络有限公司 Virtual reality system, and virtual reality interaction apparatus and method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150123966A1 (en) * 2013-10-03 2015-05-07 Compedia - Software And Hardware Development Limited Interactive augmented virtual reality and perceptual computing platform

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102884490A (en) * 2010-03-05 2013-01-16 索尼电脑娱乐美国公司 Maintaining multiple views on a shared stable virtual space
CN103460256A (en) * 2011-03-29 2013-12-18 高通股份有限公司 Anchoring virtual images to real world surfaces in augmented reality systems
CN105027190A (en) * 2013-01-03 2015-11-04 美达公司 Extramissive spatial imaging digital eye glass for virtual or augmediated vision
CN105425955A (en) * 2015-11-06 2016-03-23 中国矿业大学 Multi-user immersive full-interactive virtual reality engineering training system
CN105679169A (en) * 2015-12-31 2016-06-15 中国神华能源股份有限公司 Railway electronic sand table system
CN106383578A (en) * 2016-09-13 2017-02-08 网易(杭州)网络有限公司 Virtual reality system, and virtual reality interaction apparatus and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
增强现实电子沙盘及关键技术研究;谭树人等;《系统仿真学报》;20071031;第4727-4730页 *

Also Published As

Publication number Publication date
CN107967054A (en) 2018-04-27

Similar Documents

Publication Publication Date Title
CN107967054B (en) Immersive three-dimensional electronic sand table with virtual reality and augmented reality coupled
CN103793060B (en) A kind of user interactive system and method
CN105981076B (en) Synthesize the construction of augmented reality environment
KR100953931B1 (en) System for constructing mixed reality and Method thereof
Portalés et al. Augmented reality and photogrammetry: A synergy to visualize physical and virtual city environments
US11244511B2 (en) Augmented reality method, system and terminal device of displaying and controlling virtual content via interaction device
CN102509349B (en) Fitting method based on mobile terminal, fitting device based on mobile terminal and mobile terminal
CN104915979A (en) System capable of realizing immersive virtual reality across mobile platforms
CN104035760A (en) System capable of realizing immersive virtual reality over mobile platforms
CN109859538A (en) A kind of key equipment training system and method based on mixed reality
CN107850936A (en) For the method and system for the virtual display for providing physical environment
CN110163942B (en) Image data processing method and device
CN104050859A (en) Interactive digital stereoscopic sand table system
CN106791778A (en) A kind of interior decoration design system based on AR virtual reality technologies
CN108765576B (en) OsgEarth-based VIVE virtual earth roaming browsing method
CN106294918A (en) A kind of method for designing of virtual transparence office system
CN103064514A (en) Method for achieving space menu in immersive virtual reality system
CN106817568A (en) A kind of augmented reality display methods and device
CN106846237A (en) A kind of enhancing implementation method based on Unity3D
CN111862333A (en) Content processing method and device based on augmented reality, terminal equipment and storage medium
Thomas et al. Spatial augmented reality—A tool for 3D data visualization
CN106527719A (en) House for sale investigation system based on AR (Augmented Reality) technology and real-time three-dimensional modeling
CN104680532A (en) Object labeling method and device
CN109658518A (en) A kind of three-dimensional object display methods, storage medium and computer based on augmented reality
CN106204746A (en) A kind of augmented reality system realizing 3D model live paint

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant