CN113630593A - Multi-mode high-precision full-space hybrid positioning system - Google Patents

Multi-mode high-precision full-space hybrid positioning system Download PDF

Info

Publication number
CN113630593A
CN113630593A CN202110941281.0A CN202110941281A CN113630593A CN 113630593 A CN113630593 A CN 113630593A CN 202110941281 A CN202110941281 A CN 202110941281A CN 113630593 A CN113630593 A CN 113630593A
Authority
CN
China
Prior art keywords
equipment
environment
positioning
terminal equipment
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110941281.0A
Other languages
Chinese (zh)
Inventor
毛威
姜孝吾
朱锦腾
祝可欣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ningbo Unknown Digital Information Technology Co ltd
Original Assignee
Ningbo Unknown Digital Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo Unknown Digital Information Technology Co ltd filed Critical Ningbo Unknown Digital Information Technology Co ltd
Priority to CN202110941281.0A priority Critical patent/CN113630593A/en
Publication of CN113630593A publication Critical patent/CN113630593A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
    • H04N13/279Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • H04L67/1001Protocols in which an application is distributed across nodes in the network for accessing one among a plurality of replicated servers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • H04L67/1095Replication or mirroring of data, e.g. scheduling or transport for data synchronisation between network nodes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/52Network services specially adapted for the location of the user terminal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a multi-mode high-precision full-space hybrid positioning system, which comprises a system environment, terminal equipment, server equipment and edge equipment, wherein the equipment terminal, the server equipment and the equipment terminal all rely on a communication system to complete bidirectional data interaction; the system environment comprises a physical environment and a virtual environment; the terminal equipment senses and processes the physical environment information so as to control the execution system to feed back or send the sensed information to the server-side equipment; the edge terminal equipment is used for assisting the terminal equipment in positioning and acquiring positioning data of the terminal equipment, and transmitting the positioning data to the server terminal equipment or transmitting the positioning data to the terminal equipment for secondary processing/matching; and the server-side equipment processes the received data so as to complete the updating of the virtual environment and the issuing of the execution instruction. By adopting the hybrid positioning system, the invention provides high-precision and strong-real-time pose acquisition for the virtual reality or augmented reality technology, and enhances experience and practicability.

Description

Multi-mode high-precision full-space hybrid positioning system
Technical Field
The invention relates to the technical field of mixed reality, in particular to a multi-mode high-precision full-space mixed positioning system.
Background
In the current virtual reality, augmented reality and mixed reality fields, the positioning is basically realized through single Inside-out terminal machine vision or Outside-in external equipment, the applicable scene is small, and the limiting conditions are many.
Under the mixed reality scene, the existing equipment in the market can not be accurately positioned in a larger space, can not embody the advantages and the characteristics of the mixed reality technology, and is difficult to be applied to the specific scene.
Under the mixed reality scene with multiple users and multiple virtual objects, the existing mixed reality positioning technology and interaction mode can not meet the requirements, and the capability of responding to the interaction execution feedback of multiple requests under the same virtual scene is lacked.
Disclosure of Invention
The problem to be solved by the invention is to position one or more mixed reality devices in physical spaces of different dimensions, so that one or more operators interact with corresponding virtual objects in the same continuous physical space.
In order to achieve the purpose, the invention provides the following technical scheme:
a multi-mode high-precision full-space hybrid positioning system comprises a system environment, terminal equipment, server equipment and edge equipment, wherein the equipment terminal, the server equipment and the equipment terminal all rely on a communication system to complete bidirectional data interaction;
the system environment is divided into a physical environment and a virtual environment, the physical environment is composed of a visual environment, a sound environment and an operator, and the virtual environment comprises a prefabricated digital twin environment and a real-time digital twin model; the virtual environment is carried in the terminal equipment or the server-side equipment;
the terminal equipment is carried by an operator and is provided with a sensing system, an operation system, an execution system and a communication system; the sensing system consists of sensors and is used for realizing sensing and man-machine interaction of a physical environment; the operation system controls the execution system to complete feedback operation according to the sensing system data, and completes external two-way data transmission by depending on the communication system;
the server-side equipment is equipment deployed in a system management center, and is loaded with a cloud computing system and a communication system, the communication system completes external bidirectional data transmission, the cloud computing system performs computing processing on received data according to data types, and virtual environment updating and debugging of edge-side equipment and terminal equipment are completed according to the processed data;
the edge terminal equipment is equipment deployed in a physical environment and is provided with a positioning system, a communication system and an edge computing system; the positioning system is divided into an active positioning system and a passive positioning device, and the active positioning system transmits the acquired positioning data of the terminal equipment to the server-side equipment for calculation processing through an edge calculation system or a communication system, or transmits the positioning data to the terminal equipment for secondary processing/matching; the passive positioning tag is used for acquiring and processing data by a sensor in the terminal equipment.
Preferably, the active positioning system is built by relying on UWB, Bluetooth, a GPS base station and infrared positioning, and the passive positioning device comprises an optical tag, a sound generator and a modulated light generator.
Preferably, the virtual environment is established by three-dimensional scanning, synchronous positioning and mapping, mapping and three-dimensional reconstruction methods, and the real-time digital twin model includes a virtual object corresponding to a real object and a virtual object corresponding to an operator.
Preferably, the execution system comprises a display unit, a feedback unit and a media unit, and is built by relying on an actuator mainly comprising a loudspeaker, a display and a motor.
The hybrid positioning system adopting the structure has the following advantages:
1. by the multi-mode mixed positioning method, high-precision and strong-real-time pose acquisition is provided for a virtual reality or augmented reality technology, and experience and practicability of the technology are enhanced.
2. The method can perform model selection and adjustment of different edge devices according to various different scenes, and a matched software system can also be matched with a hardware system for self-adaption, so that various application requirements are met.
Drawings
FIG. 1 is a component framework diagram of an embodiment of the present invention;
fig. 2 is a diagram illustrating a scenario according to an embodiment of the present invention.
Detailed Description
The technical scheme of the invention is further explained by combining the drawings and the embodiment.
Fig. 1 shows a multi-modal high-precision full-space hybrid positioning system, which includes a system environment, a terminal device, a server device, and an edge device. The equipment terminal, the server side equipment and the equipment terminal all rely on a communication system to complete bidirectional data interaction.
1. System environment
The system environment is divided into a physical environment and a virtual environment.
The physical environment includes a visual environment, a sound environment, and an operator. The operator is an object with the terminal equipment, and the terminal equipment is operated to interact with the edge terminal equipment. The operator may include multiple names, but must be connected to a network environment to complete the interaction between the operators.
The virtual environment is a cloud mirror image of the physical environment and is built by methods of three-dimensional scanning, synchronous positioning and map building, surveying and mapping, three-dimensional reconstruction and the like. The virtual environment comprises a prefabricated digital twin environment and a real-time digital twin model, the prefabricated digital twin environment is presented in a two-dimensional or three-dimensional reconstruction map, and the real-time digital twin model is a virtual object corresponding to a real object in the physical environment. The real-time digital twin model comprises a virtual object corresponding to a real object and a virtual object corresponding to an operator. The virtual environment exists in the terminal device or the server device.
2. Terminal device
The terminal device is a device carried by an operator, and is equipped with a sensing system, an arithmetic system, an execution system, and a communication system.
The sensing system comprises a physical quantity signal sensor mainly comprising optics, images and sounds, and man-machine interaction is carried out between the sensing system and an operator. The physical quantity signal sensor is specifically a physical quantity sensor, an image sensor and an acoustic sensor, wherein the physical quantity sensor and the image sensor collect visual environment information, and the physical quantity sensor and the acoustic sensor collect sound environment information.
The operation system is divided into an edge operation system and an upper operation system, and bidirectional data interaction is completed between the edge operation system and the upper operation system. The edge computing system is used for receiving the data transmitted by the sensing system and issuing a control instruction to the execution system. The upper operation system completes secondary processing of data transmitted by the edge calculation system, completes bidirectional data synchronization with the real-time digital twin model according to the secondary processing, and transmits the data to the communication system.
The execution system receives the instructions of the edge computing system and sends feedback information to the operator. The execution system is divided into a display unit, a feedback unit and a media unit and is realized by means of actuators such as a loudspeaker, a display and a motor. The feedback unit is used for providing basic data to the edge computing system.
The communication system receives the data transmitted by the upper operation system and sends the data received from other end equipment to the upper operation system. The communication system is realized by depending on Wi-Fi, Bluetooth, RFID and various USB interface technologies or hardware.
3. Server side equipment
The server device is a device deployed in a system management center, and is loaded with a cloud computing system, a virtual environment (or loaded on a terminal device), and a communication system.
The cloud computing system receives data from the terminal equipment and the edge terminal equipment through the communication system, performs computing processing on the data according to the data type, and forms execution instructions and condition updating data. And the execution instruction is used for being transmitted back to the terminal equipment and/or the edge terminal equipment, and the condition updating data is used for finishing the updating of the virtual environment.
4. Edge terminal device
The edge end equipment is equipment deployed in a physical environment, and an edge computing system, a communication system and a positioning system are loaded on the edge end equipment.
The positioning system is used for acquiring positioning data of the terminal equipment and comprises an active positioning system and a passive positioning device. The active positioning system mainly comprises UWB, Bluetooth, a GPS base station and infrared positioning. The passive positioning device mainly comprises an optical label, a sound generator and a modulation light generator. The active positioning system is deployed according to the size and the spatial form of the physical environment space and is used for detecting and positioning the terminal equipment. The passive positioning device obtains and processes data by a sensor in the terminal equipment.
The communication system transmits the positioning data to the server device for calculation processing through an edge calculation system or a wired or wireless transmission mode, or transmits the positioning data to the terminal device for secondary processing or matching.
As shown in the example scenario of fig. 2, operator a walks with a mixed reality terminal Ma in a museum Ea (physical environment) where a positioning system is installed. And the base stations Da, Db and Dc of the positioning system D in the edge end device acquire the coordinates of the Ma in the museum Ea and upload the coordinates to the service end device Sa. The image sensor of Ma recognizes the optical tag Ta of the edge-end device, Ma generates the corresponding virtual object Pa (real-time digital twin model) in the corresponding coordinates of the virtual scene Eb (pre-fabricated digital twin environment) with the physical coordinates of Ta in Ea, while synchronizing the coordinates of Pb in Eb to Sa. The coordinate data uploaded by Ma and D are then compared by Sa and the coordinates of Pb in Eb are recalculated and synchronized to Eb.
The above is a specific embodiment of the present invention, but the scope of the present invention should not be limited thereto. Any changes or substitutions that can be easily made by those skilled in the art within the technical scope of the present invention are included in the protection scope of the present invention, and therefore, the protection scope of the present invention is subject to the protection scope defined by the appended claims.

Claims (4)

1. The utility model provides a mixed positioning system of multimode high accuracy total space which characterized in that: the system comprises a system environment, terminal equipment, server equipment and edge equipment, wherein the equipment terminal, the server equipment and the edge equipment all rely on a communication system to complete bidirectional data interaction;
the system environment is divided into a physical environment and a virtual environment, the physical environment is composed of a visual environment, a sound environment and an operator, and the virtual environment comprises a prefabricated digital twin environment and a real-time digital twin model; the virtual environment is carried in the terminal equipment or the server-side equipment;
the terminal equipment is carried by an operator and is provided with a sensing system, an operation system, an execution system and a communication system; the sensing system consists of sensors and is used for realizing sensing and man-machine interaction of a physical environment; the operation system controls the execution system to complete feedback operation according to the sensing system data, and completes external two-way data transmission by depending on the communication system;
the server-side equipment is equipment deployed in a system management center, and is loaded with a cloud computing system and a communication system, the communication system completes external bidirectional data transmission, the cloud computing system performs computing processing on received data according to data types, and virtual environment updating and debugging of edge-side equipment and terminal equipment are completed according to the processed data;
the edge terminal equipment is equipment deployed in a physical environment and is provided with a positioning system, a communication system and an edge computing system; the positioning system is divided into an active positioning system and a passive positioning device, and the active positioning system transmits the acquired positioning data of the terminal equipment to the server-side equipment for calculation processing through an edge calculation system or a communication system, or transmits the positioning data to the terminal equipment for secondary processing/matching; the passive positioning tag is used for acquiring and processing data by a sensor in the terminal equipment.
2. The multi-modal high-precision full-space hybrid localization system of claim 1, wherein: the active positioning system is built by relying on UWB, Bluetooth, a GPS base station and infrared positioning, and the passive positioning device comprises an optical label, a sound generator and a modulation light generator.
3. The multi-modal high-precision full-space hybrid localization system of claim 1, wherein: the virtual environment is established through three-dimensional scanning, synchronous positioning and map building, mapping and three-dimensional reconstruction methods, and the real-time digital twin model comprises a virtual object corresponding to a real object and a virtual object corresponding to an operator.
4. The multi-modal high-precision full-space hybrid localization system of claim 1, wherein: the execution system comprises a display unit, a feedback unit and a media unit, and is built by relying on an actuator mainly comprising a loudspeaker, a display and a motor.
CN202110941281.0A 2021-08-17 2021-08-17 Multi-mode high-precision full-space hybrid positioning system Pending CN113630593A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110941281.0A CN113630593A (en) 2021-08-17 2021-08-17 Multi-mode high-precision full-space hybrid positioning system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110941281.0A CN113630593A (en) 2021-08-17 2021-08-17 Multi-mode high-precision full-space hybrid positioning system

Publications (1)

Publication Number Publication Date
CN113630593A true CN113630593A (en) 2021-11-09

Family

ID=78385985

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110941281.0A Pending CN113630593A (en) 2021-08-17 2021-08-17 Multi-mode high-precision full-space hybrid positioning system

Country Status (1)

Country Link
CN (1) CN113630593A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103810353A (en) * 2014-03-09 2014-05-21 杨智 Real scene mapping system and method in virtual reality
CN107167132A (en) * 2016-03-07 2017-09-15 上海积杉信息科技有限公司 Indoor locating system based on augmented reality and virtual reality
US20190073831A1 (en) * 2016-07-09 2019-03-07 Doubleme, Inc. Electronic System and Method for Three-Dimensional Mixed-Reality Space and Experience Construction and Sharing
CN110531846A (en) * 2018-05-24 2019-12-03 明日基金知识产权控股有限公司 The two-way real-time 3D interactive operation of real-time 3D virtual objects in the range of real-time 3D virtual world representing real world
CN110794955A (en) * 2018-08-02 2020-02-14 广东虚拟现实科技有限公司 Positioning tracking method, device, terminal equipment and computer readable storage medium
US20200265644A1 (en) * 2018-09-12 2020-08-20 Limited Liability Company "Transinzhkom" Method and system for generating merged reality images
CN112102500A (en) * 2019-06-18 2020-12-18 明日基金知识产权控股有限公司 Virtual presence system and method through converged reality
US20210043005A1 (en) * 2019-08-07 2021-02-11 Magic Leap, Inc. Spatial instructions and guides in mixed reality
CN112954292A (en) * 2021-01-26 2021-06-11 北京航天创智科技有限公司 Digital museum navigation system and method based on augmented reality

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103810353A (en) * 2014-03-09 2014-05-21 杨智 Real scene mapping system and method in virtual reality
CN107167132A (en) * 2016-03-07 2017-09-15 上海积杉信息科技有限公司 Indoor locating system based on augmented reality and virtual reality
US20190073831A1 (en) * 2016-07-09 2019-03-07 Doubleme, Inc. Electronic System and Method for Three-Dimensional Mixed-Reality Space and Experience Construction and Sharing
CN110531846A (en) * 2018-05-24 2019-12-03 明日基金知识产权控股有限公司 The two-way real-time 3D interactive operation of real-time 3D virtual objects in the range of real-time 3D virtual world representing real world
CN110794955A (en) * 2018-08-02 2020-02-14 广东虚拟现实科技有限公司 Positioning tracking method, device, terminal equipment and computer readable storage medium
US20200265644A1 (en) * 2018-09-12 2020-08-20 Limited Liability Company "Transinzhkom" Method and system for generating merged reality images
CN112102500A (en) * 2019-06-18 2020-12-18 明日基金知识产权控股有限公司 Virtual presence system and method through converged reality
US20210043005A1 (en) * 2019-08-07 2021-02-11 Magic Leap, Inc. Spatial instructions and guides in mixed reality
CN112954292A (en) * 2021-01-26 2021-06-11 北京航天创智科技有限公司 Digital museum navigation system and method based on augmented reality

Similar Documents

Publication Publication Date Title
CN106993181B (en) More VR/AR equipment collaboration systems and Synergistic method
KR20220027119A (en) System and method for monitoring field based augmented reality using digital twin
WO2015014018A1 (en) Indoor positioning and navigation method for mobile terminal based on image recognition technology
JP2020140696A (en) Method and apparatus for determining position attitude of bucket of drilling machine
WO2020042968A1 (en) Method for acquiring object information, device, and storage medium
WO2018076777A1 (en) Robot positioning method and device, and robot
KR20190059120A (en) Facility Inspection System using Augmented Reality based on IoT
CN107085857A (en) Power cable localization method, device and system
CN112655027A (en) Maintenance support system, maintenance support method, program, method for generating processed image, and processed image
CN110717994A (en) Method for realizing remote video interaction and related equipment
CN105373130A (en) Special device accident on-site information detection system based on stereo modeling
CN104569909A (en) Indoor positioning system and method
WO2019105009A1 (en) Method and system for synchronized scanning of space
KR20130134986A (en) Slam system and method for mobile robots with environment picture input from user
CN112528699A (en) Method and system for obtaining identification information of a device or its user in a scene
CN105183142A (en) Digital information reproduction method by means of space position nailing
CN204945671U (en) A kind of stage multimode synchronous control system
CN113630593A (en) Multi-mode high-precision full-space hybrid positioning system
CN112581630B (en) User interaction method and system
KR20220165948A (en) Method and system for remote collaboration
CN203135278U (en) Three-dimensional space real-time display transformer station robot inspection system
CN205610690U (en) Stage lighting controlling means based on thing networking
WO2024106661A1 (en) Robot and method for estimating position of robot
JP2020170482A (en) Work instruction system
CN113610987B (en) Mixed reality space labeling method and system based on three-dimensional reconstruction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20211109