CN113190113A - Ultra-wideband positioning virtual reality system and positioning method for realizing position and direction - Google Patents

Ultra-wideband positioning virtual reality system and positioning method for realizing position and direction Download PDF

Info

Publication number
CN113190113A
CN113190113A CN202110384173.8A CN202110384173A CN113190113A CN 113190113 A CN113190113 A CN 113190113A CN 202110384173 A CN202110384173 A CN 202110384173A CN 113190113 A CN113190113 A CN 113190113A
Authority
CN
China
Prior art keywords
positioning
uwb
uwb positioning
virtual
virtual image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110384173.8A
Other languages
Chinese (zh)
Inventor
陈振骐
肖家幸
严蓬蓬
陈润
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Nuoruixin Technology Co ltd
Original Assignee
Shenzhen Nuoruixin Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Nuoruixin Technology Co ltd filed Critical Shenzhen Nuoruixin Technology Co ltd
Priority to CN202110384173.8A priority Critical patent/CN113190113A/en
Publication of CN113190113A publication Critical patent/CN113190113A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/02Systems for determining distance or velocity not using reflection or reradiation using radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention relates to a virtual reality system for ultra-wideband positioning and a positioning method for realizing position and direction, belonging to the technical field of virtual reality and UWB positioning. The system comprises: one or more movable display devices, a virtual image processing device, a positioning data processing device, one or more UWB positioning anchor points with fixed positions in advance and one or more groups of movable UWB positioning labels; wherein, a group of UWB positioning labels comprises two or more UWB positioning labels. The method comprises the steps that each UWB positioning label periodically sends a positioning broadcast message or interacts with a UWB positioning anchor point to carry out positioning message interaction, and the position and direction positioning result of the movable display equipment is calculated; thereby transmitting the exactly matched virtual picture data. The invention has more stable virtual reality visual scene, thereby avoiding vertigo caused by long-term use of users and avoiding the condition that the interoperation feeling of the users in the virtual reality system is inconsistent.

Description

Ultra-wideband positioning virtual reality system and positioning method for realizing position and direction
Technical Field
The invention belongs to the technical field of virtual reality and UWB positioning, and particularly provides an ultra-wideband positioning virtual reality system and a positioning method for realizing position and direction.
Background
The Virtual Reality System (VR) is a high and new technology in the field of graphics and images, which is also called a smart environment technology or an artificial environment. The virtual reality is a virtual world which utilizes a computer to simulate and generate a three-dimensional space, provides the simulation of senses such as vision, hearing, touch and the like for a user, and enables the user to observe objects in the three-dimensional space in time without limitation as if the user is in his own environment. In addition to the virtual reality concept, there are some similar visual scenes, such as: augmented Reality (AR), Mixed Reality (MR), extended Reality (XR), and the like. The augmented reality technology AR is a technology for calculating the position and angle of a camera image in real time and adding corresponding images, videos and 3D models, and aims to overlap a virtual world on a screen in the real world and perform interaction. The mixed reality technology MR, namely including augmented reality and augmented virtual, is a further development of virtual reality technology, which is to build an interactive feedback information loop among the virtual world, the real world and the user by introducing real scene information into the virtual environment, so as to enhance the sense of reality experienced by the user. Augmented reality XR is a term used to create a real and virtual combined, human-machine interactive environment through computer technology and wearable devices. Extensions include multiple forms of virtual reality VR, augmented reality AR, mixed reality MR, and the like. In other words, XR is a generic term that includes VR, AR, MR in order to avoid confusion of concepts.
A Virtual Reality system (VR), also called Virtual Reality Platform, VR-Platform or VRP, generally comprises three parts of equipment, namely: 1) a head mounted display device HMD, 2) a VR host device, 3) a tracking system. The head-mounted display device HMD is a hardware device, generally a head-mounted hardware device and a display screen installed at a proper position, and the display screen is positioned in front of eyes of a user to allow the user to see AR or VR effects. VR host device is a device providing various functions for HMD, and is composed of a virtual environment database, an image processor, and an operating system, for example, an intelligent terminal such as a smart phone or a PC is used, and is generally placed in a space environment for implementing VR. The tracking system, typically as a peripheral to the HMD or integrated into the head mounted display device HMD, typically includes an acceleration sensor, a gyroscope, and a magnetometer. Virtual Reality Systems (VRs) may also include 4) VR handheld devices, primarily controllers that can be worn on the user's hand, through which the user can track his or her movements and gestures.
According to a classification method in "general specifications for virtual reality head-mounted display devices", a virtual reality head-mounted display device HMD may be mainly classified into three categories, that is: external virtual reality wear-type display device, integral type virtual reality wear-type display device and shell type virtual reality wear-type display device. The external virtual reality head-mounted display equipment needs to be connected with an external host, does not contain main computing power and only contains the head-mounted display equipment of a display system and a sensing system. The integrated virtual reality head-mounted display device integrates systems such as display, sensing and calculation into an integrated head-mounted device. Wherein, the calculating unit and the sensing unit may not be mounted on the head-mounted device, but connected with the display part mounted on the head-mounted device by wire or wirelessly. The shell type virtual reality head-mounted display equipment does not have a display and calculation system, is a head-mounted device only provided with an optical system and an optional sensing system, and realizes a complete head-mounted display function by combining with intelligent terminals such as an intelligent mobile communication terminal and the like.
The existing virtual reality system has the problem of visual scene stability, such as: because the tracking system of the virtual reality system uses insufficient positioning precision of the positioning technology, the frame pushed by the host according to the positioning result is unstable, namely: the visual scene of a user using the virtual reality system is unstable, so that the user wearing the virtual reality glasses for a long time can feel dizzy.
An Ultra Wide Band (UWB) technology is a wireless carrier communication technology, which does not use a sinusoidal carrier but uses nanosecond-level non-sinusoidal narrow pulses to transmit data, and thus, the occupied frequency spectrum range is Wide. The UWB technology has the advantages of low system complexity, low power spectrum density of transmitted signals, strong anti-interference performance, high safety, high positioning precision and the like, and is particularly suitable for high-speed wireless access in dense multipath places such as indoor places and the like. The UWB positioning technology is realized by adopting a positioning data processing module or equipment which stores various positioning algorithms, a UWB positioning anchor point and a UWB positioning tag to realize the accurate position positioning of the UWB positioning tag. The positioning time delay of a real-time positioning system based on the UWB positioning technology is about 2ms, the positioning data can be updated at the frequency of 500Hz theoretically, and the refreshing frequency of the real-time positioning system is far superior to the sampling frequency requirement of a position tracking system of not less than 60Hz in the current VR industry alliance specification T/IVRA 0001 supplement 2017. Many wireless location algorithms can be used in UWB location technology, and the more common location algorithms include TOF ranging algorithm, TDOA location algorithm, AOA location algorithm, and PDOA location algorithm.
The contents of the positioning data required by different positioning algorithms are different. Besides the co-required positioning anchor point coordinate data and the identification information of the positioning label, if the positioning data is a TOF positioning algorithm, the positioning data also comprises time data of sending a positioning message and receiving a positioning feedback message by a positioning initiator, and time data of receiving the positioning message and sending the positioning feedback message by the positioned party; if the positioning data is the TDOA positioning algorithm, the positioning data also comprises the coordinate data of each UWB positioning anchor point participating in positioning and the absolute time data of the UWB positioning broadcast signals reaching each UWB positioning anchor point; if the positioning data is the AOA positioning algorithm, the positioning data comprises position coordinate data of a UWB positioning anchor antenna 1 and an antenna 2, and two arrival angle data for receiving positioning broadcast messages are respectively detected on the UWB positioning anchor antenna 1 and the antenna 2; if the positioning data is the PDOA positioning algorithm, the positioning data further comprises the spacing distance data of two antennas on a UWB positioning anchor point, the phase data of a wireless signal sent by the UWB positioning tag and reaching the two antennas of the UWB positioning anchor point, the time data of sending a positioning message and the time data of receiving a positioning feedback message by a positioning initiator, and the time data of receiving the positioning message and the time data of sending the positioning feedback message by a positioning passive party.
The TOF positioning is a ranging algorithm for calculating the distance between a wireless signal and the TOF positioning based on the time of flight of the wireless signal in the air, and is also called a time-of-flight ranging method. The distance measurement formula is as follows:
Figure BDA0003014171360000031
in the above algorithm formula, D represents the distance between two points from the signal transmitting end to the signal receiving end, and T representsroundTime interval between transmission of signal to reception of signal, T, representing TOF location initiatorreplyRepresenting the time interval between the reception of the positioning signal and the transmission of the feedback positioning signal by the TOF positioning passive party, if the signal is received and then transmitted immediately without delay, then Treply0, c represents the speed of light.
TDOA location is a method of location using time differences. The positioning algorithm is to measure the time of the UWB positioning label positioning broadcast signal reaching the UWB positioning anchor point which is completely synchronized by a clock, and calculate the time difference of reaching different UWB positioning anchor points to determine the distance of a signal source (UWB positioning label). The distance from the signal source to each UWB positioning anchor point (the distance is taken as a radius to make a circle by taking the UWB positioning anchor point as a center) is utilized to determine the position of the signal source. However, the absolute time is generally difficult to measure, and by comparing the absolute time difference of the signal reaching each UWB positioning anchor point, a hyperbola with the UWB positioning anchor point as the focus and the distance difference as the major axis can be made, and the intersection point of the hyperbola is the position of the signal source, i.e., the coordinates of the unknown target UWB positioning tag can be obtained by calculating the TDOA calculation formula. In the case of three UWB location anchors, the TDOA is calculated as follows:
Figure BDA0003014171360000032
in the above algorithm formula, the coordinates of the UWB positioning anchor point 1 are (x)1,y1) The coordinates of the UWB positioning anchor point 2 are (x)2,y2) The coordinates of the UWB positioning anchor point 3 are (x)3,y3) The coordinates of the UWB positioning tag to be positioned are (x)M,yM) The time difference between the UWB positioning anchor point 1 and the UWB positioning anchor point 2 receiving the UWB positioning label and sending the positioning broadcast signal is ti2The time difference between the UWB positioning anchor point 1 and the UWB positioning anchor point 3 receiving the UWB positioning label and sending the positioning broadcast signal is t13The time difference between the UWB positioning anchor point 2 and the UWB positioning anchor point 3 receiving the UWB positioning label and sending the positioning broadcast signal is t23And c represents the speed of light.
The AOA positioning algorithm is an arrival angle ranging positioning algorithm, is a typical positioning algorithm based on ranging, senses the arrival direction of a signal transmitted by a UWB positioning label through a UWB positioning anchor point, calculates the relative position or angle between the UWB positioning label and the UWB positioning anchor point, and then calculates the position of an unknown UWB positioning label by utilizing a triangulation method or other methods. In the case of a single UWB positioning anchor point with known position coordinates of two receiving antennas, the formula of the AOA positioning algorithm is as follows:
Figure BDA0003014171360000033
Figure BDA0003014171360000034
in the above algorithm formula, the coordinates of the UWB positioning anchor point antenna 1 are (x)1,y1) The coordinates of the UWB positioning anchor point antenna 2 are (x)2,y2) The coordinates of the UWB positioning tag to be positioned are (x)M,yM) The angle of arrival of the UWB positioning anchor point antenna 1 is alpha1The angle of arrival of the UWB positioning anchor point antenna 2 is alpha2
PDOA is a signal arrival phase difference ranging algorithm that measures a phase difference between two receiving antennas and converts the phase difference into a time difference to determine the position of an unknown node. Two identical antennas which are separated by a distance d < lambda/2 (lambda represents the wavelength of a wireless signal) are arranged on the UWB positioning anchor point, and the phase difference of the signals on the UWB positioning label reaching the two antennas is in a range from-180 DEG to 180 deg. Converting the measured phase difference into a distance difference p, obtaining a distance r between the UWB positioning label and the UWB positioning anchor point by using the flight time, and finally obtaining the coordinates of the UWB positioning label as follows:
Figure BDA0003014171360000041
at present, some UWB positioning technologies are used to solve the problem of positioning a tracking system of a VR system, for example, patent application No. 201610474511.6, "VR positioning and tracking system based on ultra-wideband positioning and tracking method thereof", where the disclosed positioning system includes more than three ultra-wideband positioning base stations (or referred to as UWB positioning anchor points), VR infrastructure equipment, ultra-wideband positioning modules, positioning engines, and VR infrastructure operation, as shown in fig. 1; the ultra-wideband positioning base station comprises a base station code ID generating unit, a time synchronizing unit, a data processing unit, an ultra-wideband (UWB) transceiver, a high-precision clock, an antenna and a network data transmission unit; the VR basic equipment comprises a display screen, a processor, a direction sensor, a VR and VR handheld controller connecting line, a battery, a lens and an Ultra Wide Band (UWB) positioning module; the ultra-wideband (UWB) positioning module (or called UWB positioning tag) comprises an antenna, an ultra-wideband (UWB) transmitter and a high-precision clock; the ultra-wideband (UWB) positioning module comprises an image processing unit, a positioning engine and an image transmission unit. The method for tracking the VR basic equipment in the patent comprises the following steps:
the method comprises the following steps: each ultra-wideband positioning base station carries out wireless or wired clock synchronization through a time synchronization unit: keeping clock synchronization among all positioning base stations;
step two: VR basic equipment control ultra wide band location module sends the broadcast information of carrying out the location to ultra wide band location basic station: the VR basic equipment controls the ultra-wideband positioning module to send broadcast signals containing base station coding ID information to the ultra-wideband positioning base station in a staggered mode at regular intervals;
step three: the ultra-wideband positioning base station receives positioning information: an ultra-wideband positioning base station receives positioning broadcast information, a high-precision ultra-wideband receiving timestamp is provided through a high-precision clock, an ultra-wideband (UWB) transceiver sends the receiving timestamp information and the broadcast information to a data processing unit to obtain base station coding ID information, and transmits the receiving timestamp and the base station coding ID information to VR operation;
step four: the positioning engine firstly calculates the timestamp difference from the ultra-wideband positioning module to the ultra-wideband positioning base station, namely TDOA (time difference of arrival) measured value according to the received receiving timestamp information, and the specific coordinates of the ultra-wideband positioning module can be calculated by using the time difference of arrival;
step five: transmitting image information to VR basic equipment during VR basic operation: after the position information of VR basic equipment is acquired during the operation of a VR base, VR scenes and pictures are processed through a processor according to the position information, and then image information is transmitted to the VR basic equipment through a connecting line of a VR handheld controller; the direction sensor is used for collecting the direction of a user, and the VR and VR handheld controller connecting line is used for providing a transmission interface of image data and transmission of control information.
Based on the above description of the prior art, the ultra-wideband positioning module in the tracking method only performs the function of positioning the position, and the direction determination of the position is performed by the direction sensor. The direction sensor is also called direction sensor, and it measures the component force of a piece of heavy object (the heavy object and the piezoelectric plate are integrated) in the direction orthogonal to the gravity to determine the horizontal direction. "orientation sensor" for determining orientation is generally used to refer to a gyroscope or other instrument that can help identify orientation, and which requires calibration before or during use to determine its initial orientation or correct for deviations from its in-use orientation accumulation, and this calibration step is required each time it is used, which affects the user's experience. In addition, current orientation sensors, such as: the gyroscope, the acceleration sensor or the magnetometer and the like have the problem of accumulated errors in the using process, namely: after long-term use, the error of the direction judgment result is large, the real feeling of the virtual picture is influenced, and the vertigo feeling appears when the user uses the device for a long time. In the method in the aforementioned prior patent "VR positioning and tracking system based on ultra-wideband positioning and tracking method thereof", because a direction sensor is also used, the problem of error accumulation of the direction sensor on the direction judgment result cannot be solved, and after long-term use, the use effect of the virtual reality user is affected, and a feeling of dizziness is generated.
For example, in the existing VR application scenario, a single-user virtual reality visual scene is shown in fig. 2a and fig. 2B, and after a user wearing a head mounted display device HMD rotates in place from an initial position a to a target position B, as shown in fig. 2a, the virtual reality visual scene is seen because a certain error is accumulated by a direction sensor during the movement, which causes a certain deviation of a stationary virtual object in the virtual reality visual scene from a position a 'to a position B', as shown in fig. 2B, so that the displacement perceived by the user is inconsistent with the unintended movement of the stationary object in the virtual reality visual scene, resulting in a feeling of visual vertigo.
As another example, in the existing VR application scenario, the visual scenario of multi-user virtual reality is as shown in fig. 3a and 3B, the position and orientation sensors of the user a and the user B wearing the head mounted display device HMD are independent, as shown in fig. 3a, a certain directional error is accumulated between them after a period of use, resulting in user a seeing something in the virtual reality visual scene at position a ', while user B sees the same thing in the virtual reality visual scene at position B', as shown in fig. 3b, so that different users feel that there is an inconsistency in the location of the same object, the interactive operation feeling of the virtual reality multi-user scene is influenced, and the interaction between different users cannot be realized under the condition, namely, the multi-user interactive virtual scene game and other related VR applications needing multi-user interaction cannot be realized.
Because the direction sensor has the accumulative error problem in the direction positioning in use at present, the problem of the accumulative error of the direction positioning needs to be solved by adding a calibration step in use, and the experience effect of a user is influenced by the calibration step, so that certain difficulty is generated in the aspect of popularization of a virtual reality system adopting the direction sensor.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides an ultra-wideband positioning virtual reality system and a method for realizing the position and the direction.
The invention provides an ultra-wideband positioning virtual reality system, which is characterized by comprising: one or more movable display devices, a virtual image processing device, a positioning data processing device, one or more UWB positioning anchor points with fixed positions in advance and one or more groups of movable UWB positioning labels; the group of UWB positioning tags comprises two or more UWB positioning tags; the mobile display equipment is directly connected with the virtual image processing equipment through a wire; the virtual image processing equipment is directly connected with the positioning data processing equipment through a wire; a group of UWB positioning tags are arranged on different spatial directions of each movable display device, and each UWB positioning tag of the group is directly or indirectly connected with the virtual image processing device through a wire and is directly or indirectly connected with the positioning data processing device through a wire; the UWB positioning anchor point is directly connected with the positioning data processing equipment through a wire, each UWB positioning label and UWB positioning anchor point are respectively provided with an antenna, and each UWB positioning label is directly and wirelessly connected with the UWB positioning anchor point through the antenna; a set of UWB positioning tags are fixed in position relative to a positioning object on the mobile display device.
The invention provides a positioning method for realizing position and direction based on the system, which is characterized by comprising the following steps:
step 1: starting an ultra-wideband positioning virtual reality system and ensuring clock synchronization in the system; the clock synchronization process is realized by the interaction between a UWB positioning label in the system and a clock synchronization client module in a UWB positioning anchor point and a clock synchronization server module of the virtual image processing equipment according to a network synchronization protocol or a precise clock synchronization protocol;
step 2: each UWB positioning label sends positioning broadcast message based on the selected positioning algorithm period or carries out positioning message interaction with a UWB positioning anchor point, and the positioning message comprises positioning anchor point coordinate data, identification information of each positioning label and other positioning data required by the positioning algorithm; the UWB positioning anchor point acquires the required information and sends the information to the given bit data processing equipment for positioning calculation; and positioning the accurate position coordinates of a group of UWB positioning tags on the movable display equipment in real time by the positioning data processing equipment in the same positioning time period, and calculating the positioning result of the position and the direction of the positioning target on the movable display equipment according to the spatial data.
Step 3: the positioning data processing equipment sends the positioning result of the position and the direction of the movable display equipment to the virtual image processing equipment in real time, and the virtual image processing equipment sends accurately matched virtual picture data to the movable display equipment in real time based on the received positioning result data of the position and the direction of the movable display equipment and a virtual reality space coordinate system.
Step 4: and the movable display equipment receives the virtual picture data sent by the virtual image processing equipment and projects the virtual picture data to a display screen of the movable display equipment.
Step 5: and repeating the processing in steps 2-4 according to the set refreshing frequency until the virtual reality system is closed, so that the virtual picture obtained by the movable display equipment in real time at any moment can accurately reflect the visual scene corresponding to the real-time moving state of the movable display equipment.
The invention has the following characteristics and beneficial effects:
1. the virtual reality system of the invention does not need to be configured with a direction sensor and a tracking system in the traditional VR system, has simple structure, can more accurately realize the positioning of the position and the direction of a positioning target on the mobile display equipment or the controller only by arranging a plurality of UWB positioning tags on the mobile display equipment or the controller, and ensures that the position and the direction of the positioning target are more accurately matched with a virtual picture image, namely: the virtual reality visual scene is fixed, and the change of the visual scene caused by the movement of VR head-mounted display equipment is avoided, so that a direction sensor and a tracking system in the conventional virtual reality VR system are not needed, and the effect is better;
2. the method can save the step of calibrating the position of the direction sensor in the existing virtual reality system, and is beneficial to improving the experience satisfaction degree in the use of the virtual reality VR;
3. the UWB positioning method has no accumulated error, so that the consistency of the direction and the position of each virtual reality VR visual scene can be kept in real time in a multi-user virtual reality VR scene and an interactive scene under the condition of no real-time calibration, and the consistency of the interactive experience effect of the multi-user virtual reality VR scene is ensured.
4. According to the UWB-based positioning method, the positioning frequency is higher than that of the positioning frequency of the sensor, so that the virtual reality VR frame can be refreshed faster, and the playing experience of the virtual frame of the virtual reality VR user is smoother.
Drawings
FIG. 1 is a block diagram of a positioning and tracking system in the "VR positioning and tracking system based on ultra-wideband positioning and its positioning and tracking method";
FIGS. 2a and 2b are schematic diagrams of a single-user virtual reality visual scene of the prior art;
FIGS. 3a and 3b are schematic diagrams of a multi-user virtual reality visual scene of the prior art;
FIG. 4 is a diagram of a virtual reality system according to a first embodiment of the present invention;
FIG. 5 is a complete flow chart of a first implementation of the method of the present invention;
fig. 6 is a virtual reality system composition diagram according to embodiment 1 of the present invention;
FIG. 7 is a schematic diagram of a single-user virtual reality environment scene in embodiment 1 of the present invention;
fig. 8 is a schematic view of a visual scene of a virtual reality environment scene in embodiment 1 of the present invention;
fig. 9 is a virtual reality system composition diagram according to embodiment 2 of the present invention;
FIG. 10 is a schematic diagram of a single-user virtual reality environment scene in embodiment 2 of the present invention;
fig. 11 is a schematic view of a visual scene of a virtual reality environment scene in embodiment 2 of the present invention;
fig. 12 is a virtual reality system composition diagram according to embodiment 3 of the present invention;
fig. 13 is a schematic view of a multi-user virtual reality environment scene according to embodiment 3 of the present invention.
Detailed Description
The invention provides an ultra-wideband positioning virtual reality system and a positioning method for realizing position and direction, which are described in the following by combining the attached drawings and a specific implementation mode:
as shown in fig. 4, the first ultra-wideband positioning virtual reality system provided by the present invention includes: one or more movable display devices, a virtual image processing device, a positioning data processing device, one or more UWB positioning anchor points with fixed positions in advance and one or more groups of movable UWB positioning labels. Wherein, a group of UWB positioning labels comprises two or more UWB positioning labels.
The mobile display equipment is directly connected with the virtual image processing equipment through a wire; the virtual image processing equipment is directly connected with the positioning data processing equipment through a wire; a group of UWB positioning tags are arranged on different spatial directions of each movable display device, and each UWB positioning tag of the group is directly or indirectly connected with the virtual image processing device through a wire and is directly or indirectly connected with the positioning data processing device through a wire; the UWB positioning anchor point is directly connected with the positioning data processing equipment through a wire, each UWB positioning label and UWB positioning anchor point are respectively provided with an antenna, and each UWB positioning label and UWB positioning anchor point are directly connected wirelessly through the antenna; a set of UWB positioning tags are fixed in position relative to a positioning target on the removable display device.
The specific functions and module compositions of each component of the VR system are described as follows:
the UWB positioning anchor point is a wireless communication service device which can send or receive corresponding positioning information required by a selected positioning algorithm, and is matched with a UWB positioning label to complete the positioning of the positioning target position and direction of the movable display device.
A UWB positioning tag is a wireless communication terminal device that can transmit or receive positioning messages carrying information about positioning required based on a particular selected positioning algorithm. A group of UWB positioning tags in the system of the invention are two or more UWB positioning tags arranged on a movable display device (how many groups of UWB positioning tags are arranged according to how many movable display devices in the virtual reality system are). And each group of UWB positioning labels are matched with all UWB positioning anchor points to complete the positioning of the positioning target on the movable display equipment. The system mainly comprises a positioning communication terminal module, a wireless transceiver module, an antenna module, a clock synchronization client module, a power supply module, a wired communication interface and other devices.
The movable display device is an output device that displays the received virtual image on a display screen. The system mainly comprises an image output module, a communication module, a power supply module and one or more communication interfaces, wherein the image output module comprises a display screen, a lens and the like which are used as positioning targets, the communication module is used for receiving and transmitting virtual reality data, and the communication interfaces are used for communicating with the power supply module. The virtual image display device is used for displaying the received virtual image on the display screen of the virtual image display device and fixing the display screen in front of the eyes of a user to output the virtual image.
The virtual image processing device is a device which obtains the positioning result of the position and the direction of the positioning target based on the UWB positioning label and the UWB positioning anchor point to match the virtual environment coordinate system and generate the virtual environment image of the position corresponding to the positioning result. The system mainly comprises a virtual image processing module capable of performing virtual image processing based on position and direction positioning results, a virtual environment database module, a UWB communication module for receiving and transmitting data, a clock synchronization server module, a power supply module, one or more communication interfaces and the like. The method can be realized by adopting intelligent terminal equipment such as a smart phone/computer host with a UWB module.
The components are basically the same as those of the known VR system based on ultra-wideband positioning in composition and function.
The positioning data processing equipment of the invention calculates the position of each UWB positioning label in a certain group in real time based on the received positioning data from the UWB positioning anchor point or the UWB positioning label and based on a positioning algorithm calculation formula, then calculates the positioning result of the position and the direction of the positioning target of the movable display equipment according to the space data of the UWB positioning label which is configured in advance for the positioning target on the movable display equipment and the position of which is relatively fixed, and can send the positioning result to the virtual image processing equipment. The spatial data here refers to the number identification information of the movable display device and the number identification information of the relative position of the UWB positioning tag and the movable display device, or a number identification information obtained by combining two number identification information. The positioning data processing device mainly comprises a storage module for storing a wireless positioning algorithm, a database module for storing spatial data in advance, a positioning data processing module, a communication module, a power supply module, one or more communication interfaces and the like. The location algorithms stored in the storage module may include any one or more of TOF location algorithms, TDOA location algorithms, AOA location algorithms, PDOA location algorithms, and the like, indoor location algorithms.
The virtual reality system components in the invention can form different final VR system embodiment forms by different combination and integration modes, such as: the mobile display device and each UWB positioning tag in a corresponding set of UWB positioning tags may be integrated in one device, and the virtual image processing device and the positioning data processing device may be integrated in one device.
The overall flow of the method for realizing the positioning of the position and the direction by the first virtual system is shown in fig. 5, and the method comprises the following detailed steps:
step 1: starting an ultra-wideband positioning virtual reality system and ensuring clock synchronization in the system; the clock synchronization process is realized by the interaction between a UWB positioning label in the system and a clock synchronization client module in a UWB positioning anchor point and a clock synchronization server module of the virtual image processing equipment according to the existing clock synchronization technologies such as network synchronization protocol (NTP) or IEEE 1588 precision clock synchronization protocol and the like, and the clock synchronization processing is realized;
step 2: each UWB positioning label sends positioning broadcast message based on the selected positioning algorithm period or carries out positioning message interaction with a UWB positioning anchor point, and the positioning message comprises positioning anchor point coordinate data, identification information of each positioning label and other positioning data required by the positioning algorithm; the UWB positioning anchor point acquires the required information and sends the information to the given bit data processing equipment for positioning calculation; positioning data processing equipment positions accurate position coordinates of a group of UWB positioning tags on the movable display equipment in real time in the same positioning time period, and a positioning result of the position and the direction of a positioning target on the movable display equipment is calculated according to spatial data; (because the UWB positioning tag identification information includes the number corresponding to the mobile display device and the location number of each positioning tag, and the location number represents the spatial data of each positioning tag and the location target (display screen) on the mobile display device at a spatially fixed position, the coordinates of the location target (display screen) on the mobile display device and the orientation of the location target (display screen) can be determined in real time by the real-time location coordinates of the UWB positioning tags of the group).
The positioning algorithms selected by this step may include TDOA, AOA, PDOA, and other positioning algorithms.
Step 3: the positioning data processing equipment sends the positioning result of the position and the direction of the movable display equipment to the virtual image processing equipment in real time, and the virtual image processing equipment sends accurately matched virtual picture data to the movable display equipment in real time based on the received positioning result data of the position and the direction of the movable display equipment and a virtual reality space coordinate system.
Step 4: and the movable display equipment receives the virtual picture data sent by the virtual image processing equipment and projects the virtual picture data to a display screen of the movable display equipment.
Step 5: and repeating the processing in steps 2-4 according to the set refreshing frequency until the virtual reality system is closed, so that the virtual picture obtained by the movable display equipment in real time at any moment can accurately reflect the visual scene corresponding to the real-time moving state of the movable display equipment.
The method can bring an immersive virtual reality space visual perception effect to a user using the virtual reality system.
The invention has the following beneficial effects:
1. the invention sets a plurality of UWB positioning labels on the mobile display equipment, so that the position of the mobile display equipment and the direction of the display screen can be more accurately matched with a more accurate virtual picture image, namely: the visual scene of the virtual reality can be fixed and does not change due to the movement of the movable display equipment;
2. the virtual reality system replaces a position sensor and a direction sensor in the original virtual reality system, and overcomes the dizzy feeling problem of the virtual reality;
3. the invention has no problem of accumulative error, thereby needing no calibration step;
4. based on the characteristic that the UWB positioning frequency is higher than the positioning frequency of the direction sensor, the VR frame can be refreshed more quickly, and the playing experience of the virtual frame of the virtual reality is smoother.
The invention provides a virtual reality system based on UWB positioning and an implementation method thereof. Wherein, a group of controllers can comprise one or more than one controller, and the position change of the controller and the operation of the instruction are used for feeding back to the virtual reality world to realize the actions such as interaction and the like; the set of controllers is present in a kit with a mobile display device, each controller is directly wired to the virtual image processing device, a set of UWB positioning tags is mounted on each controller, the set of UWB positioning tags comprises one or more UWB positioning tags, and the controller is directly wired to the UWB positioning tags.
The functions of the second system and the added devices of the first system are described as follows:
the controller is a handheld or wearable device for the virtual reality system to track the actions and gestures or body gestures of a virtual reality user, such as: the virtual reality data controller glove, the virtual reality three-dimensional controller mouse, the virtual reality controller ring and the like are fed back to the virtual reality world through the position change of the controller and the operation of instructions, and the interaction and other actions are realized. The controller mainly comprises a control input module, a control processing module, a control output module, a power supply module, a communication interface and other devices. The set of controllers is present in a kit with a mobile display device, in which one or more controllers may be present depending on the application scenario. The controllers are directly in wired connection with the virtual image processing device, a group of UWB positioning tags are installed on each controller, the group of UWB positioning tags comprises one or more UWB positioning tags, and the controllers are in wired connection with the UWB positioning tags.
In addition, the virtual image processing apparatus and the positional data processing apparatus of the above system may be plural.
The plurality of virtual image processing devices respectively correspond to the plurality of movable display devices, namely: each movable display device is bound with a virtual image processing device, and each virtual image processing device only processes the processing of the virtual image related to the corresponding movable display device. The functions and constituent modules of the single virtual image processing apparatus are the same as those of the virtual image processing apparatus in the first system.
The plurality of positioning data processing devices also correspond to the plurality of movable display devices, namely: each movable display device is bound with a positioning data processing device, each positioning data processing device only processes corresponding movable display device related positioning data, and a plurality of positioning data processing devices are connected with one or more UWB positioning anchors in the system through wires and acquire corresponding movable display device related positioning data from the UWB positioning anchors. The function and the constituent modules of the single positioning data processing device are the same as those of the positioning data processing device in the first system.
The components in the second ultra-wideband positioning virtual reality system of the present invention can form different final virtual reality system embodiments through different combination and integration manners, such as: on the basis of the existing integrated combination mode of the first UWB-positioning-based virtual realization system, a certain movable display device, a certain virtual image processing device corresponding to the movable display device, a certain positioning data processing device and a certain group of UWB positioning tags can be integrated in one device, if a plurality of movable display devices are arranged in the system, the corresponding number of the integrated devices can be corresponding, and a controller and the corresponding group of UWB positioning tags can be integrated in one device.
The method for implementing location and direction positioning of the second virtual system is different from the first implementation method in the following ways:
step1 further comprises a group of UWB positioning tags installed on each controller in each group of controllers and carrying out clock synchronization with the virtual image processing device;
step2 also includes that a group of UWB positioning labels on the controller and the relative position of the controller are numbered correspondingly in advance, the attitude of the controller is positioned according to the relative position of the group of UWB positioning labels and the positioning result obtained according to the corresponding position number, the attitude comprises the position and direction positioning result, and the attitude is mapped to the virtual reality visual scene, so that the realistic visual feeling that the virtual reality is consistent with the expectation is realized.
Step3, sending the position and direction positioning result of the controller to the virtual image processing device, sending the accurate matched virtual image data to the movable display device in real time by the virtual image processing device based on the received position and direction positioning result data of the controller and the virtual reality space coordinate system, sending the control instruction input to the controller to the virtual image processing device through wired connection after being processed by the controller, and generating virtual image data of corresponding operation by the virtual image processing device according to the received control instruction and sending the virtual image data to the movable display device;
step4, the movable virtual display device receives the virtual picture data which are correspondingly and accurately matched by the controller and generates a virtual image corresponding to the operation based on the control instruction sent by the controller, and then projects the virtual image onto the display screen of the movable display device.
Step 5: as in step5 of the first implementation.
The implementation method adds the following effects on the basis of the first implementation method:
a group of UWB positioning tags are arranged on the controller, so that the position and the direction of the positioning controller can be more accurately realized, and a corresponding virtual effect can be formed in virtual reality by combining an instruction sent by the controller in the prior art, so that more accurate matching of virtual picture images and more vivid interactive operation effects are realized, and the UWB positioning tags can also be used in a multi-user virtual realization scene.
On the basis of the above two kinds of virtual reality systems and implementation methods based on UWB positioning, the present invention specifically describes the systems and implementation methods through the following 3 embodiments, and the following various embodiments are used to illustrate the content of the present invention, but not to limit the scope of the present invention.
Embodiment 1 is a virtual reality system and an implementation method for a single-user scene, which are described below with reference to the accompanying drawings and specific description systems and implementation methods: the VR system of this embodiment 1 is configured as shown in fig. 6, and includes: the device comprises a movable display device, a virtual image processing device, a positioning data processing device, three UWB positioning anchor points and four UWB positioning labels. The four UWB positioning labels are connected with the movable display equipment and are fixed in relative positions, and the three UWB positioning anchor points are fixed in positions and have known corresponding position coordinates.
Wherein, portable display device directly passes through wired communication interface connection with virtual image processing equipment, every UWB location anchor point directly passes through wired communication interface connection with location data processing equipment, location data processing equipment passes through wired communication interface connection with virtual image processing equipment, every UWB location anchor point passes through wired communication interface indirect connection with virtual image processing equipment, every UWB location label passes through wired communication interface connection with portable display device, every UWB location label passes through portable display device indirect wired connection with virtual image processing equipment, every UWB location label passes through wireless communication interface connection with every UWB location anchor point. The four UWB positioning tags are respectively installed on the movable display according to the front, back, left and right directions of the display screen of the movable display device and are identified, and the UWB positioning tags at different positions correspond to the front, back, left and right directions in the virtual reality environment in advance and are bound with the identifications corresponding to the UWB positioning tags. In addition, the distance between every two UWB positioning tags in the four UWB positioning tags is required to be more than twice the positioning accuracy of the UWB positioning technology used, for example: if the UWB positioning accuracy is 5 cm, the distance between every two UWB positioning tags is more than 10 cm.
The specific components and implementation functions of each component of the VR system are introduced as follows:
the composition and function of the mobile display device, the virtual image processing device, the positioning data processing device, the UWB positioning anchor point and the UWB positioning tag in this embodiment 1 are substantially the same as those of the first UWB-positioned virtual reality VR system, and the following descriptions are respectively given in the places where there is a difference:
the positioning algorithm supported by the UWB positioning anchor point on the basis of the corresponding entity of the first UWB positioning virtual reality VR system is specifically a TDOA positioning algorithm, wherein the positioning communication base station module is a positioning communication base station module for determining and supporting the processing of the TDOA positioning algorithm.
The UWB positioning tag is a wireless communication terminal device which can send a positioning broadcast message carrying a TDOA-based positioning algorithm on the basis of a first UWB positioning virtual reality VR system corresponding entity, wherein the positioning communication terminal module is a positioning communication terminal module which is determined to support the TDOA positioning algorithm processing.
The movable display device is specifically composed of five determined wired communication interfaces on the basis of a first UWB-positioned virtual reality VR system corresponding entity, wherein one wired communication interface is connected with the virtual image processing device, and the other four wired communication interfaces are connected with the UWB positioning tag.
The virtual image processing device is specifically determined to be composed of two wired communication interfaces connected with the movable display device and the positioning data processing device on the basis of a first UWB-positioned virtual reality VR system corresponding entity.
The positioning data processing equipment specifically determines a stored positioning algorithm as a TDOA (time difference of arrival) positioning algorithm on the basis of a corresponding entity of a first UWB-positioned virtual reality VR (virtual reality) system, the number of wired communication interfaces supported by the positioning data processing equipment is determined to be four, one wired communication interface is used for being connected with virtual image processing equipment, and the remaining three wired communication interfaces are directly connected with three UWB positioning anchors.
The virtual scene of the present embodiment is schematically illustrated in fig. 7, where a block diagram with a grid represents a virtual implementation image generated in a virtual reality environment (an actual reality environment is absent), where an illustration of a solid-line frame part is a component real object part of a VR system in the actual environment. In FIG. 7, 101 and 101 'represent movable display devices, 111 represents a virtual image processing device, 121-123 represents three UWB positioning anchor points, 124 represents a positioning data processing device, 131-134 and 331' -134 'represent 4 UWB positioning tags U1, U2, U3, U4, 141 represent VR users, 151-157 represent visual scenes of a certain position during movement of the VR users, 161 represents a track route of movement of the VR users, 171-175 represent virtual occlusions which do not actually exist in the virtual reality world, and 181' represent directions of display screens of the movable display devices (i.e., VR user eye visual directions).
The 4 UWB positioning tags are fixed in relative position on the movable display device in advance to obtain spatial data of a positional relationship between the two UWB positioning tags, including corresponding position numbering, that is, 4 UWB positioning tags which are fixed in Front of the display screen, behind the display screen, on the Left of the display screen and on the Right of the display screen form a circle and are uniformly located on the circle, coordinates of a Front UWB positioning tag U1, a rear UWB positioning tag U2, a Left UWB positioning tag U3 and a Right UWB positioning tag U4 are VR _ Loc _ Front (x1, y1, z1), VR _ Loc _ Back (x2, y2, z2), VR _ Loc _ Left (x3, y3, z3) and VR _ Loc _ Right (x4, y4, z4), the display screen U0 is exactly the central position of the spatial model, and in this relationship, the coordinate of the display screen coordinate of the central point U0, the central point 0 of the display screen is expressed as follows:
Figure BDA0003014171360000141
the orientation of the display screen on the movable display device may be determined by the direction of the wiring from U0(x0, y0, z0) to U1(x1, y1, z1) or the direction of the wiring from U2(x2, y2, z2) to U1(x1, y1, z 1).
The implementation method of the embodiment specifically comprises the following steps:
step 1: after the VR system is started, the clock positioning client modules on the three UWB positioning anchors 121-123 automatically and periodically perform clock synchronization with the clock positioning server module on the virtual image processing device 111, the three UWB positioning anchors 121-123 record corresponding receiving time stamps by receiving positioning broadcast messages sent by the UWB positioning tags 131-134 in Front of, behind, left of, and right of the display screen on the mobile display device, the three UWB positioning anchors send the recorded time stamp information to the positioning data processing device 124, the positioning data processing device 124 calculates time differences between the UWB positioning tag positioning broadcast message receiving time stamps recorded by the three UWB positioning anchors 121-123, and calculates the accurate position coordinates VR _ Loc _ Front (x1, y1, z1), VR _ Loc _ Back (x2, y2, z2), VR _ Loc _ Left (x3, y3, z3), and VR _ Loc _ Right (x4, y4, z 4);
step 2: the positioning data processing device 124 obtains, through calculation, 131-134 positioning results VR _ Loc _ Front (x1, y1, z1), VR _ Loc _ Back (x2, y2, z2), VR _ Loc _ Lef (x3, y3, z3) t and VR _ Loc _ Right (x4, y4, z4) corresponding to Front, Back, Left and Right UWB positioning tags of the movable display device 101, and identifies positions VR _ Loc _ Front (x1, y1, z1), VR _ Loc _ Back (x2, y2, z2), VR _ Loc _ Left (x3, y3, z3) and VR _ Loc _ 4, y4, z4) of the movable display device 141, where points are surrounded by
Figure BDA0003014171360000142
And the orientation facing the movable display device 101 of the VR user 141 is identified through the positioning results of the front, back, left and right UWB positioning tags 131-134, namely: the direction represented by the front UWB positioning tag 131, namely: display screen center point U0(x0, y0, z0)>UWB positioning tag U1(x1, y1, z1) in front of display screen or UWB positioning tag U2(x2, y2, z2) behind display screen>A UWB positioning tag U1(x1, y1, z1) in front of the display screen, the positioning data processing device 124 sends the U0 sitting position result and the movable display device 101 direction positioning result to the virtual image processing device 111, and the virtual image processing device 111 sends accurate matching VR visual scene virtual picture data to the movable display device 101 in real time according to the recognized position and direction positioning result of the movable display device 101;
step 3: the movable display device 101 receives the VR virtual image data of the virtual image processing device 111 and projects the data onto the VR display screen, so that the VR user 141 in the same VR system can receive the 3D realistic effect of the virtual environment according to the position of the user and the facing direction of the VR user 141 in real time, that is: the movable display device 101 of the VR user 141 can receive the 3D virtual picture according to its own position and posture in real time, having an immersive visual effect;
step 4: in the VR environment scene shown in fig. 7, when the VR user 141 moves to the destination position 141' according to the path line 161, the movable display device 101 continuously receives the VR visual scene virtual image data fed back by real-time positioning in real time, and in detail, refer to fig. 8 for the VR visual scene 151-157 during the movement of the VR user 141.
In this embodiment, after the VR user 141 turns left and right in the VR visual scenes 151 to 157 at a certain point on the path 161, the virtual reality visual scene of the VR system is fixed and unchanged, as shown in fig. 8, that is: the original articles in the north are still in the north, the original articles in the south are still in the south, the original articles in the east are still in the east, the original articles in the west are still in the west, and the rotation of the VR user 141 cannot be followed. The specific process is that the UWB positioning tags on the left, right, front and back of the movable display device can position the specific position and the direction of the eye orientation of the user in real time and feed back the position and the direction of the eye orientation of the user to the host device, and the host device can feed back VR visual scene data corresponding to the position and the change situation of the direction of the eye orientation of the user to the movable display device and play the VR visual scene data, so that original static objects in a virtual scene still stay at a distance and are not moved, and the reality of a VR user 141 in the VR system is subjected to a vivid visual effect.
The effects of this example 1 are as follows:
1. set up four UWB location labels on portable display device for can more accurately realize fixing a position and facing vector direction of portable display device, thereby match the virtual picture image of more accurate position and direction, promptly: the virtual reality visual scene can be fixed and does not change due to the movement of a virtual reality user, so that a position sensor and a direction sensor in the original VR system are replaced;
2. the method can help the existing virtual reality system reduce the steps of calibrating the position of the direction sensor, and is beneficial to improving the experience satisfaction degree of VR users;
3. the positioning frequency of UWB is higher than the positioning frequency of direction sensor, can refresh VR picture more fast, makes the experience of virtual picture broadcast of VR user more smooth.
The invention provides an embodiment 2 of a VR system based on UWB positioning and an implementation method, wherein the constitution framework of the VR system is shown in figure 9, and the constitution of the VR system comprises: the device comprises a movable display device, a positioning data processing device, a virtual image processing device, two controllers, a UWB positioning anchor point and four UWB positioning labels. The system composition of the embodiment is based on the system composition of embodiment 1, two UWB positioning anchor points are reduced, two controllers are added, two UWB positioning labels connected to the original mobile display device are reduced, and one UWB positioning label is respectively added to the two controllers.
Wherein, portable display device and virtual image processing equipment lug connection, UWB location anchor point and location data processing equipment lug connection, location data processing equipment and virtual image processing equipment lug connection, two UWB location labels and portable display device lug connection, two other UWB location labels and controller lug connection, every UWB location label passes through wireless connection with every UWB location anchor point, and all four UWB location labels all pass through wired indirect connection with virtual image processing equipment. In addition, the distance requirement between every two UWB positioning tags of the four UWB positioning tags is the same as that in embodiment 1.
The VR system has the same components and functions as the VR system with the same name, and the differences are specifically as follows:
the positioning algorithm supported by the UWB positioning anchor point, the UWB positioning tag, and the positioning data processing device is not a TDOA positioning algorithm, but an AOA angle positioning algorithm and a TOF ranging algorithm.
The five wired communication interfaces on the movable display device are respectively one wired communication interface connected with the virtual image processing device, two wired communication interfaces connected with the two UWB positioning tags and two wired communication interfaces connected with the two controllers.
The virtual image processing device can also dynamically generate a virtual image corresponding to the controller movement and operation according to the position of the controller in the virtual environment on the basis of the system of the embodiment 1, such as: the virtual reality is the virtual picture of the shooting aiming action of the virtual user and the bullet flying after the shooting. The number of wired communication interfaces is reduced to two wired communication interfaces connected to a mobile display device and a UWB positioning anchor.
The added controller is a device for the VR system to track VR user actions or gestures, typically one on each of the left and right hands. The UWB positioning tag mainly comprises a control module, a communication module for receiving and transmitting control commands, a power supply module, two communication interfaces connected with a movable display device and a UWB positioning tag and the like.
The two UWB positioning tags on the movable display device are fixed in relative position and numbered in corresponding position in advance on the movable display device, that is, the two UWB positioning tags installed in front of the display screen and fixed behind the display screen have the coordinates of the front UWB positioning tag U1 and the rear UWB positioning tag U2 being VR _ Loc _ GlsFront (x1, y1, z1) and VR _ Loc _ Back (x2, y2, z2), the display screen center point U0 is the center position of the connection line between the front UWB positioning tag U1 and the rear UWB positioning tag U2, and the display screen center point coordinates U0(x0, y0, z0) under the relationship are expressed as follows:
Figure BDA0003014171360000161
the orientation of the display screen on the movable display device may be determined by the direction of the wiring from U0(x0, y0, z0) to U1(x1, y1, z1) or the direction of the wiring from U2(x2, y2, z2) to U1(x1, y1, z 1).
The 2 UWB positioning labels on the left controller and the right controller are fixed in relative position in advance and are coded in corresponding positions, the virtual images of the UWB positioning labels are stored in a virtual environment database, the coordinates of the 2 UWB positioning labels U3 and U4 are VR _ Loc _ CtlLeft (x3, y3, z3) and VR _ Loc _ CtlRight (x4, y4, z4), and the positions of the left controller and the right controller in the virtual reality world are determined through the matching relation between the positioning results of U3 and U4 and the virtual environment coordinate system.
In example 2 of fig. 10, a schematic diagram of a virtual reality scene is shown, in which a block diagram with a mesh represents a virtual reality image generated in a virtual reality environment, which is not the real reality environment, and a diagram of a solid frame part is a component real object part of a VR system in the real environment. In FIG. 10, 201 and 201 'represent movable display devices, 211 represents a virtual image processing device, 221 represents a UWB positioning anchor point, 222 represents a positioning data processing device, 231-234 and 231' -234 'represent movable display devices before and after movement and four UWB positioning labels on a VR controller, 241 represents a VR user, 251-257 represents a visual scene at a certain position during movement of the VR user, 261 represents a track route of movement of the VR user, 271-275 represent virtual obstructions which do not exist in the virtual reality world, and 281' represent directions of a display screen of the movable display device (i.e., visual directions of eyes of the VR user).
Referring to fig. 10, a schematic diagram of a virtual reality scene in embodiment 2 of the present invention, which is different from embodiment 1 in that instead of using a TDOA positioning algorithm, a combination of an AOA angular positioning algorithm and a TOF positioning algorithm is used to locate a specific position of a VR user and a direction of eyes of the user, and to locate a position of a controller, the method of the embodiment includes the following steps:
step 1: after each device in the VR system is powered on, the UWB positioning anchor 221, the UWB positioning tags 232 and 233 before and after the removable display device 201, and the two UWB positioning tags 231 and 234 on the controller are automatically clocked with the virtual image processing device 211. Each UWB positioning label 231-234 sends positioning broadcast messages according to a certain period and staggered time, after the UWB positioning anchor point 221 receives the positioning broadcast messages sent by the UWB positioning labels 231-234, the time stamps of the positioning broadcast messages received by the two antennae record on the UWB positioning anchor point 221 are recorded, the recorded data are sent to the positioning data processing equipment 222, and the positioning data processing equipment 222 calculates the AOA angle positioning results of the UWB positioning labels 231-234 at the front and back of the mobile display equipment and at the left and right of the controller to reach the angle alpha through the AOA angle positioning algorithm introduced in the background part of the invention.
In addition, each UWB positioning tag also interacts with a positioning message necessary for TOF positioning algorithm of the UWB positioning anchor serving as a positioning initiator at a certain periodic interval, the UWB positioning anchor 221 records a positioning message sending time stamp and a positioning response message receiving time stamp, and obtains interval time data of receiving the positioning message and sending the positioning response message recorded by the UWB positioning tag in the positioning response message, and sends these information to the positioning data processing device 222, and the positioning data processing device 222 calculates a highly accurate ranging positioning result D between the UWB positioning tag and the UWB positioning anchor 221 through a TOF positioning algorithm calculation formula introduced in the background section of the present invention. Combining the AOA angular positioning result reaching Angle α and the ranging positioning result D to form the UWB positioning tag polar coordinates (VR _ Loc _ Radius _ r/Radius coordinate ═ D, VR _ Loc _ Angle _ θ/angular coordinate ═ α) with the UWB positioning anchor point 221 as the pole, or obtaining the UWB positioning tag 232 position coordinates VR _ Loc _ GlsFront (x1, y1, z1) and the UWB positioning tag 233 position coordinates VR _ Loc _ Gls Back (x2, y2, z2) on the movable display device 201 through coordinate conversion, and the UWB positioning tag 231 position coordinates VR _ Loc _ CtlLeft (x3, y3, z3) and the positioning tag 234 position coordinates VR _ Loc _ CtlRight (x4, y4, z4) on the controller;
step 2: the positioning data processing device 222 sends the corresponding positioning result polar coordinates (VR _ Loc _ Angle ═ α, VR _ Loc _ Distance ═ D) or the position coordinates VR _ Loc _ GlsFront, VR _ Loc _ Gls Back, VR _ Loc _ CtlLeft and VR _ Loc _ CtlRight of the movable display device 201 and the UWB positioning tags 231 to 234 on the controller to the virtual image processing device 211 in real time, and the virtual image processing device 211 identifies the position of the movable display device 201 through the positioning results of the UWB positioning tags 231 to 234 on the movable display device and the controller
Figure BDA0003014171360000181
Figure BDA0003014171360000182
The direction of the display screen is the direction of the connection from U0(x0, y0, z0) to U1(x1, y1, z1), or the direction of the connection from U2(x2, y2, z2) to U1(x1, y1, z1) and the positions of the left and right controls 231, 234, i.e.:VR _ Loc _ CtlLeft (x3, y3, z3), VR _ Loc _ CtlRight (x4, y4, z4), and sending an accurate matching VR virtual picture to the movable display device of the VR user 241 in real time according to the recognition result;
step 3: after receiving the VR virtual image sent by the virtual image processing device 211, the movable display device 201 projects the VR virtual image onto a VR display screen, and completes a round of updating of the VR virtual image.
Step 4: in the VR environment scene in this embodiment 2, when the VR user 241 moves to the destination location 241' according to the route, the movable display device 201 continuously receives the VR visual scene virtual image data fed back by real-time positioning at the locations 251 to 257 in fig. 10 in real time, and specifically, please refer to fig. 11 for VR visual scene conditions at different locations 251 to 257.
In fig. 11, after VR user 241 rotates left and right in VR visual scene at some point 251 ~ 257 on the path, the visual scene of VR system is not changed, that is: the virtual article originally in the north is still in the north, the virtual article originally in the south is still in the south, the virtual article originally in the east is still in the east, the virtual article originally in the west is still in the west, and the virtual article cannot rotate along with the rotation of the VR user 241. The specific process is that the left and right UWB positioning tags 232 and 233 on the movable display device 201 can position the specific position and the direction positioning result of the display screen on the movable display device in real time and feed back the positioning result to the virtual image processing device 211, and the virtual image processing device 211 can generate corresponding VR visual scene data according to the position and the direction change condition of the display screen on the movable display device, send the VR visual scene data to the movable display device 201 and play the VR visual scene data on the display screen, so that the original static object in the virtual scene still remains to be motionless at a distance, and the VR user 241 can feel the virtual reality in the VR system with the virtual reality visual effect.
The effects of embodiment 2 are substantially the same as those of embodiment 1, and the specific differences are as follows:
two UWB positioning labels are arranged on the controller, so that the positions of a plurality of controllers can be more accurately positioned, and the controller virtual image picture image with more accurate positions and directions is matched, thereby replacing a position sensor and a direction sensor on the controller in the original VR system.
The embodiment 3 of the VR system and the realization method based on UWB positioning provided by the invention combines the attached drawings and the specific description system and the realization method as follows:
the VR system of this embodiment 3, as shown in fig. 12, includes: three movable display equipment, a virtual image processing equipment, three sets of controllers, three UWB positioning anchor points, a positioning data processing equipment, three sets of UWB positioning tag groups. Wherein, each set of controller includes two controllers, is located left hand and right hand respectively. In addition, each set of UWB positioning label group is divided into two sets of UWB positioning label sub-groups of the movable display device and the controller. Wherein, the supporting UWB location label subgroup of portable display device has four UWB location labels, promptly: and UWB positioning tags are respectively placed in the front direction, the rear direction, the left direction, the right direction and the like based on the direction of a display screen of the movable display equipment. The matched UWB positioning label subgroup of each set of controller is provided with two UWB positioning labels which are respectively fixed on the left-hand controller and the right-hand controller.
Wherein, three portable display device and a virtual image processing equipment lug connection, the UWB location label in three sets of UWB location label group respectively with correspond portable display device and controller lug connection, three UWB location anchor point and a location data processing equipment lug connection, a location data processing equipment and a virtual image processing equipment lug connection, every UWB location label passes through wireless connection with every UWB location anchor point.
The VR system of this embodiment 3 has substantially the same composition and function of each entity as those of the VR system of embodiment 1 except for the controller, and the composition and function of the controller are substantially the same as those of the controller of embodiment 2, and the specific differences between the entities are described as follows:
the number of UWB positioning tags is eighteen, and the corresponding number in both embodiment 1 and embodiment 2 is four.
The positioning algorithm supported by the UWB positioning anchor, the UWB positioning tag, and the positioning data processing device is a TDOA positioning algorithm, which is the same as the positioning algorithm of embodiment 1, but is different from the positioning algorithm of embodiment 2.
The number of movable display devices is three, and the corresponding number in embodiment 1 and embodiment 2 is one.
The number of wired communication interfaces on the virtual image processing device is four, the corresponding number in the embodiment 1 and the embodiment 2 is two, and a scheduling module is additionally arranged, and the scheduling module has a positioning scheduling function for each UWB positioning label and each UWB positioning anchor point.
The number of controllers is six, and the corresponding number in embodiment 2 is two.
In the virtual scene diagram of embodiment 3 in fig. 13, a dashed block diagram with a grid in the diagram represents a virtual implementation image generated in a virtual reality environment, which is not the real reality environment, wherein the solid frame part is a component real object part of the VR system in the real environment. In FIG. 13, 301-303 represent mobile display devices, 311 represents a virtual image processing device, 321-323 represent UWB positioning anchor points, 324 represent positioning data processing devices, 331-1-336-1, 331-2-336-2 and 331-3-336-3 represent mobile display devices of three VR users and six UWB positioning tags on VR controllers, respectively, 341-343 represent three VR users, and 351-355 represent virtual shelters that do not exist in the virtual reality world.
The implementation method of four UWB positioning tags on a mobile display device for position and orientation positioning is described with reference to the corresponding parts in embodiment 1, and the position positioning method of a controller is described with reference to the corresponding parts in embodiment 2.
Referring to fig. 13, a schematic diagram of a virtual scenario of embodiment 3 of the present invention, which is different from the virtual scenario of the previous embodiment in that there are multiple VR users in the scenario, that is: VR user 341, VR user 342, and VR user 343. The method steps of this example 3 are as follows:
step 1: after the VR system is started, the three UWB positioning anchors 321-323 and the clock positioning client modules on all UWB positioning tags automatically perform clock synchronization with the clock positioning server module on the virtual image processing device 311 according to a certain periodic time interval,
step 2: under the condition of system clock synchronization, the three UWB positioning anchors 321-323 wear front, rear, left and right UWB positioning tags 333-1/2/3-336-1/2/3 on the movable display devices 301-303 through a receiving VR user 341, a VR user 342 and a VR user 343, and UWB positioning tags 331-1/2/3 and 332-1/2/3 on a left controller and a right controller respectively schedule and stagger the positioning broadcast messages sent in the same time window at different time slots through a scheduling module of the virtual image processing device 311, the three UWB positioning anchors 321-323 respectively record corresponding receiving time stamps, the three UWB positioning anchors 321-323 send the recorded time stamp information to the positioning data processing device 324, and the positioning data processing device 324 calculates the UWB positioning tags 331-1/2/3-336-1/2 recorded by the positioning anchors 331-1/2/3-336-1/2 The time difference between the receiving time stamps of the positioning broadcast messages is calculated in real time by combining a TDOA positioning algorithm formula and coordinate information of three UWB positioning anchors 321-323, and accurate position coordinates VR1/2/3_ Loc _ GlsFront, VR1/2/3_ Loc _ GlsBack, VR1/2/3_ Loc _ GlsLeft, VR1/2/3_ Loc _ GlsRight, VR1/2/3_ Loc _ LLeft and VR1/2/3_ Loc _ CtlRight of movable display devices of VR user 341, VR user 342 and VR user 343 and accurate position coordinates VR 331-1/2/3-336-1/2/3 of UWB positioning labels 331-1/2/3 and 332-1/2/3 of the controller are calculated in real time;
step 3: the positioning data processing device 324 sends four UWB positioning labels 333-1/2/3-336-1/2/3 of VR user 341, VR user 342 and movable display device 301-303 of VR user 343, front, back, left and right sides and corresponding positioning results VR1/2/3_ Loc _ GlsFront, VR1/2/3_ Loc _ GlsBack, VR1/2/3_ Loc _ GlsLeft, VR1/2/3_ Loc _ GlsRight, VR1/2/3_ Loc _ CtlLeft and VR1/2/3_ Loc _ CtlRight to the virtual image processing device 311 in real time, and the virtual image processing device 311 sends the positioning results VR1/2/3_ Loc _ GlsFront, VR1/2/3_ Loc _ GlsBack, VR user 343, positioning results VR on the controller and corresponding positioning results VR1/2/3_ Loc _ GlsLeft, VR 588 _ CtlRight to the virtual image processing device 311 in real time, VR1/2/3_ Loc _ GlsLeft, VR1/2/3_ Loc _ GlsRight, VR1/2/3_ Loc _ CtlLeft and VR1/2/3_ Loc _ CtlRight, and the same method for identifying the position and direction of the positioning target in embodiment 1 and embodiment 2, identify the positions of the movable display devices 301-303, the left and right controller positions of the VR user 341, the VR user 342 and the VR user 343, and the directions of the display screens on the movable display devices corresponding to the VR user 341, the VR user 342 and the VR user 343, and the virtual image processing device 311 will update the accurate matching virtual image data of the VR scene to the movable display devices 301-303 of the VR user 341, the VR user 342 and the VR user 343 in real time according to the identification results;
step 4: the movable display devices 301 to 303 of the VR users 341, 342, and 343 receive VR virtual image data of the virtual image processing device 311 and project the VR virtual image data onto their respective display screens, so that the VR users 341, 342, and 343 in the same VR system can really feel the real effect of the virtual environment in one visual scene according to their own positions and their facing directions, that is: the movable display devices 301-303 of the VR users 341, 342 and 343 can receive virtual reality pictures in real time according to their positions and postures, and have an immersive visual effect. In addition, the virtual image processing device 311 also realizes interaction among different VR users according to certain control key operations on the controller, such as: virtual shooting of VR users, virtual contact and collision of bodies among VR users, exchange of virtual articles among VR users and the like, so that multi-VR user interaction of a VR system is realized;
step 5: the visual scene effects of the VR user 341, the VR user 342, and the VR user 343 in fig. 13 of this embodiment are as follows:
based on the information about the position of the VR user 341 and the user's eye orientation represented by the display screen on the movable display device, the virtual image processing device 311 generates an accurate VR virtual frame, realistically displays the front virtual obstacles 351, 352 and 354, and can see the side back of the virtual image and the virtual hand image at the position corresponding to the controller selected in advance by the VR user 343 near the virtual obstacle 354 in the virtual reality scene, and can see the frontmost virtual obstacle 355.
Based on the user eye orientation information represented by the VR user 342's position and the display screen orientation on the movable display device, the virtual image processing device 311 generates an accurately matching VR virtual frame with the front virtual obstacles 354 and 355 realistically appearing.
Based on the user eye orientation information represented by the position of the VR user 343 and the display screen orientation on the movable display device, the virtual image processing device 311 generates an accurately matching VR virtual frame, realistically presenting the front virtual obstacles 353 and 355.
Step 6: and the VR system executes the processing steps from Step2 to Step5 in real time according to the movement and control operations of the VR user 341, the VR user 342 and the VR user 343 until the VR system is closed.
The visual scene effect that above appear makes in same VR system different users can really feel the lifelike effect of virtual reality environment according to the position of oneself and its user's eyes orientation direction in real time, promptly: the mobile display equipment of different users can receive virtual reality pictures in real time according to the position and the posture of the users, has an immersive visual effect, and even can optionally see the selected virtual image presented in the virtual reality world based on the positions of other users and the facing directions of the eyes of the users, so that the interaction effect of VR users in the virtual environment can be realized in the same VR system.
The embodiment 3 includes the implementation effects of the embodiments 1 and 2, and in addition to the effects of the embodiments 1 and 2, the following effects are also provided: the consistency of the direction and the position of the virtual entity in each VR visual scene of the VR users can be kept in a multi-user VR scene and an interactive scene, and the interactive experience effect of the multi-user VR scene is guaranteed.
Although VR is described as the object in the above embodiments, these embodiments do not limit the applicable scope of the present invention, and the corresponding technology for stabilizing visual scene may also be used in other scenes, such as: augmented Reality (AR), mixed display (MR) and other extended reality (XR) visual scenes. In addition, in the above embodiments, a plurality of positioning algorithms are respectively used, but the present invention does not limit the indoor positioning algorithm used, and it is within the scope of the present invention to refer to an indoor positioning algorithm other than the indoor positioning algorithm to achieve positioning in the present invention. In addition, the indoor positioning algorithms may be used singly, or two positioning algorithms or a plurality of positioning algorithms may be used in combination.
Although the UWB positioning tags on the mobile display device in the system of the embodiment of the present invention are divided into two tags on the left and right and four tags on the front, back, left and right, the present invention is not limited by the protection scope of the present invention, that is, the number of the UWB positioning tags on the mobile display device may be three or more, for example: a total of eight UWB positioning labels are respectively embedded in the front, rear, left, right, front left, front right, rear left and rear right positions of the movable display device, and more UWB positioning labels can provide auxiliary information for VR user position and direction positioning in more detail and can more accurately position and position the movable display device.

Claims (7)

1. An ultra-wideband positioned virtual reality system, the system comprising: one or more movable display devices, a virtual image processing device, a positioning data processing device, one or more UWB positioning anchor points with fixed positions in advance and one or more groups of movable UWB positioning labels; the group of UWB positioning tags comprises two or more UWB positioning tags; the mobile display equipment is directly connected with the virtual image processing equipment through a wire; the virtual image processing equipment is directly connected with the positioning data processing equipment through a wire; a group of UWB positioning tags are arranged on different spatial directions of each movable display device, and each UWB positioning tag of the group is directly or indirectly connected with the virtual image processing device through a wire and is directly or indirectly connected with the positioning data processing device through a wire; the UWB positioning anchor point is directly connected with the positioning data processing equipment through a wire, each UWB positioning label and UWB positioning anchor point are respectively provided with an antenna, and each UWB positioning label is directly and wirelessly connected with the UWB positioning anchor point through the antenna; a set of UWB positioning tags are fixed in position relative to a positioning object on the removable display device.
2. The UWB positioning virtual reality system of claim 1, wherein the positioning data processing device calculates the position of each UWB positioning tag in a group in real time based on the received positioning data from the UWB positioning anchor point or UWB positioning tag and based on a positioning algorithm calculation formula, and then calculates the positioning result of the position and direction of the positioning object of the movable display device through the spatial data of the group of UWB positioning tags, which are configured in advance for the positioning object on the movable display device, and the two positions of which are relatively fixed, and sends the positioning result to the virtual image processing device; the spatial data refers to the number identification information of the movable display equipment and the number identification information of the relative position of the UWB positioning tag and the movable display equipment, or the number identification information formed by combining the two number identification information.
3. The ultra-wideband positioned virtual reality system of claim 1, further comprising one or more sets of controllers; wherein each group of controllers comprises one or more controllers; the position change of the controller and the operation of the instruction are fed back to the virtual reality world to realize the actions such as interaction and the like; the set of controllers is present in a kit with a mobile display device, each controller is directly wired to the virtual image processing device, a set of UWB positioning tags is mounted on each controller, the set of UWB positioning tags comprises one or more UWB positioning tags, and the controller is directly wired to each UWB positioning tag.
4. The ultra-wideband positioning virtual reality system as claimed in claim 3, wherein the number of the virtual image processing devices and the positioning data processing devices is plural, each of the mobile display devices is bound with one virtual image processing device and one positioning data processing device, and each of the virtual image processing devices and the positioning data processing devices only processes the corresponding virtual image related to the mobile display device.
5. A method for position location enabling location and orientation of a system according to claim 1, the method comprising the steps of:
step 1: starting an ultra-wideband positioning virtual reality system and ensuring clock synchronization in the system; the clock synchronization process is realized by the interaction between a UWB positioning label in the system and a clock synchronization client module in a UWB positioning anchor point and a clock synchronization server module of the virtual image processing equipment according to a network synchronization protocol or a precise clock synchronization protocol;
step 2: each UWB positioning label sends positioning broadcast message based on the selected positioning algorithm period or carries out positioning message interaction with a UWB positioning anchor point, and the positioning message comprises positioning anchor point coordinate data, identification information of each positioning label and other positioning data required by the positioning algorithm; the UWB positioning anchor point acquires the required information and sends the information to the given bit data processing equipment for positioning calculation; positioning data processing equipment positions accurate position coordinates of a group of UWB positioning tags on the movable display equipment in real time in the same positioning time period, and a positioning result of the position and the direction of a positioning target on the movable display equipment is calculated according to spatial data;
step 3: the positioning data processing equipment sends the positioning result of the position and the direction of the movable display equipment to virtual image processing equipment in real time, and the virtual image processing equipment sends accurately matched virtual picture data to the movable display equipment in real time based on the received positioning result data of the position and the direction of the movable display equipment and a virtual reality space coordinate system;
step 4: the movable display equipment receives the virtual image data sent by the virtual image processing equipment and projects the virtual image data to a display screen of the movable display equipment;
step 5: and repeating the processing in steps 2-4 according to the set refreshing frequency until the virtual reality system is closed, so that the movable display equipment can obtain a virtual picture in real time at any moment, and the visual scene corresponding to the real-time moving state of the movable display equipment can be accurately reflected.
6. The location method of claim 5 wherein said location algorithm at Step2 comprises a TDOA, AOA or PDOA location algorithm.
7. The positioning method of claim 5,
step1, further comprising a group of UWB positioning tags installed on each controller in each group of controllers and carrying out clock synchronization with the virtual image processing device;
step2, corresponding position numbering is carried out on a group of UWB positioning labels on the controller and the relative position of the controller in advance, the posture of the controller is positioned according to the relative position of the group of UWB positioning labels and the positioning result obtained by the corresponding position numbering, the posture comprises the position and direction positioning result, and the positioning result is mapped to the virtual reality visual scene according to the posture, so that the realistic visual feeling that the virtual reality is consistent with the expectation is realized;
step3, sending the position and direction positioning result of the controller to the virtual image processing device, sending the accurate matched virtual image data to the movable display device in real time by the virtual image processing device based on the received position and direction positioning result data of the controller and the virtual reality space coordinate system, sending the control instruction input to the controller to the virtual image processing device through wired connection after being processed by the controller, and generating virtual image data of corresponding operation by the virtual image processing device according to the received control instruction and sending the virtual image data to the movable display device;
step4, the movable virtual display device receives the corresponding and accurately matched virtual picture data of the controller and generates a virtual image of the corresponding operation based on the control instruction sent by the controller, and then projects the virtual image onto the display screen of the movable display device.
CN202110384173.8A 2021-04-09 2021-04-09 Ultra-wideband positioning virtual reality system and positioning method for realizing position and direction Pending CN113190113A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110384173.8A CN113190113A (en) 2021-04-09 2021-04-09 Ultra-wideband positioning virtual reality system and positioning method for realizing position and direction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110384173.8A CN113190113A (en) 2021-04-09 2021-04-09 Ultra-wideband positioning virtual reality system and positioning method for realizing position and direction

Publications (1)

Publication Number Publication Date
CN113190113A true CN113190113A (en) 2021-07-30

Family

ID=76975442

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110384173.8A Pending CN113190113A (en) 2021-04-09 2021-04-09 Ultra-wideband positioning virtual reality system and positioning method for realizing position and direction

Country Status (1)

Country Link
CN (1) CN113190113A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113570907A (en) * 2021-09-23 2021-10-29 深圳华云时空技术有限公司 UWB-based pedestrian and vehicle anti-collision method and system in tunnel
CN113848771A (en) * 2021-08-11 2021-12-28 广州求远电子科技有限公司 UWB anchor point automatic configuration method, device, equipment and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160187969A1 (en) * 2014-12-29 2016-06-30 Sony Computer Entertainment America Llc Methods and Systems for User Interaction within Virtual Reality Scene using Head Mounted Display
US20160238692A1 (en) * 2015-02-13 2016-08-18 Position Imaging, Inc. Accurate geographic tracking of mobile devices
CN106199511A (en) * 2016-06-23 2016-12-07 郑州联睿电子科技有限公司 VR location tracking system based on ultra broadband location and location tracking method thereof
CN106249896A (en) * 2016-08-12 2016-12-21 浙江拓客网络科技有限公司 Based on sterically defined virtual reality interactive system
US20180107278A1 (en) * 2016-10-14 2018-04-19 Intel Corporation Gesture-controlled virtual reality systems and methods of controlling the same
CN208145442U (en) * 2018-04-18 2018-11-27 西安灵境科技有限公司 Dodgem game system based on virtual reality and ultra wideband location techniques
US10659679B1 (en) * 2017-08-16 2020-05-19 Disney Enterprises, Inc. Facial location determination
US20200380178A1 (en) * 2017-02-22 2020-12-03 Middle Chart, LLC Tracking safety conditions of an area

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160187969A1 (en) * 2014-12-29 2016-06-30 Sony Computer Entertainment America Llc Methods and Systems for User Interaction within Virtual Reality Scene using Head Mounted Display
US20160238692A1 (en) * 2015-02-13 2016-08-18 Position Imaging, Inc. Accurate geographic tracking of mobile devices
CN106199511A (en) * 2016-06-23 2016-12-07 郑州联睿电子科技有限公司 VR location tracking system based on ultra broadband location and location tracking method thereof
CN106249896A (en) * 2016-08-12 2016-12-21 浙江拓客网络科技有限公司 Based on sterically defined virtual reality interactive system
US20180107278A1 (en) * 2016-10-14 2018-04-19 Intel Corporation Gesture-controlled virtual reality systems and methods of controlling the same
US20200380178A1 (en) * 2017-02-22 2020-12-03 Middle Chart, LLC Tracking safety conditions of an area
US10659679B1 (en) * 2017-08-16 2020-05-19 Disney Enterprises, Inc. Facial location determination
CN208145442U (en) * 2018-04-18 2018-11-27 西安灵境科技有限公司 Dodgem game system based on virtual reality and ultra wideband location techniques

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113848771A (en) * 2021-08-11 2021-12-28 广州求远电子科技有限公司 UWB anchor point automatic configuration method, device, equipment and storage medium
CN113570907A (en) * 2021-09-23 2021-10-29 深圳华云时空技术有限公司 UWB-based pedestrian and vehicle anti-collision method and system in tunnel
CN113570907B (en) * 2021-09-23 2021-12-17 深圳华云时空技术有限公司 UWB-based pedestrian and vehicle anti-collision method and system in tunnel

Similar Documents

Publication Publication Date Title
JP7297028B2 (en) Systems and methods for augmented reality
US10678324B2 (en) Systems and methods for augmented reality
CN110650354B (en) Live broadcast method, system, equipment and storage medium for virtual cartoon character
EP2579128B1 (en) Portable device, virtual reality system and method
US11042028B1 (en) Relative pose data augmentation of tracked devices in virtual environments
CN106199511B (en) VR location tracking system and its location tracking method based on ultra wide band positioning
CN110140099B (en) System and method for tracking controller
CN102939139B (en) Calibration of portable devices in shared virtual space
CN109613983A (en) It wears the localization method of handle in display system, device and wears display system
CN104380347A (en) Video processing device, video processing method, and video processing system
TWI714054B (en) Tracking system for tracking and rendering virtual object corresponding to physical object and the operating method for the same
CN113190113A (en) Ultra-wideband positioning virtual reality system and positioning method for realizing position and direction
EP3264228A1 (en) Mediated reality
JP2016122277A (en) Content providing server, content display terminal, content providing system, content providing method, and content display program
US11256090B2 (en) Systems and methods for augmented reality
CN107229055B (en) Mobile equipment positioning method and mobile equipment positioning device
CN112074705A (en) Method and system for optical inertial tracking of moving object
CN114373016A (en) Method for positioning implementation point in augmented reality technical scene
CN108261761B (en) Space positioning method and device and computer readable storage medium
JP2022175912A (en) Main terminal, program, system and method for maintaining relative position attitude with sub-terminal in real space in virtual space

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210730