CN105785373A - Virtual reality position identification system and method - Google Patents

Virtual reality position identification system and method Download PDF

Info

Publication number
CN105785373A
CN105785373A CN201610263594.4A CN201610263594A CN105785373A CN 105785373 A CN105785373 A CN 105785373A CN 201610263594 A CN201610263594 A CN 201610263594A CN 105785373 A CN105785373 A CN 105785373A
Authority
CN
China
Prior art keywords
virtual reality
ultrasound wave
axis
coordinate position
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610263594.4A
Other languages
Chinese (zh)
Inventor
周晓峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Wind Communication Technologies Co Ltd
Original Assignee
Shanghai Wind Communication Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Wind Communication Technologies Co Ltd filed Critical Shanghai Wind Communication Technologies Co Ltd
Priority to CN201610263594.4A priority Critical patent/CN105785373A/en
Publication of CN105785373A publication Critical patent/CN105785373A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • G01S15/06Systems determining the position data of a target

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Acoustics & Sound (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
  • Length Measuring Devices Characterised By Use Of Acoustic Means (AREA)

Abstract

The present invention discloses a virtual reality position identification system and method. The virtual reality position identification system comprises a virtual reality wearing device, external positioning devices and a host, an ultrasonic wave generator is arranged inside the virtual reality wearing device, and the external positioning devices are ultrasonic wave receivers which are placed on an X-axis, a Y-axis and a Z-axis of a preset coordinate system respectively. The virtual reality wearing device is used to send out an ultrasonic wave via the ultrasonic wave generator and record the ultrasonic wave sending time, the external positioning devices are used for the ultrasonic wave receivers to obtain the ultrasonic wave and record the ultrasonic wave reception time, and the host is used to calculate and obtain the coordinate position of the virtual reality wearing device. According to the present invention, the ultrasonic wave generator and the ultrasonic wave receivers form the virtual reality position identification system, so that the virtual reality wearing device can position accurately via a plurality of ultrasonic wave sensors in the limited space.

Description

A kind of virtual reality position-recognizing system and method
Technical field
The present embodiments relate to virtual reality technology, particularly relate to a kind of virtual reality position-recognizing system and method.
Background technology
Virtual reality technology, being a kind of can establishment and the computer simulation system in the experiencing virtual world, it utilizes computer to generate a kind of simulated environment to be the interactively Three-Dimensional Dynamic what comes into a driver's of a kind of Multi-source Information Fusion and the system emulation of entity behavior makes user be immersed in this environment.Generally including virtual reality in virtual reality system and dress the helmet, main frame and other system peripheral hardware, main frame can be PC or mobile terminal.
At present, the location recognition technology of virtual reality system is mainly included in virtual reality and dresses the built-in a lot of position tracing sensors of helmet fuselage, principle is to dress integrated several infrared lamps on helmet fore-body in virtual reality, the infrared signal launched is to receptor, receptor is installed to above PC display, or being fixed on spider, user necessarily be in and can realize real time position in the visual range of receptor and follow the trail of, and receptor can be infrared camera etc..This programme has the disadvantage that this system can only realize the distance identification of a fore-and-aft direction, and is limited in scope.
Current SteamVR system is based on Lighthouse laser traces system and follows the trail of to realize room, can accurately follow the trail of the movement of user in certain circumstances, this system defect needs to empty room object before being in that use, it is impossible to has object to block, otherwise can produce error.
Summary of the invention
The present invention provides a kind of virtual reality position-recognizing system and method, to realize virtual reality wearable device in limited space by realizing being accurately positioned of various direction.
First aspect, embodiments provides a kind of virtual reality position-recognizing system, and this system includes:
Virtual reality wearable device, external location equipment and main frame;Wherein, being provided with supersonic generator in described virtual reality wearable device, described external location equipment is the ultrasonic receiver being individually positioned in the X-axis of preset coordinate system, Y-axis and Z axis;
Described virtual reality wearable device, for sending ultrasound wave by supersonic generator, record ultrasound wave sends the time;
Described external location equipment, obtains described ultrasound wave for described each ultrasonic receiver, and record ultrasound wave receives the time;
Described main frame, for receiving the coordinate position of the time difference with the described ultrasound wave transmission time and each ultrasonic receiver according to described ultrasound wave, calculates the coordinate position of described virtual reality wearable device.
Preferably, described main frame, specifically for, the coordinate position of the time difference with the described ultrasound wave transmission time and each ultrasonic receiver is received according to described ultrasound wave, calculate the coordinate position of described virtual reality wearable device, send corresponding picture according to described position coordinates and output data to described virtual reality wearable device.
Preferably, described main frame, specifically for: the coordinate position according to the ultrasonic receiver of X-axis, Y-axis and Z axis, and X-axis, the Y-axis described ultrasound wave corresponding with the ultrasonic receiver of Z axis receives time and described ultrasound wave and send the difference of time, drawn the coordinate position of described virtual reality wearable device by three ball intersection point location algorithms.
Preferably, described X-axis, Y-axis and Z-direction are placed with at least two ultrasonic receiver respectively.
Preferably, described main frame includes preparation coordinate position computing module and weighted average module;
Described preparation coordinate position computing module, the difference of time and described ultrasound wave transmission time is received at X-axis, Y-axis and Z axis for the described ultrasound wave that the coordinate position according to three three ultrasonic receivers placed respectively and these three ultrasonic receivers are corresponding, draw a preparation coordinate position, exhaust the coordinate position of all ultrasonic receivers and the described ultrasound wave reception time of ultrasonic receiver according to aforesaid way, draw multiple preparation coordinate position;
Described weighted average module, for obtaining the coordinate position of described virtual reality wearable device by all preparation coordinate positions are done weighted average.
Second aspect, the embodiment of the present invention additionally provides a kind of virtual reality location recognition method, and the method includes:
Described virtual reality wearable device sends ultrasound wave by supersonic generator, and record ultrasound wave sends the time;
Described each ultrasonic receiver of described external location equipment obtains described ultrasound wave, and record ultrasound wave receives the time;
Described main frame receives the coordinate position of the time difference with the described ultrasound wave transmission time and each ultrasonic receiver according to described ultrasound wave, calculates the coordinate position of described virtual reality wearable device.
Preferably, described in calculate described virtual reality wearable device coordinate position after, also include:
Described main frame sends corresponding picture according to the position coordinates of described virtual reality wearable device and outputs data to described virtual reality wearable device.
Preferably, described main frame receives the coordinate position of the time difference with the described ultrasound wave transmission time and each ultrasonic receiver according to described ultrasound wave, calculates the coordinate position of described virtual reality wearable device, particularly as follows:
The described main frame coordinate position according to the ultrasonic receiver of X-axis, Y-axis and Z axis, and X-axis, the Y-axis described ultrasound wave corresponding with the ultrasonic receiver of Z axis receives time and described ultrasound wave and send the difference of time, drawn the coordinate position of described virtual reality wearable device by three ball intersection point location algorithms.
Preferably, described X-axis, Y-axis and Z-direction are placed with at least two ultrasonic receiver respectively.
Preferably, described main frame receives the coordinate position of the time difference with the described ultrasound wave transmission time and each ultrasonic receiver according to described ultrasound wave, calculates the coordinate position of described virtual reality wearable device, specifically includes:
Described main frame receives time and described ultrasound wave according to the described ultrasound wave that the coordinate position of three ultrasonic receivers placed at X-axis, Y-axis and Z axis respectively and these three ultrasonic receivers are corresponding and sends the difference of time, draw a preparation coordinate position, exhaust the coordinate position of all ultrasonic receivers and the described ultrasound wave reception time of ultrasonic receiver according to aforesaid way, draw multiple preparation coordinate position;
Described main frame obtains the coordinate position of described virtual reality wearable device by all preparation coordinate positions are done weighted average.
The virtual reality location recognition alignment system that the present invention is consisted of supersonic generator and ultrasonic receiver, virtual reality wearable device is made to realize X-axis in preset coordinate system in limited space by multiple ultrasonic sensors, Y-axis, being accurately positioned of Z axis, object is blocked the error brought by ultrasonic propagation certain defensive ability/resistance ability, realize virtual reality wearable device in limited space by realizing the positioning precision being accurately positioned raising virtual reality in various direction, improve virtual reality by the precise positioning of this position-recognizing system and experience alternately and feeling of immersion.
Accompanying drawing explanation
Fig. 1 is the structural representation one of a kind of virtual reality position-recognizing system in the embodiment of the present invention one;
Fig. 2 is the structural representation two of a kind of virtual reality position-recognizing system in the embodiment of the present invention one;
Fig. 3 is the structural representation one of a kind of virtual reality position-recognizing system in the embodiment of the present invention two;
Fig. 4 is the structural representation two of a kind of virtual reality position-recognizing system in the embodiment of the present invention two.
Fig. 5 is the structural representation three of a kind of virtual reality position-recognizing system in the embodiment of the present invention two.
Fig. 6 is the flow chart of a kind of virtual reality location recognition method in the embodiment of the present invention three.
Fig. 7 is the flow chart one of a kind of virtual reality location recognition method in the embodiment of the present invention four.
Fig. 8 is the flowchart 2 of a kind of virtual reality location recognition method in the embodiment of the present invention four.
Detailed description of the invention
Below in conjunction with drawings and Examples, the present invention is described in further detail.It is understood that specific embodiment described herein is used only for explaining the present invention, but not limitation of the invention.It also should be noted that, for the ease of describing, accompanying drawing illustrate only part related to the present invention but not entire infrastructure.
Embodiment one
Fig. 1 and Fig. 2 show the structural representation of a kind of virtual reality position-recognizing system that the embodiment of the present invention one provides, the present embodiment is applicable to the situation that virtual reality wearable device positions in a limited space, virtual reality system generally includes virtual reality wearable device, main frame and other system peripheral hardware, system peripheral and wearable device are mutual each through Wireless Telecom Equipment or wire communication facility and main-machine communication, and the concrete structure of this virtual reality position-recognizing system is as follows:
Including virtual reality wearable device 110, external location equipment 120 and main frame 130;Wherein, being provided with supersonic generator 111 in virtual reality wearable device 110, external location equipment 120 is the ultrasonic receiver 121 being individually positioned in the X-axis of preset coordinate system, Y-axis and Z axis.
Wherein, usual virtual reality wearable device 110 dresses display device for virtual reality, virtual reality dresses bracelet etc., and in the present invention, virtual reality wearable device comprises virtual reality and dresses display device, virtual reality can also be included and dress bracelet etc., not limit.Supersonic generator 111 can be arranged in any part in virtual reality wearable device 110.Main frame 130 can be PC or mobile terminal.Wherein preset coordinate system is default space coordinates, and these space coordinates are arranged in the space of virtual reality wearable device 110 motion, are respectively provided with at least one ultrasonic receiver 121 at the X-axis of this coordinate system, Y-axis and Z axis.
Virtual reality wearable device 110, for sending ultrasound wave by supersonic generator 111, record ultrasound wave sends the time.
External location equipment 120, obtains ultrasound wave 121 for each ultrasonic receiver, and record ultrasound wave receives the time.
The tranmitting frequency of supersonic generator 111 is identical with the reception frequency of ultrasonic receiver 121, and the ultrasonic frequency owing to being typically employed in ultrasonic equipment is 20KHz, 25KHz, 28KHz, 33KHz, 40KHz, 60KHz, user cannot hear, user is not caused noise jamming.
Main frame 130, for receiving the coordinate position of the time difference with the ultrasound wave transmission time and each ultrasonic receiver according to ultrasound wave, calculates the coordinate position of virtual reality wearable device.
Wherein, coordinate position is the coordinate position in preset coordinate system, and the coordinate position of each ultrasonic receiver is be stored in advance in main frame.Receive the time according to ultrasound wave and ultrasound wave sends the difference of time and is multiplied by ultrasonic propagation velocity, draw the distance of virtual reality wearable device 110 and each ultrasonic receiver 121, coordinate position according to ultrasonic receiver 121 part or all of in all ultrasonic receivers 121 and its distance with virtual reality wearable device 110, the coordinate position of virtual reality wearable device 110 can be calculated, position owing to have employed the ultrasonic receiver 121 on three coordinate axess, accurate positioning.
The technical scheme of the present embodiment, by the virtual reality location recognition alignment system that supersonic generator and ultrasonic receiver are constituted, virtual reality wearable device is made to realize X-axis in preset coordinate system in limited space by multiple ultrasonic sensors, Y-axis, being accurately positioned of Z axis, object is blocked the error brought by ultrasonic propagation certain defensive ability/resistance ability, realize virtual reality wearable device and realize the positioning precision being accurately positioned raising virtual reality in various direction in limited space, improve virtual reality by the precise positioning of this position-recognizing system to experience alternately and feeling of immersion.
On the basis of technique scheme, main frame 130 preferably can be specifically for: the coordinate position according to the ultrasonic receiver of X-axis, Y-axis and Z axis, and X-axis, the Y-axis ultrasound wave corresponding with the ultrasonic receiver of Z axis receives time and ultrasound wave and send the difference of time, drawn the coordinate position of virtual reality wearable device by three ball intersection point location algorithms.
Example 1: the ultrasound wave being positioned the ultrasonic receiver that object arrives 3 coordinate axess is received time and difference respectively △ t1, △ t2, the △ t3 of ultrasound wave transmission time, this hyperacoustic time difference is defined as one group of sonic data.The product of ultrasound data and ultrasound wave transfer rate just arrives the actual range measuring point for sound source, utilizes three groups of ultrasound datas, it is possible to target carries out three axle location.In three axle location, the position of target is that the intersection point of three balls being radius to the distance of measurement target (virtual reality wearable device 110) with each ultrasonic receiver is determined by with the position of each ultrasonic receiver for the center of circle.The formula then utilizing three ball intersection point location algorithms is:
Wherein, the coordinate position of the ultrasonic receiver of X-axis, Y-axis and Z axis is designated as (x1, y1, z1) respectively, (x2, y2, z2) and (x3, y3, z3), and ultrasound wave spread speed in space is c.Simultaneous solution above equation group, it is possible to obtain being positioned the position coordinates of object.
Embodiment two
The structural representation of a kind of virtual reality position-recognizing system that Fig. 3 and Fig. 4 provides for the embodiment of the present invention two, the present embodiment is on the basis of the various embodiments described above, it is preferred that main frame has done further restriction.
The concrete structure of this virtual reality position-recognizing system is as follows:
This system includes virtual reality wearable device 210, external location equipment 220 and main frame 230;Wherein, being provided with supersonic generator 211 in virtual reality wearable device 210, external location equipment 220 is the ultrasonic receiver 221 being individually positioned in the X-axis of preset coordinate system, Y-axis and Z axis.
Virtual reality wearable device 210, for sending ultrasound wave by supersonic generator 211, record ultrasound wave sends the time.
External location equipment 220, obtains ultrasound wave for each ultrasonic receiver 221, and record ultrasound wave receives the time.
The tranmitting frequency of supersonic generator 211 is identical with the reception frequency of ultrasonic receiver 221, and the ultrasonic frequency owing to being typically employed in ultrasonic equipment is 20KHz, 25KHz, 28KHz, 33KHz, 40KHz, 60KHz, user cannot hear, user is not caused noise jamming.
Main frame 230, for receiving the coordinate position of the time difference with the ultrasound wave transmission time and each ultrasonic receiver 221 according to ultrasound wave, calculate the coordinate position of virtual reality wearable device 210, send corresponding picture according to position coordinates and output data to virtual reality wearable device.
Wherein, corresponding picture output data can pass through what 3D engine in main frame obtained according to the coordinate position drafting of virtual reality wearable device 210, output data to virtual reality wearable device by sending corresponding picture according to position coordinates, improve virtual reality and experience alternately and feeling of immersion.
On the basis of technique scheme, it is preferable that can be that X-axis, Y-axis and Z-direction are placed with at least two ultrasonic receiver respectively.
And preferred, as it is shown in figure 5, main frame 230 includes preparation coordinate position computing module 231 and weighted average module 232.
Preparation coordinate position computing module 231, the difference of time and ultrasound wave transmission time is received at X-axis, Y-axis and Z axis for the ultrasound wave that the coordinate position according to three three ultrasonic receivers placed respectively and these three ultrasonic receivers are corresponding, draw a preparation coordinate position, exhaust the coordinate position of all ultrasonic receivers and the described ultrasound wave reception time of ultrasonic receiver according to aforesaid way, draw multiple preparation coordinate position.
Weighted average module 232, for obtaining the coordinate position of virtual reality wearable device by all preparation coordinate positions are done weighted average.
Being placed with two ultrasonic receivers 221 in example 2:X axle, Y-axis and Z-direction respectively, X-axis is placed with ultrasonic receiver Xa, Xb, and Y-axis is placed with ultrasonic receiver Ya, Yb, and Z axis is placed with ultrasonic receiver Za, Zb;Preparation coordinate position computing module 231 first passes through three ball intersection point location algorithms in example 1 according to ultrasonic receiver Xa, Ya, the coordinate position of Za and ultrasound wave corresponding to these three ultrasonic receivers receive time and ultrasound wave and send the difference of time, draw a preparation coordinate position T1 (t1x, t1y, t1z), again through three ball intersection point location algorithms in example 1 according to ultrasonic receiver Xb, Yb, the coordinate position of Zb and ultrasound wave corresponding to these three ultrasonic receivers receive time and ultrasound wave and send the difference of time, draw a preparation coordinate position T2 (t2x, t2y, t2z), now, the described ultrasound wave of the coordinate position and ultrasonic receiver of exhausting all ultrasonic receivers receives the time.Final weighted average module 232, by being weighted on average by all preparation coordinate position T1 and T2, obtains the coordinate position [(t1x+t2x)/2, (t1y+t2y)/2, (t1z+t2z)/2] of virtual reality wearable device.
By preparing coordinate position computing module 231 and weighted average module 232, at least two ultrasonic receiver on each coordinate axes is carried out packet calculating weighted average, it is possible to obtain virtual reality wearable device coordinate position more accurately.
Embodiment three
The flow chart of a kind of virtual reality location recognition method that Fig. 6 provides for the embodiment of the present invention one, the present embodiment is applicable to the situation that virtual reality wearable device positions in a limited space, being applied in above-mentioned enforcement one in a kind of virtual reality position-recognizing system of offer, the method specifically includes following steps:
Step 310, main frame obtain ultrasound wave from virtual reality wearable device and send the time, and the ultrasound wave transmission time is that virtual reality wearable device sends hyperacoustic time by supersonic generator.
Step 320, main frame obtain ultrasound wave from external location equipment and receive the time, and ultrasound wave receives each ultrasonic receiver that the time is external location equipment and obtains hyperacoustic time.
Step 330, main frame receive the coordinate position of the time difference with the ultrasound wave transmission time and each ultrasonic receiver according to ultrasound wave, calculate the coordinate position of virtual reality wearable device.
Wherein, coordinate position is the coordinate position in preset coordinate system, and the coordinate position of each ultrasonic receiver is be stored in advance in main frame.Receive the time according to ultrasound wave and ultrasound wave sends the difference of time and is multiplied by ultrasonic propagation velocity, draw the distance of virtual reality wearable device 110 and each ultrasonic receiver 121, coordinate position according to ultrasonic receiver 121 part or all of in all ultrasonic receivers 121 and its distance with virtual reality wearable device 110, the coordinate position of virtual reality wearable device 110 can be calculated, position owing to have employed the ultrasonic receiver 121 on three coordinate axess, accurate positioning.
The technical scheme of the present embodiment, by the virtual reality location recognition alignment system that supersonic generator and ultrasonic receiver are constituted, virtual reality wearable device is made to realize X-axis in preset coordinate system in limited space by multiple ultrasonic sensors according to the method, Y-axis, being accurately positioned of Z axis, object is blocked the error brought by ultrasonic propagation certain defensive ability/resistance ability, realize virtual reality wearable device and realize the positioning precision being accurately positioned raising virtual reality in various direction in limited space, improve virtual reality by the precise positioning of this position-recognizing system to experience alternately and feeling of immersion.
On the basis of technique scheme, step 330 specifically includes: the main frame coordinate position according to the ultrasonic receiver of X-axis, Y-axis and Z axis, and X-axis, the Y-axis ultrasound wave corresponding with the ultrasonic receiver of Z axis receives time and ultrasound wave and send the difference of time, drawn the coordinate position of described virtual reality wearable device by three ball intersection point location algorithms.
Example 3: the ultrasound wave being positioned the ultrasonic receiver that object arrives 3 coordinate axess is received time and difference respectively △ t1, △ t2, the △ t3 of ultrasound wave transmission time, this hyperacoustic time difference is defined as one group of sonic data.The product of ultrasound data and ultrasound wave transfer rate just arrives the actual range measuring point for sound source, utilizes three groups of ultrasound datas, it is possible to target carries out three axle location.In three axle location, the position of target is that the intersection point of three balls being radius to the distance of measurement target (virtual reality wearable device) with each ultrasonic receiver is determined by with the position of each ultrasonic receiver for the center of circle.The formula then utilizing three ball intersection point location algorithms is:
Wherein, the coordinate position of the ultrasonic receiver of X-axis, Y-axis and Z axis is designated as (x1, y1, z1) respectively, (x2, y2, z2) and (x3, y3, z3), and ultrasound wave spread speed in space is c.Simultaneous solution above equation group, it is possible to obtain being positioned the position coordinates of object.
Embodiment four
The method flow diagram of a kind of virtual reality location recognition method that Fig. 7 and Fig. 8 provides for the embodiment of the present invention four, it is applied in above-mentioned enforcement two in a kind of virtual reality position-recognizing system of offer, the present embodiment is on the basis of above-described embodiment three, it is preferred that add step 440 step.The method specifically includes following steps:
Step 410, main frame obtain ultrasound wave from virtual reality wearable device and send the time, and the ultrasound wave transmission time is that virtual reality wearable device sends hyperacoustic time by supersonic generator.
Step 420, main frame obtain ultrasound wave from external location equipment and receive the time, and ultrasound wave receives each ultrasonic receiver that the time is external location equipment and obtains hyperacoustic time.
Step 430, main frame receive the coordinate position of the time difference with the ultrasound wave transmission time and each ultrasonic receiver according to ultrasound wave, calculate the coordinate position of virtual reality wearable device.This step is identical with step 330 embodiment in embodiment three, does not repeat them here.
Step 440, main frame send corresponding picture according to the position coordinates of virtual reality wearable device and output data to virtual reality wearable device.
Wherein, corresponding picture output data can pass through what 3D engine in main frame obtained according to the coordinate position drafting of virtual reality wearable device, output data to virtual reality wearable device by sending corresponding picture according to position coordinates, improve virtual reality and experience alternately and feeling of immersion.
On the basis of technique scheme, it is preferred that X-axis, Y-axis and Z-direction are placed with at least two ultrasonic receiver respectively.
And as shown in Figure 8, step 430 preferably can include step 431 and step 432.
Step 431: main frame receives time and ultrasound wave according to the described ultrasound wave that the coordinate position of three ultrasonic receivers placed at X-axis, Y-axis and Z axis respectively and these three ultrasonic receivers are corresponding and sends the difference of time, draw a preparation coordinate position, exhaust the coordinate position of all ultrasonic receivers and the described ultrasound wave reception time of ultrasonic receiver according to aforesaid way, draw multiple preparation coordinate position;
Step 432: main frame obtains the coordinate position of virtual reality wearable device by all preparation coordinate positions are done weighted average.
Being placed with two ultrasonic receivers in example 4:X axle, Y-axis and Z-direction respectively, X-axis is placed with ultrasonic receiver Xa, Xb, and Y-axis is placed with ultrasonic receiver Ya, Yb, and Z axis is placed with ultrasonic receiver Za, Zb;Step 431: first pass through three ball intersection point location algorithms in example 1 according to ultrasonic receiver Xa, Ya, the coordinate position of Za and ultrasound wave corresponding to these three ultrasonic receivers receive time and ultrasound wave and send the difference of time, draw a preparation coordinate position T1 (t1x, t1y, t1z), again through three ball intersection point location algorithms in example 3 according to ultrasonic receiver Xb, Yb, the coordinate position of Zb and ultrasound wave corresponding to these three ultrasonic receivers receive time and ultrasound wave and send the difference of time, draw a preparation coordinate position T2 (t2x, t2y, t2z), now, the described ultrasound wave of the coordinate position and ultrasonic receiver of exhausting all ultrasonic receivers receives the time.Step 432: by being weighted on average by all preparation coordinate position T1 and T2, obtains the coordinate position [(t1x+t2x)/2, (t1y+t2y)/2, (t1z+t2z)/2] of virtual reality wearable device.
By step 431 and step 432, at least two ultrasonic receiver on each coordinate axes is carried out packet to calculate and weighted average, it is possible to obtain virtual reality wearable device coordinate position more accurately.
Note, above are only presently preferred embodiments of the present invention and institute's application technology principle.It will be appreciated by those skilled in the art that and the invention is not restricted to specific embodiment described here, various obvious change can be carried out for a person skilled in the art, readjust and substitute without departing from protection scope of the present invention.Therefore, although the present invention being described in further detail by above example, but the present invention is not limited only to above example, when without departing from present inventive concept, other Equivalent embodiments more can also be included, and the scope of the present invention is determined by appended right.

Claims (10)

1. a virtual reality position-recognizing system, it is characterised in that include virtual reality wearable device, external location equipment and main frame;Wherein, being provided with supersonic generator in described virtual reality wearable device, described external location equipment is the ultrasonic receiver being individually positioned in the X-axis of preset coordinate system, Y-axis and Z axis;
Described virtual reality wearable device, for sending ultrasound wave by supersonic generator, record ultrasound wave sends the time;
Described external location equipment, obtains described ultrasound wave for described each ultrasonic receiver, and record ultrasound wave receives the time;
Described main frame, for receiving the coordinate position of the time difference with the described ultrasound wave transmission time and each ultrasonic receiver according to described ultrasound wave, calculates the coordinate position of described virtual reality wearable device.
2. system according to claim 1, it is characterized in that, described main frame, specifically for, the coordinate position of the time difference with the described ultrasound wave transmission time and each ultrasonic receiver is received according to described ultrasound wave, calculate the coordinate position of described virtual reality wearable device, send corresponding picture according to described position coordinates and output data to described virtual reality wearable device.
3. system according to claim 1, it is characterized in that, described main frame, specifically for: the coordinate position according to the ultrasonic receiver of X-axis, Y-axis and Z axis, and X-axis, the Y-axis described ultrasound wave corresponding with the ultrasonic receiver of Z axis receives time and described ultrasound wave and send the difference of time, drawn the coordinate position of described virtual reality wearable device by three ball intersection point location algorithms.
4. system according to claim 1, it is characterised in that be placed with at least two ultrasonic receiver in described X-axis, Y-axis and Z-direction respectively.
5. system according to claim 1, it is characterised in that described main frame includes preparation coordinate position computing module and weighted average module;
Described preparation coordinate position computing module, the difference of time and described ultrasound wave transmission time is received at X-axis, Y-axis and Z axis for the described ultrasound wave that the coordinate position according to three three ultrasonic receivers placed respectively and these three ultrasonic receivers are corresponding, draw a preparation coordinate position, exhaust the coordinate position of all ultrasonic receivers and the described ultrasound wave reception time of ultrasonic receiver according to aforesaid way, draw multiple preparation coordinate position;
Described weighted average module, for obtaining the coordinate position of described virtual reality wearable device by all preparation coordinate positions are done weighted average.
6. a virtual reality location recognition method, in the system as described in claim 1-5 any one, it is characterised in that including:
Described main frame obtains ultrasound wave from described virtual reality wearable device and sends the time, and the described ultrasound wave transmission time is that described virtual reality wearable device sends hyperacoustic time by supersonic generator;
Described main frame obtains ultrasound wave from described external location equipment and receives the time, and described ultrasound wave receives described each ultrasonic receiver that the time is described external location equipment and obtains described hyperacoustic time;
Described main frame receives the coordinate position of the time difference with the described ultrasound wave transmission time and each ultrasonic receiver according to described ultrasound wave, calculates the coordinate position of described virtual reality wearable device.
7. method according to claim 6, it is characterised in that described in calculate described virtual reality wearable device coordinate position after, also include:
Described main frame sends corresponding picture according to the position coordinates of described virtual reality wearable device and outputs data to described virtual reality wearable device.
8. method according to claim 6, it is characterized in that, described main frame receives the coordinate position of the time difference with the described ultrasound wave transmission time and each ultrasonic receiver according to described ultrasound wave, calculates the coordinate position of described virtual reality wearable device, particularly as follows:
The described main frame coordinate position according to the ultrasonic receiver of X-axis, Y-axis and Z axis, and X-axis, the Y-axis described ultrasound wave corresponding with the ultrasonic receiver of Z axis receives time and described ultrasound wave and send the difference of time, drawn the coordinate position of described virtual reality wearable device by three ball intersection point location algorithms.
9. method according to claim 6, it is characterised in that be placed with at least two ultrasonic receiver in described X-axis, Y-axis and Z-direction respectively.
10. the method according to claim 6 or 9, it is characterized in that, described main frame receives the coordinate position of the time difference with the described ultrasound wave transmission time and each ultrasonic receiver according to described ultrasound wave, calculates the coordinate position of described virtual reality wearable device, specifically includes:
Described main frame receives time and described ultrasound wave according to the described ultrasound wave that the coordinate position of three ultrasonic receivers placed at X-axis, Y-axis and Z axis respectively and these three ultrasonic receivers are corresponding and sends the difference of time, draw a preparation coordinate position, exhaust the coordinate position of all ultrasonic receivers and the described ultrasound wave reception time of ultrasonic receiver according to aforesaid way, draw multiple preparation coordinate position;
Described main frame obtains the coordinate position of described virtual reality wearable device by all preparation coordinate positions are done weighted average.
CN201610263594.4A 2016-04-26 2016-04-26 Virtual reality position identification system and method Pending CN105785373A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610263594.4A CN105785373A (en) 2016-04-26 2016-04-26 Virtual reality position identification system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610263594.4A CN105785373A (en) 2016-04-26 2016-04-26 Virtual reality position identification system and method

Publications (1)

Publication Number Publication Date
CN105785373A true CN105785373A (en) 2016-07-20

Family

ID=56399553

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610263594.4A Pending CN105785373A (en) 2016-04-26 2016-04-26 Virtual reality position identification system and method

Country Status (1)

Country Link
CN (1) CN105785373A (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106569337A (en) * 2016-10-21 2017-04-19 北京小鸟看看科技有限公司 Virtual reality system and positioning method thereof
CN106646480A (en) * 2016-11-04 2017-05-10 乐视控股(北京)有限公司 Positioning system in enclosed space, correlation method and apparatus
CN108169713A (en) * 2017-12-26 2018-06-15 青岛小鸟看看科技有限公司 Localization method and device, the virtual reality device and system of external equipment
CN108196258A (en) * 2017-12-26 2018-06-22 青岛小鸟看看科技有限公司 Method for determining position and device, the virtual reality device and system of external equipment
CN108303698A (en) * 2016-12-29 2018-07-20 宏达国际电子股份有限公司 Tracing system, follow-up mechanism and method for tracing
CN108965712A (en) * 2018-07-30 2018-12-07 青岛小鸟看看科技有限公司 A kind of space positioning system and its synchronous method and device
CN109188413A (en) * 2018-10-18 2019-01-11 京东方科技集团股份有限公司 The localization method of virtual reality device, device and system
CN109274886A (en) * 2018-09-18 2019-01-25 成都泰盟软件有限公司 A kind of mixed reality video recording method based on OpenVR
CN110262667A (en) * 2019-07-29 2019-09-20 上海乐相科技有限公司 A kind of virtual reality device and localization method
TWI675217B (en) * 2016-12-26 2019-10-21 宏達國際電子股份有限公司 Positioning system and method thereof
CN112379774A (en) * 2020-11-12 2021-02-19 歌尔光学科技有限公司 Interaction method of VR glasses and wrist strap equipment and related components

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102200829A (en) * 2010-03-24 2011-09-28 南京大学 Body movement recognition device used for virtual reality input
CN103814304A (en) * 2011-09-20 2014-05-21 阿尔斯通技术有限公司 Method of locating an event transmitting a signal
CN105183161A (en) * 2015-09-02 2015-12-23 胡剑颖 Synchronized moving method for user in real environment and virtual environment
WO2016041088A1 (en) * 2014-09-19 2016-03-24 Sulon Technologies Inc. System and method for tracking wearable peripherals in augmented reality and virtual reality applications

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102200829A (en) * 2010-03-24 2011-09-28 南京大学 Body movement recognition device used for virtual reality input
CN103814304A (en) * 2011-09-20 2014-05-21 阿尔斯通技术有限公司 Method of locating an event transmitting a signal
WO2016041088A1 (en) * 2014-09-19 2016-03-24 Sulon Technologies Inc. System and method for tracking wearable peripherals in augmented reality and virtual reality applications
CN105183161A (en) * 2015-09-02 2015-12-23 胡剑颖 Synchronized moving method for user in real environment and virtual environment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
韩新立: "三维超声定位系统设计与实现", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》 *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106569337B (en) * 2016-10-21 2019-11-08 北京小鸟看看科技有限公司 A kind of virtual reality system and its localization method
CN106569337A (en) * 2016-10-21 2017-04-19 北京小鸟看看科技有限公司 Virtual reality system and positioning method thereof
CN106646480A (en) * 2016-11-04 2017-05-10 乐视控股(北京)有限公司 Positioning system in enclosed space, correlation method and apparatus
TWI675217B (en) * 2016-12-26 2019-10-21 宏達國際電子股份有限公司 Positioning system and method thereof
CN108303698A (en) * 2016-12-29 2018-07-20 宏达国际电子股份有限公司 Tracing system, follow-up mechanism and method for tracing
CN108169713A (en) * 2017-12-26 2018-06-15 青岛小鸟看看科技有限公司 Localization method and device, the virtual reality device and system of external equipment
CN108196258A (en) * 2017-12-26 2018-06-22 青岛小鸟看看科技有限公司 Method for determining position and device, the virtual reality device and system of external equipment
CN108196258B (en) * 2017-12-26 2020-07-07 青岛小鸟看看科技有限公司 Method and device for determining position of external device, virtual reality device and system
CN108965712A (en) * 2018-07-30 2018-12-07 青岛小鸟看看科技有限公司 A kind of space positioning system and its synchronous method and device
CN108965712B (en) * 2018-07-30 2021-03-16 青岛小鸟看看科技有限公司 Space positioning system and synchronization method and device thereof
CN109274886B (en) * 2018-09-18 2020-09-25 成都泰盟软件有限公司 OpenVR-based mixed reality video recording method
CN109274886A (en) * 2018-09-18 2019-01-25 成都泰盟软件有限公司 A kind of mixed reality video recording method based on OpenVR
CN109188413A (en) * 2018-10-18 2019-01-11 京东方科技集团股份有限公司 The localization method of virtual reality device, device and system
CN110262667A (en) * 2019-07-29 2019-09-20 上海乐相科技有限公司 A kind of virtual reality device and localization method
CN110262667B (en) * 2019-07-29 2023-01-10 上海乐相科技有限公司 Virtual reality equipment and positioning method
CN112379774A (en) * 2020-11-12 2021-02-19 歌尔光学科技有限公司 Interaction method of VR glasses and wrist strap equipment and related components

Similar Documents

Publication Publication Date Title
CN105785373A (en) Virtual reality position identification system and method
KR102133105B1 (en) 3D spatial detection system, positioning method and system
CN112513711B (en) Method and system for resolving hemispherical ambiguities using position vectors
CN105608746B (en) A method of reality is subjected to Virtual Realization
EP2579128B1 (en) Portable device, virtual reality system and method
CN106406551A (en) Positioning system, positioning terminal and positioning network
CN105378801A (en) Holographic snap grid
CN108267715A (en) Localization method and device, the virtual reality device and system of external equipment
CN105824416A (en) Method for combining virtual reality technique with cloud service technique
CN106774901B (en) Remote PC body-sensing input method based on localization by ultrasonic
KR20110025216A (en) Method for producing an effect on virtual objects
CN106291455A (en) Positioner based on movement state information and method
CN110044357A (en) A kind of interior high-precision three-dimensional wireless location method
Shin et al. Application of precise indoor position tracking to immersive virtual reality with translational movement support
CN206249245U (en) A kind of alignment system, positioning terminal and positioning network
Carter et al. An ultrasonic indoor positioning system for harsh environments
CN110262667B (en) Virtual reality equipment and positioning method
CN105653035A (en) Virtual reality control method and system
CN106646480A (en) Positioning system in enclosed space, correlation method and apparatus
CN110928404B (en) Tracking system and related tracking method thereof
Thio et al. Experimental evaluation of the Forkbeard ultrasonic indoor positioning system
CN107229055B (en) Mobile equipment positioning method and mobile equipment positioning device
CN206876184U (en) A kind of indoor positioning device based on RSSI and inertial navigation
CN113190113A (en) Ultra-wideband positioning virtual reality system and positioning method for realizing position and direction
CN106843481A (en) A kind of three dimensions Freehandhand-drawing device and method based on gesture control

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20160720

RJ01 Rejection of invention patent application after publication