CN107193380B - High-precision virtual reality positioning system - Google Patents

High-precision virtual reality positioning system Download PDF

Info

Publication number
CN107193380B
CN107193380B CN201710382295.7A CN201710382295A CN107193380B CN 107193380 B CN107193380 B CN 107193380B CN 201710382295 A CN201710382295 A CN 201710382295A CN 107193380 B CN107193380 B CN 107193380B
Authority
CN
China
Prior art keywords
positioning
head
positioning unit
unit
handle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710382295.7A
Other languages
Chinese (zh)
Other versions
CN107193380A (en
Inventor
魏学华
廖巍巍
曾超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Seefeld Technology Co ltd
Original Assignee
Chengdu Seefeld Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Seefeld Technology Co ltd filed Critical Chengdu Seefeld Technology Co ltd
Priority to CN201710382295.7A priority Critical patent/CN107193380B/en
Publication of CN107193380A publication Critical patent/CN107193380A/en
Application granted granted Critical
Publication of CN107193380B publication Critical patent/CN107193380B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides a high-precision virtual reality positioning system. The system comprises a head-wearing positioning unit, a handle positioning unit and a wall surface or ground positioning point unit; the embodiment of the present invention further provides a virtual reality positioning system, including: a spatial positioning calibration method, a spatial positioning implementation method and a virtual interactive spatial positioning method; compared with the prior art, the method and the device can more quickly position the spatial position of the virtual reality device wearer and accurately position the position information in a wide space; the system has simple structure and convenient operation, and is a virtual reality space positioning technology with low cost and high efficiency; the method is combined with a virtual reality interaction system, and brand new virtual reality interaction experience can be brought to a user using the equipment.

Description

High-precision virtual reality positioning system
Technical Field
The invention relates to the technical field of virtual reality space positioning and virtual reality interaction, in particular to a high-precision virtual reality positioning system.
Background
With the development of the times and the progress of science and technology, consumer-grade products are continuously pushed out, and virtual reality becomes a development hotspot which is widely concerned. The virtual reality technology utilizes a computer or other intelligent computing equipment to simulate and generate a virtual world of a three-dimensional space, provides simulation of senses of vision, hearing, touch and the like for a user, and enables the user to be as if the user is personally on the scene. The virtual reality device is mainly used for enabling people to have real sense in a virtual environment. Therefore, the virtual reality apparatus needs to know the position information and motion information of the human eyes of the user using the apparatus, such as the relative position and direction in space.
Disclosure of Invention
At present, the virtual reality positioning technology in the industry mainly uses infrared optical positioning, laser positioning and visible light positioning. The infrared optical positioning covers the indoor positioning space by using a plurality of infrared emission cameras, the positioning precision is very high, the range is large, but the cost of a plurality of high frame rate cameras is very high; the laser positioning utilizes a positioning light tower and a plurality of laser induction receivers on a positioned object to calculate laser angle difference positioning, and has high positioning precision, wide range and complex equipment structure and higher cost; the active visible light of a tracking point is shot by the camera in visible light positioning, and the low range of positioning accuracy is narrow but the cost is low.
The technical problem to be solved by the embodiment of the invention is to provide a low-cost high-precision virtual reality positioning and interaction system which is high in positioning speed and wide in positioning range and overcomes the problems of space positioning of the conventional virtual reality system.
As described above, a low-cost high-precision virtual reality positioning and interaction system is realized, the system comprising:
the invention relates to a simulated human positioning device which comprises a head-wearing positioning unit, a handle positioning unit and a wall or ground positioning point unit.
Wherein, wear the positioning unit and include power, two high-speed cameras of global shutter, central processing unit, the trinity sensor of direction of gravity gyroscope, memory and wireless signal receiver.
Preferably, the two global shutter high-speed cameras are provided with light-admitting filters with the wavelength of 850 nanometers, and can shoot more than 120 frames of pictures within 1 second.
Preferably, the global shutter high-speed camera is connected with the central processing unit by using an MIPI (Mobile Industry processor interface), and has a higher bandwidth compared with a conventional USB interface, and the global shutter high-speed camera can directly transmit spatial position information to devices such as a computer after processing information.
The handle positioning unit comprises a power supply, a wireless signal emitter, a three-in-one sensor of a gravity direction gyroscope and a light emitting diode.
Preferably, the light emitting diode has a wavelength of 850 nanometers and can flash 120 times in 1 second.
The wall surface or ground positioning point unit comprises a power supply and a light emitting diode.
Preferably, the light emitting diode has a wavelength of 850 nanometers and remains bright.
The embodiment of the present invention further provides a virtual reality positioning system, including: the method comprises a space positioning calibration method, a space positioning implementation method and a virtual interactive space positioning method.
The spatial positioning calibration method comprises the steps of placing the head-wearing positioning unit at the central position of a spatial central point, enabling the camera to be perpendicular to four surfaces of a surrounding space at each time, and recording data of the current surface in sequence to finish calibration.
Preferably, the flow of the head-mounted positioning unit recording the current data is as follows: the head-mounted positioning unit calculates the distance from each light spot to the center points of the two cameras of the head-mounted positioning unit and the position data of each light spot on the imaging data of the cameras through the double cameras and the central processing unit; the head-wearing positioning unit acquires the attitude and direction information of the current head-wearing positioning unit through a gravity direction gyroscope three-in-one sensor; and finally, storing the distance and position data of the light spot and the posture and direction information of the head-mounted positioning unit into a memory.
The space positioning implementation method comprises the following steps: the head-mounted positioning unit calculates the distance from each light spot to the center points of two cameras in the head-mounted positioning unit and the position data of each light spot on the imaging data of the cameras through the double cameras and the central processing unit; the head-wearing positioning unit acquires the attitude and direction information of the current head-wearing positioning unit through a gravity direction gyroscope three-in-one sensor; and the central processing unit reads related calibration data in the storage according to the current posture and direction information of the head-mounted positioning unit and calculates the current position information of the equipment according to the current position data of each light spot on the imaging data of the camera and the posture and direction information of the head-mounted positioning unit.
The virtual interactive space positioning method comprises the following steps: the handle positioning unit sends data signals of the gravity direction gyroscope three-in-one sensor in the handle data to the head-mounted positioning unit through the wireless signal transmitter; the head-wearing positioning unit receives the handle data through the wireless signal receiver and obtains the posture of the handle positioning unit in the head-wearing positioning unit through the operation of the central processing unit according to the posture data of the head-wearing positioning unit; the camera collects the data of the infrared luminotron to the central processing unit, the central processing unit distinguishes the handle positioning unit or the positioning point unit according to the flash frequency and obtains the information of the handle light spot and the positioning point light spot through the two cameras, and the central processing unit calculates the position of the handle positioning unit and the distance and the position information of the handle positioning unit from the head-wearing positioning unit according to the information of the handle light spot.
The invention provides a low-cost high-precision virtual reality positioning and interaction system, which can be suitable for head-mounted virtual reality equipment; compared with the prior art, the method can more quickly position the spatial position of the virtual reality device wearer and accurately position the position information in a wide space; the method has simple structure and convenient operation, and is a virtual reality space positioning technology with low cost and high efficiency; the method is combined with a virtual reality interaction system, and brand new virtual reality interaction experience can be brought to a user using the equipment.
Drawings
FIG. 1 is a schematic diagram of a low-cost high-precision virtual reality positioning and interaction system according to the present invention.
Fig. 2 is a schematic view of a head mounted positioning unit of the present invention.
Fig. 3 is a schematic view of a handle positioning unit of the present invention.
Fig. 4 is a schematic view of a wall or ground anchor point unit of the present invention.
Detailed Description
The invention is described in further detail below with reference to the figures and the detailed description.
FIG. 1 is a low-cost high-precision virtual reality positioning and interaction system of the present invention. The system comprises 3 units: 101 a head-mounted positioning unit, 102 a handle positioning unit, 103 a wall or ground positioning point unit.
The 101 head-mounted positioning unit is a main unit of the virtual reality positioning and interaction system, and is generally integrated in a virtual reality head-mounted display device, and the unit comprises 201 a global shutter high-speed camera, 202 a central processing unit, 203 a gravity direction gyroscope three-in-one sensor, 204 a storage and 205 a wireless receiver in fig. 2. All hardware in the head-mounted positioning unit is installed on the same circuit board as shown in fig. 2, and the circuit board with the distance of 6.5cm. between the two 201 global shutter high-speed cameras is installed in the virtual reality head-mounted display device, so that the camera shooting direction is parallel to the front of a person watching a virtual reality device.
The 201 global shutter high-speed camera in fig. 2 is provided with a light filter with a wavelength of 850 nm, and can be matched with a light emitting diode with the same wavelength in a wall surface or ground positioning point unit to effectively and accurately record data information of each light spot, so that a positioning system can calculate position information of a head-mounted positioning unit; the global shutter high-speed camera 201 can also be matched with a 303 same-wavelength light-emitting diode in the handle positioning unit to obtain position information between the handle positioning unit and the head-mounted positioning unit; the 201 global shutter high-speed camera is additionally provided with a sampling and holding unit at each pixel, samples data after the specified time is reached and then sequentially reads out the data, can shoot 120 frames of pictures per second, and effectively ensures that objects in high-speed motion cannot be distorted. The 201 global shutter high-speed camera adopts an MIPI interface to be connected with the central processing unit, and compared with a traditional USB interface, the 201 global shutter high-speed camera has higher bandwidth, and can directly transmit spatial position information to equipment such as a computer after information processing is finished.
The central processor 202 in fig. 2 is the main unit of the spatial localization calculation of the present invention, and the central processor 202 can receive 401 the distance between the wall localization light spot and the global shutter high-speed camera 201, and read 203 the attitude and direction information of the three-in-one sensor of the gravity direction gyroscope and the head-mounted localization unit in the memory 204 to calculate the actual three-dimensional spatial position information of the head-mounted localization unit in the whole space. The central processor 202 can also read the light point information of the 302 gravity direction gyroscope three-in-one sensor in the handle positioning unit to calculate the position information between the handle positioning unit and the head-mounted positioning unit.
The 203 gravity direction gyroscope three-in-one sensor in fig. 2 is used for obtaining the posture information of the head-mounted positioning unit, and transmitting the posture information to the 202 central processing unit to assist in spatial positioning calculation.
The 204 memory in fig. 2 is used to store the distances from 12 401 wall positioning light spots to the center points of 2 201 global shutter high-speed cameras, and the memory effectively stores the distance information of the current light spots for spatial positioning calculation.
The wireless receiver 205 in fig. 2 is matched with the wireless signal transmitter 301 of the handle positioning unit for data transmission, and the central processor 202 is helped to obtain the posture information of the three-in-one sensor 203 of the gravity direction gyroscope and the three-in-one sensor 204 of the gravity direction gyroscope in the handle positioning unit.
FIG. 3 is a schematic diagram of a handle positioning unit comprising 301 a wireless signal transmitter, 302 a gravity direction gyroscope triad sensor and 303 a light emitting diode having a wavelength of 850 nanometers. The user holds the handle positioning unit with the hand after wearing the virtual reality head-mounted display equipment, the infrared lamp faces the direction of the front thumb, and the handle positioning unit is mainly used in the virtual interaction space positioning method.
Fig. 4 is a schematic diagram of a wall or floor location point unit with 12 401 wall location light spots consisting of long bright 850 nm wavelength leds. The wall or ground positioning point units are installed on four walls of an actual space, 3 wall positioning light spots are attached to the periphery of the center position of each wall, the positions of every 2 wall positioning light spots are not on the same level or perpendicular to the ground, and the distance between the two wall positioning light spots is larger than 15 cm.
The invention provides a virtual reality positioning system, comprising: the method comprises a space positioning calibration method, a space positioning implementation method and a virtual interactive space positioning method.
The space positioning calibration method comprises the following steps: firstly, installing 3 401 wall positioning light spots on each wall of four walls in an actual space, installing the light spots on the four walls if the actual space is small, and installing the light spots on the ground around the head-mounted positioning unit if the actual space is large; then put the head-mounted positioning unit to the central position of the central point of the whole actual space, make 201 global shutter high-speed cameras perpendicular to four sides of the surrounding space at every turn respectively and record the data of the wall surface in sequence to finish the calibration.
The data flow of the head-mounted positioning unit recording the wall surface is as follows: the head-mounted positioning unit calculates the distance from each 401 wall positioning light spot to the central point of two 201 global shutter high-speed cameras of the head-mounted positioning unit and the position data of each 401 wall positioning light spot on the imaging data of the 201 cameras through 2 201 global shutter high-speed cameras and a 202 central processing unit; the head-wearing positioning unit acquires the attitude and direction information of the current head-wearing positioning unit through a three-in-one sensor of a gyroscope in the 203 gravity direction; finally, the distance and position data of the 401 wall positioning light spot, and the posture and direction information of the head positioning unit are stored in a 204 memory.
The space positioning implementation method comprises the following steps: the head-mounted positioning unit calculates the distance from each 401 wall positioning light spot to the central point of two 201 global shutter high-speed cameras in the head-mounted positioning unit and the position data of each 401 wall positioning light spot on the imaging data of the 201 global shutter high-speed cameras through 2 201 global shutter high-speed cameras and a central processing unit 202; the head-wearing positioning unit acquires the attitude and direction information of the current head-wearing positioning unit through a three-in-one sensor of a gyroscope in the 203 gravity direction; the central processing unit 202 reads 204 the relevant calibration data in the memory according to the current posture and direction information of the head-mounted positioning unit and calculates the position information of the current equipment according to the position data of each current 401 wall positioning light spot on the imaging data of the 201 global shutter high-speed camera and the posture and direction information of the head-mounted positioning unit.
The virtual interactive space positioning method comprises the following steps: the handle positioning unit sends data signals of a 302 gravity direction gyroscope three-in-one sensor in handle data to the head-mounted positioning unit through a 301 wireless signal transmitter; the head-wearing positioning unit receives the handle data through a 205 wireless signal receiver and obtains the posture of the handle positioning unit in the head-wearing positioning unit through the operation of a 202 central processing unit according to the posture data of the head-wearing positioning unit; the 201 global shutter high-speed camera collects data of the infrared light emitting tube to a 202 central processing unit, the 202 central processing unit distinguishes whether the handle positioning unit or the positioning point unit according to the flash frequency, handle light spots and positioning point light spot information are obtained through the two 201 global shutter high-speed cameras, and the 202 central processing unit calculates the position of the handle positioning unit and the distance and the position information between the handle positioning unit and the head-mounted positioning unit according to the handle light spot information.

Claims (3)

1. A high-precision virtual reality positioning system is characterized by comprising a head-mounted positioning unit, a handle positioning unit and a wall surface or ground positioning point unit; the head-mounted positioning unit comprises a power supply, two global shutter high-speed cameras, a central processing unit, a gravity direction gyroscope three-in-one sensor, a storage and a wireless signal receiver; the handle positioning unit comprises a power supply, a wireless signal emitter, a gravity direction gyroscope three-in-one sensor and a light emitting diode; the wall surface or ground positioning point unit comprises a power supply and a light emitting diode;
the virtual reality positioning system provides a virtual reality positioning method, which comprises the following steps: a spatial positioning calibration method, a spatial positioning implementation method and a virtual interactive spatial positioning method;
the space positioning implementation method comprises the following steps: the head-mounted positioning unit calculates the distance from each light spot to the center points of two cameras in the head-mounted positioning unit and the position data of each light spot on the imaging data of the cameras through the double cameras and the central processing unit; the head-wearing positioning unit acquires the attitude and direction information of the current head-wearing positioning unit through a gravity direction gyroscope three-in-one sensor; finally, the distance and position data of the light spots and the posture and direction information of the head-mounted positioning unit are stored in a memory, and a central processing unit reads related calibration data in the memory according to the current posture and direction information of the head-mounted positioning unit and calculates the position information of the current equipment according to the position data of each current light spot on the imaging data of the camera and the posture and direction information of the head-mounted positioning unit;
the virtual interactive space positioning method comprises the following steps: the handle positioning unit sends data signals of the gravity direction gyroscope three-in-one sensor in the handle data to the head-mounted positioning unit through the wireless signal transmitter; the head-wearing positioning unit receives the handle data through the wireless signal receiver and obtains the posture of the handle positioning unit in the head-wearing positioning unit through the operation of the central processing unit according to the posture data of the head-wearing positioning unit; the camera collects the data of the infrared luminotron to the central processing unit, the central processing unit distinguishes the handle positioning unit or the positioning point unit according to the flash frequency and obtains the information of the handle light spot and the positioning point light spot through the two cameras, and the central processing unit calculates the position of the handle positioning unit and the distance and the position information of the handle positioning unit from the head-wearing positioning unit according to the information of the handle light spot.
2. The high-precision virtual reality positioning system according to claim 1, wherein the two global shutter high-speed cameras are provided with light-admitting optical filters with the wavelength of 850 nanometers, can shoot more than 120 frames of pictures within 1 second, and are directly connected with a central processing unit through a mobile industry processor interface; the wavelength of the light emitting diode of the handle positioning unit is 850 nanometers and the light emitting diode can flash 120 times in 1 second; the wavelength of the LED at the location point of the wall or the ground is 850 nanometers and keeps long and bright.
3. The high-precision virtual reality positioning system according to claim 1, wherein the spatial positioning calibration method comprises: and placing the head-wearing positioning unit at the central position of the central point of the space, and enabling the camera to be respectively perpendicular to four surfaces of the surrounding space at each time and sequentially recording the data of the current surface to finish calibration.
CN201710382295.7A 2017-05-26 2017-05-26 High-precision virtual reality positioning system Active CN107193380B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710382295.7A CN107193380B (en) 2017-05-26 2017-05-26 High-precision virtual reality positioning system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710382295.7A CN107193380B (en) 2017-05-26 2017-05-26 High-precision virtual reality positioning system

Publications (2)

Publication Number Publication Date
CN107193380A CN107193380A (en) 2017-09-22
CN107193380B true CN107193380B (en) 2020-04-03

Family

ID=59875013

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710382295.7A Active CN107193380B (en) 2017-05-26 2017-05-26 High-precision virtual reality positioning system

Country Status (1)

Country Link
CN (1) CN107193380B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110134226B (en) * 2018-02-09 2022-05-10 深圳市掌网科技股份有限公司 Auxiliary positioning device and virtual reality operation platform adopting same
CN108595023A (en) * 2018-04-27 2018-09-28 宁波视睿迪光电有限公司 A kind of interactive handle and system
CN109375764B (en) * 2018-08-28 2023-07-18 北京凌宇智控科技有限公司 Head-mounted display, cloud server, VR system and data processing method
CN109407834B (en) 2018-10-08 2021-12-03 京东方科技集团股份有限公司 Electronic equipment, computer equipment, space positioning system and method
CN110567451A (en) * 2019-09-20 2019-12-13 深圳市丰之健电子科技有限公司 Human body posture recognition instrument device and use method thereof

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1603944A (en) * 2003-09-30 2005-04-06 Tdk株式会社 Calibration jig for a stereoscopic camera and calibrating method for the camera
CN103185577A (en) * 2011-12-29 2013-07-03 盛乐信息技术(上海)有限公司 Method and system used for acquiring spatial attitude data
CN105445937A (en) * 2015-12-27 2016-03-30 深圳游视虚拟现实技术有限公司 Mark point-based multi-target real-time positioning and tracking device, method and system
CN106445159A (en) * 2016-09-30 2017-02-22 乐视控股(北京)有限公司 Virtual reality system and positioning method
CN106597864A (en) * 2016-12-15 2017-04-26 北京国承万通信息科技有限公司 Virtual reality system and intelligent household system
CN106643699A (en) * 2016-12-26 2017-05-10 影动(北京)科技有限公司 Space positioning device and positioning method in VR (virtual reality) system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1603944A (en) * 2003-09-30 2005-04-06 Tdk株式会社 Calibration jig for a stereoscopic camera and calibrating method for the camera
CN103185577A (en) * 2011-12-29 2013-07-03 盛乐信息技术(上海)有限公司 Method and system used for acquiring spatial attitude data
CN105445937A (en) * 2015-12-27 2016-03-30 深圳游视虚拟现实技术有限公司 Mark point-based multi-target real-time positioning and tracking device, method and system
CN106445159A (en) * 2016-09-30 2017-02-22 乐视控股(北京)有限公司 Virtual reality system and positioning method
CN106597864A (en) * 2016-12-15 2017-04-26 北京国承万通信息科技有限公司 Virtual reality system and intelligent household system
CN106643699A (en) * 2016-12-26 2017-05-10 影动(北京)科技有限公司 Space positioning device and positioning method in VR (virtual reality) system

Also Published As

Publication number Publication date
CN107193380A (en) 2017-09-22

Similar Documents

Publication Publication Date Title
CN107193380B (en) High-precision virtual reality positioning system
JP7297028B2 (en) Systems and methods for augmented reality
US10678324B2 (en) Systems and methods for augmented reality
US10972715B1 (en) Selective processing or readout of data from one or more imaging sensors included in a depth camera assembly
CN105138135B (en) Wear-type virtual reality device and virtual reality system
JP7519742B2 (en) Method and system for resolving hemispheric ambiguity using position vectors - Patents.com
KR102331164B1 (en) Systems and Methods for Augmented Reality
CN106774844B (en) Method and equipment for virtual positioning
US20160292924A1 (en) System and method for augmented reality and virtual reality applications
US20150097719A1 (en) System and method for active reference positioning in an augmented reality environment
WO2016184255A1 (en) Visual positioning device and three-dimensional mapping system and method based on same
JP7546116B2 (en) Systems and methods for augmented reality - Patents.com
CN105190703A (en) 3D environmental modeling using photometric stereo
US20140085204A1 (en) Correlating Pupil Position to Gaze Location Within a Scene
CN108257177B (en) Positioning system and method based on space identification
WO2018113759A1 (en) Detection system and detection method based on positioning system and ar/mr
WO2015048890A1 (en) System and method for augmented reality and virtual reality applications
CN110572635A (en) Method, equipment and system for tracking and positioning handheld control equipment
EP4134917A1 (en) Imaging systems and methods for facilitating local lighting
US11256090B2 (en) Systems and methods for augmented reality
US11494997B1 (en) Augmented reality system with display of object with real world dimensions
CN107544549B (en) Positioning and data transmission method and system suitable for VR equipment
Dong et al. SEVAR: a stereo event camera dataset for virtual and augmented reality
CN109117000A (en) A kind of interactive holographic blackboard system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant