WO2018196216A1 - 一种坐标对齐方法、系统和虚拟现实系统 - Google Patents

一种坐标对齐方法、系统和虚拟现实系统 Download PDF

Info

Publication number
WO2018196216A1
WO2018196216A1 PCT/CN2017/096120 CN2017096120W WO2018196216A1 WO 2018196216 A1 WO2018196216 A1 WO 2018196216A1 CN 2017096120 W CN2017096120 W CN 2017096120W WO 2018196216 A1 WO2018196216 A1 WO 2018196216A1
Authority
WO
WIPO (PCT)
Prior art keywords
coordinate system
control device
binocular
hmd
angle
Prior art date
Application number
PCT/CN2017/096120
Other languages
English (en)
French (fr)
Inventor
黄嗣彬
戴景文
贺杰
Original Assignee
广东虚拟现实科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 广东虚拟现实科技有限公司 filed Critical 广东虚拟现实科技有限公司
Publication of WO2018196216A1 publication Critical patent/WO2018196216A1/zh
Priority to US16/236,488 priority Critical patent/US10802606B2/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/64Analysis of geometric attributes of convexity or concavity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • G06T2207/10021Stereoscopic video; Stereoscopic image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Definitions

  • the present invention relates to the field of VR/MR/AR technology, and in particular, to a coordinate alignment method, a system, and a virtual reality system.
  • VR Virtual Reality
  • AR Augmented Reality
  • MR Mixed Reality
  • the existing VR interaction device includes a Head Mount Display (HMD), an internal or external tracking transmitting or receiving device (such as a stereo camera, an infrared receiver or a transmitter, and a signal transmitting receiver).
  • HMD Head Mount Display
  • Trackler the hand device controller
  • the action can be captured by the cooperation between the HMD, the tracking transmitting or receiving device, and the Controller.
  • IO InsideOut
  • OI OutsideIn
  • IO has LeapMotion gesture recognition, Ximmerse Xhawk, etc.
  • OI has Occulus Constellation, HTC VIVE, Sony PSVR and so on.
  • the IMU Inertial Measurement Unit
  • the IMU module includes: gyroscopes, accelerometer accelerometer,
  • the magnetometer is aligned with the optical coordinates (the coordinates of the device used to track and locate the Controller).
  • the current alignment scheme often uses multi-spot target pose estimation to directly estimate the pose of the object by optical.
  • This scheme requires multiple light spots or receivers in the upper part of the HMD due to the use of multiple spots.
  • This solution requires various directions.
  • the deployment of light spots is very complicated, and the installation of light spots or receivers requires high precision and very complicated calibration.
  • the cost is high, the mass production is difficult, the product configuration is complicated, and it takes a professional person to spend an hour or two. Can be configured successfully.
  • embodiments of the present invention provide a coordinate alignment method, system, and virtual reality system to achieve alignment between a control device and a positive direction of a positive direction and a virtual world coordinate system.
  • the embodiment of the present invention provides the following technical solutions:
  • a coordinate alignment method is applied to a virtual reality device, an augmented reality device, or a mixed reality device, and the method includes:
  • the binocular coordinate system has a preset mapping relationship with the virtual world coordinate system of the virtual world.
  • the method further includes:
  • the acquiring real-time coordinate data in the binocular coordinate system during the reciprocating motion of the control device based on the light spot on the control device includes:
  • Real-time coordinate data located in the binocular coordinate system during the circular motion during the linear reciprocating motion or the rotational motion of the control device is acquired based on the light spot on the control device.
  • the calculation results in an angle between the positive direction of the control device and the positive direction of the binocular coordinate system, which is recorded as a first angle, and includes:
  • the motion trajectory is a straight line
  • the motion trajectory is straight-line fitted, the straightness of the motion trajectory is calculated, and a straight line direction perpendicular to the motion trajectory after the straight line fitting is calculated;
  • Segmenting the motion trajectory to obtain a multi-segment sub-motion trajectory and calculating a slope K i of a linear direction corresponding to each sub-motion trajectory in a binocular coordinate system, wherein the i corresponds to each sub-segment trajectory;
  • the sum of the included angle ⁇ and the included angle ⁇ is taken as the angle between the positive direction of the control device and the positive direction of the binocular coordinate system.
  • the position and direction of the HMD device in the binocular coordinate system are estimated by the HMD dual spot on the HMD device, including:
  • Calculating the position of the double spot in the binocular coordinate system by calculating the stereo measurement of the image processing of the HMD double spot on the HMD device, and calculating the position according to the position of the double spot in the binocular coordinate system
  • the position of the HMD device in the binocular coordinate system, according to the double spot in the binocular coordinates The position in the system calculates the positive direction of the double spot, and the positive direction of the double spot is recorded as the positive direction of the HMD device.
  • a coordinate alignment system applied to a virtual reality device, an augmented reality device, or a mixed reality device including:
  • Controlling a device direction alignment subsystem configured to: align a positive direction of a virtual item corresponding to the control device in a virtual world with a positive direction of the virtual world;
  • the control device direction alignment unit subsystem includes:
  • control device posture calculation unit configured to acquire sensor data of the control device; and calculate posture data of the control device according to the sensor data;
  • a motion trajectory calculation unit configured to acquire real-time coordinate data of the control device in a binocular coordinate system during reciprocation based on a light spot on the control device while acquiring the posture data of the control device;
  • the real-time coordinate data and the attitude data corresponding to the real-time coordinate data generate a motion trajectory of the control device during the motion;
  • a first angle calculation unit calculates a posture of the control device in a coordinate system of the control device based on the motion trajectory, and records the positive direction of the control device in the binocular coordinate system; and calculates a coordinate system of the control device The angle between the positive directions of the binocular coordinate system is recorded as the first angle;
  • a binocular coordinate adjustment unit configured to correct the binocular coordinate system according to the first angle, such that an angle between the positive direction of the control device and the positive direction of the binocular coordinate system is zero, so that the virtual The positive direction of the virtual item corresponding to the control device in the world is aligned with the positive direction of the virtual world; wherein the binocular coordinate system and the virtual world coordinate system of the virtual world have a preset mapping relationship.
  • the method further includes:
  • An HMD device direction alignment subsystem configured to adjust a position and a direction of an object corresponding to the HMD device in a virtual world coordinate in a virtual world according to a position and a direction of the HMD device;
  • the HMD device direction alignment subsystem includes:
  • the HMD position and direction calculation unit is configured to estimate the position and direction of the HMD device in the binocular coordinate system by using the HMD dual spot on the HMD device, and record the direction of the HMD device in the binocular coordinate system as the HMD direction vector;
  • a second angle calculating unit configured to calculate an angle between the HMD direction vector and a binocular direction vector preset in the binocular coordinate system, and record the second angle
  • a virtual user adjustment unit configured to adjust an angle between a positive direction of an object corresponding to the HMD device and a positive direction of virtual world coordinates in the virtual world according to the second angle, according to the HMD device
  • the position in the binocular coordinate system adjusts a position of an object corresponding to the HMD device in the virtual world in the binocular coordinate system.
  • the motion track calculation unit is specifically configured to:
  • the first angle calculation unit is specifically configured to:
  • the motion trajectory is a straight line
  • the motion trajectory is straight-line fitted, the straightness of the motion trajectory is calculated, and a straight line direction perpendicular to the motion trajectory after the straight line fitting is calculated;
  • Segmenting the motion trajectory to obtain a multi-segment sub-motion trajectory and calculating a slope K i of a linear direction corresponding to each sub-motion trajectory in a binocular coordinate system, wherein the i corresponds to each sub-segment trajectory;
  • the sum of the included angle ⁇ and the included angle ⁇ is taken as the angle between the positive direction of the control device and the positive direction of the binocular coordinate system.
  • the HMD position and direction calculation unit is specifically configured to:
  • Calculating the position of the double spot in the binocular coordinate system by calculating the stereo measurement of the image processing of the HMD double spot on the HMD device, and calculating the position according to the position of the double spot in the binocular coordinate system
  • the position of the HMD device in the binocular coordinate system, according to the double spot in the binocular coordinates The position in the system calculates the positive direction of the double spot, and the positive direction of the double spot is recorded as the positive direction of the HMD device.
  • a virtual reality system comprising:
  • a dual spot output device fixedly connected to the head display device for outputting two light spots having a unique ID identifier
  • control device connected to the head display device by wireless or wired;
  • a light spot output device disposed on the control device for outputting a light spot having a unique ID identifier
  • a light spot tracking device for capturing the light spot, wherein the light track tracking device is configured with the motion track calculation unit, the HMD position and direction calculation unit described in the above embodiment, for calculating the calculated motion track and
  • the HMD direction vector is sent to the head display device through the dual spot output device.
  • the head display device is configured with the control device posture calculation unit, the first angle calculation unit, the binocular coordinate adjustment unit, the second angle calculation unit, and the virtual user adjustment unit described in the above embodiments.
  • FIG. 1 is a schematic flowchart of a coordinate alignment method according to an embodiment of the present invention.
  • FIG. 2 is a schematic flow chart of a coordinate alignment method according to another embodiment of the present invention.
  • FIG. 3 is a schematic flow chart of calculating an angle between a positive direction of a control device and a positive direction of a binocular coordinate system according to an embodiment of the present invention
  • FIG. 4 is a schematic structural diagram of a virtual display device coordinate alignment system according to an embodiment of the present invention.
  • the present invention discloses a coordinate alignment method, which can be applied to a VR device, an AR device, or an MR device in the prior art.
  • the method in FIG. 1 is mainly directed to the method, which may include:
  • Step S101 acquiring sensor data of the control device
  • the control device may be a remote control device such as a virtual reality device, an augmented reality device, or a handle of a mixed reality device. Each device may be provided with multiple control devices. Generally, two control devices are provided.
  • the internal device is configured with an IMU device, and the IMU device is configured with a plurality of sensors for measuring the posture of the device, and the sensor data of the sensor can be obtained to obtain the attitude data O1 of the two control devices (yaw, pitch, roll) ), O2 (yaw, pitch, roll);
  • Step S102 calculating posture data of the control device according to the sensor data
  • Step S103 acquiring real-time coordinate data in the binocular coordinate system during the reciprocating motion of the control device based on the light spot on the control device while acquiring the sensor attitude data;
  • a light point output device is disposed on each control device, and the light spot output device outputs a light spot signal with an ID identifier, and the light spot signal outputted by the tracking device is monitored and tracked to obtain a reciprocating motion.
  • Real-time coordinate data in a binocular coordinate system and each real-time coordinate data corresponds to posture data of a control device, and data processing is performed on the real-time coordinate data and the attitude data, so that the control device can be obtained when reciprocating The motion trajectory in the binocular coordinate system;
  • Step S104 Generate a motion trajectory of the control device during the motion according to the real-time coordinate data and the attitude data;
  • Step S105 Calculating a posture of the control device in a control device coordinate system based on the motion trajectory, and recording the positive direction of the control device in a binocular coordinate system;
  • the positive direction of the control device in the binocular coordinate system refers to the orientation direction of the device.
  • the control device is calculated in the binocular coordinate system according to the posture data of the control device and the motion trajectory.
  • the positive direction in the middle is recorded as the positive direction of the control device;
  • Step S106 calculating an angle between the control device coordinate system and the binocular coordinate system, which is recorded as a first angle
  • Step S107 correcting the binocular coordinate system according to the first angle, so that the angle between the positive direction of the control device and the positive direction of the binocular coordinate system is zero, so that the virtual world is The positive direction of the virtual item corresponding to the control device is aligned with the positive direction of the virtual world;
  • a virtual reality device an augmented reality device or a mixed reality device
  • a virtual coordinate coordinate system and a binocular coordinate system have a preset coordinate mapping relationship, and when the coordinates of one coordinate system change, the coordinates of another coordinate system
  • the binocular coordinate system is adjusted according to the first angle a positive direction, which is aligned with a positive direction of the virtual reality device, the augmented reality device, or the mixed reality device.
  • the virtual world coordinate system follows the coordinate change of the binocular coordinate system, so that the virtual world is The positive direction of the virtual item corresponding to the control device is aligned with the positive direction of the virtual world;
  • the binocular coordinate system has a preset mapping relationship with the virtual world coordinate system of the virtual world.
  • the IMU when the IMU is aligned to the control device, only one light spot needs to be set on the control device to implement the positive direction of the control device and the virtual world coordinate system.
  • the alignment between the positive directions is simple and the cost is low.
  • the reciprocating motion in the above embodiment of the present invention refers to a reciprocating motion of a regular pattern, for example,
  • the reciprocating motion of the control device in the linear direction or the circular motion of the rotation of the control device, that is, the step S103 can be further defined as:
  • Real-time coordinate data located in the binocular coordinate system during the circular motion during the linear reciprocating motion or the rotational motion of the control device is acquired based on the light spot on the control device.
  • the present invention also discloses a specific process for calculating the first angle.
  • the step S105 and the step S106 further include:
  • Step S301 Acquire a motion trajectory of the control device during the motion, where the motion trajectory includes real-time coordinate data and device posture data;
  • Step S302 Perform pattern recognition on the motion track to determine whether the motion track is a linear track or a circular track.
  • step S303 is performed, when the motion track is a circular track. Go to step S305;
  • Step S303 when the motion trajectory is a straight line, straight line fitting the motion trajectory, step S304 is performed;
  • Step S304 calculating a linear direction perpendicular to the motion trajectory after the straight line fitting
  • Step S305 when the motion trajectory is a circle-like, round-fit the motion trajectory, step S306 is performed;
  • Step S306 calculating a linear direction of a center of the motion trajectory of the circle fitting through the curve fitting;
  • Step S307 calculating a slope K of the linear direction in the binocular coordinate system
  • the slope K refers to a slope in a plane formed by a direction perpendicular to the perpendicular direction and a binocular coordinate system with respect to a positive direction of the binocular coordinate system;
  • Step S308 calculating an angle ⁇ between the linear direction and the positive direction of the binocular coordinate system according to the slope;
  • Step S309 segment the motion trajectory to obtain a multi-segment sub-motion trajectory, calculate a slope K i of a linear direction corresponding to each sub-motion trajectory in a binocular coordinate system, and perform step S310, wherein the i and each The segment motion trajectory corresponds;
  • the present invention further corrects the above-mentioned angle ⁇ , and segments the motion trajectory, and when segmenting the segment, the segmentation may be performed according to a preset interval.
  • the preset spacing may refer to a distance between two sampling points on the motion trajectory, that is, a motion trajectory between each two sampling points as a sub-motion trajectory, and each sub-motion obtained by dividing according to steps S302-308
  • the trajectory is processed into a process to obtain a slope K i corresponding to each sub-motion trajectory;
  • Step S310 respectively calculating the similarity D i of the slope K i corresponding to each segment of the motion track and the slope K;
  • Step S311 according to the formula (Formula 1) Calculate the weighted average horizontal direction angle ⁇ (the angle between the natural orientation of the handle and the binocular plane, which is recorded as the azimuth angle, which can also be calculated from the attitude data), wherein the N is The total number of segments of the trajectory of the narration;
  • Step S312 The sum of the angle ⁇ and the angle ⁇ is used as an angle between the positive direction of the control device and the positive direction of the binocular coordinate system.
  • the present invention also provides a method for implementing preset objects in the HMD and virtual world coordinate systems.
  • the scheme may include:
  • Step S201 estimating the position and direction of the HMD device in the binocular coordinate system by using the double light spot on the HMD device, and recording the direction of the HMD device in the binocular coordinate system as the HMD direction vector;
  • the HMD device is provided with a dual spot output device, and the dual spot output device is configured to output a double spot, and the spot tracking device can estimate the double spot in the double after detecting the double spot a position and a direction in the target coordinate system (a direction corresponding to the double light point), marking the direction of the HMD device in the binocular coordinate system as an HMD direction vector;
  • Step S202 Calculate an angle between the HMD direction vector and a binocular direction vector preset in the binocular coordinate system, and record it as a second angle;
  • Step S203 adjusting an angle between a positive direction of an object corresponding to the HMD device and a positive direction of virtual world coordinates in the virtual world according to the second angle;
  • An object corresponding to the HMD device exists in the virtual world, and may be regarded as a virtual user.
  • the virtual user is adjusted according to the second angle, so that the virtual user is positive.
  • An angle between the direction and a positive direction of the virtual world coordinate is the second angle, thereby achieving alignment of the HMD device in the virtual world coordinate system;
  • Step S204 Adjusting the position of the object corresponding to the HMD device in the binocular coordinate system according to the position of the HMD device in the binocular coordinate system, thereby realizing that the HMD device is virtual Position alignment in the world coordinate system.
  • the specific process when estimating the position and direction of the HMD device in the binocular coordinate system through the HMD dual spot on the HMD device, the specific process may include:
  • the present invention also discloses a coordinate alignment system, and the technical features of the two can be used for reference.
  • the system can be applied to a VR device, an AR device, or an MR device, including:
  • Controlling device direction alignment subsystem 100 configured to: align a positive direction of a virtual item corresponding to the control device in a virtual world with a positive direction of the virtual world;
  • the control device direction alignment unit subsystem includes:
  • Control device posture calculation unit 101 which corresponds to step S101 in the above method, is configured to acquire sensor data of the control device; and calculate posture data of the control device according to the sensor data;
  • a motion trajectory calculation unit 102 for acquiring the sensor attitude data, and acquiring the control device in the reciprocating motion based on the light spot on the control device Real-time coordinate data in a mesh coordinate system; generating a motion trajectory of the control device during the motion according to the real-time coordinate data;
  • a first angle calculation unit 103 corresponding to steps S105 and S106 in the above method, configured to calculate a posture of the control device in a control device coordinate system based on the motion trajectory, and record that the control device is in a binocular coordinate system The positive direction of the middle; calculating the angle between the control device coordinate system and the binocular coordinate system, which is recorded as the first angle;
  • a binocular coordinate adjustment unit 104 corresponding to step S107 in the above method, for correcting the binocular coordinate system according to the first angle, so that the positive direction of the control device and the binocular coordinate system are positive
  • the angle between the directions is zero, such that the positive direction of the virtual item corresponding to the control device in the virtual world is aligned with the positive direction of the virtual world; wherein the binocular coordinate system and the virtual world coordinate of the virtual world There is a preset mapping relationship between the departments.
  • the first angle calculation unit 103 is specifically configured to:
  • the motion trajectory is a straight line
  • the motion trajectory is straight-line fitted, the straightness of the motion trajectory is calculated, and a straight line direction perpendicular to the motion trajectory after the straight line fitting is calculated;
  • Segmenting the motion trajectory to obtain a multi-segment sub-motion trajectory and calculating a slope K i of a linear direction corresponding to each sub-motion trajectory in a binocular coordinate system, wherein the i corresponds to each sub-segment trajectory;
  • the sum of the included angle ⁇ and the included angle ⁇ is taken as the angle between the positive direction of the control device and the positive direction of the binocular coordinate system.
  • the above system may further comprise a system for implementing HMD alignment:
  • the HMD device direction alignment subsystem 200 is configured to adjust a position and a direction of the object corresponding to the HMD device in the virtual world coordinate in the virtual world according to the position and direction of the HMD device;
  • the HMD device direction alignment subsystem 200 includes:
  • An HMD position and direction calculation unit 201 corresponding to step S201 in the above method, for estimating the position and direction of the HMD device in the binocular coordinate system by the HMD dual spot on the HMD device, and setting the HMD device in binocular coordinates
  • the direction in the system is recorded as the HMD direction vector;
  • a second angle calculation unit 202 which corresponds to step S202 in the above method, and is used for calculating an angle between the HMD direction vector and a binocular direction vector preset in the binocular coordinate system, and is recorded as a second clip. angle;
  • a virtual user adjustment unit 203 corresponding to step S203 to step S204 in the foregoing method, configured to adjust a positive direction and a virtual world coordinate of an object corresponding to the HMD device in the virtual world according to the second angle The angle between the positive directions, according to the HMD device in the binocular A position in the coordinate system that adjusts a position of the object corresponding to the HMD device in the virtual world in the binocular coordinate system.
  • the motion track calculation unit 102 is specifically configured to:
  • the HMD position and direction calculation unit 201 corresponding to the above method is specifically configured to:
  • Calculating the position of the double spot in the binocular coordinate system by calculating the stereo measurement of the image processing of the HMD double spot on the HMD device, and calculating the position according to the position of the double spot in the binocular coordinate system
  • the position of the HMD device in the binocular coordinate system is calculated according to the position of the double light spot in the binocular coordinate system to obtain the positive direction of the double light spot, and the positive direction of the double light spot is recorded as the HMD The positive direction of the device.
  • the present invention also provides a virtual reality system, an augmented reality system or a mixed reality system, which can include: a head display device, a control device, a dual spot output device, a spot output device, and a spot tracking device, Each unit in the coordinate alignment system disclosed in the embodiment is respectively disposed in a corresponding device,
  • the dual spot output device is fixedly connected to the head display device, and the dual spot output device is configured to output a light spot having two unique ID identifiers;
  • the control device is connected to the head display device by wireless or wired
  • the light point output device is disposed on the control device for outputting a light spot having a unique ID identifier
  • the light spot tracking device is configured to capture the light spot
  • the light track tracking device is configured with the motion track calculation unit according to claim 6 and the HMD position and direction calculation unit according to claim 7 for The calculated motion trajectory and HMD direction vector are sent to the head display device through the dual spot output device,
  • the head display device is configured with the control device posture calculation unit, the first angle calculation unit and the binocular coordinate adjustment unit according to claim 6, and the second angle calculation unit and the virtual user adjustment unit according to claim 7. .
  • the dual-spot output device When in use, the dual-spot output device is installed on the head-mounted device. If it is a non-integral head-mounted device, the phone is inserted into the head-mounted device, and the head-mounted device is in the binocular camera (photoelectric tracking device). Before being worn on the head, the binocular tracking device can see the double spot output of the dual spot device on the head display device, and the double spot has an ID, so it can be divided into photoelectric P1 and P2.
  • the binocular tracking device estimates the stereoscopic measurement of the image processing of the double spot, and obtains the position of the double spot in the binocular coordinate system, P1 (X, Y, Z), P2 (X, Y, Z), which will be double
  • the position of the spot is transmitted to the dual spot output device through a wireless or wired channel, and the algorithm is calculated by transmitting the position of the double spot to the processor of the head display device.
  • the head display device estimates the positive direction of the double spot, which is the positive direction of the head display device, and places the display center in the positive direction of the binocular device to locate the double spot.
  • the direction of the HMD in the binocular coordinate system forms a direction vector (HMD direction vector), and calculates the angle between the HMD direction vector and the direction vector in the positive direction of the binocular coordinate system, that is, the angle correction can directly obtain the HMD in the virtual world.
  • the location and direction of the corresponding virtual user can directly obtain the HMD in the virtual world.
  • the process can be:
  • the head display device connects the dual handles through Bluetooth (each handle represents a control device), and the handle performs data acquisition and data fusion calculation through the internal computing unit MCU to obtain the handle posture data O1 (yaw, pitch, roll), O2 (yaw, pitch, roll), the handle gesture data is transmitted to the head display device via Bluetooth or data cable.
  • the coordinates of the handle in the binocular coordinate system can be obtained by the spot tracking device binocular recognition and positioning of the spot of the handle: P3 (X, Y, Z) and P3 (X, Y, Z). The user then picks up the handle and then reciprocates left and right in a lateral or rotational direction in front of the binocular tracking device.
  • the binocular tracking device recognizes the spot of the left and right handles according to the color of the image through image processing and stereoscopic vision; and then obtains the 3D position of the handle spot in the binocular coordinate system through stereo vision.
  • the 3D position information wireless channel data line of the handle spot in the binocular coordinate system is sent to the spot output device, and the 3D position information of the handle spot in the binocular coordinate system is sent to the head display device by the spot output.
  • the handle and HMD are synchronized by time stamp to get 6DOF data on the unified time axis.
  • the head display device After receiving the position and posture data of the handle, the head display device estimates the angle between the positive direction of the handle and the positive direction of the binocular coordinate system according to the movement track of the handle, such as the direct motion of the left and right or the circular motion of the rotation. According to the angle, the positive direction of the handle is aligned with the positive direction of the binocular coordinate system by the angle of the positive direction of the binocular coordinate system, that is, in the virtual world, the positive direction of the object or the hand holds the object and the virtual world. Align in the positive direction.
  • the head display device has a built-in computing unit.
  • the built-in algorithm can detect the direction of the handle track by detecting the motion track of the handle spot and the movement of the IMU data, and then obtain the azimuth of the handle (handle direction and binocular coordinates). The angle between the positive directions). By this azimuth, the positive direction of the handle is aligned to the positive direction of the binocular coordinate system, that is, the orientation of the person in the virtual world.
  • the steps of a method or algorithm described in connection with the embodiments disclosed herein can be implemented directly in hardware, a software module executed by a processor, or a combination of both.
  • the software module can be placed in random access memory (RAM), memory, read only memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, removable disk, CD-ROM, or technical field. Any other form of storage medium known.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Processing Or Creating Images (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种坐标对齐方法、系统和虚拟现实系统,方法包括:依据控制设备的传感器数据计算得到控制设备的姿态数据(S102);基于控制设备上的光点获取控制设备在往复运动过程中的位于双目坐标系中的实时坐标数据(S103);依据实时坐标数据和对应的姿态数据生成控制设备在运动过程中的运动轨迹(S104);基于运动轨迹计算控制设备在控制设备坐标系下的姿态,记为控制设备在双目坐标系中的正方向(S105);计算得到控制设备坐标系与双目坐标系之间的夹角,记为第一夹角(S106);依据第一夹角对双目坐标系进行校正,使得所述控制设备正方向与所述双目坐标系的正方向质检的夹角为零(S107);使得虚拟世界中与控制设备对应的虚拟物品的正方向与虚拟世界的正方向对齐。实现控制设备的正方向与虚拟世界坐标系的正方向之间的对齐,方案实施过程简单,成本较低。

Description

一种坐标对齐方法、系统和虚拟现实系统 技术领域
本发明涉及VR/MR/AR技术领域,具体涉及一种坐标对齐方法、系统和虚拟现实系统。
背景技术
虚拟现实(VR,Virtual Reality)、增强现实(AR,Augmented Reality)、混合现实(MR,Mediated Reality)行业的发展,不仅仅需要满足对内容的观看,还需要通过各种光学、传感器等方案,沟通虚拟与现实世界,实现通过现实的器件实现对虚拟世界的操作及控制。
以VR为例,现有的VR交互设备包括头部显示设备(Head Mount Display,简称HMD)、内部或外部的跟踪发射或接收设备(如立体相机、红外接收或发射器、信号发射接收器)和手部设备控制器(Controller)等,通过HMD、跟踪发射或接收设备以及Controller之间的配合可以进行动作的捕捉。
行业内采用的方案主要有InsideOut(以下简称IO)和OutsideIn(以下简称OI)两种方案进行动作捕捉。其中,IO有LeapMotion的手势识别,Ximmerse的Xhawk等;OI有Occulus Constellation,HTC VIVE,Sony PSVR等。
在使用上述方案时,需将控制器中IMU(Inertial Measurement Unit惯性测量装置,是测量物体三轴姿态角(或角速率)以及加速度的装置,IMU模块包含了:陀螺仪gyroscopes,加速度计accelerometer,磁力计magnetometer)坐标与光学坐标(用于对Controller进行跟踪定位的设备的坐标)对齐。目前的对齐方案常采用多光点的目标姿态估计,直接用光学估计出物体的姿态,该方案由于采用多光点,因此需要在HMD上部置许多发射光点或接收器,该方案需要各个方向部署光点,复杂度非常高,且光点或接收器的安装需要很高的精度以及非常复杂的标定,成本高,量产难,产品配置复杂,需有专业的人花费一两个小时才可以配置成功。
发明内容
有鉴于此,本发明实施例提供一种坐标对齐方法、系统和虚拟现实系统,以实现控制设备与正方向与虚拟世界坐标系的正方向之间的对齐。
为实现上述目的,本发明实施例提供如下技术方案:
一种坐标对齐方法,应用于虚拟现实设备、增强现实设备或混合现实设备中,该方法包括:
获取控制设备的传感器数据;
依据所述传感器数据计算得到控制设备的姿态数据;
在获取传感器姿态数据的同时,基于所述控制设备上的光点获取所述控制设备在往复运动过程中的位于双目坐标系中的实时坐标数据;
依据所述实时坐标数据和与所述实时坐标数据对应的姿态数据生成控制设备在运动过程中的运动轨迹;
基于所述运动轨迹计算所述控制设备在控制设备坐标系下的姿态,记为控制设备在双目坐标系中的正方向;
计算得到所述控制设备坐标系与所述双目坐标系之间的夹角,记为第一夹角;
依据所述第一夹角对双目坐标系进行校正,使得所述控制设备正方向与所述双目坐标系的正方向之间的夹角为零,使得虚拟世界中与所述控制设备对应的虚拟物品的正方向与所述虚拟世界的正方向对齐;
其中,所述双目坐标系与虚拟世界的虚拟世界坐标系之间具有预设的映射关系。
优选的,上述坐标对齐方法中,还包括:
通过HMD设备上的HMD双光点估算HMD设备在双目坐标系中的位置和方向,将HMD设备在双目坐标系中的方向记为HMD方向矢量;
计算所述HMD方向矢量与双目坐标系中预设的双目方向矢量之间的夹角,记为第二夹角;
依据所述第二夹角调整所述虚拟世界中与所述HMD设备对应的物体的正方向与虚拟世界坐标的正方向之间的夹角;
依据所述HMD设备在所述双目坐标系中的位置,调整所述虚拟世界中与所述HMD设备对应的物体在所述双目坐标系中的位置。
优选的,上述坐标对齐方法中,所述基于所述控制设备上的光点获取所述控制设备在往复运动过程中的位于双目坐标系中的实时坐标数据,包括:
基于所述控制设备上的光点获取所述控制设备在沿直线往复运动过程中或旋转的圆周运动过程中的位于双目坐标系中的实时坐标数据。
优选的,上述坐标对齐方法中,所述计算得到所述控制设备正方向与所述双目坐标系的正方向之间的夹角,记为第一夹角,包括:
获取控制设备在运动过程中的运动轨迹;
当所述运动轨迹为直线时,对所述运动轨迹进行直线拟合,计算得到所述运动轨迹的直线度,计算得到与直线拟合后的运动轨迹垂直的直线方向;
当所述运动轨迹为类圆形时,对所述运动轨迹进行圆拟合,计算得到所述运动轨迹的曲度,计算得到穿过曲线拟合后的圆拟合的运动轨迹的圆心的直线方向;
计算所述直线方向在双目坐标系中的斜率K;
依据斜率计算所述直线方向与所述双目坐标系正方向之间的夹角θ;
对所述运动轨迹进行分段,得到多段子运动轨迹,计算得到与各段子运动轨迹对应的直线方向在双目坐标系中的斜率Ki,其中所述i与各段子运动轨迹相对应;
分别计算得到各段子运动轨迹对应的斜率Ki与所述斜率K的相似度Di
依据公式
Figure PCTCN2017096120-appb-000001
计算得到加权平均水平方向夹角δ,其中,所述N为所述子运动轨迹的总段数;
将所述夹角θ与夹角δ之和作为所述控制设备的正方向与所述双目坐标系正方向之间的夹角。
优选的,上述坐标对齐方法中,通过HMD设备上的HMD双光点估算HMD设备在双目坐标系中的位置和方向,包括:
通过对所述HMD设备上的HMD双光点进行图像处理的立体测量估算,计算得到双光点在双目坐标系中的位置,依据双光点在双目坐标系中的位置计算得到所述HMD设备在双目坐标系中的位置,依据所述双光点在双目坐标 系中的位置计算得到所述双光点的正方向,将所述双光点的正方向记为所述HMD设备的正方向。
一种坐标对齐系统,应用于虚拟现实设备、增强现实设备或混合现实设备中,包括:
控制设备方向对齐子系统,用于:使得虚拟世界中与所述控制设备对应的虚拟物品的正方向与所述虚拟世界的正方向对齐;
所述控制设备方向对齐单元子系统包括:
控制设备姿态计算单元,用于获取控制设备的传感器数据;依据所述传感器数据计算得到控制设备的姿态数据;
运动轨迹计算单元,用于在获取到控制设备的姿态数据的同时,基于所述控制设备上的光点获取所述控制设备在往复运动过程中的位于双目坐标系中的实时坐标数据;依据所述实时坐标数据和与所述实时坐标数据对应的姿态数据生成控制设备在运动过程中的运动轨迹;
第一夹角计算单元,基于所述运动轨迹计算所述控制设备在控制设备坐标系下的姿态,记为控制设备在双目坐标系中的正方向;计算得到所述控制设备坐标系与所述双目坐标系的正方向之间的夹角,记为第一夹角;
双目坐标调整单元,用于依据所述第一夹角对双目坐标系进行校正,使得所述控制设备正方向与所述双目坐标系的正方向之间的夹角为零,使得虚拟世界中与所述控制设备对应的虚拟物品的正方向与所述虚拟世界的正方向对齐;其中,所述双目坐标系与虚拟世界的虚拟世界坐标系之间具有预设的映射关系。
优选的,上述坐标对齐系统中,还包括:
HMD设备方向对齐子系统,用于依据HMD设备的位置和方向调整虚拟世界中与所述HMD设备对应的物体的在虚拟世界坐标中的位置和方向;
所述HMD设备方向对齐子系统,包括:
HMD位置和方向计算单元,用于通过HMD设备上的HMD双光点估算HMD设备在双目坐标系中的位置和方向,将HMD设备在双目坐标系中的方向记为HMD方向矢量;
第二夹角计算单元,用于计算所述HMD方向矢量与双目坐标系中预设的双目方向矢量之间的夹角,记为第二夹角;
虚拟用户调整单元,用于依据所述第二夹角调整所述虚拟世界中与所述HMD设备对应的物体的正方向与虚拟世界坐标的正方向之间的夹角,依据所述HMD设备在所述双目坐标系中的位置,调整所述虚拟世界中与所述HMD设备对应的物体在所述双目坐标系中的位置。
优选的,上述坐标对齐系统中,所述运动轨迹计算单元具体被配置为:
基于所述控制设备上的光点获取所述控制设备在沿直线往复运动过程中或旋转的圆周运动过程中的位于双目坐标系中的实时坐标数据;依据所述实时坐标数据生成控制设备在运动过程中的运动轨迹。
优选的,上述坐标对齐系统中,所述第一夹角计算单元,具体被配置为:
获取控制设备在运动过程中的运动轨迹;
当所述运动轨迹为直线时,对所述运动轨迹进行直线拟合,计算得到所述运动轨迹的直线度,计算得到与直线拟合后的运动轨迹垂直的直线方向;
当所述运动轨迹为类圆形时,对所述运动轨迹进行圆拟合,计算得到所述运动轨迹的曲度,计算得到穿过曲线拟合后的圆拟合的运动轨迹的圆心的直线方向;
计算所述直线方向在双目坐标系中的斜率K;
依据斜率计算所述直线方向与所述双目坐标系正方向之间的夹角θ;
对所述运动轨迹进行分段,得到多段子运动轨迹,计算得到与各段子运动轨迹对应的直线方向在双目坐标系中的斜率Ki,其中所述i与各段子运动轨迹相对应;
分别计算得到各段子运动轨迹对应的斜率Ki与所述斜率K的相似度Di
依据公式
Figure PCTCN2017096120-appb-000002
计算得到加权平均水平方向夹角δ,其中,所述N为所述子运动轨迹的总段数;
将所述夹角θ与夹角δ之和作为所述控制设备的正方向与所述双目坐标系正方向之间的夹角。
优选的,上述坐标对齐系统中,所述HMD位置和方向计算单元,具体被配置为:
通过对所述HMD设备上的HMD双光点进行图像处理的立体测量估算,计算得到双光点在双目坐标系中的位置,依据双光点在双目坐标系中的位置计算得到所述HMD设备在双目坐标系中的位置,依据所述双光点在双目坐标 系中的位置计算得到所述双光点的正方向,将所述双光点的正方向记为所述HMD设备的正方向。
一种虚拟现实系统,包括:
头显设备;
与所述头显设备固定相连的双光点输出设备,用于输出具有输出两个具有唯一ID标识的光点;
通过无线或有线与所述头显设备相连的控制设备;
设置在所述控制设备上的光点输出设备,用于输出一个具有唯一ID标识的光点;
用于捕捉所述光点的光点跟踪设备,所述光点跟踪设备内配置有上述实施例中所述的运动轨迹计算单元、HMD位置和方向计算单元,用于将计算得到的运动轨迹和HMD方向矢量通过双光点输出设备发送给所述头显设备,
所述头显设备内配置有上述实施例中所述的控制设备姿态计算单元、第一夹角计算单元、双目坐标调整单元、第二夹角计算单元和虚拟用户调整单元。
基于上述技术方案,本发明实施例提供的上述方案,在对所述控制设备进行IMU对齐时,只需要在所述控制设备上设置一个光点即可,实现控制设备的正方向与虚拟世界坐标系的正方向之间的对齐,方案实施过程简单,成本较低。
附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据提供的附图获得其他的附图。
图1为本发明实施例公开的一种坐标对齐方法的流程示意图;
图2为本发明另一实施例公开的一种坐标对齐方法的流程示意图;
图3为本发明实施例中计算控制设备正方向与双目坐标系正方向之间的夹角的流程示意图;
图4为本发明实施例公开的虚拟显示设备坐标对齐系统的结构示意图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
本发明公开了一种坐标对齐方法,其可以应用于现有技术中的VR设备、AR设备或MR设备等,参见图1,图1中的方法主要是针对于该方法可以包括:
步骤S101:获取控制设备的传感器数据;
所述控制设备可以为虚拟现实设备、增强现实设备或混合现实设备的手柄等遥控器设备,每套设备中可设置有多个控制设备,通常而言,设置有两个控制设备的情况居多,其内部配置有用于IMU器件,所述IMU器件内配置有多个用于测量设备姿态的传感器,通过将传感器的传感数据融合后可得到两个控制设备的姿态数据O1(yaw,pitch,roll),O2(yaw,pitch,roll);
步骤S102:依据所述传感器数据计算得到控制设备的姿态数据;
步骤S103:在获取传感器姿态数据的同时,基于所述控制设备上的光点获取所述控制设备在往复运动过程中的位于双目坐标系中的实时坐标数据;
在每个控制设备上设置一光点输出设备,所述光点输出设备输出具有ID标识的光点信号,通过跟踪设备对其输出的光点信号进行监测并跟踪,得到其在往复运动过程中在双目坐标系中的实时坐标数据,且每一实时坐标数据对应一所述控制设备的姿态数据,对该实时坐标数据和姿态数据进行数据处理,可得到所述控制设备往复运动时在所述双目坐标系内的运动轨迹;
步骤S104:依据所述实时坐标数据和姿态数据生成控制设备在运动过程中的运动轨迹;
步骤S105:基于所述运动轨迹计算所述控制设备在控制设备坐标系下的姿态,记为控制设备在双目坐标系下的正方向;
所述控制设备在双目坐标系中的正方向指的是设备的朝向方向,在本步骤中,依据所述控制设备的姿态数据,以及所述运动轨迹计算所述控制设备在双目坐标系中的正方向,记为控制设备正方向;
步骤S106:计算得到所述控制设备坐标系与所述双目坐标系之间的夹角,记为第一夹角;
当所述控制设备为多个时,此时会计算得到多个第一夹角,如果每得到一个第一夹角就对双目坐标系进行一次坐标调整,则会导致坐标系统混乱,针对于此,当计算得到多个第一夹角时,对所述多个第一夹角求均值,将其平均值作为后续步骤中用于对双目坐标系进行坐标调整的第一夹角;
步骤S107:依据所述第一夹角对双目坐标系进行校正,使得所述控制设备正方向与所述双目坐标系的正方向之间的夹角为零,使得虚拟世界中与所述控制设备对应的虚拟物品的正方向与所述虚拟世界的正方向对齐;
在虚拟现实设备、增强现实设备或混合现实设备中,虚拟世界坐标系与双目坐标系之间具有预设的坐标映射关系,当其中一个坐标系统的坐标发生变化时,另一个坐标系统的坐标更随变化,在本步骤中,当计算得到控制设备正方向与所述双目坐标系的正方向之间的第一夹角时,依据所述第一夹角调整所述双目坐标系的正方向,使其与虚拟现实设备、增强现实设备或混合现实设备的正方向对齐,此时,所述虚拟世界坐标系会跟随所述双目坐标系的坐标变化,使得所述虚拟世界中与所述控制设备对应的虚拟物品的正方向与所述虚拟世界的正方向对齐;
其中,所述双目坐标系与虚拟世界的虚拟世界坐标系之间具有预设的映射关系。
通过本发明上述实施例公开的技术方案可见,在对所述控制设备进行IMU对齐时,只需要在所述控制设备上设置一个光点即可,实现控制设备的正方向与虚拟世界坐标系的正方向之间的对齐,方案实施过程简单,成本较低。
在本发明上述实施例公开的步骤S103中,为了使得计算得到的第一夹角的精确度更高,本发明上述实施例中的所述往复运动指的是规则图形的往复运动,例如,所述控制设备在直线方向上的往复运动或者是所述控制设备旋转的圆周运动,即所述步骤S103可被进一步限定为:
基于所述控制设备上的光点获取所述控制设备在沿直线往复运动过程中或旋转的圆周运动过程中的位于双目坐标系中的实时坐标数据。
此外,为了进一步提交所述第一夹角计算结果的准确性,本发明还公开了一种计算所述第一夹角的具体过程,参见图3,所述步骤S105和步骤S106进一步包括:
步骤S301:获取控制设备在运动过程中的运动轨迹,所述运动轨迹包括实时坐标数据以及设备姿态数据;
步骤S302:对所述运动轨迹进行图形识别,判断该运动轨迹为直线轨迹还是类圆形轨迹,当所述运动轨迹为直线轨迹时,执行步骤S303,当所述运动轨迹为类圆形轨迹时,执行步骤S305;
步骤S303:当所述运动轨迹为直线时,对所述运动轨迹进行直线拟合,执行步骤S304;
步骤S304:计算得到与直线拟合后的运动轨迹垂直的直线方向;
步骤S305:当所述运动轨迹为类圆形时,对所述运动轨迹进行圆拟合,执行步骤S306;
步骤S306:计算得到穿过曲线拟合后的圆拟合的运动轨迹的圆心的直线方向;
步骤S307:计算所述直线方向在双目坐标系中的斜率K;
所述斜率K指的是其垂线方向与双目坐标系的正方向所构成的平面内,相对于所述双目坐标系的正方向的斜率;
步骤S308:依据斜率计算所述直线方向与所述双目坐标系正方向之间的夹角θ;
步骤S309:对所述运动轨迹进行分段,得到多段子运动轨迹,计算得到与各段子运动轨迹对应的直线方向在双目坐标系中的斜率Ki,执行步骤S310,其中所述i与各段子运动轨迹相对应;
作为了确保计算结果的准确性,本发明还进一步对上述夹角θ,进行了校正,将所述运动轨迹进行分段,在对其进行分段时,可依据预设间距进行分段,该预设间距可以指的是所述运动轨迹上两个采样点之间的距离,即每两个采样点之间的运动轨迹作为一子运动轨迹,依据步骤S302-308对分割得到的各个子运动轨迹进行入处理,得到各个子运动轨迹对应的斜率Ki
步骤S310:分别计算得到各段子运动轨迹对应的斜率Ki与所述斜率K的相似度Di
步骤S311:依据公式
Figure PCTCN2017096120-appb-000003
(公式一)计算得到加权平均水平方向夹角δ(手柄的自然方位与双目平面垂直的夹角,记为方位角,其也可以用过姿态数据计算得到),其中,所述N为所述子运动轨迹的总段数;
步骤S312:将所述夹角θ与夹角δ之和作为所述控制设备的正方向与所述双目坐标系正方向之间的夹角。
本发明除了提供了一种用于实现控制设备的正方向与虚拟世界的坐标系的正方向对齐的方法之外,还提供了一种用于实现HMD与虚拟世界坐标系中的预设对象进行对齐的方案,参见图2,该方案可以包括:
步骤S201:通过HMD设备上的双光点估算HMD设备在双目坐标系中的位置和方向,将HMD设备在双目坐标系中的方向记为HMD方向矢量;
所述HMD设备上设置有双光点输出设备,所述双光点输出设备用于输出双光点,光点跟踪设备在检测到所述双光点后可估算得到所述双光点在双目标坐标系统中的位置和方向(双光点所对应的方向),将所述HMD设备在双目坐标系中的方向标记为HMD方向矢量;
步骤S202:计算所述HMD方向矢量与双目坐标系中预设的双目方向矢量之间的夹角,记为第二夹角;
步骤S203:依据所述第二夹角调整所述虚拟世界中与所述HMD设备对应的物体的正方向与虚拟世界坐标的正方向之间的夹角;
在所述虚拟世界中存在与HMD设备对应的物体,可认为是虚拟用户,在得到所述第二夹角后,依据所述第二夹角调整所述虚拟用户,使得所述虚拟用户的正方向与所述虚拟世界坐标的正方向之间的夹角为所述第二夹角,从而实现了HMD设备在虚拟世界坐标系中的方向对齐;
步骤S204:依据所述HMD设备在所述双目坐标系中的位置,调整所述虚拟世界中与所述HMD设备对应的物体在所述双目坐标系中的位置,实现了HMD设备在虚拟世界坐标系中的位置对齐。
本发明公开的另一实施例中,在通过HMD设备上的HMD双光点估算HMD设备在双目坐标系中的位置和方向时,其具体过程可以包括:
通过对所述HMD设备上的HMD双光点进行图像处理的立体测量估算,计算得到双光点在双目坐标系中的位置,依据双光点在双目坐标系中的位置计算得到所述HMD设备在双目坐标系中的位置,进而得到与所述HMD对应的虚拟用户在虚拟世界坐标系中的位置,依据所述双光点在双目坐标系中的位置计算得到所述双光点在所述双目坐标系中的正方向,将所述双光点的正方向记为所述HMD设备在双目坐标系正方向,进而得到与所述HMD对应的虚拟用户在虚拟世界坐标系中的正方向。
与上述方法相对应,本发明还公开了一种坐标对齐系统,两者的技术特征可以相互借鉴,参见图4,该系统可应用于VR设备、AR设备或MR设备中,包括:
控制设备方向对齐子系统100,用于:使得虚拟世界中与所述控制设备对应的虚拟物品的正方向与所述虚拟世界的正方向对齐;
所述控制设备方向对齐单元子系统包括:
控制设备姿态计算单元101,其与上述方法中步骤S101相对应,用于获取控制设备的传感器数据;依据所述传感器数据计算得到控制设备的姿态数据;
运动轨迹计算单元102,其与上述方法中步骤S103和S104相对应,用于在获取传感器姿态数据的同时,基于所述控制设备上的光点获取所述控制设备在往复运动过程中的位于双目坐标系中的实时坐标数据;依据所述实时坐标数据生成控制设备在运动过程中的运动轨迹;
第一夹角计算单元103,其与上述方法中步骤S105和S106相对应,用于基于所述运动轨迹计算所述控制设备在控制设备坐标系下的姿态,记为控制设备在双目坐标系中的正方向;计算得到所述控制设备坐标系与所述双目坐标系之间的夹角,记为第一夹角;
双目坐标调整单元104,其与上述方法中步骤S107相对应,用于依据所述第一夹角对双目坐标系进行校正,使得所述控制设备正方向与所述双目坐标系的正方向之间的夹角为零,使得虚拟世界中与所述控制设备对应的虚拟物品的正方向与所述虚拟世界的正方向对齐;其中,所述双目坐标系与虚拟世界的虚拟世界坐标系之间具有预设的映射关系。
与上述方法相对应,所述第一夹角计算单元103,具体被配置为:
获取控制设备在运动过程中的运动轨迹;
当所述运动轨迹为直线时,对所述运动轨迹进行直线拟合,计算得到所述运动轨迹的直线度,计算得到与直线拟合后的运动轨迹垂直的直线方向;
当所述运动轨迹为类圆形时,对所述运动轨迹进行圆拟合,计算得到所述运动轨迹的曲度,计算得到穿过曲线拟合后的圆拟合的运动轨迹的圆心的直线方向;
计算所述直线方向在双目坐标系中的斜率K;
依据斜率计算所述直线方向与所述双目坐标系正方向之间的夹角θ;
对所述运动轨迹进行分段,得到多段子运动轨迹,计算得到与各段子运动轨迹对应的直线方向在双目坐标系中的斜率Ki,其中所述i与各段子运动轨迹相对应;
分别计算得到各段子运动轨迹对应的斜率Ki与所述斜率K的相似度Di
依据公式
Figure PCTCN2017096120-appb-000004
计算得到加权平均水平方向夹角δ,其中,所述N为所述子运动轨迹的总段数;
将所述夹角θ与夹角δ之和作为所述控制设备的正方向与所述双目坐标系正方向之间的夹角。
与上述方法相对应,上述系统还可以包括一用于实现HMD对齐的系统:
HMD设备方向对齐子系统200,用于依据HMD设备的位置和方向调整虚拟世界中与所述HMD设备对应的物体的在虚拟世界坐标中的位置和方向;
所述HMD设备方向对齐子系统200,包括:
HMD位置和方向计算单元201,其与上述方法中步骤S201相对应,用于通过HMD设备上的HMD双光点估算HMD设备在双目坐标系中的位置和方向,将HMD设备在双目坐标系中的方向记为HMD方向矢量;
第二夹角计算单元202,其与上述方法中步骤S202相对应,用于计算所述HMD方向矢量与双目坐标系中预设的双目方向矢量之间的夹角,记为第二夹角;
虚拟用户调整单元203,其与上述方法中步骤S203-步骤S204相对应,用于依据所述第二夹角调整所述虚拟世界中与所述HMD设备对应的物体的正方向与虚拟世界坐标的正方向之间的夹角,依据所述HMD设备在所述双目 坐标系中的位置,调整所述虚拟世界中与所述HMD设备对应的物体在所述双目坐标系中的位置。
与上述方法相对应,所述运动轨迹计算单元102具体被配置为:
基于所述控制设备上的光点获取所述控制设备在沿直线往复运动过程中或旋转的圆周运动过程中的位于双目坐标系中的实时坐标数据;依据所述实时坐标数据生成控制设备在运动过程中的运动轨迹。
与上述方法相对应所述HMD位置和方向计算单元201,具体被配置为:
通过对所述HMD设备上的HMD双光点进行图像处理的立体测量估算,计算得到双光点在双目坐标系中的位置,依据双光点在双目坐标系中的位置计算得到所述HMD设备在双目坐标系中的位置,依据所述双光点在双目坐标系中的位置计算得到所述双光点的正方向,将所述双光点的正方向记为所述HMD设备的正方向。
对应于上述系统,本发明还一种虚拟现实系统、增强现实系统或混合现实系统,该系统可以包:头显设备、控制设备、双光点输出设备、光点输出设备和光点跟踪设备,上述实施例公开的坐标对齐系统中的各个单元分别布置在对应的设备中,
其中,所述双光点输出设备与所述头显设备固定相连,所述双光点输出设备用于输出具有输出两个具有唯一ID标识的光点;
所述控制设备通过无线或有线与所述头显设备相连
所述光点输出设备设置在所述控制设备上,用于输出一个具有唯一ID标识的光点;
所述光点跟踪设备用于捕捉所述光点,所述光点跟踪设备内配置有权利要求6所述的运动轨迹计算单元和权利要求7所述的HMD位置和方向计算单元,用于将计算得到的运动轨迹和HMD方向矢量通过双光点输出设备发送给所述头显设备,
所述头显设备内配置有权利要求6所述的控制设备姿态计算单元、第一夹角计算单元和双目坐标调整单元以及权利要求7所述的第二夹角计算单元和虚拟用户调整单元。
在使用时,将双光点输出设备安装于头显设备上,如果是非一体机头显设备,则将手机插入头显设备中,将头显设备在双目摄像机(光电跟踪设备) 前,戴于头上,双目跟踪设备可以看得到头显设备上的双光点设备输出的双光点,双光点具有ID,因此可以区分为光电P1和P2。双目跟踪设备对双光点进行图像处理的立体测量估计,得到双光点的在双目坐标系中的位置,P1(X,Y,Z),P2(X,Y,Z),将双光点的位置通过无线或有线通道传输到双光点输出设备,在将双光点的位置发送给头显设备的处理器进行算法的计算。头显设备在获取到双光点的位置数据后,估计双光点的正方向,该方向即为头显设备的正方向,将显示中心放置于双目设备的正方向以定位双光点在双目坐标系中HMD的方向,形成一个方向矢量(HMD方向矢量),计算HMD方向矢量与双目坐标系正方向的方向矢量的夹角,即可以通过角度矫正,直接得到HMD在虚拟世界中对应的虚拟用户的位置及方向。
在进行控制设备对齐时,其过程可以为:
头显设备通过蓝牙,连接双手柄(每个手柄代表一控制设备),手柄通过内部计算单元MCU对传感器进行数据获取和数据融合解算,得到手柄姿态数据O1(yaw,pitch,roll),O2(yaw,pitch,roll),通过蓝牙或数据线,将手柄姿态数据传输给头显设备。通过光点跟踪设备双目识别和定位手柄的光点,可以得到手柄的在双目坐标系内的坐标:P3(X,Y,Z)和P3(X,Y,Z)。然后用户拿起手柄,然后在双目跟踪设备前面,横向或旋转地进行左右往复运动。双目跟踪设备通过图像处理和立体视觉,根据图像颜色,识别得到左右手柄的光点;然后通过立体视觉得到手柄光点在双目坐标系中的3D位置。将手柄光点在双目坐标系中的3D位置信息无线通道数据线发送给光点输出设备,由光点输出将手柄光点在双目坐标系中的3D位置信息发送给头显设备。并且手柄和HMD通过时间戳进行同步,得到统一时间轴上的6DOF数据。头显设备在接收到手柄的位置和姿态数据后,根据手柄的运动轨迹,如左右的直接运动或旋转的圆周运动,估计手柄的正方向与双目坐标系的正方向之间的夹角,依据该夹角通过对双目坐标系的正方向进行角度,将手柄的正方向与双目坐标系的正方向进行对齐,即在虚拟世界中,手或手握物品的正方向与虚拟世界的正方向对齐。头显设备内置有计算单元,内置算法通过检测手柄光点的运动轨迹,能及通过IMU数据对运动的约束,估计手柄轨迹中的方向,进而得到手柄的方位角(手柄正方向与双目坐标系正方向之间的夹角)。 通过该方位角将手柄的正方向,对齐到双目坐标系的正方向,即虚拟世界的人所处的方位。
为了描述的方便,描述以上系统时以功能分为各种模块分别描述。当然,在实施本发明时可以把各模块的功能在同一个或多个软件和/或硬件中实现。
本说明书中的各个实施例均采用递进的方式描述,各个实施例之间相同相似的部分互相参见即可,每个实施例重点说明的都是与其他实施例的不同之处。尤其,对于系统或系统实施例而言,由于其基本相似于方法实施例,所以描述得比较简单,相关之处参见方法实施例的部分说明即可。以上所描述的系统及系统实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。本领域普通技术人员在不付出创造性劳动的情况下,即可以理解并实施。
专业人员还可以进一步意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、计算机软件或者二者的结合来实现,为了清楚地说明硬件和软件的可互换性,在上述说明中已经按照功能一般性地描述了各示例的组成及步骤。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本发明的范围。
结合本文中所公开的实施例描述的方法或算法的步骤可以直接用硬件、处理器执行的软件模块,或者二者的结合来实施。软件模块可以置于随机存储器(RAM)、内存、只读存储器(ROM)、电可编程ROM、电可擦除可编程ROM、寄存器、硬盘、可移动磁盘、CD-ROM、或技术领域内所公知的任意其它形式的存储介质中。
还需要说明的是,在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括 没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
对所公开的实施例的上述说明,使本领域专业技术人员能够实现或使用本发明。对这些实施例的多种修改对本领域的专业技术人员来说将是显而易见的,本文中所定义的一般原理可以在不脱离本发明的精神或范围的情况下,在其它实施例中实现。因此,本发明将不会被限制于本文所示的这些实施例,而是要符合与本文所公开的原理和新颖特点相一致的最宽的范围。

Claims (11)

  1. 一种坐标对齐方法,其特征在于,应用于虚拟现实设备、增强现实设备或混合现实设备中,包括:
    获取控制设备的传感器数据;
    依据所述传感器数据计算得到控制设备的姿态数据;
    在获取传感器姿态数据的同时,基于所述控制设备上的光点获取所述控制设备在往复运动过程中的位于双目坐标系中的实时坐标数据;
    依据所述实时坐标数据和与所述实时坐标数据对应的姿态数据生成控制设备在运动过程中的运动轨迹;
    基于所述运动轨迹计算所述控制设备在控制设备坐标系下的姿态,记为控制设备在双目坐标系中的正方向;
    计算得到所述控制设备坐标系与所述双目坐标系之间的夹角,记为第一夹角;
    依据所述第一夹角对双目坐标系进行校正,使得所述控制设备正方向与所述双目坐标系的正方向之间的夹角为零,使得虚拟世界中与所述控制设备对应的虚拟物品的正方向与所述虚拟世界的正方向对齐;
    其中,所述双目坐标系与虚拟世界的虚拟世界坐标系之间具有预设的映射关系。
  2. 根据权利要求1所述的坐标对齐方法,其特征在于,还包括:
    通过HMD设备上的HMD双光点估算HMD设备在双目坐标系中的位置和方向,将HMD设备在双目坐标系中的方向记为HMD方向矢量;
    计算所述HMD方向矢量与双目坐标系中预设的双目方向矢量之间的夹角,记为第二夹角;
    依据所述第二夹角调整所述虚拟世界中与所述HMD设备对应的物体的正方向与虚拟世界坐标的正方向之间的夹角;
    依据所述HMD设备在所述双目坐标系中的位置,调整所述虚拟世界中与所述HMD设备对应的物体在所述双目坐标系中的位置。
  3. 根据权利要求1所述的坐标对齐方法,其特征在于,所述基于所述控制设备上的光点获取所述控制设备在往复运动过程中的位于双目坐标系中的实时坐标数据,包括:
    基于所述控制设备上的光点获取所述控制设备在沿直线往复运动过程中或旋转的圆周运动过程中的位于双目坐标系中的实时坐标数据。
  4. 根据权利要求1所述的坐标对齐方法,其特征在于,所述计算得到所述控制设备正方向与所述双目坐标系的正方向之间的夹角,记为第一夹角,包括:
    获取控制设备在运动过程中的运动轨迹;
    当所述运动轨迹为直线时,对所述运动轨迹进行直线拟合,计算得到所述运动轨迹的直线度,计算得到与直线拟合后的运动轨迹垂直的直线方向;
    当所述运动轨迹为类圆形时,对所述运动轨迹进行圆拟合,计算得到所述运动轨迹的曲度,计算得到穿过曲线拟合后的圆拟合的运动轨迹的圆心的直线方向;
    计算所述直线方向在双目坐标系中的斜率K;
    依据斜率计算所述直线方向与所述双目坐标系正方向之间的夹角θ;
    对所述运动轨迹进行分段,得到多段子运动轨迹,计算得到与各段子运动轨迹对应的直线方向在双目坐标系中的斜率Ki,其中所述i与各段子运动轨迹相对应;
    分别计算得到各段子运动轨迹对应的斜率Ki与所述斜率K的相似度Di
    依据公式
    Figure PCTCN2017096120-appb-100001
    计算得到加权平均水平方向夹角δ,其中,所述N为所述子运动轨迹的总段数;
    将所述夹角θ与夹角δ之和作为所述控制设备的正方向与所述双目坐标系正方向之间的夹角。
  5. 根据权利要求1所述的坐标对齐方法,其特征在于,通过HMD设备上的HMD双光点估算HMD设备在双目坐标系中的位置和方向,包括:
    通过对所述HMD设备上的HMD双光点进行图像处理的立体测量估算,计算得到双光点在双目坐标系中的位置,依据双光点在双目坐标系中的位置计算得到所述HMD设备在双目坐标系中的位置,依据所述双光点在双目坐标系中的位置计算得到所述双光点的正方向,将所述双光点的正方向记为所述HMD设备的正方向。
  6. 一种坐标对齐系统,应用于虚拟现实设备、增强现实设备或混合现实设备中,其特征在于,包括:
    控制设备方向对齐子系统,用于:使得虚拟世界中与所述控制设备对应的虚拟物品的正方向与所述虚拟世界的正方向对齐;
    所述控制设备方向对齐单元子系统包括:
    控制设备姿态计算单元,用于获取控制设备的传感器数据;依据所述传感器数据计算得到控制设备的姿态数据;
    运动轨迹计算单元,用于在获取到控制设备的姿态数据的同时,基于所述控制设备上的光点获取所述控制设备在往复运动过程中的位于双目坐标系中的实时坐标数据;依据所述实时坐标数据和与所述实时坐标数据对应的姿态数据生成控制设备在运动过程中的运动轨迹;
    第一夹角计算单元,基于所述运动轨迹计算所述控制设备在控制设备坐标系下的姿态,记为控制设备在双目坐标系中的正方向;计算得到所述控制设备坐标系与所述双目坐标系的正方向之间的夹角,记为第一夹角;
    双目坐标调整单元,用于依据所述第一夹角对双目坐标系进行校正,使得所述控制设备正方向与所述双目坐标系的正方向之间的夹角为零,使得虚拟世界中与所述控制设备对应的虚拟物品的正方向与所述虚拟世界的正方向对齐;其中,所述双目坐标系与虚拟世界的虚拟世界坐标系之间具有预设的映射关系。
  7. 根据权利要求6所述的坐标对齐系统,其特征在于,还包括:
    HMD设备方向对齐子系统,用于依据HMD设备的位置和方向调整虚拟世界中与所述HMD设备对应的物体的在虚拟世界坐标中的位置和方向;
    所述HMD设备方向对齐子系统,包括:
    HMD位置和方向计算单元,用于通过HMD设备上的HMD双光点估算HMD设备在双目坐标系中的位置和方向,将HMD设备在双目坐标系中的方向记为HMD方向矢量;
    第二夹角计算单元,用于计算所述HMD方向矢量与双目坐标系中预设的双目方向矢量之间的夹角,记为第二夹角;
    虚拟用户调整单元,用于依据所述第二夹角调整所述虚拟世界中与所述HMD设备对应的物体的正方向与虚拟世界坐标的正方向之间的夹角,依据所述HMD设备在所述双目坐标系中的位置,调整所述虚拟世界中与所述HMD设备对应的物体在所述双目坐标系中的位置。
  8. 根据权利要求6所述的坐标对齐系统,其特征在于,所述运动轨迹计算单元具体被配置为:
    基于所述控制设备上的光点获取所述控制设备在沿直线往复运动过程中或旋转的圆周运动过程中的位于双目坐标系中的实时坐标数据;依据所述实时坐标数据生成控制设备在运动过程中的运动轨迹。
  9. 根据权利要求6所述的坐标对齐系统,所述第一夹角计算单元,具体被配置为:
    获取控制设备在运动过程中的运动轨迹;
    当所述运动轨迹为直线时,对所述运动轨迹进行直线拟合,计算得到所述运动轨迹的直线度,计算得到与直线拟合后的运动轨迹垂直的直线方向;
    当所述运动轨迹为类圆形时,对所述运动轨迹进行圆拟合,计算得到所述运动轨迹的曲度,计算得到穿过曲线拟合后的圆拟合的运动轨迹的圆心的直线方向;
    计算所述直线方向在双目坐标系中的斜率K;
    依据斜率计算所述直线方向与所述双目坐标系正方向之间的夹角θ;
    对所述运动轨迹进行分段,得到多段子运动轨迹,计算得到与各段子运动轨迹对应的直线方向在双目坐标系中的斜率Ki,其中所述i与各段子运动轨迹相对应;
    分别计算得到各段子运动轨迹对应的斜率Ki与所述斜率K的相似度Di
    依据公式
    Figure PCTCN2017096120-appb-100002
    计算得到加权平均水平方向夹角δ,其中,所述N为所述子运动轨迹的总段数;
    将所述夹角θ与夹角δ之和作为所述控制设备的正方向与所述双目坐标系正方向之间的夹角。
  10. 根据权利要求7所述的坐标对齐系统,所述HMD位置和方向计算单元,具体被配置为:
    通过对所述HMD设备上的HMD双光点进行图像处理的立体测量估算,计算得到双光点在双目坐标系中的位置,依据双光点在双目坐标系中的位置计算得到所述HMD设备在双目坐标系中的位置,依据所述双光点在双目坐标系中的位置计算得到所述双光点的正方向,将所述双光点的正方向记为所述HMD设备的正方向。
  11. 一种虚拟现实系统,其特征在于,包括:
    头显设备;
    与所述头显设备固定相连的双光点输出设备,用于输出具有输出两个具有唯一ID标识的光点;
    通过无线或有线与所述头显设备相连的控制设备;
    设置在所述控制设备上的光点输出设备,用于输出一个具有唯一ID标识的光点;
    用于捕捉所述光点的光点跟踪设备,所述光点跟踪设备内配置有运动轨迹计算单元和HMD位置和方向计算单元,用于将计算得到的运动轨迹和HMD方向矢量通过双光点输出设备发送给所述头显设备,
    所述头显设备内配置有控制设备姿态计算单元、第一夹角计算单元和双目坐标调整单元以及第二夹角计算单元和虚拟用户调整单元。
PCT/CN2017/096120 2017-04-25 2017-08-04 一种坐标对齐方法、系统和虚拟现实系统 WO2018196216A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/236,488 US10802606B2 (en) 2017-04-25 2018-12-29 Method and device for aligning coordinate of controller or headset with coordinate of binocular system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710278094.2A CN108733206B (zh) 2017-04-25 2017-04-25 一种坐标对齐方法、系统和虚拟现实系统
CN201710278094.2 2017-04-25

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/236,488 Continuation US10802606B2 (en) 2017-04-25 2018-12-29 Method and device for aligning coordinate of controller or headset with coordinate of binocular system

Publications (1)

Publication Number Publication Date
WO2018196216A1 true WO2018196216A1 (zh) 2018-11-01

Family

ID=63918680

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/096120 WO2018196216A1 (zh) 2017-04-25 2017-08-04 一种坐标对齐方法、系统和虚拟现实系统

Country Status (3)

Country Link
US (1) US10802606B2 (zh)
CN (1) CN108733206B (zh)
WO (1) WO2018196216A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10802606B2 (en) 2017-04-25 2020-10-13 Guangdong Virtual Reality Technology Co., Ltd. Method and device for aligning coordinate of controller or headset with coordinate of binocular system

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10664993B1 (en) * 2017-03-13 2020-05-26 Occipital, Inc. System for determining a pose of an object
CN109908583B (zh) * 2019-02-25 2022-09-20 成都秘灵互动科技有限公司 基于vr的角色控制方法及装置
US11624757B2 (en) * 2019-03-04 2023-04-11 Meta Platforms, Inc. Modeling poses of tracked objects by predicting sensor data
CN111179679B (zh) * 2019-12-31 2022-01-28 广东虚拟现实科技有限公司 射击训练方法、装置、终端设备及存储介质
CN112113462B (zh) * 2020-04-24 2023-04-07 南京钧和瑞至电子科技有限公司 直瞄武器射击效果检测方法、系统及虚拟靶标射击系统
KR20210157708A (ko) * 2020-06-22 2021-12-29 삼성전자주식회사 밝기 조절 방법 및 hmd 장치
CN113721767B (zh) * 2021-08-30 2024-02-02 歌尔科技有限公司 一种手柄的追踪方法、装置、系统及介质
CN113573203B (zh) * 2021-09-26 2022-03-04 广州朗国电子科技股份有限公司 一种非接触式耳机的控制方法、智能耳机及存储介质
CN114161453B (zh) * 2021-12-30 2024-05-10 上海钛米机器人股份有限公司 基于双手柄的机器人控制方法、装置、系统和电子设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105898275A (zh) * 2016-04-28 2016-08-24 乐视控股(北京)有限公司 虚拟现实图像校准方法及装置
CN106095102A (zh) * 2016-06-16 2016-11-09 深圳市金立通信设备有限公司 一种虚拟现实显示界面处理的方法及终端
CN106249918A (zh) * 2016-08-18 2016-12-21 南京几墨网络科技有限公司 虚拟现实图像显示方法、装置及应用其的终端设备
US20170024935A1 (en) * 2015-03-17 2017-01-26 Colopl, Inc. Computer and computer system for controlling object manipulation in immersive virtual space

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2351636B (en) * 1999-01-20 2003-03-19 Canon Kk Video conferencing apparatus
US20100013765A1 (en) * 2008-07-18 2010-01-21 Wei Gu Methods for controlling computers and devices
US9519968B2 (en) * 2012-12-13 2016-12-13 Hewlett-Packard Development Company, L.P. Calibrating visual sensors using homography operators
TWI617948B (zh) * 2015-07-24 2018-03-11 由田新技股份有限公司 用於眼部追蹤的校正模組及其方法及電腦可讀取紀錄媒體
US10089778B2 (en) * 2015-08-07 2018-10-02 Christie Digital Systems Usa, Inc. System and method for automatic alignment and projection mapping
US10249090B2 (en) * 2016-06-09 2019-04-02 Microsoft Technology Licensing, Llc Robust optical disambiguation and tracking of two or more hand-held controllers with passive optical and inertial tracking
CN108733206B (zh) 2017-04-25 2020-10-30 广东虚拟现实科技有限公司 一种坐标对齐方法、系统和虚拟现实系统

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170024935A1 (en) * 2015-03-17 2017-01-26 Colopl, Inc. Computer and computer system for controlling object manipulation in immersive virtual space
CN105898275A (zh) * 2016-04-28 2016-08-24 乐视控股(北京)有限公司 虚拟现实图像校准方法及装置
CN106095102A (zh) * 2016-06-16 2016-11-09 深圳市金立通信设备有限公司 一种虚拟现实显示界面处理的方法及终端
CN106249918A (zh) * 2016-08-18 2016-12-21 南京几墨网络科技有限公司 虚拟现实图像显示方法、装置及应用其的终端设备

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10802606B2 (en) 2017-04-25 2020-10-13 Guangdong Virtual Reality Technology Co., Ltd. Method and device for aligning coordinate of controller or headset with coordinate of binocular system

Also Published As

Publication number Publication date
US10802606B2 (en) 2020-10-13
CN108733206B (zh) 2020-10-30
US20190138114A1 (en) 2019-05-09
CN108733206A (zh) 2018-11-02

Similar Documents

Publication Publication Date Title
WO2018196216A1 (zh) 一种坐标对齐方法、系统和虚拟现实系统
US10095031B1 (en) Non-overlapped stereo imaging for virtual reality headset tracking
CN106774844B (zh) 一种用于虚拟定位的方法及设备
CN108700947B (zh) 用于并发测距和建图的系统和方法
US11386611B2 (en) Assisted augmented reality
EP3323109B1 (en) Camera pose estimation for mobile devices
US9325969B2 (en) Image capture environment calibration method and information processing apparatus
JP5443134B2 (ja) シースルー・ディスプレイに現実世界の対象物の位置をマークする方法及び装置
US20190033988A1 (en) Controller tracking for multiple degrees of freedom
WO2017126172A1 (ja) 情報処理装置、情報処理方法、及び記録媒体
WO2018090692A1 (zh) 基于空间定位的虚拟现实防晕眩系统及方法
WO2017161660A1 (zh) 增强现实设备、系统、图像处理方法及装置
JP2015513662A (ja) 深度カメラを使用した頭部姿勢トラッキング
CN110782492B (zh) 位姿跟踪方法及装置
EP3413165A1 (en) Wearable system gesture control method and wearable system
CN111899276A (zh) 一种基于双目事件相机的slam方法及系统
US9445015B2 (en) Methods and systems for adjusting sensor viewpoint to a virtual viewpoint
CN111031468A (zh) 一种基于个体化hrtf立体声的视觉辅助方法与设备
US11694409B1 (en) Augmented reality using a split architecture
JP2013120150A (ja) 人間位置検出システム及び人間位置検出方法
KR20120090866A (ko) 모바일 기기를 이용하는 증강현실 환경에서 복수 객체 추적방법 및 이를 이용한 시스템
TWI814624B (zh) 環景影像的地標識別標註系統及其方法
KR102260754B1 (ko) 증강현실용 가이드 도구의 자세 추정을 위한 캘리브레이션 기구 및 이를 이용한 캘리브레이션 방법
CN112166594A (zh) 视频的处理方法和装置
TWI460683B (zh) The way to track the immediate movement of the head

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17908062

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: 1205A 26.03.2020

122 Ep: pct application non-entry in european phase

Ref document number: 17908062

Country of ref document: EP

Kind code of ref document: A1