CN106990836B - Method for measuring spatial position and attitude of head-mounted human input device - Google Patents
Method for measuring spatial position and attitude of head-mounted human input device Download PDFInfo
- Publication number
- CN106990836B CN106990836B CN201710103330.7A CN201710103330A CN106990836B CN 106990836 B CN106990836 B CN 106990836B CN 201710103330 A CN201710103330 A CN 201710103330A CN 106990836 B CN106990836 B CN 106990836B
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- axis
- camera
- infrared light
- head
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention discloses a method for measuring the spatial position and the attitude of head-mounted human input equipment, which comprises the following steps: collecting signals of an inertial sensor and first and second infrared area-array cameras; and detecting whether the infrared area-array camera detects the infrared light characteristic structure. When the infrared light characteristic structure is detected, the system reversely calculates the space position and the attitude parameter of the head-mounted equipment through the image information, and updates the state parameter of the inertial sensor; when the infrared light characteristic structure is not detected, the system measures the spatial position and the attitude parameter of the head-mounted equipment according to the recently updated inertial sensor state parameter; and finally, the system transmits the parameters to an upper computer, and the upper computer realizes more specific man-machine interaction operation.
Description
Technical Field
The invention belongs to the technical field of sensors, and particularly relates to a method for measuring the spatial position and the attitude of a head-mounted human input device.
Background
Due to the rise of the current new technologies such as virtual reality technology, people are no longer just satisfied with the interactive control with a computer by using a traditional computer mouse. At present, in order to realize actions in the virtual real world, users still mainly complete actions through traditional human input devices such as a mouse, a keyboard and a handle. When the user is immersed in the virtual reality, the user cannot see the keys of the mouse and the keyboard, so that the experience effect of the user is influenced.
The currently disclosed head-mounted computer cursor control device mainly depends on multi-axis inertial navigation to measure the spatial position and attitude parameters of the head-mounted device, so as to calculate the coordinate position of the cursor on the screen at the moment. Because the output signal of the inertial navigation sensor contains a certain offset, and the offset is difficult to completely remove, the positioning error of the mouse of the human input device for measuring the posture and the space parameters of the head-mounted device by utilizing inertial navigation can be accumulated along with time, so that the cursor drifts, and the user experience is influenced.
Disclosure of Invention
In view of the above problems, an object of the present invention is to provide a method for measuring a spatial position and an attitude of a head-mounted human input device, so as to solve the problems existing in the current virtual reality human-computer interface interaction.
A head-mounted human input device space position and posture measuring method comprises the following steps:
step 11, performing edge detection on pixel points (u, v) in an image acquired by the infrared area-array camera under an image coordinate system to obtain gray values G (u, v) corresponding to the pixel points (u, v) in the image after the edge detection:
wherein f (u, v) is a gray value corresponding to the pixel point (u, v) in the image before edge detection;
the image coordinate system takes the top left corner of the image as an origin, the horizontal direction is a u axis, and the vertical direction is a v axis;
step 13, calculating the circle centers of the 4 infrared light bright spots through the formulas (4) and (5), wherein the circle center of the ith infrared light bright spot is (u)oi,voi);
Wherein i is 1,2,3, 4; j is 1,2, …, n, uEjIs a set of u-axis coordinate values of edge points of the infrared bright spots, vEjThe set of the edge point v-axis coordinate values of the infrared light bright points; n is the number of edge points of the infrared light bright points, and n is a natural number more than or equal to 1;
the world coordinate system takes an infrared light source at the left lower part of an infrared light source array in the infrared light emitting device as an origin OwThe projection of the long side of the front view of the base on the horizontal plane is XwIn the positive axial direction, the projection of the short edge of the left view of the base on the horizontal plane is YwPositive direction of axis, ZwThe axis is vertical to the horizontal plane and downward;
the reference coordinate system takes the infrared light source above the left of the infrared light source array in the infrared light emitting device as an origin OrOrigin OrX along the long axis of the infrared light emitting devicerPositive axial direction, origin OrY along the minor axis of the infrared light emitting devicerIn the positive direction of the axis, a reference coordinate system Z is formed according to the right-hand rulerA shaft;
camera coordinate system and cameraWith the optical center of origin Oc,ZcThe axis coincides with the optical axis of the camera and is perpendicular to the imaging plane of the camera, the shooting direction of the camera is taken as the positive direction, XcThe axis being parallel to the X-axis, Y, of the physical imaging plane coordinate system of the cameracThe axis is parallel to the Y axis of the physical imaging plane coordinate system of the camera;
the normalized imaging plane where the normalized imaging plane coordinate system is located is parallel to the physical imaging plane of the camera, and the origin O of the normalized imaging plane coordinate system1cIs the intersection of the camera's optical axis and the normalized imaging plane, X1cThe axis being parallel to the X-axis, Y, of the physical imaging plane coordinate system of the camera1cThe axis is parallel to the Y axis of the physical imaging plane coordinate system of the camera;
step 21, using formula (6), calculating the center (u) of the infrared bright point obtained in step 1oi,voi) Transforming to a normalized imaging plane;
in the formula (6), x1ci、y1ciThe projection of the center of the ith infrared light spot on the normalized imaging plane is respectively, wherein i is 1,2,3 and 4;is a camera intrinsic parameter matrix;
wherein k isuIs the magnification factor along the u-axis direction, kvIs the magnification factor along the v-axis, u0、v0The coordinates of the intersection point of the central line of the optical axis of the camera and the imaging plane on an image coordinate system;
step 22, transforming the coordinates of the infrared light bright spot in the reference coordinate system to the coordinates of the camera coordinate system by the formula (7):
wherein x isci,yci,zciThe center of the ith infrared light spot is in the coordinate system of the cameraCoordinates of the lower part; x is the number ofri,yri,zriThe coordinate of the circle center of the ith infrared light spot under the reference coordinate system;cMris a pose matrix of the camera coordinate system relative to the reference coordinate system:
wherein the content of the first and second substances,cRris a pose matrix of the camera coordinate system relative to the reference coordinate system,cPra translation matrix of the camera coordinate system relative to the reference coordinate system;
further, still include:
step 31, acquiring the medium gravity acceleration g from a three-axis accelerometer arranged on an infrared light emitting device under a reference coordinate system XrComponent g on the axisxr,YrComponent g on the axisyr,ZrComponent g on the axiszr;
Step 32, using gxr,gyr,gzrObtaining a pose matrix of the reference coordinate system relative to the world coordinate system through the formulas (14) and (15)
In the formula (14), the compound represented by the formula (I),in the formula (15), the yaw angle ψ is 0;
yaw angle psi as reference coordinate system around ZrThe axis of rotation being YrProjection on horizontal plane and world coordinate system YdThe included angle between the two is positive clockwise;
the pitch angle theta is a reference coordinate system around the horizontal axis XrRotation of its longitudinal axis YrWith the world coordinate system YdThe included angle between the shafts is positive upwards;
roll angleIs a reference coordinate system around its longitudinal axis YrRotation about a vertical axis ZrThe included angle between the vertical axis and the plumb surface is positive when the vertical axis is inclined rightwards;
step 33, pose matrix of camera coordinate system relative to world coordinate systemcMw=cMr rMw;
Further, still include:
step 41, obtaining a pose matrix of the coordinate system of the head-mounted device relative to the reference coordinate system of the head-mounted device through the formula (16)dMdr_V:
In the formula (16), the compound represented by the formula,dMc_Va pose matrix for the camera coordinate system to the headset coordinate system,
wMdr_Va pose moment of the world coordinate system relative to the headset reference coordinate system;
wMdr_V=wMd_V dMdr_V,dMdr_Vis a pose matrix of a head-mounted device coordinate system measured by a six-axis inertial navigation sensor relative to a head-mounted device reference coordinate system,wMd_Va matrix of pose states of the world coordinate system with respect to the headset,wMd_V=(dMc_V cMw_V)-1;
the coordinate system of the head-mounted equipment takes the geometric center of the installation position of the six-axis inertial navigation sensor as an origin OdThe vertical axis Z thereofdAnd a cameraXcAxis parallel, XdZ of the camera coordinate systemcAxis parallel, YdY with camera coordinate systemcThe axes are parallel;
the head-mounted equipment reference coordinate system takes the geometric center of the installation position of the six-axis inertial navigation sensor as an origin OdrVertical axis ZdrThe positive direction is vertical to the horizontal plane, XdrThe axis is X of the head-mounted device at the initial time of the operation of the devicedProjection of the axis on a horizontal plane, YdrThe axis is the Y of the head-mounted equipment at the initial time of the operation of the equipmentdProjection of the axis on a horizontal plane;
step 42, setting a coordinate system of the head-wearing device obtained by the six-axis inertial navigation sensor to be relative to the head-wearing device
Wherein the values of ψ, θ,is the euler angle obtained from the camera image; psi ', theta',is an Euler angle obtained by a six-axis inertial navigation sensor;
step 43, setThen the deviation Euler angle quantity Angles is obtained by the equation (17)E;
step 44, correcting the deviation Euler angle vector Angles by the formula (18)E:
In the formula (15), α, β and γ are weighted correction coefficients, AnglesE _ integralIs an AnglesEThe error proportional integral term of (3), the initial value of which is zero;
step 45, for the translation vectordPdr_IAnddPdr_Vfirst, a translation deviation vector P is obtainedE:
PE=dPdr_I dPdr_V (19)
By PEMake a correction
Finally, by the above method, all 6 parameters describing the head mounted human input device, psi, theta, dpcx,dpcy,dpcz。
(1) according to the invention, the information of the image and the inertial sensor is combined, so that the accumulated integral error caused by the measurement of the inertial sensor is avoided;
(2) the position and the posture of the head-mounted equipment are calculated by utilizing the position information of the four infrared light bright spots, so that the visual interference to people can be avoided;
(3) the device is provided with the inertial sensor, so that the device can still measure the space position and the attitude parameter of the equipment when the device cannot acquire the image information.
Drawings
FIG. 1 is a flow chart of a method of the present invention;
FIG. 2 is an external view of the head-mounted human input device;
FIG. 3 is a schematic view of an infrared light spot emitting device according to the present invention;
FIG. 4 is a schematic coordinate diagram of a camera coordinate system, a normalized imaging plane coordinate system, and a physical imaging plane;
FIG. 5 is a flowchart of a method for fusion processing of camera image processing and a six-axis inertial navigation sensor;
FIG. 6 is an image taken with an infrared area-array camera;
FIG. 7 is the outline and center of a circle of a highlight-marked infrared light spot;
FIG. 8 is a graph of the dynamic variation of yaw, pitch and roll angles;
the invention is explained in more detail below with reference to the figures and examples.
Detailed Description
The invention is further illustrated below with reference to specific figures.
Example 1
Referring to fig. 1,2 and 3, the present invention is implemented by the following technical solutions, a head-mounted human body input device spatial position and posture measuring apparatus, comprising:
the infrared light emitting device includes:
the device comprises a three-axis accelerometer (1), an MCU (micro control unit) 1(2), a wireless transmission module 1(3) and an infrared diode (9);
the head-mounted device includes:
the device comprises an infrared area array camera (4), an MCU (micro control unit) 2(5), a six-axis inertial navigation sensor (6), a wireless transmission module 2(7) and an upper computer (8).
Wherein the content of the first and second substances,
the three-axis accelerometer (1) is arranged in the infrared light emission base and is used for detecting the pose relation between the base and a world coordinate system;
the MCU 1(2) is used for calculating a pose matrix of the infrared light emission base and a world coordinate system according to data measured by the triaxial accelerometer (1);
the wireless transmission module 1(3) is used for realizing the communication between the infrared light emitting base and the head-mounted equipment;
the infrared area-array camera (4) is used for detecting the position of the infrared light spot;
the MCU2(5) is used for calculating the space attitude and position parameters of the head-mounted equipment;
the six-axis inertial navigation sensor (6) is used for acquiring the rotation angular rate and the acceleration of the head-mounted human input device in the space;
the wireless transmission module 2(7) is used for realizing the communication between the head-mounted equipment and the upper computer;
the upper computer (8) is used for providing a software interface for other software in the operating system of the computing system to acquire the head space posture and position parameters of the user;
the infrared light diode (9) is used for providing an infrared light spot for positioning the device.
Specifically, 4 infrared light diodes are arranged on the infrared light emitting base, and the diodes are coplanar;
specifically, the infrared area-array camera (4) is arranged in the center of the front end of the head-mounted equipment;
specifically, the three-axis accelerometer (1) and the six-axis inertial navigation sensor (6) are respectively strapdown connected with the infrared light emitting base and the head-mounted equipment;
specifically, the six-axis inertial navigation sensor (6) comprises a three-axis accelerometer and a three-axis gyroscope.
Example 2
step 11, performing edge detection on pixel points (u, v) in an image acquired by the infrared area-array camera under an image coordinate system to obtain gray values G (u, v) corresponding to the pixel points (u, v) in the image after the edge detection:
wherein f (u, v) is a gray value corresponding to the pixel point (u, v) in the image before edge detection;
the image coordinate system takes the top left corner of the image as an origin, the horizontal direction is a u axis, and the vertical direction is a v axis;
in this embodiment, the Otsu method is used to calculate the dynamic segmentation threshold T;
and step 13, calculating and calculating the circle centers of the 4 infrared light bright spots according to the formulas (4) and (5), wherein the circle center of the infrared light bright spot is (u)oi,voi);
Wherein j is 1,2, …, n, uEjIs a set of u-axis coordinate values of edge points of the infrared bright spots, vEjThe set of the edge point v-axis coordinate values of the infrared light bright points; n is the number of edge points of the infrared light bright points, and n is a natural number more than or equal to 1;
fig. 6 is an image captured by an infrared area-array camera, and fig. 7 is an outline and a circle center of an infrared bright spot subjected to highlight identification, where the calculated coordinates of the circle centers of 4 infrared bright spots in an image coordinate system are:
|
1 | 2 | 3 | 4 |
Coordinates of the object | (500,362) | (664,378) | (482,582) | (644,598) |
the world coordinate system takes an infrared light source at the left lower part of an infrared light source array in the infrared light emitting device as an origin OwThe projection of the long side of the front view of the base on the horizontal plane is XwIn the positive axial direction, the projection of the short edge of the left view of the base on the horizontal plane is YwPositive direction of axis, ZwThe axis is vertical to the horizontal plane and downward;
the reference coordinate system takes the infrared light source above the left of the infrared light source array in the infrared light emitting device as an origin OrOrigin OrInfrared light emitting deviceHas a major axis direction of XrPositive axial direction, origin OrY along the minor axis of the infrared light emitting devicerIn the positive direction of the axis, a reference coordinate system Z is formed according to the right-hand rulerA shaft;
the camera coordinate system takes the optical center of the camera as the origin Oc,ZcThe axis coincides with the optical axis of the camera and is perpendicular to the imaging plane of the camera, the shooting direction of the camera is taken as the positive direction, XcThe axis being parallel to the X-axis, Y, of the physical imaging plane coordinate system of the cameracThe axis is parallel to the Y axis of the physical imaging plane coordinate system of the camera;
the normalized imaging plane where the normalized imaging plane coordinate system is located is parallel to the physical imaging plane of the camera, and the origin O of the normalized imaging plane coordinate system1cIs the intersection of the camera's optical axis and the normalized imaging plane, X1cThe axis being parallel to the X-axis, Y, of the physical imaging plane coordinate system of the camera1cThe axis is parallel to the Y axis of the physical imaging plane coordinate system of the camera;
step 21, using formula (6), calculating the center (u) of the infrared bright point obtained in step 1oi,voi) Transforming to a normalized imaging plane;
in the formula (6), x1ci、y1ciRespectively, the projection of the center of the infrared light spot on the normalized imaging plane, MinIs a camera intrinsic parameter matrix; the present embodiment employs a 4-parameter in-camera parametric model.
Wherein k isuIs the magnification factor along the u-axis direction, kvIs the magnification factor along the v-axis, u0、v0The coordinates of the intersection point of the central line of the optical axis of the camera and the imaging plane on an image coordinate system;
step 22, transforming the center coordinates of the infrared light bright spots in the reference coordinate system to the camera coordinate system by the formula (8):
wherein the content of the first and second substances,cMris a pose matrix of the camera coordinate system relative to the reference coordinate system:
wherein the content of the first and second substances,pose matrix for camera coordinate system relative to reference coordinate systemcRr,Translation matrix for camera coordinate system relative to reference coordinate systemcPr;
Writing the formulas (6) and (7) into the form of equation set to respectively obtain
Substituting the formula (10) into the formula (9) to obtain
For each point there is a set of equations as shown in equation (10). Combining the equations corresponding to all the characteristic points, writing the equations into a matrix form to obtain
A1H1+A2H2=0 (12)
In the formula (I), the compound is shown in the specification,
H1=[nx ny nz]T
H2=[ox oy oz Px py pz]T
wherein A is1And A2The parameters in the matrix are all measured or known parameters, H1And H2The parameter(s) in (1) comprise the requested unknown quantity. Only need to solve H1And H2The formed vector H can be used to calculate the rotation transformation matrix of the camera coordinate system relative to the reference coordinate systemcMrThe value of column 3 in columns 1,2 and 4 can be obtained by calculating the cross product of columns 1 and 2.
In this embodiment, the center of the four infrared light spots is located at the coordinate (x) of the reference coordinate systemri,yri,zri) Is (counterclockwise starting from the infrared light spot in the upper left corner) (0,0,0), (124,0,0), (0,68,0), (124,68,0) in mm.
H1And H2The solution of (a) is given by the following equation:
in the formula (13), the reaction mixture is,λ is the minimum eigenvalue of matrix B.
In this embodiment, the calculated intra-camera parameter matrix is:
calculating according to the coordinates of the infrared bright point in the image coordinate system obtained by previous measurement
Example 3
Since there may be a situation that the infrared light emitting device is not perpendicular to the horizontal plane, in order to avoid generating errors, the present embodiment constructs a world coordinate system, and calculates a pose matrix of the camera coordinate system with respect to the world coordinate system, and on the basis of embodiment 3, the present embodiment further includes:
step 31, acquiring the medium gravity acceleration g from a three-axis accelerometer arranged on an infrared light emitting device under a reference coordinate system XrComponent g on the axisxr,YrComponent g on the axisyr,ZrComponent g on the axiszr;
Step 32, the direction is always consistent with the Z of the world coordinate system due to the action of gravitywThe axes being coincident, g being available when the reference frame is rotated relative to the world framexr,gyr,gzrObtaining a pose matrix of the reference coordinate system relative to the world coordinate system through the formulas (14) and (15)
The direction is always consistent with the Z of the world coordinate system under the action of gravitywThe axes coincide, at this time gxr,gyr,gzrIs thatrRwThe third column in the matrix, i.e.
In the formula (14), the compound represented by the formula (I),in the formula (15), in the present invention, the yaw angle ψ of the reference coordinate system with respect to the world coordinate system is 0;
the euler angles are defined as follows:
yaw angle psi as reference coordinate system around ZrThe axis of rotation being YrProjection on horizontal plane and world coordinate system YdThe included angle between the two is positive clockwise;
the pitch angle theta is a reference coordinate system around the horizontal axis XrRotation of its longitudinal axis YrWith the world coordinate system YdThe included angle between the shafts is positive upwards;
roll angleIs a reference coordinate system around its longitudinal axis YrRotation about a vertical axis ZrThe included angle between the vertical axis and the plumb surface is positive when the vertical axis is inclined rightwards;
step 33, in which, since the reference coordinate system coincides with the origin of the world coordinate system, the translation vector P in the pose matrix is 0, thereby obtainingFurther, a pose matrix of the camera coordinate system relative to the world coordinate system is obtainedcMw=cMr rMw;
Example 4
The present embodiment further includes, on the basis of embodiment 2 or embodiment 3:
step 41, obtaining a pose matrix of the coordinate system of the head-mounted device relative to the reference coordinate system of the head-mounted device through the formula (16)dMdr_V:
In the formula (16), the compound represented by the formula,dMc_Va pose matrix for the camera coordinate system to the headset coordinate system,
wMdr_Va pose moment of the world coordinate system relative to the headset reference coordinate system;
wMdr_V=wMd_V dMdr_V,dMdr_Vis a pose matrix of a head-mounted device coordinate system measured by a six-axis inertial navigation sensor relative to a head-mounted device reference coordinate system,wMd_Va matrix of pose states of the world coordinate system with respect to the headset,wMd_V=(dMc_V cMw_V)-1;
the coordinate system of the head-mounted equipment takes the geometric center of the installation position of the six-axis inertial navigation sensor as an origin OdThe vertical axis Z thereofdAnd camera XcAxis parallel, XdZ of the camera coordinate systemcAxis parallel, YdY with camera coordinate systemcThe axes are parallel;
the head-mounted equipment reference coordinate system takes the geometric center of the installation position of the six-axis inertial navigation sensor as an origin OdrVertical axis ZdrThe positive direction is vertical to the horizontal plane, XdrThe axis is X of the head-mounted device at the initial time of the operation of the devicedProjection of the axis on a horizontal plane, YdrThe axis is the Y of the head-mounted equipment at the initial time of the operation of the equipmentdProjection of the axis on a horizontal plane;
and 42, initializing the infrared area array camera and the six-axis inertial navigation sensor, and setting the bus transmission rate of communication with the infrared area array camera and parameters of data updating frequency, acceleration, angular rate measuring range and the like of the strapdown inertial navigation sensor.
The system collects data of six-axis inertial navigation sensors; the data includes angular velocities of the headset about three axes of the headset coordinate system, and accelerations in the directions indicated by the three axes.
After the system collects signals of the six-axis inertial navigation sensor, a pose matrix of the camera is detectedcMwWhether it is updated. If updated, the pose matrix given by the camera image and the pose matrix given by the six-axis inertial navigation sensor are combined to calculate the head-mounted poseA pose matrix of the device relative to a world coordinate system; otherwise, the position and posture matrix of the head-mounted equipment is estimated by independently utilizing signals of the six-axis inertial navigation sensor.
Calculating a pose matrix of the head-mounted device relative to a world coordinate system in combination with a pose matrix given by the camera image and a pose matrix given by the six-axis inertial navigation sensor, comprising:
the position matrix of the coordinate system of the head-mounted equipment relative to the reference coordinate system of the head-mounted equipment obtained by the six-axis inertial navigation sensor is set as
Wherein the values of ψ, θ,is the euler angle obtained from the camera image; psi ', theta',is an Euler angle obtained by a six-axis inertial navigation sensor;
step 43, setThen the deviation Euler angle quantity Angles is obtained by the equation (17)E;
step 44, correcting the deviation Euler angle vector Angles by the formula (18)E:
In the formula (18), α, β and γ are weighted correction coefficients, AnglesE _ integralIs an AnglesEThe error proportional integral term of (3), the initial value of which is zero;
as shown in fig. 8, the graph shows the change with time of the euler angle curve calculated by the above method, and the black curve shows the euler angle corrected by the above method.
Step 45, for the translation vectordPdr_IAnddPdr_Vfirst, a translation deviation vector P is obtainedE:
PE=dPdr_I dPdr_V (19)
By PEMake a correction
Claims (3)
1. a head-mounted human input device spatial position and posture measuring method is characterized by comprising the following steps:
step 1, positioning four infrared light spots emitted by an infrared light emitting device through an infrared area array camera arranged on a head-worn anthropology input device; the infrared light emitting device comprises 4 infrared light source arrays;
step 11, performing edge detection on pixel points (u, v) in an image acquired by the infrared area-array camera under an image coordinate system to obtain gray values G (u, v) corresponding to the pixel points (u, v) in the image after the edge detection:
wherein f (u, v) is a gray value corresponding to the pixel point (u, v) in the image before edge detection;
the image coordinate system takes the top left corner of the image as an origin, the horizontal direction is a u axis, and the vertical direction is a v axis;
step 12, setting a threshold value T, and if G (u, v) > T, setting the pixel point (u, v) as an edge point of the infrared light bright point;
step 13, calculating the circle centers of the 4 infrared light bright spots through the formulas (4) and (5), wherein the circle center of the ith infrared light bright spot is (u)oi,voi);
Wherein i is 1,2,3, 4; j is 1,2, …, n, uEjIs a set of u-axis coordinate values of edge points of the infrared bright spots, vEjThe set of the edge point v-axis coordinate values of the infrared light bright points; n is the number of edge points of the infrared light bright points, and n is a natural number more than or equal to 1;
step 2, determining the coordinate (x) of the infrared light spot under the reference coordinate systemr,yr,zr) Combining the center (u) of the infrared light spot under the image coordinate systemoi,voi) Calculating a pose matrix of the coordinate system of the head-mounted equipment relative to the world coordinate system to obtain the spatial position and the attitude parameters of the head-mounted equipment, wherein the pose matrix comprises the following steps:
the world coordinate system takes an infrared light source at the left lower part of an infrared light source array in the infrared light emitting device as an origin OwThe projection of the long side of the front view of the base on the horizontal plane is XwIn the positive axial direction, the projection of the short edge of the left view of the base on the horizontal plane is YwPositive direction of axis, ZwThe axis is vertical to the horizontal plane and downward;
the reference coordinate system takes the infrared light source above the left of the infrared light source array in the infrared light emitting device as an origin OrOrigin OrX along the long axis of the infrared light emitting devicerPositive axial direction, origin OrY along the minor axis of the infrared light emitting devicerIn the positive direction of the axis, a reference coordinate system Z is formed according to the right-hand rulerA shaft;
the camera coordinate system takes the optical center of the camera as the origin Oc,ZcThe axis coincides with the optical axis of the camera and is perpendicular to the imaging plane of the camera, the shooting direction of the camera is taken as the positive direction, XcThe axis being parallel to the X-axis, Y, of the physical imaging plane coordinate system of the cameracThe axis is parallel to the Y axis of the physical imaging plane coordinate system of the camera;
the normalized imaging plane where the normalized imaging plane coordinate system is located is parallel to the physical imaging plane of the camera, and the origin O of the normalized imaging plane coordinate system1cIs the intersection of the camera's optical axis and the normalized imaging plane, X1cThe axis being parallel to the X-axis, Y, of the physical imaging plane coordinate system of the camera1cThe axis is parallel to the Y axis of the physical imaging plane coordinate system of the camera;
step 21, using formula (6), calculating the center (u) of the infrared bright point obtained in step 1oi,voi) Transforming to a normalized imaging plane;
in the formula (6), x1ci、y1ciThe projection of the center of the ith infrared light spot on the normalized imaging plane is respectively, wherein i is 1,2,3 and 4;is a camera intrinsic parameter matrix;
wherein k isuIs the magnification factor along the u-axis direction, kvIs the magnification factor along the v-axis, u0、v0The coordinates of the intersection point of the central line of the optical axis of the camera and the imaging plane on an image coordinate system;
step 22, transforming the coordinates of the infrared light bright spot in the reference coordinate system to the coordinates of the camera coordinate system by the formula (7):
wherein x isci,yci,zciThe coordinate of the circle center of the ith infrared light spot under the camera coordinate system; x is the number ofri,yri,zriThe coordinate of the circle center of the ith infrared light spot under the reference coordinate system;cMris a pose matrix of the camera coordinate system relative to the reference coordinate system:
wherein the content of the first and second substances,cRris a pose matrix of the camera coordinate system relative to the reference coordinate system,cPra translation matrix of the camera coordinate system relative to the reference coordinate system;
2. the head-mounted human input device spatial position and attitude measurement method of claim 1, further comprising:
step 31, acquiring the medium gravity acceleration g from a three-axis accelerometer arranged on an infrared light emitting device under a reference coordinate system XrComponent g on the axisxr,YrComponent g on the axisyr,ZrComponent g on the axiszr;
Step 32, using gxr,gyr,gzrObtaining a pose matrix of the reference coordinate system relative to the world coordinate system through the formulas (14) and (15)
In the formula (14), the compound represented by the formula (I),in the formula (15), the yaw angle ψ is 0;
yaw angle psi as reference coordinate system around ZrThe axis of rotation being YrProjection on horizontal plane and world coordinate system YwThe included angle between the two is positive clockwise;
the pitch angle theta is a reference coordinate system around the horizontal axis XrRotation of its longitudinal axis YrWith the world coordinate system YwThe included angle between the axes;
roll angleIs a reference coordinate system around its longitudinal axis YrRotation about a vertical axis ZrThe included angle between the vertical axis and the plumb surface is positive when the vertical axis is inclined rightwards;
step 33, pose matrix of camera coordinate system relative to world coordinate systemcMw=cMr rMw;
3. The head-mounted human input device spatial position and attitude measurement method of claim 1 or 2, further comprising:
step 41, obtaining a pose matrix of the coordinate system of the head-mounted device relative to the reference coordinate system of the head-mounted device through the formula (16)dMdr_V:
In the formula (16), the compound represented by the formula,dMc_Va pose matrix for the camera coordinate system to the headset coordinate system,
wMdr_Va pose moment of the world coordinate system relative to the headset reference coordinate system;
wMdr_V=wMd_V dMdr_V,dMdr_Vis a pose matrix of a head-mounted device coordinate system measured by a six-axis inertial navigation sensor relative to a head-mounted device reference coordinate system,wMd_Va matrix of pose states of the world coordinate system with respect to the headset,wMd_V=(dMc_V cMw_V)-1;
the coordinate system of the head-mounted equipment takes the geometric center of the installation position of the six-axis inertial navigation sensor as an origin OdThe vertical axis Z thereofdAnd camera XcAxis parallel, XdZ of the camera coordinate systemcAxis parallel, YdY with camera coordinate systemcThe axes are parallel;
the head-mounted equipment reference coordinate system takes the geometric center of the installation position of the six-axis inertial navigation sensor as an origin OdrVertical axis ZdrThe positive direction is vertical to the horizontal plane, XdrThe axis is X of the head-mounted device at the initial time of the operation of the devicedProjection of the axis on a horizontal plane, YdrThe axis is the Y of the head-mounted equipment at the initial time of the operation of the equipmentdProjection of the axis on a horizontal plane;
step 42, setting a head-mounted device coordinate system obtained by the six-axis inertial navigation sensorThe pose matrix relative to the head-mounted device reference coordinate system isdMdr_I,
Where ψ is a yaw angle obtained by a camera image, θ is a pitch angle obtained by a camera image,roll angle obtained from camera images; psi 'is a yaw angle obtained by the six-axis inertial navigation sensor, theta' is a pitch angle obtained by the six-axis inertial navigation sensor,the roll angle is obtained through a six-axis inertial navigation sensor;
step 43, setThen the deviation Euler angle quantity Angles is obtained by the equation (17)E;
In the formula (17), the compound represented by the formula (I),is a cross product operator;
step 44, correcting the deviation Euler by the formula (18)Angle vector AngleE:
In the formula (18), α, β and γ are weighted correction coefficients, AnglesE _ integralIs an AnglesEThe error proportional integral term of (2) has an initial value of zero, i represents the temperature at any time, TiIndicates the temperature correction coefficient at the ith time,
step 45, for the translation vectordPdr_IAnddPdr_Vfirst, a translation deviation vector P is obtainedE:
PE=dPdr_I dPdr_V (19)
By PEMake a correction
In the formula (20), i represents the temperature at any time, TiRepresenting the temperature correction coefficient at the ith time instant, finally, by the above method, all 6 parameters describing the head mounted ergonomic input device can be derived: the number of the psi, theta, dpcx’dpcy’dpcz。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710103330.7A CN106990836B (en) | 2017-02-24 | 2017-02-24 | Method for measuring spatial position and attitude of head-mounted human input device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710103330.7A CN106990836B (en) | 2017-02-24 | 2017-02-24 | Method for measuring spatial position and attitude of head-mounted human input device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106990836A CN106990836A (en) | 2017-07-28 |
CN106990836B true CN106990836B (en) | 2020-01-07 |
Family
ID=59412547
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710103330.7A Expired - Fee Related CN106990836B (en) | 2017-02-24 | 2017-02-24 | Method for measuring spatial position and attitude of head-mounted human input device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106990836B (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107314778B (en) * | 2017-08-04 | 2023-02-10 | 广东工业大学 | Calibration method, device and system for relative attitude |
CN111148970A (en) | 2017-09-13 | 2020-05-12 | 聂小春 | System and method for calibrating imaging and spatial orientation sensors |
EP3460394B1 (en) * | 2017-09-26 | 2020-06-03 | Hexagon Technology Center GmbH | Surveying instrument, augmented reality (ar)-system and method for referencing an ar-device relative to a reference system |
CN108957505A (en) * | 2018-06-27 | 2018-12-07 | 四川斐讯信息技术有限公司 | A kind of localization method, positioning system and portable intelligent wearable device |
CN109785381B (en) * | 2018-12-06 | 2021-11-16 | 苏州炫感信息科技有限公司 | Optical inertia fusion space positioning method and positioning system |
CN114237537B (en) * | 2021-12-10 | 2023-08-04 | 杭州海康威视数字技术股份有限公司 | Head-mounted equipment, remote assistance method and system |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103914145A (en) * | 2013-01-08 | 2014-07-09 | 三星电子株式会社 | Input Device, Display Device And Method Of Controlling Thereof |
CN104346813A (en) * | 2014-10-28 | 2015-02-11 | 南京理工大学 | Method for calibrating camera parameters in flame emission tomography system |
CN106295512A (en) * | 2016-07-27 | 2017-01-04 | 哈尔滨工业大学 | Many correction line indoor vision data base construction method based on mark and indoor orientation method |
-
2017
- 2017-02-24 CN CN201710103330.7A patent/CN106990836B/en not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103914145A (en) * | 2013-01-08 | 2014-07-09 | 三星电子株式会社 | Input Device, Display Device And Method Of Controlling Thereof |
CN104346813A (en) * | 2014-10-28 | 2015-02-11 | 南京理工大学 | Method for calibrating camera parameters in flame emission tomography system |
CN106295512A (en) * | 2016-07-27 | 2017-01-04 | 哈尔滨工业大学 | Many correction line indoor vision data base construction method based on mark and indoor orientation method |
Non-Patent Citations (2)
Title |
---|
3D scene reconstruction by multiple structured-light based commodity depth cameras;jianfeng Wang,et al.;《2012 IEEE International Conference on Acoustics,Speech and Signal Processing》;20120831;第5329-5432页 * |
基于MEMS器件的车辆姿态测量系统开发;王建锋,等;《中国科技论文》;20151031;第10卷(第19期);第2281-2286页 * |
Also Published As
Publication number | Publication date |
---|---|
CN106990836A (en) | 2017-07-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106990836B (en) | Method for measuring spatial position and attitude of head-mounted human input device | |
JP4593968B2 (en) | Position and orientation measurement method and apparatus | |
US10353482B2 (en) | Systems and methods for tracking motion and gesture of heads and eyes | |
JP5388932B2 (en) | Information processing apparatus and control method thereof | |
KR100947405B1 (en) | Implement for Optically Inferring Information from a Planar Jotting Surface | |
CN111415387B (en) | Camera pose determining method and device, electronic equipment and storage medium | |
US8593402B2 (en) | Spatial-input-based cursor projection systems and methods | |
TWI512548B (en) | Moving trajectory generation method | |
CN109544630B (en) | Pose information determination method and device and visual point cloud construction method and device | |
US20060262141A1 (en) | Position and orientation measuring method and apparatus | |
US9261953B2 (en) | Information processing apparatus for displaying virtual object and method thereof | |
JP2004233334A (en) | Method for measuring position and orientation | |
WO2023160694A1 (en) | Virtualization method and apparatus for input device, device, and storage medium | |
US20230325009A1 (en) | Methods, devices, apparatuses, and storage media for mapping mouse models for computer mouses | |
JP4566786B2 (en) | Position and orientation measurement method and information processing apparatus | |
US12026326B2 (en) | Pen state detection circuit and method, and input system | |
JP5726024B2 (en) | Information processing method and apparatus | |
JP2012220271A (en) | Attitude recognition apparatus, attitude recognition method, program and recording medium | |
JP6966777B2 (en) | Input system | |
US20220005225A1 (en) | Systems and methods for calibrating imaging and spatial orientation sensors | |
JP2017027472A (en) | Coordinate input system, coordinate input device, coordinate input method, and program | |
CN112181135A (en) | 6-DOF visual touch interaction method based on augmented reality | |
JP2004108836A (en) | Azimuth angle computing method of imaging apparatus and system, attitude sensor of imaging apparatus, tilt sensor of imaging device and computer program and three-dimentional model configuration device | |
WO2023218886A1 (en) | Information input device, information input method, and information input program | |
CN117806468A (en) | IMU-based cursor control method and device, storage medium and computer equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right |
Effective date of registration: 20210520 Address after: Room 502, 5th floor, bishuiyunxuan, No.9, Xingqing South Road, Beilin District, Xi'an City, Shaanxi Province, 710000 Patentee after: Shaanxi shuoxuan Information Technology Co.,Ltd. Address before: 710064 No. 126 central section of South Ring Road, Yanta District, Xi'an, Shaanxi Patentee before: CHANG'AN University |
|
TR01 | Transfer of patent right | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20200107 |
|
CF01 | Termination of patent right due to non-payment of annual fee |