CN114114688A - AR (augmented reality) glasses positioning method and system based on optical tracker - Google Patents
AR (augmented reality) glasses positioning method and system based on optical tracker Download PDFInfo
- Publication number
- CN114114688A CN114114688A CN202111360840.5A CN202111360840A CN114114688A CN 114114688 A CN114114688 A CN 114114688A CN 202111360840 A CN202111360840 A CN 202111360840A CN 114114688 A CN114114688 A CN 114114688A
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- tracker
- glasses
- marker
- virtual world
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 239000011521 glass Substances 0.000 title claims abstract description 69
- 230000003287 optical effect Effects 0.000 title claims abstract description 47
- 238000000034 method Methods 0.000 title claims abstract description 19
- 230000003190 augmentative effect Effects 0.000 title abstract description 5
- 239000003550 marker Substances 0.000 claims abstract description 104
- 239000011159 matrix material Substances 0.000 claims abstract description 35
- 238000006243 chemical reaction Methods 0.000 claims abstract description 26
- 230000009466 transformation Effects 0.000 claims abstract description 21
- 230000008859 change Effects 0.000 claims abstract description 6
- 238000005516 engineering process Methods 0.000 claims description 9
- 238000000354 decomposition reaction Methods 0.000 claims description 8
- 239000000523 sample Substances 0.000 claims description 8
- 238000012545 processing Methods 0.000 claims description 5
- 238000013519 translation Methods 0.000 claims description 4
- 238000005259 measurement Methods 0.000 abstract description 2
- 238000004891 communication Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
Abstract
The invention relates to an AR (augmented reality) glasses positioning method and system based on an optical tracker, wherein the method comprises the following steps: a marker ball support is additionally arranged on the AR glasses, and a group of marker ball groups are arranged on the marker ball support; constructing a virtual world coordinate system and an optical Tracker coordinate system of the AR glasses, and acquiring a transformation matrix between the Tracker coordinate system and the virtual world coordinate system; the optical Tracker tracks the marker ball support in real time to obtain the coordinates of the marker ball group under a Tracker coordinate system; converting the coordinates of the marker ball group in the Tracker coordinate system into the coordinates in the virtual world coordinate system based on the conversion matrix; and calculating the position transformation matrix of the current frame measured marker ball group and the previous frame measured marker ball group under the virtual world coordinate system in real time. Compared with the prior art, the method and the device realize more accurate measurement of the change of the pose of the AR glasses, so that the virtual object is displayed more stably.
Description
Technical Field
The invention relates to the technical field of space positioning, in particular to an AR (augmented reality) glasses positioning method and system based on an optical tracker.
Background
Augmented Reality (AR), which is a relatively new technology content that promotes integration between real world information and virtual world information content, implements analog simulation processing on the basis of computer and other scientific technologies of entity information that is relatively difficult to experience in the spatial range of the real world, superimposes the virtual information content for effective application in the real world, and can be perceived by human senses in the process, thereby realizing sensory experience beyond Reality. After the real environment and the virtual object are overlapped, the real environment and the virtual object can exist in the same picture and space at the same time.
The AR glasses are a product representative known in the AR field, can project virtual objects in space, do not influence a user to observe a real environment, and provide the user with experience of interaction with the real environment. The display of the virtual object merged into the real scene depends on the SLAM (synchronous positioning and mapping) positioning technology of the AR glasses, when the user observes the static virtual object through different angles, the virtual object should be displayed to the user at different visual angles, and the user can be provided with a feeling that the virtual object is placed at a certain position in the real space and is static. However, the virtual object display effect is not stable enough due to insufficient accuracy of the AR glasses SLAM technology. When the user changes the observation pose, the pose transformation calculated by the AR glasses through the SLAM technology is not accurate enough, so that the change of the relative angle and the distance displayed by the virtual object is not accurate enough, and the user can feel that the virtual object moves slightly.
Disclosure of Invention
The present invention is directed to overcoming the above-mentioned drawbacks of the prior art and providing a method and a system for positioning AR glasses based on an optical tracker.
The purpose of the invention can be realized by the following technical scheme:
an AR glasses positioning method based on an optical tracker, the method comprising:
a marker ball support is additionally arranged on the AR glasses, and a group of marker ball groups are arranged on the marker ball support;
constructing a virtual world coordinate system and an optical Tracker coordinate system of the AR glasses, and acquiring a transformation matrix between the Tracker coordinate system and the virtual world coordinate system;
the optical Tracker tracks the marker ball support in real time to obtain the coordinates of the marker ball group under a Tracker coordinate system;
converting the coordinates of the marker ball group in the Tracker coordinate system into the coordinates in the virtual world coordinate system based on the conversion matrix;
and calculating the position transformation matrix of the current frame measured marker ball group and the previous frame measured marker ball group under the virtual world coordinate system in real time.
Preferably, the marker ball group at least comprises 4 coplanar marker balls, and the distance between any two adjacent marker balls is used as a marker distance.
Preferably, the distance between the marks is not less than 40 mm.
Preferably, the obtaining of the transformation matrix between the Tracker coordinate system and the virtual world coordinate system specifically includes:
setting a plurality of non-coplanar virtual points in a virtual world coordinate system of the AR glasses, and determining the coordinates of the virtual points in the virtual world coordinate system;
calibrating the coordinates of the virtual point in a Tracker coordinate system by using a probe of an optical Tracker;
and carrying out singular value decomposition on the coordinates of the virtual points under the virtual world coordinate system and the middle Tracker coordinate system, and solving a conversion matrix for converting the Tracker coordinate system into the virtual world coordinate system.
Preferably, the position transformation matrix of the marker ball group under the virtual world coordinate system under two adjacent frames represents the real-time rotation and translation change condition of the AR glasses, and is used for replacing the space pose variable measured by the SLAM technology of the AR glasses to realize the real-time positioning of the AR glasses.
An optical tracker-based AR glasses positioning system, comprising:
the marker ball support is provided with a group of marker ball groups and is additionally arranged on the AR glasses;
an optical tracker;
a coordinate system calibration module: the system is used for constructing an AR glasses virtual world coordinate system and an optical Tracker coordinate system and acquiring a conversion matrix between the Tracker coordinate system and the virtual world coordinate system;
a marker ball group coordinate acquisition module: the optical Tracker tracks the marker ball support in real time to obtain the coordinates of the marker ball group under a Tracker coordinate system;
a mark group-solving coordinate conversion module: converting the coordinates of the marker ball group in the Tracker coordinate system into the coordinates in the virtual world coordinate system based on the conversion matrix;
a positioning module: and calculating the position transformation matrix of the current frame measured marker ball group and the previous frame measured marker ball group under the virtual world coordinate system in real time.
Preferably, the marker ball group at least comprises 4 coplanar marker balls, and the distance between any two adjacent marker balls is used as a marker distance.
Preferably, the distance between the marks is not less than 40 mm.
Preferably, the coordinate system calibration module includes:
virtual point determination submodule: setting a plurality of non-coplanar virtual points in a virtual world coordinate system of the AR glasses, and determining the coordinates of the virtual points in the virtual world coordinate system;
virtual point tracking submodule: calibrating the coordinates of the virtual point in a Tracker coordinate system by using a probe of an optical Tracker;
a conversion submodule: and carrying out singular value decomposition on the coordinates of the virtual points under the virtual world coordinate system and the middle Tracker coordinate system, and solving a conversion matrix for converting the Tracker coordinate system into the virtual world coordinate system.
Preferably, the coordinate system calibration module, the marker ball group coordinate acquisition module, the marker group coordinate conversion module and the positioning module are integrated in a processor with a digital processing function.
Compared with the prior art, the invention has the following advantages:
(1) the optical tracker is used for tracking the three-dimensional coordinates of the marker ball group on the AR glasses in real time to calculate the real-time pose and offset of the AR glasses, so that the space positioning and pose data acquired by the AR glasses by the equipment is replaced, the precision reaches the precision level of the optical tracker, and the positioning precision is high;
(2) the invention can more accurately measure the change of the pose of the AR glasses, namely the change of the distance and the visual angle of an object in an observation scene of a wearer of the AR glasses, thereby enabling the virtual object to be displayed more stably and displaying the virtual object to an observer at more accurate distance and angle in each frame.
Drawings
Fig. 1 is a block flow diagram of an AR glasses positioning method based on an optical tracker according to the present invention.
Detailed Description
The invention is described in detail below with reference to the figures and specific embodiments. Note that the following description of the embodiments is merely a substantial example, and the present invention is not intended to be limited to the application or the use thereof, and is not limited to the following embodiments.
Examples
As shown in fig. 1, the present embodiment provides an Optical Tracker (Optical Tracker) -based AR glasses positioning method, including:
a marker ball support is additionally arranged on the AR glasses, and a group of marker ball groups are arranged on the marker ball support;
constructing a virtual world coordinate system and an optical Tracker coordinate system of the AR glasses, and acquiring a transformation matrix between the Tracker coordinate system and the virtual world coordinate system;
the optical Tracker tracks the marker ball support in real time to obtain the coordinates of the marker ball group under a Tracker coordinate system;
converting the coordinates of the marker ball group in the Tracker coordinate system into the coordinates in the virtual world coordinate system based on the conversion matrix;
and calculating the position transformation matrix of the current frame measured marker ball group and the previous frame measured marker ball group under the virtual world coordinate system in real time.
The marker ball group at least comprises 4 coplanar marker balls, the distance between any two adjacent marker balls is used as a marker distance, and the marker distance is not less than 40 mm. The relative position between the marker balls is designed to comply with official documentation requirements. Through the provided algorithm, the corresponding position of each marker ball at different moments can be detected. New tool options are created in the official software to allow the optical tracker to identify the marker balls that correspond to each other between each frame.
The method for acquiring the transformation matrix between the Tracker coordinate system and the virtual world coordinate system specifically comprises the following steps:
setting a plurality of non-coplanar virtual points in a virtual world coordinate system of the AR glasses, and determining the coordinates of the virtual points in the virtual world coordinate system;
calibrating the coordinates of the virtual point in a Tracker coordinate system by using a probe of an optical Tracker;
and carrying out singular value decomposition on the coordinates of the virtual points under the virtual world coordinate system and the middle Tracker coordinate system, and solving a conversion matrix for converting the Tracker coordinate system into the virtual world coordinate system.
In this embodiment, since the optical Tracker measurement system is a right-hand coordinate system (i.e., Tracker coordinate system is a right-hand coordinate system), and the AR glasses developed by software are applied to the virtual world coordinate system established in the scene as a left-hand coordinate system, the Z-axis coordinate of the coordinate measured by the optical Tracker is inverted, so as to realize the conversion with the left-hand coordinate system. The coordinates of the virtual point group in the AR glasses in the virtual world coordinate system are known, and the coordinates of the virtual point group in the Tracker coordinate system can be measured by using the probe tip to be aligned with the virtual point group. Solving two pairs of coordinate sets by an algorithm to obtain a transformation matrix between two coordinate systems(the coordinate units are m)
For example: the coordinates of the 4 virtual point groups in the virtual space of the AR glasses are:
the coordinates of the 4 virtual points, which are calibrated by the probe and correspond to the coordinates in the Tracker coordinate system, are as follows:
and inverting the measured Z-axis coordinate to realize the conversion from a right-hand coordinate system to a left-hand coordinate system:
decomposition of P by singular valuesV,PTr' to obtain:
the optical tracker tracks the three-dimensional coordinates of the marker ball set in real time, and also inverts the Z-axis coordinates. Four coordinate point groups are measured per frameAnd converting to obtain coordinates of the four marker balls corresponding to the virtual world coordinate device established by the AR glasses. Then, SVD (singular value decomposition) is carried out on the marker sphere coordinate points under the virtual space corresponding to the four measured markers in the previous frame to solve the rotation and translation matrix. The solved data are sent to the AR glasses for processing in real time through wireless communication. The space three-dimensional variable quantity obtained by the data is used for replacing pose transformation of the AR glasses measured by a sensor and a camera through an SLAM technology in the AR glasses, so that the pose of the AR glasses is obtained in real time. For example: tracker measures the coordinates of the marker ball group in two spaced frames:
similarly, the measured Z-axis coordinate is inverted, and the coordinates of the marker ball group corresponding to the two spaced frames under the left-hand system are obtained as follows:
coordinates of two frame marker ball groups are compared withAnd (3) calculating to obtain coordinates in the virtual space coordinate system of the AR glasses:
and solving a space position transformation matrix of the marker ball group in the virtual coordinate system of the AR glasses as follows:
will TmovSending the data to AR glasses equipment under wireless communication, and passing through TmovThe rotation and translation changes of the AR glasses in the virtual space coordinate system established by the AR glasses can be known, and the AR glasses can replace the space pose variable measured by the sensors and the cameras based on the SLAM technology to realize real-time positioning. The AR glasses apparatus refers to the T received for each framemovTherefore, the corresponding virtual scene picture can be displayed under the coordinate system of the AR glasses.
Based on the above method, the present embodiment further provides an AR glasses positioning system based on an optical tracker, including:
the marker ball support is provided with a group of marker ball groups, the marker ball support is additionally arranged on the AR glasses, the marker ball groups at least comprise 4 coplanar marker balls, the distance between any two adjacent marker balls is taken as a marker distance, and the marker distance is not less than 40 mm;
an optical tracker;
a coordinate system calibration module: the system is used for constructing an AR glasses virtual world coordinate system and an optical Tracker coordinate system and acquiring a conversion matrix between the Tracker coordinate system and the virtual world coordinate system;
a marker ball group coordinate acquisition module: the optical Tracker tracks the marker ball support in real time to obtain the coordinates of the marker ball group under a Tracker coordinate system;
a mark group-solving coordinate conversion module: converting the coordinates of the marker ball group in the Tracker coordinate system into the coordinates in the virtual world coordinate system based on the conversion matrix;
a positioning module: and calculating the position transformation matrix of the current frame measured marker ball group and the previous frame measured marker ball group under the virtual world coordinate system in real time.
The coordinate system calibration module includes:
virtual point determination submodule: setting a plurality of non-coplanar virtual points in a virtual world coordinate system of the AR glasses, and determining the coordinates of the virtual points in the virtual world coordinate system;
virtual point tracking submodule: calibrating the coordinates of the virtual point in a Tracker coordinate system by using a probe of an optical Tracker;
a conversion submodule: and carrying out singular value decomposition on the coordinates of the virtual points under the virtual world coordinate system and the middle Tracker coordinate system, and solving a conversion matrix for converting the Tracker coordinate system into the virtual world coordinate system.
The coordinate system calibration module, the marker ball group coordinate acquisition module, the marker solving group coordinate conversion module and the positioning module are integrated in a processor with a digital processing function.
The above embodiments are merely examples and do not limit the scope of the present invention. These embodiments may be implemented in other various manners, and various omissions, substitutions, and changes may be made without departing from the technical spirit of the present invention.
Claims (10)
1. An AR glasses positioning method based on an optical tracker is characterized by comprising the following steps:
a marker ball support is additionally arranged on the AR glasses, and a group of marker ball groups are arranged on the marker ball support;
constructing a virtual world coordinate system and an optical Tracker coordinate system of the AR glasses, and acquiring a transformation matrix between the Tracker coordinate system and the virtual world coordinate system;
the optical Tracker tracks the marker ball support in real time to obtain the coordinates of the marker ball group under a Tracker coordinate system;
converting the coordinates of the marker ball group in the Tracker coordinate system into the coordinates in the virtual world coordinate system based on the conversion matrix;
and calculating the position transformation matrix of the current frame measured marker ball group and the previous frame measured marker ball group under the virtual world coordinate system in real time.
2. The method as claimed in claim 1, wherein the marker ball group comprises at least 4 coplanar marker balls, and the distance between any two adjacent marker balls is used as the marker distance.
3. The method as claimed in claim 2, wherein the marker distance is not less than 40 mm.
4. The method as claimed in claim 1, wherein the obtaining of the transformation matrix between the Tracker coordinate system and the virtual world coordinate system is as follows:
setting a plurality of non-coplanar virtual points in a virtual world coordinate system of the AR glasses, and determining the coordinates of the virtual points in the virtual world coordinate system;
calibrating the coordinates of the virtual point in a Tracker coordinate system by using a probe of an optical Tracker;
and carrying out singular value decomposition on the coordinates of the virtual points under the virtual world coordinate system and the middle Tracker coordinate system, and solving a conversion matrix for converting the Tracker coordinate system into the virtual world coordinate system.
5. The AR glasses positioning method based on the optical tracker according to claim 1, wherein a position transformation matrix of marker ball groups under a virtual world coordinate system under two adjacent frames represents real-time rotation and translation change conditions of the AR glasses, and is used for replacing a space pose variable measured by an SLAM technology of the AR glasses to realize real-time positioning of the AR glasses.
6. An optical tracker-based AR glasses positioning system, comprising:
the marker ball support is provided with a group of marker ball groups and is additionally arranged on the AR glasses;
an optical tracker;
a coordinate system calibration module: the system is used for constructing an AR glasses virtual world coordinate system and an optical Tracker coordinate system and acquiring a conversion matrix between the Tracker coordinate system and the virtual world coordinate system;
a marker ball group coordinate acquisition module: the optical Tracker tracks the marker ball support in real time to obtain the coordinates of the marker ball group under a Tracker coordinate system;
a mark group-solving coordinate conversion module: converting the coordinates of the marker ball group in the Tracker coordinate system into the coordinates in the virtual world coordinate system based on the conversion matrix;
a positioning module: and calculating the position transformation matrix of the current frame measured marker ball group and the previous frame measured marker ball group under the virtual world coordinate system in real time.
7. The optical tracker based AR glasses positioning system of claim 6, wherein said marker ball set comprises at least 4 coplanar marker balls, and a distance between any two adjacent marker balls is used as a marker distance.
8. The optical tracker based AR glasses positioning system of claim 7, wherein said marker distance is not less than 40 mm.
9. The optical tracker based AR glasses positioning system of claim 6, wherein the coordinate system calibration module comprises:
virtual point determination submodule: setting a plurality of non-coplanar virtual points in a virtual world coordinate system of the AR glasses, and determining the coordinates of the virtual points in the virtual world coordinate system;
virtual point tracking submodule: calibrating the coordinates of the virtual point in a Tracker coordinate system by using a probe of an optical Tracker;
a conversion submodule: and carrying out singular value decomposition on the coordinates of the virtual points under the virtual world coordinate system and the middle Tracker coordinate system, and solving a conversion matrix for converting the Tracker coordinate system into the virtual world coordinate system.
10. The optical tracker based AR glasses positioning system of claim 6, wherein said coordinate system calibration module, said marker ball set coordinate acquisition module, said marker grouping coordinate transformation module and said positioning module are integrated into a processor with digital processing function.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111360840.5A CN114114688A (en) | 2021-11-17 | 2021-11-17 | AR (augmented reality) glasses positioning method and system based on optical tracker |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111360840.5A CN114114688A (en) | 2021-11-17 | 2021-11-17 | AR (augmented reality) glasses positioning method and system based on optical tracker |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114114688A true CN114114688A (en) | 2022-03-01 |
Family
ID=80397513
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111360840.5A Pending CN114114688A (en) | 2021-11-17 | 2021-11-17 | AR (augmented reality) glasses positioning method and system based on optical tracker |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114114688A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117434571A (en) * | 2023-12-21 | 2024-01-23 | 绘见科技(深圳)有限公司 | Method for determining absolute pose of equipment based on single antenna, MR equipment and medium |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111207747A (en) * | 2018-11-21 | 2020-05-29 | 中国科学院沈阳自动化研究所 | Spatial positioning method based on HoloLens glasses |
CN113648061A (en) * | 2021-07-15 | 2021-11-16 | 上海交通大学医学院附属第九人民医院 | Head-mounted navigation system based on mixed reality and navigation registration method |
-
2021
- 2021-11-17 CN CN202111360840.5A patent/CN114114688A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111207747A (en) * | 2018-11-21 | 2020-05-29 | 中国科学院沈阳自动化研究所 | Spatial positioning method based on HoloLens glasses |
CN113648061A (en) * | 2021-07-15 | 2021-11-16 | 上海交通大学医学院附属第九人民医院 | Head-mounted navigation system based on mixed reality and navigation registration method |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117434571A (en) * | 2023-12-21 | 2024-01-23 | 绘见科技(深圳)有限公司 | Method for determining absolute pose of equipment based on single antenna, MR equipment and medium |
CN117434571B (en) * | 2023-12-21 | 2024-03-15 | 绘见科技(深圳)有限公司 | Method for determining absolute pose of equipment based on single antenna, MR equipment and medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Thewlis et al. | Next-generation low-cost motion capture systems can provide comparable spatial accuracy to high-end systems | |
EP1886281B1 (en) | Image processing method and image processing apparatus | |
CN109238235B (en) | Method for realizing rigid body pose parameter continuity measurement by monocular sequence image | |
Spitzley et al. | Feasibility of using a fully immersive virtual reality system for kinematic data collection | |
Luhmann | Precision potential of photogrammetric 6DOF pose estimation with a single camera | |
JP2008002980A (en) | Information processing method and device | |
JP2004233334A (en) | Method for measuring position and orientation | |
CN102679964B (en) | Gait parameter measurement system and data processing device and method thereof | |
Axholt et al. | Parameter estimation variance of the single point active alignment method in optical see-through head mounted display calibration | |
CN105816182A (en) | Method for measuring cervical vertebra motion degree on basis of Kinect sensor | |
CN108156450A (en) | For the method for calibration camera, calibrator (-ter) unit, calibration system and machine readable storage medium | |
Makibuchi et al. | Vision-based robust calibration for optical see-through head-mounted displays | |
JP2005256232A (en) | Method, apparatus and program for displaying 3d data | |
CN105509716A (en) | Geographic information collection method based on augmented reality technology and device | |
CN110414101B (en) | Simulation scene measurement method, accuracy measurement method and system | |
JP2016006415A (en) | Method and apparatus for estimating position of optical marker in optical motion capture | |
Moser et al. | Evaluation of user-centric optical see-through head-mounted display calibration using a leap motion controller | |
CN114114688A (en) | AR (augmented reality) glasses positioning method and system based on optical tracker | |
CN114663463A (en) | Method, system, device, electronic device and storage medium for measuring joint mobility | |
JP2011248443A (en) | Presentation device and method for augmented reality image | |
CN109785392A (en) | A kind of caliberating device and method for desktop grade virtual reality system | |
CN109938841B (en) | Surgical instrument navigation system based on multi-view camera coordinate fusion | |
JP5726024B2 (en) | Information processing method and apparatus | |
JP6109213B2 (en) | Information processing apparatus and method, program | |
CN114862960A (en) | Multi-camera calibrated image ground leveling method and device, electronic equipment and medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |