CN100416336C - Calibrating real and virtual views - Google Patents

Calibrating real and virtual views Download PDF

Info

Publication number
CN100416336C
CN100416336C CNB2004800161027A CN200480016102A CN100416336C CN 100416336 C CN100416336 C CN 100416336C CN B2004800161027 A CNB2004800161027 A CN B2004800161027A CN 200480016102 A CN200480016102 A CN 200480016102A CN 100416336 C CN100416336 C CN 100416336C
Authority
CN
China
Prior art keywords
view
virtual
display
optical
real reference
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CNB2004800161027A
Other languages
Chinese (zh)
Other versions
CN1802586A (en
Inventor
F·索尔
Y·根克
N·纳瓦布
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens Medical Solutions USA Inc
Original Assignee
Siemens Medical Solutions USA Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Medical Solutions USA Inc filed Critical Siemens Medical Solutions USA Inc
Publication of CN1802586A publication Critical patent/CN1802586A/en
Application granted granted Critical
Publication of CN100416336C publication Critical patent/CN100416336C/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

A method for calibrating real and virtual views includes tracking a calibration screen, wherein a real reference point, generated by a real reference point generator, is projected on the calibration screen, aligning a view of a virtual reference point to a view of the real reference point in a display, wherein the real reference point generator and the display have a fixed relative position, determining a point correspondence between the virtual reference point and the real reference point, and determining one or more parameters for rendering a virtual object in the real scene.

Description

Calibrating real and virtual view
Technical field
The present invention relates to augmented reality, more particularly, the present invention relates to the system and method for the augmented reality calibration of transparent Helmet Mounted Display.
Background technology
Strengthening vision (being also referred to as augmented reality or augmented reality vision) uses the stack computing machine to generate the User that graphical information is come the augmented reality world.This information can comprise that three-dimensional (3D) model of the patient's brain that for example is added in the word tag on some objects in the scene or obtains from MRI scanning is aligned this head part's real view.
The user can observe real world by his or her eyes, and mixes additional image information via the semi-transparent display between observer and real world.This display device can be an optical clear Helmet Mounted Display for example.
This display also can be opaque, such as computer screen or non-see-through head mounted.This display presents the combination of complete enhancing view-real world view and figure covering then for the user.Video camera replaces the real world observer to catch the real world view.Two cameras can realize being used for stereoscopic vision.Computing machine can be used for making up live video and figure strengthens.This display device is the transparent Helmet Mounted Display of video (HMD) for example.
With perspective fashion location, orientation, adjustment and even render graphics, be used for correctly aiming at the real world view.In order to obtain the accurate aligning of real view and virtual view, figure can be fixed to real world objects.For this reason, therefore need user's observation point with respect to the position in this object and object orientation and the knowledge in orientation.Therefore, need the relation between two coordinate systems of definition, on the head that is added in the user, one is added on the object.
Follow the tracks of the expression maintenance and follow the tracks of the process of this relation.Can use commercial tracker based on optics, magnetic, ultrasound wave and mechanical hook-up.
Need calibration with the aligning between virtual pattern object and the real-world object in the acquisition scene.Because reality and virtual image are combined in the computing machine, therefore can realize the transparent HMD of video with the objective way that is independent of the user.By contrast, use optical clear HMD, reality and virtual image finally appear in user's the eyes, and the position of the eyes of user on translucent screen backstage has material impact to aligning.
The different methods that are used for the transparent HMD of collimation optical are considered to prior art.All known calibration stepss require the user to make virtual architecture aim at real reference structure.For example, in the SPAAM method, on screen for the user shows a series of fixedly pictorial symbolizations, user's slip-on head so that these fixedly pictorial symbolization aim at reference marker in the reality scene.The shake of user's head has hindered this aligning.Because head jitter, the positional jitter of show tags, and accurately align virtual and real marker.
For such as need in the operating room accurately to measure and the augmented reality application of comfortable use for, currently do not have a known system.Therefore, the needs that have the system and method for calibrating for the augmented reality of transparent Helmet Mounted Display.
Summary of the invention
According to a kind of augmented reality system of the present invention, comprising: the real reference generator, it is used for showing real reference on aiming screen; The optical clear display, it has the fixed position with respect to described real reference generator; The virtual reference generator, it is used for showing virtual reference on described optical clear display; Input equipment, it is used for making the view of described virtual reference aim at the view of described real reference by described optical clear display, wherein moves described virtual reference on described optical clear display; Processor, it is used to determine to play up the one or more parameters of virtual objects as a part of reality scene of seeing by described optical clear display; And tracking means, it is used to realize the aligning of real reference and virtual reference.
Augmented reality system comprises and is used for the tracking camera of tracking calibration screen with respect to the posture of virtual reference.
Augmented reality system comprises the tracking camera that has with respect to the fixed position of real reference generator, is used to catch the view of aiming screen.Augmented reality system also comprises processor, wherein the optical markings configuration is fixed in aiming screen, and by the tracking camera imaging, wherein processor determines that according to the position of optical markings configuration in the image of tracking camera seizure the position between aiming screen and the Helmet Mounted Display concerns that Helmet Mounted Display comprises real reference generator and optical clear display.
Augmented reality system comprises: at least one tracking camera, and it is used to catch the view of aiming screen; And Helmet Mounted Display, it comprises real reference generator and optical clear display.Augmented reality system also comprises processor, wherein optical markings disposes on each that is fixed in aiming screen and the Helmet Mounted Display, and by at least one tracking camera tracking, wherein processor determines that according to the position of each optical markings configuration in the view of at least one tracking camera seizure the position between aiming screen and the Helmet Mounted Display concerns.
According to a kind of system that is used to calibrate reality and virtual view of the present invention, comprising: the real reference generator, it is used for showing real reference on aiming screen; Optical display, it has the fixed position with respect to described real reference generator; The virtual reference generator, it is used for producing virtual reference on described optical display; Input equipment, it is used to make the view of described virtual reference to aim at the view of described real reference, and wherein the view with respect to described real reference moves described virtual reference on described optical display; Processor, it is used for determining to play up one or more parameters of virtual objects in the reality scene that described optical display is seen; And tracking means, it is used to realize the aligning of real reference and virtual reference.
System also comprises the camera of the view of catching real reference, wherein shows real reference in optical display, and virtual reference is superimposed on this real reference.System also comprises: tracking camera, and the fixed position that it has with respect to the real reference generator is used to catch the view of aiming screen; And processor, wherein the optical markings configuration is fixed on the aiming screen, and tracked camera is followed the tracks of, wherein processor determines that according to the position of optical markings configuration in the view of tracking camera seizure the position between aiming screen and the Helmet Mounted Display concerns that Helmet Mounted Display comprises real reference generator and optical display.
System comprises the tracking camera that is connected to the real reference generator, is used to catch the view of aiming screen.This system also comprises processor, wherein the optical markings configuration is fixed on the aiming screen, and tracked camera is followed the tracks of, wherein processor determines that according to the position of optical markings configuration in the view of tracking camera seizure the position between aiming screen and the Helmet Mounted Display concerns, and Helmet Mounted Display comprises real reference generator and optical display.
System comprises: at least one tracking camera is used to catch the view of aiming screen; And Helmet Mounted Display, it comprises real reference generator and optical display.System also comprises processor, wherein optical markings disposes on each that is fixed in aiming screen and the Helmet Mounted Display, and tracked camera is followed the tracks of, and wherein processor determines that according to the position of each optical markings configuration in the view of at least one tracking camera seizure the position between aiming screen and the Helmet Mounted Display concerns.
According to a kind of method that is used to calibrate reality and virtual view of the present invention, comprising: the tracking calibration screen, wherein the real reference of real reference generator generation is projected on the described aiming screen; Make virtual reference aim at the view of real reference described in the display, wherein said real reference generator and described display have fixing relative position; Determine the some correspondence between described virtual reference and the described real reference; And the one or more parameters that are identified in reality scene, playing up virtual objects, wherein, on the optical clear display, show described virtual reference, by described optical clear display, described real reference is visible.
This method comprises: catch the view of the reality scene that comprises real reference, and show this view of the reality scene that strengthens with virtual reference.
Description of drawings
Below with reference to accompanying drawing the preferred embodiments of the present invention are described in more detail:
Fig. 1 is the diagram according to the calibration system of embodiment disclosed by the invention;
Fig. 2 A is the diagram according to the augmented reality calibration system of embodiment disclosed by the invention;
Fig. 2 B is the diagram according to the augmented reality calibration system of embodiment disclosed by the invention;
Fig. 2 C is the diagram according to the video-see-through augmented reality calibration system of embodiment disclosed by the invention;
Fig. 3 is the process flow diagram according to the method for embodiment disclosed by the invention;
Fig. 4 is the process flow diagram according to the method for embodiment disclosed by the invention.
Embodiment
The system and method that is used for the transparent Helmet Mounted Display of collimation optical (HMD) is embodied as real reference and is derived from the luminous point that is connected in the luminaire on the HMD.These luminous points are in company with HMD " shake together ", and the user does not perceive these reference markers at place, fixed position on the translucent screen that is presented at HMD and any shake between the virtual tag.
With reference to figure 1, for the collimation optical transparent system, user 100 makes the virtual reference 101 that is shown as figure on the translucent screen 103 of HMD aim at the real reference structure 102 of observing by screen 103.Real reference structure 102 is implemented as projected light and/or the image on aiming screen 104 or other substrate.Real reference 102 and virtual reference 101 can be for example one or more points or shape.
Should be appreciated that and to realize the present invention with various forms of hardware, software, firmware, application specific processor or combinations thereof.In one embodiment, the software that can be used as the actual application program of implementing on the program storage device is realized the present invention.This application program can be uploaded to the machine that comprises any appropriate configuration, and can be carried out by this machine.
Should also be appreciated that owing to can realize that some shown in the accompanying drawing form the component of a system and method steps, so the mode that the actual connection between the component of a system (or process steps) can be programmed according to the present invention changes with software.Here the instruction of the present invention that provides, those skilled in the art can consider of the present invention these and similarly realize or configuration.
With reference to figure 2A and 2B, real reference 102 is derived from and is connected in the illuminator 200 that HMD201 goes up, also observes by screen 103 securely.When the user moved HMD201, real reference 102 moved on aiming screen 104, and for little head movement, when watching by the translucent screen 103 of HMD, real reference 102 looks fixing with respect to virtual reference 101.Because the shake between real reference 102 and the virtual reference 101 fully reduces, therefore from user 100 vantage point, alignment procedures is easy now.
On flat screen 104, observe real reference 102, the user screen 104 of arm length can be held in one on hand, that this screen is placed on table is first-class.
Screen 104 is followed the tracks of user's head or HMD201.Follow the tracks of layout and comprise outside or (seeing Fig. 2 A) or the helmet (seeing Fig. 2 B) tracking camera 202.Screen 104 (under optically tracked situation) and HMD201 (under the situation of the external trace camera of Fig. 2 A) comprise optical markings 203.Optical markings 203 can be for example retroreflective flat discs or retroreflective spheres.
Luminaire 200 projections comprise the light image 102 of one (perhaps preferably several) point.Can use the optical system of single source and generation light image 102 to construct luminaire 200.For example, laser can use with lens combination and mask or diffraction optics assembly, to produce a group luminous point.
Alternatively, can use a group light source (such as led array), its advantage is that can to make light image be switchable.Can connect and cut off LED separately.These LED can make up with lens combination or micro-optical components (for example lens arra).
Luminaire 200 also can be included in scanister or the beam deflection device that switches between the different beam directions.
Screen 103 can be that for example monocular is arranged or binocular arrangement.In binocular arrangement, preferably calibrate two screens one by one separately, produce the image of virtual reference 101 in conjunction with the suitable optical module 204 of semi-transparent display 103.Because the user has an X-rayed semi-transparent display 103, so virtual reference 101 is that the user is visible.Alternatively, can use image projector and the optical system that comprises beam splitter to implement transparent display.
In order to carry out calibration, from user's angle (for example 205), user 100 will be presented at virtual reference 101 on the translucent screen 103 and move to and aim at reference light image 102.User's 100 control interfaces 206 (for example processor 207 and input equipment 208) are so that the virtual reference 101 on the moving screen 103.Input equipment 208 can be for example trace ball or mouse.Processor 207 can be the virtual reference generator, and it is included in the processor and the graphics card of the virtual reference of playing up demonstration on the translucent screen.
In order to finish calibration process, the user makes virtual reference 101 aim at several real reference luminous points (for example 102).For better calibration accuracy, the user can suppose with respect to the different gestures of aiming screen 104 (for example distance and/or orientation).
The virtual reference 101 that processor (for example 207) is determined according to the position and the user of mark 203 and the aligning of real reference 102 are determined the spatial relationship between aiming screen 104 and the HMD201.Fig. 2 B is the example that comprises the HMD of tracking camera 202.As shown in the figure, be fixed in place on the HMD, the spatial relationship that only can use the optical markings 203 that is fixed in aiming screen 104 to determine between aiming screen 104 and the HMD201 at tracking camera.In addition, can determine the posture of aiming screen 104 according to the relation of the different optical mark 203 that is fixed in screen 104.
Fig. 2 C is the video-see-through augmented reality system according to embodiment disclosed by the invention, and wherein camera (for example 202) is caught the image of the reality scene that comprises real reference 102.Can carry out the tracking and the video capability of camera 202 by independent camera.The image that shows reality scene for user 100.Virtual view is superimposed on the real view (for example 209), and user 100 perceives reality and virtual view.The user can make virtual reference aim at the view of real reference in the reality scene.
With reference to figure 3, consider the situation of helmet tracking camera.Tracking camera and luminaire be mechanical fixation each other, and determines to produce position and the orientation of the light beam of reference point with respect to the coordinate system of tracking camera.By tracking calibration screen 301, can determine as corresponding light beam and screen plane point of crossing, the 3D coordinate of reference point in the tracking camera coordinate system.
During calibration process, the user aims at reality and virtual reference 302, and one group of 3D-2D point of system log (SYSLOG) corresponding 303.Each point is corresponding have been aimed at reference, has been formed with reference to the 3D coordinate of luminous point and the 2D coordinate of virtual tag by the user.This group point is corresponding to make one or more parameters of the dummy object that can be identified for playing up correct aligning reality scene 304.For example, determine to be presented at the camera parameter coupling of camera parameter and the User of the real world of determining to see of the User of the virtual world on the translucent screen by screen.These determine it is known in the present technique, for example, as submit to September 25 calendar year 2001, name is called described in the Application No. 20020105484 of " system and method that is used to calibrate the monocular optical clear Helmet Mounted Display system of augmented reality ", wherein calibration can comprise and makes physical environment and internal representation coupling so that the initial parameter value of the mathematical model that the internal model of computing machine and physical environment mate.These parameters comprise for example optical characteristics of physical camera, and such as the position and orientation (posture) information of the various entities of camera, the mark that is used to follow the tracks of and various objects.
After the optical-see-through augmented reality system of successfully having calibrated unique user, this system can look that the mode that is securely fixed in the displayed scene plays up these objects with the 3D Drawing Object.Use tracker follow the tracks of and use Drawing Object virtual view come the viewpoint of interpreting user to change to change.
The alternative case of the situation of helmet tracking camera is that the external trace device can use jointly with the helmet mark or the sensor that fixedly secure with respect to luminaire.Tracker is followed the tracks of HMD and aiming screen (401).Equally, the 3D coordinate of calibration light points can be aligned (402), and can be confirmed as the point of crossing (403) of light beam and screen plane.Virtual reference point is aimed at the real reference points that is shown as light on the screen, and record corresponding (404).
System comprises Helmet Mounted Display, tracking means, calculating and graph rendering device, optical projection device and can follow the tracks of screen.
Can aim in several corresponding calibrations of measuring between average reality and the virtual reference structure of each point.Notice that virtual tag and real marker relative to each other demonstrate non-jitter.On average can reduce the error in the calibration.Comparing with the calibration procedure that uses surface, on average is easy-to-use.Here, because the shake that reduces between reality and the virtual tag, so the user can keep aligning to reach one second or several seconds.
Described the embodiment of the system and method that is used to calibrate reality and virtual view, noticed that those skilled in the art can realize revising and change according to above teaching.Therefore, should be appreciated that can be in the protection scope of the present invention of claims definition and spirit, these changes of realization in the disclosed specific embodiments of the invention.Describe details of the present invention thus, specifically be the desired details of Patent Law, in claims, set forth patent grant declaration and expectation protection.

Claims (15)

1. augmented reality system comprises:
The real reference generator, it is used for showing real reference on aiming screen;
The optical clear display, it has the fixed position with respect to described real reference generator;
The virtual reference generator, it is used for showing virtual reference on described optical clear display;
Input equipment, it is used for making the view of described virtual reference aim at the view of described real reference by described optical clear display, wherein moves described virtual reference on described optical clear display;
Processor, it is used to determine to play up the one or more parameters of virtual objects as a part of reality scene of seeing by described optical clear display; And
Tracking means, it is used to realize the aligning of real reference and virtual reference.
2. augmented reality system as claimed in claim 1 also comprises being used to follow the tracks of the tracking camera of described aiming screen with respect to the posture of described virtual reference.
3. augmented reality system as claimed in claim 1 also comprises the tracking camera that has with respect to the view fixed position of described real reference generator, that be used to catch described aiming screen.
4. augmented reality system as claimed in claim 3, also comprise processor, wherein the optical markings configuration is fixed in described aiming screen, and by described tracking camera imaging, wherein said processor determines that according to the position of optical markings configuration described in the image of described tracking camera seizure the position between described aiming screen and the Helmet Mounted Display concerns that described Helmet Mounted Display comprises described real reference generator and optical clear display.
5. augmented reality system as claimed in claim 1 also comprises: at least one tracking camera, and it is used to catch the view of described aiming screen; And Helmet Mounted Display, it comprises described real reference generator and optical clear display.
6. augmented reality system as claimed in claim 5, also comprise processor, wherein optical markings disposes on each that is fixed in described aiming screen and the described Helmet Mounted Display, and by described at least one tracking camera tracking, wherein said processor determines that according to the position of each optical markings configuration in the view of the described aiming screen of described at least one tracking camera seizure the position between described aiming screen and the described Helmet Mounted Display concerns.
7. system that is used to calibrate reality and virtual view comprises:
The real reference generator, it is used for showing real reference on aiming screen;
Optical display, it has the fixed position with respect to described real reference generator;
The virtual reference generator, it is used for producing virtual reference on described optical display;
Input equipment, it is used to make the view of described virtual reference to aim at the view of described real reference, and wherein the view with respect to described real reference moves described virtual reference on described optical display;
Processor, it is used for determining to play up one or more parameters of virtual objects in the reality scene that described optical display is seen; And
Tracking means, it is used to realize the aligning of real reference and virtual reference.
8. the system that is used to calibrate reality and virtual view as claimed in claim 7, the camera that also comprises the view of catching described real reference, wherein show described real reference in described optical display, described virtual reference is superimposed on the described real reference.
9. the system that is used to calibrate reality and virtual view as claimed in claim 8 also comprises:
Tracking camera, the fixed position that it has with respect to described real reference generator is used to catch the view of described aiming screen; And
Processor, wherein the optical markings configuration is fixed in described aiming screen, and followed the tracks of by described tracking camera, wherein said processor determines that according to the position of optical markings configuration described in the view of described tracking camera seizure the position between described aiming screen and the Helmet Mounted Display concerns that described Helmet Mounted Display comprises described real reference generator and optical display.
10. the system that is used to calibrate reality and virtual view as claimed in claim 7 also comprises the tracking camera that is connected to described real reference generator, is used to catch the view of described aiming screen.
11. the system that is used to calibrate reality and virtual view as claimed in claim 10, also comprise processor, wherein the optical markings configuration is fixed in described aiming screen, and followed the tracks of by described tracking camera, wherein said processor determines that according to the position of optical markings configuration described in the view of described tracking camera seizure the position between described aiming screen and the Helmet Mounted Display concerns that described Helmet Mounted Display comprises described real reference generator and optical display.
12. the system that is used to calibrate reality and virtual view as claimed in claim 7 also comprises: at least one tracking camera is used to catch the view of described aiming screen; And Helmet Mounted Display, it comprises described real reference generator and optical display.
13. the system that is used to calibrate reality and virtual view as claimed in claim 12, also comprise processor, wherein the optical markings configuration is fixed in each in described aiming screen and the described Helmet Mounted Display, and followed the tracks of by described tracking camera, the position of each optical markings configuration determines that the position between described aiming screen and the described Helmet Mounted Display concerns in the view of the described aiming screen that wherein said processor is caught according to described at least one tracking camera.
14. a method that is used to calibrate reality and virtual view comprises:
The tracking calibration screen, wherein the real reference of real reference generator generation is projected on the described aiming screen;
Make virtual reference aim at the view of real reference described in the display, wherein said real reference generator and described display have fixing relative position;
Determine the some correspondence between described virtual reference and the described real reference; And
Be identified in reality scene, playing up one or more parameters of virtual objects,
Wherein, show described virtual reference on the optical clear display, by described optical clear display, described real reference is visible.
15. the method that is used to calibrate reality and virtual view as claimed in claim 14 also comprises
Seizure comprises the view of the reality scene of described real reference; And
The view that shows the described reality scene that strengthens with described virtual reference.
CNB2004800161027A 2003-06-12 2004-06-09 Calibrating real and virtual views Expired - Fee Related CN100416336C (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US47786103P 2003-06-12 2003-06-12
US60/477,861 2003-06-12
US10/863,414 2004-06-08

Publications (2)

Publication Number Publication Date
CN1802586A CN1802586A (en) 2006-07-12
CN100416336C true CN100416336C (en) 2008-09-03

Family

ID=36811829

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2004800161027A Expired - Fee Related CN100416336C (en) 2003-06-12 2004-06-09 Calibrating real and virtual views

Country Status (1)

Country Link
CN (1) CN100416336C (en)

Families Citing this family (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9495386B2 (en) 2008-03-05 2016-11-15 Ebay Inc. Identification of items depicted in images
KR20100121690A (en) 2008-03-05 2010-11-18 이베이 인크. Method and apparatus for image recognition services
CN101702233B (en) * 2009-10-16 2011-10-05 电子科技大学 Three-dimension locating method based on three-point collineation marker in video frame
US9164577B2 (en) 2009-12-22 2015-10-20 Ebay Inc. Augmented reality system, method, and apparatus for displaying an item image in a contextual environment
US8576276B2 (en) * 2010-11-18 2013-11-05 Microsoft Corporation Head-mounted display device which provides surround video
CN106774880B (en) * 2010-12-22 2020-02-21 Z空间股份有限公司 Three-dimensional tracking of user control devices in space
US8570320B2 (en) * 2011-01-31 2013-10-29 Microsoft Corporation Using a three-dimensional environment model in gameplay
US8942917B2 (en) 2011-02-14 2015-01-27 Microsoft Corporation Change invariant scene recognition by an agent
US9600934B2 (en) * 2011-06-30 2017-03-21 Orange Augmented-reality range-of-motion therapy system and method of operation thereof
KR101971948B1 (en) * 2011-07-28 2019-04-24 삼성전자주식회사 Marker-less augmented reality system using plane feature and method thereof
US9449342B2 (en) 2011-10-27 2016-09-20 Ebay Inc. System and method for visualization of items in an environment using augmented reality
JP5838747B2 (en) * 2011-11-11 2016-01-06 ソニー株式会社 Information processing apparatus, information processing method, and program
US9240059B2 (en) 2011-12-29 2016-01-19 Ebay Inc. Personal augmented reality
JP5919899B2 (en) * 2012-03-08 2016-05-18 セイコーエプソン株式会社 Virtual image display device and method for adjusting position of virtual image display device
JP5966510B2 (en) * 2012-03-29 2016-08-10 ソニー株式会社 Information processing system
US10846766B2 (en) 2012-06-29 2020-11-24 Ebay Inc. Contextual menus based on image recognition
US9857470B2 (en) 2012-12-28 2018-01-02 Microsoft Technology Licensing, Llc Using photometric stereo for 3D environment modeling
CN106951074B (en) * 2013-01-23 2019-12-06 青岛海信电器股份有限公司 method and system for realizing virtual touch calibration
US9940553B2 (en) 2013-02-22 2018-04-10 Microsoft Technology Licensing, Llc Camera/object pose from predicted coordinates
GB201305402D0 (en) * 2013-03-25 2013-05-08 Sony Comp Entertainment Europe Head mountable display
US9239460B2 (en) * 2013-05-10 2016-01-19 Microsoft Technology Licensing, Llc Calibration of eye location
US9524580B2 (en) * 2014-01-06 2016-12-20 Oculus Vr, Llc Calibration of virtual reality systems
US10198865B2 (en) * 2014-07-10 2019-02-05 Seiko Epson Corporation HMD calibration with direct geometric modeling
CN116527860A (en) * 2015-06-15 2023-08-01 依视路国际公司 Method for calibrating a binocular display device
CN105301777B (en) * 2015-12-05 2018-06-26 中国航空工业集团公司洛阳电光设备研究所 A kind of HUD adjusting process and the device for being exclusively used in implementing this method
CN106773054A (en) * 2016-12-29 2017-05-31 北京乐动卓越科技有限公司 A kind of device and method for realizing that augmented reality is interactive
WO2018156321A1 (en) * 2017-02-27 2018-08-30 Thomson Licensing Method, system and apparatus for visual effects
CN109313811B (en) * 2017-05-18 2021-11-05 深圳配天智能技术研究院有限公司 Automatic correction method, device and system based on vibration displacement of vision system
JP2021523503A (en) * 2018-05-09 2021-09-02 ドリームスケイプ・イマーシブ・インコーポレイテッド User Selectable Tool for Optical Tracking Virtual Reality Systems
CN108830939B (en) * 2018-06-08 2022-06-10 杭州群核信息技术有限公司 Scene roaming experience method and experience system based on mixed reality
CN110874135B (en) * 2018-09-03 2021-12-21 广东虚拟现实科技有限公司 Optical distortion correction method and device, terminal equipment and storage medium
CN110874868A (en) * 2018-09-03 2020-03-10 广东虚拟现实科技有限公司 Data processing method and device, terminal equipment and storage medium
CN110160749B (en) * 2019-06-05 2022-12-06 歌尔光学科技有限公司 Calibration device and calibration method applied to augmented reality equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4281241A (en) * 1977-02-21 1981-07-28 Australasian Training Aids (Pty.) Ltd. Firing range
US4439755A (en) * 1981-06-04 1984-03-27 Farrand Optical Co., Inc. Head-up infinity display and pilot's sight
GB2259213A (en) * 1991-08-29 1993-03-03 British Aerospace Variable resolution view-tracking display
EP0827337A1 (en) * 1996-02-26 1998-03-04 Seiko Epson Corporation Wearable information displaying device and information displaying method using the same
WO2001078015A2 (en) * 2000-04-07 2001-10-18 Carnegie Mellon University Computer-aided bone distraction
US20020105484A1 (en) * 2000-09-25 2002-08-08 Nassir Navab System and method for calibrating a monocular optical see-through head-mounted display system for augmented reality

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4281241A (en) * 1977-02-21 1981-07-28 Australasian Training Aids (Pty.) Ltd. Firing range
US4439755A (en) * 1981-06-04 1984-03-27 Farrand Optical Co., Inc. Head-up infinity display and pilot's sight
GB2259213A (en) * 1991-08-29 1993-03-03 British Aerospace Variable resolution view-tracking display
EP0827337A1 (en) * 1996-02-26 1998-03-04 Seiko Epson Corporation Wearable information displaying device and information displaying method using the same
WO2001078015A2 (en) * 2000-04-07 2001-10-18 Carnegie Mellon University Computer-aided bone distraction
US20020105484A1 (en) * 2000-09-25 2002-08-08 Nassir Navab System and method for calibrating a monocular optical see-through head-mounted display system for augmented reality

Also Published As

Publication number Publication date
CN1802586A (en) 2006-07-12

Similar Documents

Publication Publication Date Title
CN100416336C (en) Calibrating real and virtual views
US7369101B2 (en) Calibrating real and virtual views
US11928838B2 (en) Calibration system and method to align a 3D virtual scene and a 3D real world for a stereoscopic head-mounted display
CN105320271B (en) It is calibrated using the head-mounted display of direct Geometric Modeling
CN106535806B (en) The quantitative three-dimensional imaging of surgical scene from multiport visual angle
US8957948B2 (en) Geometric calibration of head-worn multi-camera eye tracking system
US7774044B2 (en) System and method for augmented reality navigation in a medical intervention procedure
US20060176242A1 (en) Augmented reality device and method
CN106456267A (en) Quantitative three-dimensional visualization of instruments in a field of view
EP1369769A2 (en) System and method for measuring the registration accuracy of an augmented reality system
CN107991775B (en) Head-mounted visual equipment capable of tracking human eyes and human eye tracking method
Figl et al. A fully automated calibration method for an optical see-through head-mounted operating microscope with variable zoom and focus
Azimi et al. Alignment of the virtual scene to the tracking space of a mixed reality head-mounted display
Livingston Vision-based tracking with dynamic structured light for video see-through augmented reality
US10992928B1 (en) Calibration system for concurrent calibration of device sensors
US11071453B2 (en) Systems and methods for reflection-based positioning relative to an eye
Hua et al. Calibration of an HMPD-based augmented reality system
TW202222271A (en) Method for real-time positioning compensation of image positioning system and image positioning system capable of real-time positioning compensation
Ferrari et al. Wearable light field optical see-through display to avoid user dependent calibrations: A feasibility study
US20240115325A1 (en) Camera tracking system for computer assisted surgery navigation
Bianchi Exploration of augmented reality technology for surgical training simulators
Hua et al. A systematic framework for on‐line calibration of a head‐mounted projection display for augmented‐reality systems
LaRose A fast, affordable system for augmented reality
Falcão Surgical Navigation using an Optical See-Through Head Mounted Display
CN117957479A (en) Compact imaging optics with distortion compensation and image sharpness enhancement using spatially positioned freeform optics

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
ASS Succession or assignment of patent right

Owner name: SIEMENS MEDICAL SYSTEMS, INC.

Free format text: FORMER OWNER: SIEMENS MEDICAL SOLUTIONS

Effective date: 20061110

C41 Transfer of patent application or patent right or utility model
TA01 Transfer of patent application right

Effective date of registration: 20061110

Address after: American Pennsylvania

Applicant after: American Siemens Medical Solutions Inc.

Address before: new jersey

Applicant before: Siemens Corporate Research, Inc.

C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20080903

Termination date: 20200609

CF01 Termination of patent right due to non-payment of annual fee