CN109348209B - Augmented reality display device and vision calibration method - Google Patents

Augmented reality display device and vision calibration method Download PDF

Info

Publication number
CN109348209B
CN109348209B CN201811184383.7A CN201811184383A CN109348209B CN 109348209 B CN109348209 B CN 109348209B CN 201811184383 A CN201811184383 A CN 201811184383A CN 109348209 B CN109348209 B CN 109348209B
Authority
CN
China
Prior art keywords
cam
camera
augmented reality
dis
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811184383.7A
Other languages
Chinese (zh)
Other versions
CN109348209A (en
Inventor
王耀彰
郑昱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Journey Technology Ltd
Original Assignee
Journey Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Journey Technology Ltd filed Critical Journey Technology Ltd
Priority to CN201811184383.7A priority Critical patent/CN109348209B/en
Publication of CN109348209A publication Critical patent/CN109348209A/en
Application granted granted Critical
Publication of CN109348209B publication Critical patent/CN109348209B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses an augmented reality display device and a vision calibration method, and relates to the technical field of optics. Including augmented reality glasses, situated in an augmentationThe glasses comprise a camera, a display, a processor and a position sensor in the glasses, wherein the camera is used for acquiring a single-frame image of an actual scene at a specific moment; the processor is used for selecting a calibration point in the single-frame image; a display for displaying the single frame image and the index point; when the displayed calibration point is coincided with the corresponding point in the actual scene, the position sensor is used for acquiring the position M of the calibration point in the cameracamAnd the position M of the index point in the displaydis(ii) a A processor further configured to operate according to McamAnd MdisAnd determining the conversion relation between the image acquired by the camera and the image displayed by the display. The invention can be suitable for the visual calibration of a three-dimensional space, the application scene is wider, and the visual calibration result is more accurate.

Description

Augmented reality display device and vision calibration method
Technical Field
The invention relates to the technical field of optics, in particular to an augmented reality display device and a vision calibration method.
Background
The purpose of augmented reality displays is to achieve a visual overlay of virtual information with real information. The observer can directly watch the virtual information displayed by the display device by wearing the augmented reality display device, and simultaneously, the observer can directly observe the external environment through the display device, so that the virtual information and the real world are superposed.
The augmented reality display device needs to display the virtual information on the display device separately, so that the observer can visually recognize that the virtual information is perfectly integrated with the real space. However, since the position observed by the eyes of the observer is dynamic and varies from person to person, the augmented reality display device needs to reversely deduce the display position of the corresponding virtual information on the display device from the position point of the real environment by means of visual calibration. In the prior art, a standard plane calibration plate is usually used for performing the visual calibration work, wherein the standard plane calibration plate is a plane grid plate.
However, the calibration plate can only perform visual calibration on the image of the plane where the calibration plate is located, and cannot be used for situations outside the plane.
Disclosure of Invention
The embodiment of the invention provides an augmented reality display device and a vision calibration method. The method aims to overcome the defect that the visual calibration in the prior art is only suitable for the plane situation. The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosed embodiments. This summary is not an extensive overview and is intended to neither identify key/critical elements nor delineate the scope of such embodiments. Its sole purpose is to present some concepts in a simplified form as a prelude to the more detailed description that is presented later.
According to a first aspect of embodiments of the present invention, there is provided an augmented reality display device comprising augmented reality glasses, a camera, a display, a processor and a position sensor in the augmented reality glasses, wherein,
the camera is used for acquiring a single-frame image of an actual scene at a specific moment;
the processor is used for selecting a calibration point in the single-frame image;
a display for displaying the single frame image and the index point;
when the displayed calibration point is coincided with the corresponding point in the actual scene, the position sensor is used for acquiring the position M of the calibration point in the cameracamAnd the position M of the index point in the displaydis
A processor further configured to operate according to McamAnd MdisAnd determining the conversion relation between the image acquired by the camera and the image displayed by the display.
Alternatively to this, the first and second parts may,
a position sensor for acquiring the position P of the cameracamAnd attitude Rcam
A processor further configured to operate according to Mcam、Mdis、PcamAnd RcamAnd determining the conversion relation.
Alternatively to this, the first and second parts may,
a position sensor for acquiring the position P of the augmented reality glassesglassAnd attitude Rglass
The processor is also used for determining the position P of the camera according to the position relation between the augmented reality glasses and the cameracamAnd attitude Rcam
Alternatively to this, the first and second parts may,
a position sensor for acquiring the position P of the left eye of the observereLAnd the position P of the right eyeeR
A processor further configured to operate according to Mcam、Mdis、Pcam、Rcam、PeLAnd PeRAnd determining the conversion relation.
Alternatively to this, the first and second parts may,
the processor is also used for selecting a plurality of calibration points in the single-frame image;
m from multiple index pointscamAnd MdisAnd determining the conversion relation through a data fitting mode.
According to a second aspect of the embodiments of the present invention, there is provided an augmented reality display visual calibration method, including:
acquiring a single-frame image of an actual scene at a specific moment;
selecting a calibration point in a single-frame image;
displaying the single frame image and the index point;
when the displayed calibration point is coincided with the corresponding point in the actual scene, the position M of the calibration point in the camera is obtainedcamAnd the position M of the index point in the displaydis
According to McamAnd MdisAnd determining the conversion relation between the image acquired by the camera and the image displayed by the display.
Optionally, the method further includes:
obtaining position P of cameracamAnd attitude Rcam
Determining a conversion relationship between an image acquired by a camera and an image displayed by a display, comprising:
according to Mcam、Mdis、PcamAnd RcamAnd determining the conversion relation.
Optionally, the position P of the camera is obtainedcamAnd attitude RcamThe method comprises the following steps:
obtaining position P of augmented reality glassesglassAnd attitude Rglass
Determining the position P of the camera according to the position relation between the augmented reality glasses and the cameracamAnd attitude Rcam
Optionally, the method further includes:
obtaining the position P of the left eye of the observereLAnd the position P of the right eyeeR
Determining a conversion relationship between an image acquired by a camera and an image displayed by a display, comprising:
according to Mcam、Mdis、Pcam、Rcam、PeLAnd PeRAnd determining the position relation.
Optionally, also include
Selecting a plurality of calibration points in a single frame image;
m from multiple index pointscamAnd MdisAnd determining the conversion relation through a data fitting mode.
The technical scheme disclosed by the embodiment of the invention can be suitable for visual calibration of a three-dimensional space, the application scene is wider, and the visual calibration result is more accurate.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
Fig. 1 is a schematic diagram of an augmented reality display apparatus according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a single frame image according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a display image according to an embodiment of the present invention;
FIG. 4 is a schematic illustration of another display image disclosed in embodiments of the present invention;
fig. 5 is a flowchart of an augmented reality display visual calibration method disclosed in an embodiment of the present invention.
Detailed Description
The following description and the drawings sufficiently illustrate specific embodiments of the invention to enable those skilled in the art to practice them. The examples merely typify possible variations. Individual components and functions are optional unless explicitly required, and the sequence of operations may vary. Portions and features of some embodiments may be included in or substituted for those of others. The scope of embodiments of the invention encompasses the full ambit of the claims, as well as all available equivalents of the claims. Embodiments may be referred to herein, individually or collectively, by the term "invention" merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is in fact disclosed. Herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed. The embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. For the structures, products and the like disclosed by the embodiments, the description is relatively simple because the structures, the products and the like correspond to the parts disclosed by the embodiments, and the relevant parts can be just described by referring to the method part.
The embodiment of the invention discloses an augmented reality display device 10, as shown in fig. 1, comprising augmented reality glasses 101, a camera 102, a display 103, a processor 104 and a position sensor 105 which are positioned in the augmented reality glasses 101, wherein,
the camera 102 may be configured to obtain a single frame image of an actual scene at a specific time;
a processor 104 operable to select a calibration point in a single frame image;
a display 103 that can be used to display a single frame image and a calibration point;
when the displayed calibration point coincides with a corresponding point in the actual scene, the position sensor 105 may be configured to obtain a position M of the calibration point in the camera 102camAnd the position M of the index point in the display 103dis
The processor 104 may also be configured to operate according to McamAnd MdisA conversion relationship between the image acquired by the camera 102 and the image displayed by the display 103 is determined.
The conversion relationship F may be defined: mdis=F(Mcam) I.e. any point M in the image of the camera 102camAccording to the conversion relationship F, the coordinate M on the display 103 corresponding to the point when the observer observes the actual scene through the augmented reality glasses 101 can be obtaineddis. The conversion relationship F may be a simple function, may be a implicit function, or may be other forms of set mapping.
For example, a single frame image of the actual scene at time T acquired by the camera 102 is shown in fig. 2, an image viewed by the observer through the augmented reality glasses 101 is shown in fig. 3, the augmented reality glasses 101 display image overlaps the actual scene, and a dashed line frame is a range of the image displayed by the augmented reality glasses 101.
In fig. 3, the display image and the actual scene do not overlap each other, and therefore, the position of the image displayed on the display 103 needs to be corrected.
For example, the processor 104 may select one or more calibration points from a single frame of image acquired by the camera 102, as shown in fig. 4, when the processor 104 selects multiple calibration points, the calibration points may be selected in a standard grid form, or end points that are easily distinguished may be selected according to specific information of the screen, where the more calibration points, the longer the calibration process takes, but the more accurate the calibration result. The calibration points marked in fig. 4 are selected according to the specific information of the screen.
When the observer changes the observation posture, that is, changes the posture of the glasses, by turning the head or body, the calibration points in the single frame of the image are overlapped with the real world body, and the processor 104 records the data set including M when the data set is overlappedcamAnd MdisAnd the like.
Further, the processor 104 may be based on McamAnd MdisDetermining a conversion relationship F between the image acquired by the camera 102 and the image displayed by the display 103, namely:
Mdis=F(Mcam)
wherein, the camera 102 forms a certain point M in the imagecamAccording to the conversion relation F, the coordinate M of the point on the display 103 corresponding to the position of the point in the virtual picture when the observer observes the actual scene through the augmented reality device can be obtaineddis
Thus, the display 103 can display the image to be displayed at an accurate position so that the virtual image seen by the observer coincides with the actual scene according to the conversion relationship F.
Optionally, the position sensor 105 may also be used to obtain the position P of the camera 102camAnd attitude Rcam
The processor 104 may also be configured to operate according to Mcam、Mdis、PcamAnd RcamAnd determining the conversion relation F.
When the position sensor 105 acquires the position P of the camera 102camAnd attitude RcamThen, McamCorresponding to position P in the real sceneobjThere is the following relationship between:
Pobj=G(Mcam,(Pcam,Rcam))
the conversion relation G is determined by the attribute of the camera 102 and the mounting position thereof, and is a parameter inherent to the augmented reality display device 10.
When the image M is required to be displayeddisWith the actual scene PobjIn coincidence, i.e. Mdis=PobjIn time, there are:
Mdis=G(Mcam,(Pcam,Rcam))
from the above equation, it can be obtained through further calculation:
Mdis=F(Mcam)
thereby determining the conversion relation F.
In the embodiment of the invention, the position P of the camera 102 is further consideredcamAnd attitude RcamAnd the intrinsic parameter conversion relationship G of the augmented reality display apparatus 10, the conversion relationship F can be determined more accurately.
Optionally, the position sensor 105 is further configured to obtain the position P of the augmented reality glasses 101glassAnd attitude Rglass
The processor 104 is further configured to determine a position P of the camera 102 according to a position relationship between the augmented reality glasses and the cameracamAnd attitude Rcam
In general, in the augmented reality display device 10, the positional relationship between the augmented reality glasses 101 and the camera 102 is determined, that is, the positional relationship can be determined after the augmented reality display device 10 is assembled, and therefore, there are:
(Pcam,Rcam)=TGC(Pglass,Rglass)
wherein, the position relation TGCAre intrinsic parameters of the augmented reality display device 10.
The embodiment of the invention provides another optional position P of the camera 102camAnd attitude RcamIn the technical scheme, a person skilled in the art can directly obtain P in the concrete implementation processcamAnd RcamAlso by obtaining PglassAnd RglassAnd according to the device intrinsic parameters TGCObtaining PcamAnd Rcam
Optionally, the position sensor 105 may also be used to obtain the position P of the left eye of the observereLAnd the position P of the right eyeeR
The processor 104 may also be configured to operate according to Mcam、Mdis、Pcam、Rcam、PeLAnd PeRAnd determining the conversion relation F.
Image position P seen by observervirPosition P of the observer's eyes relative to the augmented reality display device 10eLAnd PeRAnd the optical properties of the augmented reality glasses 101,
Pvir=H(Mdis,PeL,PeR)
the conversion relation H is determined by the attributes of the augmented reality glasses 101 and is a parameter unique to the augmented reality display device 10.
When the image M is required to be displayeddisWith the actual scene PobjIn coincidence, i.e. Pvir=PobjIn time, there are:
H(Mdis,PeL,PeR)=G(Mcam,(Pcam,Rcam))
from the above equation, it can be obtained through further calculation:
Mdis=F(Mcam)
thereby determining the conversion relation F.
In the embodiment of the invention, the positions P of the eyes of the observer are further consideredeLAnd PeRAnd the intrinsic parameter conversion relationship H of the augmented reality display apparatus 10, the conversion relationship F can be determined more accurately and easily. When the observer carries out the visual calibration to augmented reality display device 10, need not to carry out complicated artifical demarcation, also need not the supplementary of calibration plate, it is more convenient to operate.
Optionally, the processor 104 may be further configured to select a plurality of calibration points in the single frame image;
m from multiple index pointscamAnd MdisAnd determining the conversion relation through a data fitting mode.
Generally, a plurality of calibration points are selected for calibration, so that the accuracy of visual calibration can be improved, a more accurate conversion relation F is obtained, and the user experience in the actual use process is improved.
The embodiment of the invention discloses an augmented reality display visual calibration method, as shown in FIG. 5, comprising the following steps:
s501, acquiring a single-frame image of an actual scene at a specific moment;
s502, selecting a calibration point in a single-frame image;
s503, displaying the single-frame image and the calibration point;
s504, when the displayed calibration point is superposed with the corresponding point in the actual scene, acquiring the position M of the calibration point in the cameracamAnd the position M of the index point in the displaydis
S505, according to McamAnd MdisAnd determining the conversion relation between the image acquired by the camera and the image displayed by the display.
In S501, the specific time may be a time selected by the observer, may be a time automatically determined according to a plurality of rules, or may be any time, which is not limited in the embodiment of the present invention.
Illustratively, as before, let the conversion relationship be F, i.e., Mdis=F(Mcam) Reference may be made to the above definitions, which are not repeated herein.
When the observer changes the posture of the glasses, each calibration point in the single frame picture can be coincided with the body of the single frame picture in the real world. Recording at registration, including McamAnd MdisAnd so on.
Further, according to McamAnd MdisA conversion relationship F between the image acquired by the camera and the image displayed by the display may be determined.
Optionally, before S504, the method may further include:
s506, acquiring position P of cameracamAnd attitude Rcam
S505 may include:
s5051, determining a conversion relation F between the image acquired by the camera and the image displayed by the display comprises the following steps:
according to Mcam、Mdis、PcamAnd RcamAnd determining the conversion relation F.
In S5051, the conversion relationship F may be determined according to the following equation:
Mdis=G(Mcam,(Pcam,Rcam))
the meanings of the parameters in the formula are described above, and are not repeated here.
Optionally, S506 may further include:
s5061, obtaining position P of augmented reality glassesglassAnd attitude Rglass
S5062, determining the position P of the camera according to the position relation between the augmented reality glasses and the cameracamAnd attitude Rcam
Generally, the relative position relationship between the augmented reality glasses and the camera is determined, and based on this, the position P of the augmented reality glasses can be obtained according to the relative position relationshipglassAnd attitude RglassThen, the position P of the camera is determinedcamAnd attitude Rcam
Optionally, before S504, the method may further include:
s507, obtaining the position P of the left eye of the observereLAnd the position P of the right eyeeR
S505 may include:
s5052, determining a conversion relation F between the image acquired by the camera and the image displayed by the display comprises the following steps:
according to Mcam、Mdis、Pcam、Rcam、PeLAnd PeRAnd determining the position relation F.
In S5052, the conversion relationship F may be determined according to the following equation:
H(Mdis,PeL,PeR)=G(Mcam,(Pcam,Rcam))
the meanings of the parameters in the formula are described above, and are not repeated here.
Optionally, in S502, the method may further include
S5021, selecting a plurality of calibration points in a single-frame image;
s505 may further include:
s5053, M based on multiple index pointscamAnd MdisAnd determining the conversion relation F through a data fitting mode.
According to multiple groups McamAnd MdisThe data may be fit by calculation or the like to determine the transformation relationship F.
Generally, the accuracy of the fitting result depends on the dispersion and number of the calibration points, and when the calibration points are more dispersed and more numerous, the accuracy is higher, and when the calibration points are more numerous and more calibration points are provided at different depths, the accuracy is higher.
The technical scheme disclosed by the embodiment of the invention can be suitable for visual calibration of a three-dimensional space, the application scene is wider, and the visual calibration result is more accurate.
It is to be understood that the present invention is not limited to the procedures and structures described above and shown in the drawings, and that various modifications and changes may be made without departing from the scope thereof. The scope of the invention is limited only by the appended claims.

Claims (10)

1. An augmented reality display device comprising augmented reality glasses, a camera, a display, a processor and a position sensor located in the augmented reality glasses, wherein,
the camera is used for acquiring a single-frame image of an actual scene at a specific moment;
the processor is used for selecting a calibration point in the single-frame image;
the display is used for displaying the single-frame image and the calibration point;
when the displayed calibration point is overlapped with the corresponding point in the actual scene, the position sensor is used for acquiring the position M of the calibration point in the image acquired by the cameracamAnd a position M of the index point in the image displayed on the displaydis
The processor is further configured to according to the McamAnd MdisDetermining the image acquired by the camera and the image displayed by the displayThe conversion relationship between the images.
2. The augmented reality display device of claim 1,
the position sensor is also used for acquiring the position P of the cameracamAnd attitude Rcam
The processor is further configured to according to the Mcam、Mdis、PcamAnd RcamAnd determining the conversion relation.
3. The augmented reality display device of claim 2,
the position sensor is also used for acquiring the position P of the augmented reality glassesglassAnd attitude Rglass
The processor is further configured to determine a position P of the camera according to a position relationship between the augmented reality glasses and the cameracamAnd attitude Rcam
4. The augmented reality display device of claim 3,
the position sensor is also used for acquiring the position P of the left eye of the observereLAnd the position P of the right eyeeR
The processor is further configured to according to the Mcam、Mdis、Pcam、Rcam、PeLAnd PeRAnd determining the conversion relation.
5. The augmented reality display device of any one of claims 1-4,
the processor is further configured to select a plurality of calibration points in the single frame image;
m according to the plurality of index pointscamAnd MdisAnd determining the conversion relation through a data fitting mode.
6. An augmented reality display visual calibration method, comprising:
acquiring a single-frame image of an actual scene at a specific moment by using a camera;
selecting a calibration point in the single-frame image;
displaying the single frame image and the index point with a display;
when the displayed calibration point is coincident with the corresponding point in the actual scene, acquiring the position M of the calibration point in the image acquired by the cameracamAnd a position M of the index point in the image displayed on the displaydis
According to said McamAnd MdisAnd determining a conversion relation between the image acquired by the camera and the image displayed by the display.
7. The method of claim 6, further comprising:
obtaining the position P of the cameracamAnd attitude Rcam
Determining the conversion relationship between the image acquired by the camera and the image displayed by the display, including:
according to said Mcam、Mdis、PcamAnd RcamAnd determining the conversion relation.
8. Method according to claim 7, characterized in that the position P of the camera is obtainedcamAnd attitude RcamThe method comprises the following steps:
obtaining position P of augmented reality glassesglassAnd attitude Rglass
Determining the position P of the camera according to the position relation between the augmented reality glasses and the cameracamAnd attitude Rcam
9. The method of claim 8, further comprising:
obtaining the position P of the left eye of the observereLAnd the position P of the right eyeeR
Determining the conversion relationship between the image acquired by the camera and the image displayed by the display, including:
according to said Mcam、Mdis、Pcam、Rcam、PeLAnd PeRAnd determining the conversion relation.
10. The method of any of claims 6-9, further comprising
Selecting a plurality of calibration points in the single-frame image;
m according to the plurality of index pointscamAnd MdisAnd determining the conversion relation through a data fitting mode.
CN201811184383.7A 2018-10-11 2018-10-11 Augmented reality display device and vision calibration method Active CN109348209B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811184383.7A CN109348209B (en) 2018-10-11 2018-10-11 Augmented reality display device and vision calibration method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811184383.7A CN109348209B (en) 2018-10-11 2018-10-11 Augmented reality display device and vision calibration method

Publications (2)

Publication Number Publication Date
CN109348209A CN109348209A (en) 2019-02-15
CN109348209B true CN109348209B (en) 2021-03-16

Family

ID=65309533

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811184383.7A Active CN109348209B (en) 2018-10-11 2018-10-11 Augmented reality display device and vision calibration method

Country Status (1)

Country Link
CN (1) CN109348209B (en)

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9508146B2 (en) * 2012-10-31 2016-11-29 The Boeing Company Automated frame of reference calibration for augmented reality
CN103226817A (en) * 2013-04-12 2013-07-31 武汉大学 Superficial venous image augmented reality method and device based on perspective projection
JP6225538B2 (en) * 2013-07-24 2017-11-08 富士通株式会社 Information processing apparatus, system, information providing method, and information providing program
JP6299234B2 (en) * 2014-01-23 2018-03-28 富士通株式会社 Display control method, information processing apparatus, and display control program
CN106791784B (en) * 2016-12-26 2019-06-25 深圳增强现实技术有限公司 A kind of the augmented reality display methods and device of actual situation coincidence
JP6426772B2 (en) * 2017-02-07 2018-11-21 ファナック株式会社 Coordinate information conversion apparatus and coordinate information conversion program
CN108227929B (en) * 2018-01-15 2020-12-11 廖卫东 Augmented reality lofting system based on BIM technology and implementation method

Also Published As

Publication number Publication date
CN109348209A (en) 2019-02-15

Similar Documents

Publication Publication Date Title
KR101761751B1 (en) Hmd calibration with direct geometric modeling
JP6609929B2 (en) Depth-parallax calibration of binocular optical augmented reality system
JP4137078B2 (en) Mixed reality information generating apparatus and method
CN104205175B (en) Information processor, information processing system and information processing method
CN103815866B (en) Visual performance inspection method and visual performance inspection control device
KR101816041B1 (en) See-through smart glasses and see-through method thereof
JP5560858B2 (en) Correction value calculation apparatus, correction value calculation method, and correction value calculation program
US10623721B2 (en) Methods and systems for multiple access to a single hardware data stream
WO2016115872A1 (en) Binocular ar head-mounted display device and information display method thereof
CN106461983B (en) Method for determining at least one parameter of personal visual behaviour
US20240046432A1 (en) Compensation for deformation in head mounted display systems
EP2919093A1 (en) Method, system, and computer for identifying object in augmented reality
CN110708384B (en) Interaction method, system and storage medium of AR-based remote assistance system
US10235806B2 (en) Depth and chroma information based coalescence of real world and virtual world images
WO2015181827A1 (en) Method and system for image georegistration
CN111007939B (en) Virtual reality system space positioning method based on depth perception
CN111033573B (en) Information processing apparatus, information processing system, image processing method, and storage medium
JP2020526735A (en) Pupil distance measurement method, wearable eye device and storage medium
JP2017213191A (en) Sight line detection device, sight line detection method and sight line detection program
US9918066B2 (en) Methods and systems for producing a magnified 3D image
Moser et al. Evaluation of user-centric optical see-through head-mounted display calibration using a leap motion controller
JP2007034628A (en) Method and system for image processing
JP2006329684A (en) Image measuring instrument and method
JP2019045989A (en) Information processing apparatus, information processing method and computer program
CN111291746A (en) Image processing system and image processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant