WO2021148993A2 - Wearable device for the use of augmented reality - Google Patents
Wearable device for the use of augmented reality Download PDFInfo
- Publication number
- WO2021148993A2 WO2021148993A2 PCT/IB2021/050483 IB2021050483W WO2021148993A2 WO 2021148993 A2 WO2021148993 A2 WO 2021148993A2 IB 2021050483 W IB2021050483 W IB 2021050483W WO 2021148993 A2 WO2021148993 A2 WO 2021148993A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- projection
- centre
- respect
- virtual image
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0129—Head-up displays characterised by optical features comprising devices for correcting parallax
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
Definitions
- the present invention relates to the field of augmented reality.
- the invention relates to a wearable "optical see-through" viewer that allows an improved overlap of virtual content on a real scenario.
- optical see-through (OST) wearable systems allow the user to observe the world with their own eyes through a semi-transparent display on which virtual images are reproduced.
- the view is augmented by reproducing the virtual content on a two-dimensional micro display and projecting it, by means of a beam combiner, for example an appropriate optical guide, on a semi-transparent projection surface a comfortable sight distance (Holland and Fuchs 2000, Benton 2001).
- a limitation of these viewers is given by the intrinsic difficulty of providing perfect alignment between the real- world view and the virtual images projected on the semi- transparent display.
- these systems are very sensitive to changes in focus by the user's eye and to changes in the relative position between the eye and the viewer.
- the first aspect implies that these systems are effective, to date, only if the contents of the virtual guide are represented by simplified graphics, indications and/or writings, which do not require their precise location in the real environment nor a focus consistent with it.
- the second aspect instead involves the so-called parallax error, that is when there is a variation in relative position between the eye and the viewer, there is a translation of the virtual image that is different from the translation of the real image.
- a first solution consists in tracking the eye (“eye- tracking”) through the use of a camera projected towards it.
- eye- tracking usually requires complex calibrations and adaptations to the user's ocular geometry, making this solution impractical for non-expert users.
- the eye tracking system allows you to determine the center of rotation of the eye, which however does not coincide with the center of projection, resulting in only partial resolution of the aforementioned parallax error.
- the correct overlap of the virtual image on the image of the real scenario is today obtained starting from a manual calibration, or based on complex instrumentation and methods, carried out with respect to a specific position of the eye, of the projective parameters of an algorithm of renderings which are subsequently updated on the basis of the displacement of the eye with respect to said position.
- a method for calibrating a system for the fruition of augmented reality according to claim 12 is also claimed.
- Fig. 1 shows a system for the fruition of augmented reality according to the present invention
- Fig. 2 schematically shows the operating principle of the wearable device.
- the system 10 for the fruition of augmented reality by a user comprises a wearable device 100 comprising a frame 110 and at least one see-through display 120.
- the wearable device 100 is adapted to project a virtual image overlapped to an image of a real scenario comprising a plurality of characteristic points p ij .
- the frame 110 allows the user to wear the wearable device 100, allowing to define a centre of projection C p as ideal positioning of the nodal point of the eye of the user for having a consistent overlapping between the virtual image and the image of the real scenario.
- the purpose of the present invention is to allow the correct overlapping between the virtual image and the image of the real scenario also in case that, due to movements of the frame with respect to the user's eye, the nodal point of the eye is not perfectly superimposed on the centre of projection C p .
- the see-through display 120 at least partially transparent, comprises a micro display 121 arranged to emit the virtual image, an ocular lens 122 having a focal distance ⁇ 1 , and an optical combiner 123 arranged to project the virtual image emitted by the micro display 121 in front of the eye of the user.
- the virtual image is focused by the user in a plane ⁇ v at a distance V with respect to the centre of projection C p .
- the system 10 then comprises a tracking device and a control unit, not shown in the figure.
- the tracking device carries out a localization of the characteristic points p ij and of the wearable device 100 with respect to the reference system S, the characteristic points p ij being arranged on characteristic planes ⁇ i located at distances d i with respect to the centre of projection C p .
- the control unit is instead arranged to receive, by the tracking device, the localization of the characteristic points p ij and of the wearable device 100 and to operate the see-through display 120 for consistently overlapping the virtual image to the image of the real scenario at the centre of projection C p .
- the wearable device 100 also comprises at least one positive lens 130 having a scale factor A and a focal distance ⁇ 2 .
- Such positive lens 130 is arranged, on the opposite side with respect to eyes of the user, in a plane ⁇ R at a distance R with respect to the centre of projection C p .
- each characteristic point p ij located in a characteristic plane ⁇ i ⁇ at a distance with respect to the centre of projection C p is focused in the plane ⁇ v with the virtual image whatever is the position of the eye of the user with respect to the centre of projection C p .
- the distance V possibly also as an infinite value
- the desired working distance i.e. the distance from the centre of projection C p of the characteristic plane on which they lie the characteristic points p ij of the objects on which you want to make the virtual image consistent
- the focal distance ⁇ 2 in such a way that , obtaining coherence between the virtual image and the objects at the working distance , even if the nodal point of the eye is moved by the centre of projection C p .
- the virtual image is coherent with respect to the characteristic points p ij arranged in the characteristic plane located at a distance with respect to the centre of projection C p , even if the nodal point of the eye is moved by the centre of projection C p .
- the focal distance ⁇ 2 for having a coherent virtual image at the desired distance d .
- This can be done by choosing a positive lens 130 with a suitable focal distance ⁇ 2 , or using a lens 130 having a variable focal distance ⁇ 2 , such as a liquid glycerol-based lens.
- the ocular lens 122 can be arranged substantially at a distance equal to the focal distance with respect to the micro display 121, in such a way that the user focuses the virtual image at the infinite and that the characteristic points p ij of the real scenario located at a distance equal to the focal distance ⁇ 2 with respect to the positive lens 130 are also focused at the infinite with respect to the user.
- the wearable device 100 does not comprise the positive lens 130 and the control unit is arranged to:
- the invention also provides a calibration step of the system 10 for the fruition of augmented reality arranged to identify the position of the centre of projection C p with respect to the reference system S.
- the method comprises the steps of:
- arranging a calibration chamber in order to have a centre of projection C c known with respect to the reference system S integral to the frame 110, said calibration chamber being pointed towards the see-through display 120; projecting by the see-through display 120 an image comprising a plurality of characteristic calibration points;
- Such calibration in prior art is carried out manually, involving the presence of an expert operator who carries out an appropriate calibration on each device and each user.
- the present invention allows instead a totally automatic calibration by the system.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
Abstract
A system (10) for the fruition of augmented reality by a user comprising a wearable device (100) arranged to project a virtual image overlapped to an image of a real scenario comprising a plurality characteristic points p
ij
, said wearable device (100) comprising a frame (110) arranged to allow the user to wear the wearable device (100), said frame (110) defining a centre of projection C
p . The system (10) also comprises at least one see-through display (120), at least partially transparent, comprising a micro display (121) arranged to emit the virtual image, an ocular lens (122) having a focal distance f
1, an optical combiner (123) arranged to project the virtual image emitted by the micro display (121) in front of at least one eye of the user, said virtual image being focused by the user in a plane Π
V at a distance V with respect to the centre of projection C
p . The system (10) then comprises a tracking device arranged to carry out a localization of the characteristic points p
ij
and of the wearable device (100), said characteristic points p
ij
arranged on characteristic planes Π
i located at distances d
i with respect to the centre of projection C
p . The system also comprises a control unit arranged to receive the localization of the characteristic points p
ij
and of the wearable device (100) and to operate said or each see-through display (120) for consistently overlapping the virtual image to the image of the real scenario.
Description
TITLE
WEARABLE DEVICE FOR THE USE OF AUGMENTED REALITY
DESCRIPTION Field of the invention
The present invention relates to the field of augmented reality.
In particular, the invention relates to a wearable "optical see-through" viewer that allows an improved overlap of virtual content on a real scenario.
Description of the prior art
As is well known, optical see-through (OST) wearable systems allow the user to observe the world with their own eyes through a semi-transparent display on which virtual images are reproduced. In particular, the view is augmented by reproducing the virtual content on a two-dimensional micro display and projecting it, by means of a beam combiner, for example an appropriate optical guide, on a semi-transparent projection surface a comfortable sight distance (Holland and Fuchs 2000, Benton 2001).
A limitation of these viewers is given by the intrinsic difficulty of providing perfect alignment between the real- world view and the virtual images projected on the semi- transparent display.
In particular, these systems are very sensitive to changes in focus by the user's eye and to changes in the relative position between the eye and the viewer.
The first aspect implies that these systems are effective, to date, only if the contents of the virtual guide are represented by simplified graphics, indications and/or writings, which do not require their precise location in the real environment nor a focus consistent with it.
The second aspect instead involves the so-called parallax error, that is when there is a variation in relative position between the eye and the viewer, there is a translation of the virtual image that is different from the translation of the real image.
A first solution consists in tracking the eye ("eye- tracking") through the use of a camera projected towards it. However, these systems usually require complex calibrations and adaptations to the user's ocular geometry, making this solution impractical for non-expert users.
Document US7809160B2 tries to solve this problem, proposing an apparatus and a method for detecting the gaze of the eyes that do not require camera calibration and specific measurements of the user's face geometry, thanks to an auto-detection system of the distance between the eyes.
However, even this system, although more effective, requires the presence of a dedicated camera inside the viewer,
with a consequent increase in the cost of production and in the complexity of the device.
Furthermore, the eye tracking system allows you to determine the center of rotation of the eye, which however does not coincide with the center of projection, resulting in only partial resolution of the aforementioned parallax error.
Furthermore, the correct overlap of the virtual image on the image of the real scenario is today obtained starting from a manual calibration, or based on complex instrumentation and methods, carried out with respect to a specific position of the eye, of the projective parameters of an algorithm of renderings which are subsequently updated on the basis of the displacement of the eye with respect to said position. Summary of the invention
It is therefore an object of the present invention to provide an "optical see-through" wearable device for the use of augmented reality that allows to overlay virtual contents while maintaining consistency with respect to the real-world view, without being affected by changes in position and focus by of the user's eye.
It is also an object of the present invention to provide such a device that does not require the presence of eye tracking devices or device calibrations to adapt it to the
geometry of the user's face and to the relative pose between the face and the device.
These and other objects are achieved by a system for the fruition of augmented reality according to claims from 1 to 11 and 13.
According to another aspect of the invention, a method for calibrating a system for the fruition of augmented reality according to claim 12 is also claimed.
Brief description of the drawings Further characteristic and/or advantages of the present invention are more bright with the following description of an exemplary embodiment thereof, exemplifying but not limitative, with reference to the attached drawings in which:
— Fig. 1 shows a system for the fruition of augmented reality according to the present invention;
— Fig. 2 schematically shows the operating principle of the wearable device.
Description of a preferred exemplary embodiment With reference to Fig. 1, the system 10 for the fruition of augmented reality by a user comprises a wearable device 100 comprising a frame 110 and at least one see-through display 120.
In particular, with reference even at Fig. 2, the wearable device 100 is adapted to project a virtual image
overlapped to an image of a real scenario comprising a plurality of characteristic points pij .
The frame 110 allows the user to wear the wearable device 100, allowing to define a centre of projection Cp as ideal positioning of the nodal point of the eye of the user for having a consistent overlapping between the virtual image and the image of the real scenario.
The purpose of the present invention is to allow the correct overlapping between the virtual image and the image of the real scenario also in case that, due to movements of the frame with respect to the user's eye, the nodal point of the eye is not perfectly superimposed on the centre of projection Cp .
The see-through display 120, at least partially transparent, comprises a micro display 121 arranged to emit the virtual image, an ocular lens 122 having a focal distance ƒ1, and an optical combiner 123 arranged to project the virtual image emitted by the micro display 121 in front of the eye of the user. In particular, the virtual image is focused by the user in a plane Πv at a distance V with respect to the centre of projection Cp .
The system 10 then comprises a tracking device and a control unit, not shown in the figure.
In particular, the tracking device carries out a localization of the characteristic points pij and of the
wearable device 100 with respect to the reference system S, the characteristic points pij being arranged on characteristic planes Πi located at distances di with respect to the centre of projection Cp . The control unit is instead arranged to receive, by the tracking device, the localization of the characteristic points pij and of the wearable device 100 and to operate the see-through display 120 for consistently overlapping the virtual image to the image of the real scenario at the centre of projection Cp .
In particular, the wearable device 100 also comprises at least one positive lens 130 having a scale factor A and a focal distance ƒ2 . Such positive lens 130 is arranged, on the opposite side with respect to eyes of the user, in a plane ΜR at a distance R with respect to the centre of projection Cp .
This way, each characteristic point pij located in a characteristic plane Πi· at a distance with
respect to the centre of projection Cp is focused in the plane Πv with the virtual image whatever is the position of the eye of the user with respect to the centre of projection Cp.
This way, defined the distance V, possibly also as an infinite value, and defined the desired working distance
, i.e. the distance from the centre of projection Cp of the characteristic plane on which they lie the characteristic
points pij of the objects on which you want to make the virtual image consistent, it is possible to define the focal distance ƒ2 in such a way that , obtaining coherence
between the virtual image and the objects at the working distance , even if the nodal point of the eye is moved by
the centre of projection Cp .
Vice-versa, having defined the distance V and fixed the focal distance ƒ2, the virtual image is coherent with respect to the characteristic points pij arranged in the characteristic plane located at a distance
with respect to the centre of projection Cp, even if the nodal point of the eye is moved by the centre of projection Cp.
In the first case, it is possible to suitably vary the focal distance ƒ2 for having a coherent virtual image at the desired distance d . This can be done by choosing a positive lens 130 with a suitable focal distance ƒ2, or using a lens 130 having a variable focal distance ƒ2, such as a liquid glycerol-based lens.
In particular, the ocular lens 122 can be arranged substantially at a distance equal to the focal distance with respect to the micro display 121, in such a way that the user focuses the virtual image at the infinite and that the characteristic points pij of the real scenario located at a distance equal to the focal distance ƒ2 with respect to the
positive lens 130 are also focused at the infinite with respect to the user.
In an alternative embodiment of the invention, the wearable device 100 does not comprise the positive lens 130 and the control unit is arranged to:
— receive a value of a desired working distance
, i.e. of a distance from the centre of projection Cp of a characteristic plane on which they lie
the characteristic points pij on which you want to make the virtual image consistent;
— modify the distance V of the plane Πv with respect to the centre of projection Cp in such a way that
The invention also provides a calibration step of the system 10 for the fruition of augmented reality arranged to identify the position of the centre of projection Cp with respect to the reference system S.
In particular, the method comprises the steps of:
— arranging a calibration chamber, in order to have a centre of projection Cc known with respect to the reference system S integral to the frame 110, said calibration chamber being pointed towards the see-through display 120; projecting by the see-through display 120 an image comprising a plurality of characteristic
calibration points;
— acquiring by means of the calibration chamber the image projected by the see-through display 120 and identifying in the acquired image the characteristic calibration points;
— comparing the image acquired by the calibration chamber with the image projected by the display 120 for determining a transformation matrix which transforms the plurality of characteristic points of the projected image in the plurality of characteristic points identified in the acquired image;
— starting from the transformation matrix, computing a three-dimensional translation vector arranged to define a spatial distance between the centre of projection Cc and the centre of projection Cp, with subsequent spatial identification of the centre of projection Cp with respect to the reference system S.
In substance, the comparison between the image as it is projected by the display 120 and as it is acquired by the calibration chamber, placed in a different position with respect to the projector of the display 120, allows to identify the change of observation point between the two images, allowing to calculate the centre of projection Cp,
and allowing to consistently overlap the virtual image to the image of the real scenario at a distance d = V, even if the nodal point of the eye is moved by the centre of projection Cp . Such calibration in prior art is carried out manually, involving the presence of an expert operator who carries out an appropriate calibration on each device and each user. The present invention allows instead a totally automatic calibration by the system. The foregoing description some exemplary specific embodiments will so fully reveal the invention according to the conceptual point of view, so that others, by applying current knowledge, will be able to modify and/or adapt in various applications the specific exemplary embodiments without further research and without parting from the invention, and, accordingly, it is meant that such adaptations and modifications will have to be considered as equivalent to the specific embodiments. The means and the materials to realise the different functions described herein could have a different nature without, for this reason, departing from the field of the invention, it is to be understood that the phraseology or terminology that is employed herein is for the purpose of description and not of limitation.
Claims
1. A system (10) for the fruition of augmented reality by a user comprising:
— a wearable device (100) arranged to project a virtual image overlapped to an image of a real scenario comprising a plurality of characteristic points pij, said wearable device (100) comprising:
— a frame (110) arranged to allow said user to wear said wearable device (100), said frame (110) defining a centre of projection Cp as ideal positioning of the nodal point of the eye of said user for having a consistent overlapping between said virtual image and said image of the real scenario, said centre of projection Cp having a known position with respect to a reference system S integral to said frame (110);
— at least one see-through display (120), at least partially transparent, comprising:
— a micro display (121) arranged to emit said virtual image;
— an ocular lens (122) having a focal distance ƒ1;
— an optical combiner (123) arranged to project said virtual image emitted by said micro display (121) in front of at least one eye of
said user, said virtual image being focused by said user in a plane Πv at a distance V with respect to said centre of projection Cp;
— a tracking device arranged to carry out a localization of said characteristic points pij and of said wearable device (100) with respect to said reference system S, said characteristic points pij arranged on characteristic planes Πi located at distances di with respect to said centre of projection Cp;
— a control unit arranged to receive, from said tracking device, said localization of said characteristic points pij and of said wearable device (100) with respect to said reference system S and to operate said or each see-through display (120) for consistently overlapping said virtual image to said image of said real scenario at said centre of projection Cp; said system (10) characterized in that said wearable device (100) also comprises at least one positive lens
(130) having a scale factor A and a focal distance ƒ2 and arranged, on the opposite side with respect to said eyes of said user, in a plane ΠR at a distance R with respect to said centre of projection Cp, each characteristic point pij located in a characteristic
plane Πi at a distance with respect to said
centre of projection Cp being focused on said plane Πv together with said virtual image whatever is the position of the eye of said user with respect to the centre of projection Cp .
2. The system (10) for the fruition of augmented reality by a user, according to claim 1, wherein said ocular lens (122) is arranged substantially at a distance equal to said focal distance with respect to said micro display (121), in such a way that said user focuses said virtual image at the infinite and that said characteristic points pij of said real scenario located at a distance equal to said focal distance ƒ2 with respect to said positive lens (130) are also focused at the infinite with respect to said user.
3. The system (10) for the fruition of augmented reality by a user, according to claim 1, wherein said tracking device comprises at least one external camera (115).
4 . The system (10) for the fruition of augmented reality by a user, according to claim 1, wherein said tracking device carries out said localization of said characteristic points pij by means of a technology selected from the group consisting of:
— optical tracking by monoscopic video camera; — optical tracking by multiscopic video camera;
optical tracking by mono or multiscopic video camera with projection of light patterns; electromagnetic tracking; inertial tracking; time-of-flight tracking; a combining the previous.
5 . The system (10) for the fruition of augmented reality by a user, according to claim 1, wherein said positive lens (130) has a suitably variable focal distance f2 . 6. The system (10) for the fruition of augmented reality by a user, according to claim 1, wherein said positive lens (130) is a liquid lens based on glycerol.
7 . The system (10) for the fruition of augmented reality by a user, according to claim 1, wherein an adjustment means is provided arranged to change the relative position between said positive lens (130) and said centre of projection Cp .
8. The system (10) for the fruition of augmented reality by a user, according to claim 1, wherein said control unit is adapted to carry out a preliminary calibration comprising a step of compensating the scale factor A .
9. The system (10) for the fruition of augmented reality by a user, according to claim 1, wherein said control unit is adapted to define a maximum error of overlapping
said virtual image to said image of said real scenario for different characteristic points pi .
10. The system (10) for the fruition of augmented reality by a user, according to claim 1, wherein said control unit is adapted to carry out a manual calibration step that, by means of said tracking device, allows to identify the position of an eye of said user with respect to said centre of projection Cp .
11. The system (10) for the fruition of augmented reality by a user, according to claim 1, wherein an eye-tracking system is also provided arranged to identify the movements of the eye of said user with respect to said frame (110).
12. A method for calibrating a system (10) for the fruition of augmented reality by a user, said system (10) comprising a wearable device (100) arranged to project a virtual image overlapped to an image of a real scenario comprising a plurality of characteristic points pij, said wearable device (100) comprising: — a frame (110) arranged to allow said user to wear said wearable device (100);
— at least one see-through display (120), at least partially transparent, comprising: a micro display (121) arranged to emit said virtual image;
— an ocular lens (122) having a focal distance ƒ1,
— an optical combiner (123) arranged to project said virtual image emitted by said micro display (121) in front of at least one eye of said user; said method arranged to determine the position of a centre of projection Cp with respect to a reference system S integral to said frame (110), said centre of projection Cp defined as ideal positioning of the nodal point of the eye of said user for having a consistent overlapping between said virtual image and said image of the real scenario, said virtual image being focused by said user in a plane Πv at a distance V with respect to said centre of projection Cp, said method comprising the steps of:
— arranging a calibration chamber, in order to have a centre of projection Cc known with respect to said reference system S integral to said frame (110), said calibration chamber being pointed towards said see-through display (120);
— projecting by said see-through display (120) an image comprising a plurality of characteristic calibration points; acquiring by said calibration chamber said image
projected by said see-through display (120) and identifying in said acquired image said characteristic calibration points;
— comparing said image acquired by said calibration chamber with said image projected by said see- through display (120) for determining a transformation matrix which transforms said plurality of characteristic points of said projected image in said plurality of characteristic points identified in said acquired image;
— starting from said transformation matrix, computing a three-dimensional translation vector arranged to define a spatial distance between said centre of projection Cc and said centre of projection Cp, with subsequent spatial identification of said centre of projection Cp with respect to said reference system 5.
13 . A system (10) for the fruition of augmented reality by a user comprising:
— a wearable device (100) arranged to project a virtual image overlapped to an image of a real scenario comprising a plurality of characteristic points pij, said wearable device (100) comprising:
— a frame (110) arranged to allow said user to wear said wearable device (100), said frame (110)
defining a centre of projection Cp as ideal positioning of the nodal point of the eye of said user for having a consistent overlapping between said virtual image and said image of the real scenario, said centre of projection Cp having a known position with respect to a reference system S integral to said frame (110);
— at least one see-through display (120), at least partially transparent, comprising:
— a micro display (121) arranged to emit said virtual image;
— an ocular lens (122) having a focal distance ƒ1,
— an optical combiner (123) arranged to project said virtual image emitted by said micro display (121) in front of at least one eye of said user, said virtual image being focused by said user in a plane Πv at a distance V with respect to said centre of projection Cp;
— a tracking device arranged to carry out a localization of said characteristic points pij and of said wearable device (100) with respect to said reference system S, said characteristic points pij arranged on characteristic planes Πi located at distances di with respect to said centre of
projection Cp;
— a control unit arranged to receive, from said tracking device, said localization of said characteristic points pij and of said wearable device (100) with respect to said reference system S and to operate said or each see-through display (120) for consistently overlapping said virtual image to said image of said real scenario at said centre of projection Cp; said system (10) characterized in that said control unit is adapted to:
— receive a value of a desired working distance
, i.e. of a distance from the centre of projection Cp of a characteristic plane
on which they lie the characteristic points pij on which you want to make the virtual image consistent;
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
IT102020000001246 | 2020-01-22 | ||
IT102020000001246A IT202000001246A1 (en) | 2020-01-22 | 2020-01-22 | Improved system for the use of augmented reality |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2021148993A2 true WO2021148993A2 (en) | 2021-07-29 |
WO2021148993A3 WO2021148993A3 (en) | 2022-02-10 |
Family
ID=70295871
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2021/050483 WO2021148993A2 (en) | 2020-01-22 | 2021-01-22 | Wearable device for the use of augmented reality |
Country Status (2)
Country | Link |
---|---|
IT (1) | IT202000001246A1 (en) |
WO (1) | WO2021148993A2 (en) |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7809160B2 (en) | 2003-11-14 | 2010-10-05 | Queen's University At Kingston | Method and apparatus for calibration-free eye tracking using multiple glints or surface reflections |
JP5226528B2 (en) * | 2005-11-21 | 2013-07-03 | マイクロビジョン,インク. | Display having an image guiding substrate |
US9292973B2 (en) * | 2010-11-08 | 2016-03-22 | Microsoft Technology Licensing, Llc | Automatic variable virtual focus for augmented reality displays |
CN102402005B (en) * | 2011-12-06 | 2015-11-25 | 北京理工大学 | Bifocal-surface monocular stereo helmet-mounted display device with free-form surfaces |
WO2018138714A1 (en) * | 2017-01-28 | 2018-08-02 | Lumus Ltd. | Augmented reality imaging system |
CN108398787B (en) * | 2018-03-20 | 2023-05-16 | 京东方科技集团股份有限公司 | Augmented reality display device, method and augmented reality glasses |
-
2020
- 2020-01-22 IT IT102020000001246A patent/IT202000001246A1/en unknown
-
2021
- 2021-01-22 WO PCT/IB2021/050483 patent/WO2021148993A2/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
IT202000001246A1 (en) | 2021-07-22 |
WO2021148993A3 (en) | 2022-02-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP3787939B2 (en) | 3D image display device | |
JP2019091051A (en) | Display device, and display method using focus display and context display | |
US10382699B2 (en) | Imaging system and method of producing images for display apparatus | |
US10048750B2 (en) | Content projection system and content projection method | |
US9961335B2 (en) | Pickup of objects in three-dimensional display | |
US11798141B2 (en) | Method and apparatus for calibrating augmented reality headsets | |
US20200201038A1 (en) | System with multiple displays and methods of use | |
WO2013179427A1 (en) | Display device, head-mounted display, calibration method, calibration program, and recording medium | |
EP3548955B1 (en) | Display apparatus and method of displaying using image renderers and optical combiners | |
CN110377148B (en) | Computer readable medium, method of training object detection algorithm, and training apparatus | |
JP6349660B2 (en) | Image display device, image display method, and image display program | |
CN109997067B (en) | Display apparatus and method using portable electronic device | |
Ferrari et al. | Parallax free registration for augmented reality optical see-through displays in the peripersonal space | |
CN111868605B (en) | Method of calibrating a display device wearable on a user's head for a specific user for enhancing the display | |
EP4160299A1 (en) | Image display control method, image display control apparatus, and head-mounted display device | |
Jun et al. | A calibration method for optical see-through head-mounted displays with a depth camera | |
CN114365214A (en) | System and method for superimposing virtual image on real-time image | |
US10698218B1 (en) | Display system with oscillating element | |
JP6170725B2 (en) | Image processing apparatus and program | |
WO2014119555A1 (en) | Image processing device, display device and program | |
WO2021148993A2 (en) | Wearable device for the use of augmented reality | |
US11934571B2 (en) | Methods and systems for a head-mounted device for updating an eye tracking model | |
WO2017208148A1 (en) | Wearable visor for augmented reality | |
KR101733519B1 (en) | Apparatus and method for 3-dimensional display | |
JP6608208B2 (en) | Image display device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21708065 Country of ref document: EP Kind code of ref document: A2 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21708065 Country of ref document: EP Kind code of ref document: A2 |