WO2017208148A1 - Wearable visor for augmented reality - Google Patents
Wearable visor for augmented reality Download PDFInfo
- Publication number
- WO2017208148A1 WO2017208148A1 PCT/IB2017/053171 IB2017053171W WO2017208148A1 WO 2017208148 A1 WO2017208148 A1 WO 2017208148A1 IB 2017053171 W IB2017053171 W IB 2017053171W WO 2017208148 A1 WO2017208148 A1 WO 2017208148A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- content
- light field
- augmented reality
- user
- projection
- Prior art date
Links
- 230000003190 augmentative effect Effects 0.000 title claims abstract description 41
- 230000001427 coherent effect Effects 0.000 claims abstract description 9
- 238000000034 method Methods 0.000 claims description 10
- 210000003128 head Anatomy 0.000 claims description 3
- 239000007787 solid Substances 0.000 claims description 2
- 238000005516 engineering process Methods 0.000 description 3
- 239000011521 glass Substances 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 210000001525 retina Anatomy 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
Definitions
- the present invention relates to the field of augmented reality .
- the invention relates to a wearable visor providing augmented reality by the "see-through” technique, synthesizing the virtual information by a "light field” display.
- the "see-through” technique provides that the user sees the scenario he faces directly with his eyes, or at most through transparent or semi-transparent means.
- the "see-through” technique currently available allows you to show virtual information added to the real scene by displaying it as a two-dimensional image on transparent or semi-transparent media in the eye (like in Google Glass) or possibly projected directly into the retina.
- light field displays are displays capable to generate a light field that is likely to be a real scene, thus providing a parallax of motion and allowing a focus that varies as in reality depending on objects distance. This technology is used, for example, for 3D screens and TVs that do not use special glasses.
- US20120113092 describes an augmented reality system that allows a user to display a virtual object with the same focus as it would display real objects.
- focus changes according to the user's movement, simulating a high level of realism.
- this document does not in any way address the problem of the size of the device, given by the necessary distance between the user's point of view and the parallax plane. For this reason, in a practical embodiment of the object, the dimensions could make the device very uncomfortable to use.
- a augmented reality system arranged to superimpose, in front of a user, a virtual light field of a 3D content to the real light field of a surrounding environment
- said augmented reality system providing : — an augmented reality display arranged to be worn by the user on its own head, said augmented reality display comprising:
- a light field display arranged to create a virtual light field of the 3D content
- a beam combiner arranged to deviate the light rays emitted by the virtual light field of the 3D content and project them in front of a user in order to superimpose the virtual light field of the 3D content to the real light field of the surrounding environment;
- a tracking system arranged to real time monitor position and orientation of a moving reference system S 1 integral to the augmented reality display with respect to a fixed reference system S 2 integral to surrounding environment;
- control unit arranged to receive from the tracking system an information about position and orientation of the moving reference system S 1 and to determine the virtual light field of the 3D content coherent with the position and the direction of the moving reference system S 1 with respect to the fixed reference system S 2 ;
- the light field display comprises : — a 2D display panel arranged to show a 2D image consisting of an array of elemental images made of a plurality of pixels;
- parallax panel located at a predetermined distance d by the 2D display panel and at a predetermined distance D by a point of user' s observation A, said parallax panel comprising a plurality of centres of projection, said parallax panel arranged to force the light rays emitted by the pixels of the elemental images to pass through the centres of projection;
- At least one pixel of the plurality containing chromatic information about color and brightness of a virtual light ray coming from the 3D content and passing through a centre of projection and the pixel, in such a way that a user, looking at the centres of projection, perceives a virtual light field of the 3D content,
- the use of the field of light for creating the virtual content allows not to have to determine with extreme precision the relative position between eye and the visor, since it ensures a correct alignment of the virtual information with the real surrounding environment whichever is the movement of the eye. Therefore the tracking system must provide only the relative position between display and surrounding environment, so that the control unit can generate in real time a light field consistent with the user's position in relation to the surrounding environment.
- the system of virtual reality according to the present invention therefore, provides to the user a much more realistic and versatile experience with respect to the prior art, in addition to allow a big freedom of movement.
- the 2D display panel and/or the parallax panel have curved shape and the centres of projection are located in such a way that a straight line that passes through the point of user's observation A and a centre of projection crosses the centre of the solid angle defined by the surface of the corresponding elemental image and by the point of user's observation A .
- each centre of projection of the parallax panel comprises at least one pinhole that crosses the parallax panel.
- each centre of projection of the parallax panel comprises at least one lens.
- the tracking system may comprise:
- control unit comprises a database containing a plurality of 2D acquired images of the 3D content, each 2D acquired image comprising a plurality of acquired pixels, each acquired pixel being associated with a set of parameters defining the direction of a light ray coming from the 3D content, said control unit arranged to access the database and to determine the virtual light field of the 3D content coherent with the information received by the tracking system relating to position and orientation of the moving reference system S 1 with respect to the fixed reference system S 2 .
- the plurality of 2D acquired images is acquired by a camera equipped with an image sensor on which the light rays coming from the 3D content form a 2D image acquired by the database, and the set of parameters comprises:
- the 3D content is a 3D virtual content created by computer graphics and the control unit is adapted to determine the light field of the 3D virtual content coherent with the information received by the tracking system relating to position and orientation of the moving reference system S 1 with respect to the fixed reference system S 2 .
- control unit uses the information provided by the tracking system to generate the 2D images of the 3D virtual content by means of a predetermined algorithm (for example Ray tracing) or simulating an array of virtual chambers arranged towards such 3D virtual content, in a way coherent with respect to the position of the user in space and with respect to the position in which the virtual light field must superimpose to the real light field.
- a predetermined algorithm for example Ray tracing
- simulating an array of virtual chambers arranged towards such 3D virtual content in a way coherent with respect to the position of the user in space and with respect to the position in which the virtual light field must superimpose to the real light field.
- a method for augmented reality arranged to superimpose, in front of the user, a virtual light field of a 3D content to the real light field of a surrounding environment, comprises the steps of:
- a parallax panel located at a predetermined distance d by the 2D display panel and at a predetermined distance D by a point of user' s observation A, said parallax panel comprising a plurality of centres of projection, said parallax panel arranged to force the light rays emitted by the pixel of the elemental images to pass through the centres of projection, in such a way that a user, looking at the centres of projection, perceives a virtual light field of the 3D content — dispose a beam combiner arranged to deviate the light rays emitted by the virtual light field of the 3D content and project them in front of a user.
- the 2D image is determined on the basis of a database containing a plurality of 2D acquired images of the 3D content, each 2D acquired image comprising a plurality of acquired pixels, each acquired pixel being associated with a set of parameters defining the direction of a light ray coming from the 3D content.
- the 3D content is a 3D virtual content created by computer graphics and the chromatic information associated with the pixel of the elemental images is created on the basis of position and orientation of the moving reference system S 1 with respect to the fixed reference system Brief description of the drawings
- Fig. 1 shows the augmented reality system according to the present invention
- Fig. 2 schematically shows the components of the augmented reality display
- Fig. 3 schematically shows a possible exemplary embodiment of the light field display, indicating the characteristic quantities.
- the augmented reality system 10 comprises an augmented reality display 100 arranged to be worn by a user on its own head.
- the augmented reality display 100 comprises a light field display 110, arranged to create a virtual light field of a 3D content, and a beam combiner 130, arranged to deviate the light rays emitted by the field of light of the content virtual 3D and project them in front of a user in order to superimpose the virtual light field of the 3D content (dashed line in Fig. 1) to the real light field of the surrounding environment.
- a tracking system 140 is also provided arranged to real time monitor the position and the orientation of a moving reference system S 1 integral to the augmented reality display 100 with respect to a fixed reference system S 2 integral to surrounding environment. Such information is given in real time to a control unit 150 that, on the basis of this information, determines the virtual light field of the 3D content more coherent, i.e. the light field that integrates better with the surrounding environment with respect to the point of view of the user.
- the tracking system 140 is in the augmented reality display 100 and comprises at least one camera for locating the objects of the surrounding environment and locating the user in relation to it .
- Fig. 3 an exemplary embodiment is schematically shown of the light field display 110 comprising a 2D display panel 111, arranged to show a 2D image consisting of an array of elemental images 112 made of a plurality of pixels, and a parallax panel 116 located at a predetermined distance d by the 2D display panel 111 and at a predetermined distance D by a point of user's observation A.
- the parallax panel 116 comprises a plurality of centres of projection 117, through which are forced to pass the light rays emitted by the pixel of the elemental images 112 for reaching the point of user's observation .
- Each square is also provided with a geometrical centre C lying on a straight line that passes through a corresponding centre of projection 117 and the point of user's observation A.
- the point of user's observation A is located in a symmetric way with respect to the panels 111 and 116, it can also be located in a asymmetric way, simply by moving the centres of projection 117 or the centres geometric C so that the lines passing through them converge in the point A.
- the point of user's observation A to be considered as an ideal point where the user must place his eyes.
- the eyes can move slightly, remaining in a neighbourhood of this point A of few mm, without loosing consistency of the image with respect to surrounding environment .
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Processing Or Creating Images (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
An augmented reality system (10) arranged to superimpose, in front of a user, a virtual light field of a 3D content to the real light field of a surrounding environment. The system (10) provides an augmented reality display (100) comprising a light field display (110) arranged to create a virtual light field of the 3D content, and a beam combiner (130) arranged to deviate the light rays emitted by the virtual light field of the 3D content and project them in front of a user in order to overlap the virtual light field of the 3D content to the real light field of the surrounding environment. Furthermore, the augmented reality display (100) is adapted to be worn by the user on its own head. The augmented reality system (10) also comprises a tracking system (140) arranged to real time monitor position and orientation of a moving reference system S
1 integral to the augmented reality display (100) with respect to a fixed reference system S
2 integral to surrounding environment. The system (10) comprises then a control unit (150) arranged to receive from the tracking system (140) an information about position and orientation of the moving reference system S
1 and to determine the virtual light field of the 3D content coherent with position and orientation of the moving reference system S
1 with respect to the fixed reference system S
2 .
Description
TITLE
Wearable visor for augmented reality
DESCRIPTION Field of the invention
The present invention relates to the field of augmented reality .
In particular, the invention relates to a wearable visor providing augmented reality by the "see-through" technique, synthesizing the virtual information by a "light field" display.
Description of the prior art
As well known, the "see-through" technique provides that the user sees the scenario he faces directly with his eyes, or at most through transparent or semi-transparent means.. The "see-through" technique currently available allows you to show virtual information added to the real scene by displaying it as a two-dimensional image on transparent or semi-transparent media in the eye (like in Google Glass) or possibly projected directly into the retina.
However, it is very complex to overlap reality with a three-dimensional virtual image that is consistent with the user's eye position with respect to the surrounding environment and with the focus of the eyes of the user himself .
An experienced possibility is to use a light field to generate a virtual information to superimpose on the real one. As well known, "light field displays" are displays capable to generate a light field that is likely to be a real scene, thus providing a parallax of motion and allowing a focus that varies as in reality depending on objects distance. This technology is used, for example, for 3D screens and TVs that do not use special glasses.
However, to date the integration between the "see- through" technique and the "light field display" technique is still ineffective, since the 3D virtual image often has poor quality or is not positioned in a consistent way respect to the relative position between the user and the surrounding environment. For this reason, the very few examples of combination of such technologies make use of non-mobile viewers, for which an easy position tracking is possible and therefore a better coherence of virtual content respect to the surrounding environment.
However, there is a strong need to be able to use this technology on wearable visors that allow a free movement of the user.
For example, US20120113092 describes an augmented reality system that allows a user to display a virtual object with the same focus as it would display real objects. In particular, focus changes according to the user's movement,
simulating a high level of realism. However, this document does not in any way address the problem of the size of the device, given by the necessary distance between the user's point of view and the parallax plane. For this reason, in a practical embodiment of the object, the dimensions could make the device very uncomfortable to use.
Summary of the invention
It is therefore a feature of the present invention to provide an augmented reality system that allows the superposition of a virtual light field of a 3D content to the real light field of a surrounding environment.
It is also a feature of the present invention to provide such a system that comprises a mobile display wearable by a user .
It is also a feature of the present invention to provide such a system that allows to create a virtual content either from a real content or by using computer graphics.
It is still a feature of the present invention to provide such a system that allows a quality of the virtual 3D image higher than the prior art.
These and other objects are achieved by a augmented reality system arranged to superimpose, in front of a user, a virtual light field of a 3D content to the real light field of a surrounding environment, said augmented reality system providing :
— an augmented reality display arranged to be worn by the user on its own head, said augmented reality display comprising:
— a light field display arranged to create a virtual light field of the 3D content;
— a beam combiner arranged to deviate the light rays emitted by the virtual light field of the 3D content and project them in front of a user in order to superimpose the virtual light field of the 3D content to the real light field of the surrounding environment;
— a tracking system arranged to real time monitor position and orientation of a moving reference system S1 integral to the augmented reality display with respect to a fixed reference system S2 integral to surrounding environment;
— a control unit arranged to receive from the tracking system an information about position and orientation of the moving reference system S1 and to determine the virtual light field of the 3D content coherent with the position and the direction of the moving reference system S1 with respect to the fixed reference system S2 ;
whose main feature is that the light field display comprises :
— a 2D display panel arranged to show a 2D image consisting of an array of elemental images made of a plurality of pixels;
— a parallax panel located at a predetermined distance d by the 2D display panel and at a predetermined distance D by a point of user' s observation A, said parallax panel comprising a plurality of centres of projection, said parallax panel arranged to force the light rays emitted by the pixels of the elemental images to pass through the centres of projection;
at least one pixel of the plurality containing chromatic information about color and brightness of a virtual light ray coming from the 3D content and passing through a centre of projection and the pixel, in such a way that a user, looking at the centres of projection, perceives a virtual light field of the 3D content,
and that each elemental image is a square having a side LEI = V (l + r where p is the spacing between two adjacent centres of projection, said elemental image having a geometrical centre C lying on a straight line that passes through a corresponding centre of projection and the point of user's observation A, in such a way that the user, placing his eyes substantially at the point of user' s observation A, can coherently perceive the
virtual light field in relation to surrounding environment .
The use of the field of light for creating the virtual content allows not to have to determine with extreme precision the relative position between eye and the visor, since it ensures a correct alignment of the virtual information with the real surrounding environment whichever is the movement of the eye. Therefore the tracking system must provide only the relative position between display and surrounding environment, so that the control unit can generate in real time a light field consistent with the user's position in relation to the surrounding environment.
The system of virtual reality according to the present invention, therefore, provides to the user a much more realistic and versatile experience with respect to the prior art, in addition to allow a big freedom of movement.
Furthermore, owing to the particular size of the edge of the elemental image, by the point of observation A the user sees, through the centres of projection, all the centres of the elemental images, obtaining a better image quality. It is thus also possible to reduce the distance D without reducing the distance d , as it would be instead in prior art, where normally there is LEI = p . Such aspect causes the significant advantage of increasing the wearability of the device .
In particular, the 2D display panel and/or the parallax panel have curved shape and the centres of projection are located in such a way that a straight line that passes through the point of user's observation A and a centre of projection crosses the centre of the solid angle defined by the surface of the corresponding elemental image and by the point of user's observation A .
Advantageously, each centre of projection of the parallax panel comprises at least one pinhole that crosses the parallax panel.
Alternatively, each centre of projection of the parallax panel comprises at least one lens.
In particular, the tracking system may comprise:
— a GPS locator;
— an inertia sensor;
— an optical sensor;
— an electromagnetic sensor;
— a combining the previous.
Advantageously, the control unit comprises a database containing a plurality of 2D acquired images of the 3D content, each 2D acquired image comprising a plurality of acquired pixels, each acquired pixel being associated with a set of parameters defining the direction of a light ray coming from the 3D content, said control unit arranged to access the database and to determine the virtual light field of the 3D
content coherent with the information received by the tracking system relating to position and orientation of the moving reference system S1 with respect to the fixed reference system S2 .
In particular, the plurality of 2D acquired images is acquired by a camera equipped with an image sensor on which the light rays coming from the 3D content form a 2D image acquired by the database, and the set of parameters comprises:
— parameters of position and orientation of the camera respect to the 3D content;
— parameters defining the orientation of each light ray associated with one of the pixel with respect to the camera, said parameters being associated with the settings of the camera.
Alternatively, the 3D content is a 3D virtual content created by computer graphics and the control unit is adapted to determine the light field of the 3D virtual content coherent with the information received by the tracking system relating to position and orientation of the moving reference system S1 with respect to the fixed reference system S2 .
In particular, using the information provided by the tracking system, the control unit generates the 2D images of the 3D virtual content by means of a predetermined algorithm (for example Ray tracing) or simulating an array of virtual chambers arranged towards such 3D virtual content, in a way
coherent with respect to the position of the user in space and with respect to the position in which the virtual light field must superimpose to the real light field. For both solutions is essential to know the inner geometry of the display arranged to display the virtual light field, i.e. D, d and .
According to another aspect of the invention, a method for augmented reality, arranged to superimpose, in front of the user, a virtual light field of a 3D content to the real light field of a surrounding environment, comprises the steps of:
— track position and orientation of a moving reference system S1 integral to a augmented reality display with respect to a fixed reference system S2 integral to surrounding environment;
— show on a 2D display panel an 2D image consisting of an array of elemental images made of a plurality of pixels, at least one pixel of the plurality containing chromatic information of a virtual light ray coming from the 3D content and passing through a centre of projection and the pixel, at a distance d by the centre of the corresponding elemental image;
— dispose a parallax panel located at a predetermined distance d by the 2D display panel and at a
predetermined distance D by a point of user' s observation A, said parallax panel comprising a plurality of centres of projection, said parallax panel arranged to force the light rays emitted by the pixel of the elemental images to pass through the centres of projection, in such a way that a user, looking at the centres of projection, perceives a virtual light field of the 3D content — dispose a beam combiner arranged to deviate the light rays emitted by the virtual light field of the 3D content and project them in front of a user. Advantageously, the 2D image is determined on the basis of a database containing a plurality of 2D acquired images of the 3D content, each 2D acquired image comprising a plurality of acquired pixels, each acquired pixel being associated with a set of parameters defining the direction of a light ray coming from the 3D content.
Alternatively, the 3D content is a 3D virtual content created by computer graphics and the chromatic information associated with the pixel of the elemental images is created on the basis of position and orientation of the moving reference system S1 with respect to the fixed reference system
Brief description of the drawings
Further characteristic and/or advantages of the present invention are more bright with the following description of an exemplary embodiment thereof, exemplifying but not limitative, with reference to the attached drawings in which:
— Fig. 1 shows the augmented reality system according to the present invention;
— Fig. 2 schematically shows the components of the augmented reality display;
— Fig. 3 schematically shows a possible exemplary embodiment of the light field display, indicating the characteristic quantities.
Description of a preferred exemplary embodiment
With reference to Fig. 1, the augmented reality system 10, according to the present invention, comprises an augmented reality display 100 arranged to be worn by a user on its own head.
In particular, with reference even at Fig. 2, the augmented reality display 100 comprises a light field display 110, arranged to create a virtual light field of a 3D content, and a beam combiner 130, arranged to deviate the light rays emitted by the field of light of the content virtual 3D and project them in front of a user in order to superimpose the virtual light field of the 3D content (dashed line in Fig. 1) to the real light field of the surrounding environment.
Advantageously, a tracking system 140 is also provided arranged to real time monitor the position and the orientation of a moving reference system S1 integral to the augmented reality display 100 with respect to a fixed reference system S2 integral to surrounding environment. Such information is given in real time to a control unit 150 that, on the basis of this information, determines the virtual light field of the 3D content more coherent, i.e. the light field that integrates better with the surrounding environment with respect to the point of view of the user.
In a preferred exemplary embodiment, the tracking system 140 is in the augmented reality display 100 and comprises at least one camera for locating the objects of the surrounding environment and locating the user in relation to it .
In Fig. 3 an exemplary embodiment is schematically shown of the light field display 110 comprising a 2D display panel 111, arranged to show a 2D image consisting of an array of elemental images 112 made of a plurality of pixels, and a parallax panel 116 located at a predetermined distance d by the 2D display panel 111 and at a predetermined distance D by a point of user's observation A.
In particular, the parallax panel 116 comprises a plurality of centres of projection 117, through which are forced to pass the light rays emitted by the pixel of the
elemental images 112 for reaching the point of user's observation .
Advantageously, each elemental image 112 is a square having a side LEI = ^l + ^ , where p is the spacing between two adjacent centres of projection 117. Each square is also provided with a geometrical centre C lying on a straight line that passes through a corresponding centre of projection 117 and the point of user's observation A.
Notwithstanding in Fig. 3 the point of user's observation A is located in a symmetric way with respect to the panels 111 and 116, it can also be located in a asymmetric way, simply by moving the centres of projection 117 or the centres geometric C so that the lines passing through them converge in the point A.
Furthermore, the point of user's observation A to be considered as an ideal point where the user must place his eyes. However, owing to the three-dimensional light field created, the eyes can move slightly, remaining in a neighbourhood of this point A of few mm, without loosing consistency of the image with respect to surrounding environment .
The foregoing description some exemplary specific embodiments will so fully reveal the invention according to the conceptual point of view, so that others, by applying current knowledge, will be able to modify and/or adapt in
various applications the specific exemplary embodiments without further research and without parting from the invention, and, accordingly, it is meant that such adaptations and modifications will have to be considered as equivalent to the specific embodiments. The means and the materials to realise the different functions described herein could have a different nature without, for this reason, departing from the field of the invention, it is to be understood that the phraseology or terminology that is employed herein is for the purpose of description and not of limitation .
Claims
An augmented reality system (10) arranged to superimpose, in front of a user, a virtual light field of a 3D content to the real light field of a surrounding environment, said augmented reality system (10) providing :
— an augmented reality display (100) arranged to be worn by said user on its own head, said augmented reality display (100) comprising:
— a light field display (110) arranged to create a virtual light field of said 3D content;
— a beam combiner (130) arranged to deviate the light rays emitted by said virtual light field of said 3D content and project them in front of said user in order to superimpose said virtual light field of said 3D content on said real light field of said surrounding environment ;
— a tracking system (140) arranged to real time monitor position and orientation of a moving reference system S1 integral to said augmented reality display (100) with respect to a fixed reference system S2 integral to said surrounding environment ;
— a control unit (150) arranged to receive from said
tracking system (140) an information about position and orientation of said moving reference system S1 and to determine the virtual light field of said 3D content coherent with said position and orientation of said moving reference system S1 with respect to said fixed reference system S2;
said augmented reality system (10) characterized in that said light field display (110) comprises:
— a 2D display panel (111) arranged to show a 2D image consisting of an array of elemental images (112) made of a plurality of pixels;
— a parallax panel (116) located at a predetermined distance d from said 2D display panel (111) and at a predetermined distance D from a point of user' s observation A, said parallax panel (116) comprising a plurality of centres of projection (117), said parallax panel (116) arranged to force the light rays emitted by said pixels of said elemental images (112) to pass through said centres of projection (117);
at least one pixel of said plurality containing chromatic information about color and brightness of a virtual light ray coming from said 3D content and passing through a centre of projection (117) and said pixel, in such a way that a user, looking at said centres
of projection (117), perceives a virtual light field of said 3D content, and by that each elemental image (112) is a square having a side LEI=p^l+^, where p is the spacing between two adjacent centres of projection (117), each elemental image (112) having a geometrical centre C lying on a straight line that passes through a corresponding centre of projection (117) and said point of user' s observation A, in such a way that said user, placing his eyes substantially at said point of user' s observation A, can coherently perceive said virtual light field in relation to said surrounding environment.
The augmented reality system (10), according to claim 1, wherein said 2D display panel (111) and/or said parallax panel (116) has curved shape and said centres of projection (117) are located in such a way that a straight line that passes through said point of user' s observation A and a centre of projection (117) crosses the centre of the solid angle defined by the surface of the corresponding elemental image (112) and by said point of user's observation A.
The augmented reality system (10), according to claim 1, wherein each centre of projection (117) of said parallax panel (116) comprises at least a pinhole that crosses said parallax panel (116) .
The augmented reality system (10), according to claim 1, wherein each centre of projection (117) of said parallax panel (116) comprises at least a lens.
The augmented reality system (10), according to claim 1, wherein said control unit (150) comprises a database containing a plurality of 2D acquired images of said 3D content, each 2D acquired image comprising a plurality of acquired pixels, each of said acquired pixel being associated to a set of parameters defining the direction of a light ray coming from said 3D content, said control unit (150) arranged to access said database and to determine the virtual light field of said 3D content coherent with said information received by said tracking system (140) about position and orientation of said moving reference system S1 with respect to said fixed reference system S2 .
The augmented reality system (10), according to claim 5, wherein said plurality of 2D acquired images is acquired by a camera providing an image sensor where light rays coming from said 3D content form a 2D image acquired by said database, and where said set of parameters comprises:
— parameters of position and orientation of said camera respect to said 3D content;
— parameters defining the orientation of each light
ray associated to one of said pixel with respect to said camera, said parameters being associated with the settings of said camera.
The augmented reality system (10), according to claim 1, wherein said 3D content is a 3D virtual content created by computer graphics and said control unit (150) is adapted to determine the light field of said 3D virtual content coherent with said information received from said tracking system (140) about position and orientation of said moving reference system S1 with respect to said fixed reference system S2 .
A method for augmented reality arranged to superimpose, in front of the user, a virtual light field of a 3D content to the real light field of a surrounding environment, said method characterized in that it comprises the steps of:
— track position and orientation of a moving reference system S1 integral to an augmented reality display (100) with respect to a fixed reference system S2 integral to said surrounding environment ;
— show on a 2D display panel (111) a 2D image consisting of an array of elemental images (112) made of a plurality of pixels, at least one pixel of said plurality containing chromatic information
of a virtual light ray coming from said 3D content and passing through a centre of projection (117) and said pixel, at a distance d by the centre of the corresponding elemental image (112);
— dispose a parallax panel (116) located at a predetermined distance d from said 2D display panel
(111) and at a predetermined distance D by a point of user's observation A, said parallax panel (116) comprising a plurality of centres of projection
(117), said parallax panel (116) arranged to force the light rays emitted by said pixel of said elemental images (112) to pass through said centres of projection (117), in such a way that a user, looking at said centres of projection (117), perceives a virtual light field of said 3D content
— dispose a beam combiner (130) arranged to deviate the light rays emitted by said virtual light field of said 3D content and project them in front of a user .
9. The method for augmented reality, according to claim 8, wherein said 2D image is determined on the basis of a database containing a plurality of 2D acquired images of said 3D content, each 2D acquired image comprising a plurality of acquired pixels, each of said acquired pixel being associated with a set of parameters defining
the direction of a light ray coming from said 3D content .
The method for augmented reality, according to claim 8, wherein said 3D content is a 3D virtual content created by computer graphics and the chromatic information associated with said pixel of said elemental images (112) is created on the basis of position and orientation of said moving reference system S1 with respect to said fixed reference system S2 .
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
ITUA2016A003946A ITUA20163946A1 (en) | 2016-05-30 | 2016-05-30 | Wearable viewer for augmented reality |
IT102016000055855 | 2016-05-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017208148A1 true WO2017208148A1 (en) | 2017-12-07 |
Family
ID=56990876
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2017/053171 WO2017208148A1 (en) | 2016-05-30 | 2017-05-30 | Wearable visor for augmented reality |
Country Status (2)
Country | Link |
---|---|
IT (1) | ITUA20163946A1 (en) |
WO (1) | WO2017208148A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108375840A (en) * | 2018-02-23 | 2018-08-07 | 苏州耐德佳天成光电科技有限公司 | Light field display unit based on small array image source and the nearly eye display device of three-dimensional using it |
US11373400B1 (en) | 2019-03-18 | 2022-06-28 | Express Scripts Strategic Development, Inc. | Methods and systems for image processing to present data in augmented reality |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011059297A (en) * | 2009-09-09 | 2011-03-24 | Seiko Epson Corp | Parallax barrier |
US20110075257A1 (en) * | 2009-09-14 | 2011-03-31 | The Arizona Board Of Regents On Behalf Of The University Of Arizona | 3-Dimensional electro-optical see-through displays |
US20120113092A1 (en) * | 2010-11-08 | 2012-05-10 | Avi Bar-Zeev | Automatic variable virtual focus for augmented reality displays |
US20140168034A1 (en) * | 2012-07-02 | 2014-06-19 | Nvidia Corporation | Near-eye parallax barrier displays |
US20160026253A1 (en) * | 2014-03-11 | 2016-01-28 | Magic Leap, Inc. | Methods and systems for creating virtual and augmented reality |
-
2016
- 2016-05-30 IT ITUA2016A003946A patent/ITUA20163946A1/en unknown
-
2017
- 2017-05-30 WO PCT/IB2017/053171 patent/WO2017208148A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011059297A (en) * | 2009-09-09 | 2011-03-24 | Seiko Epson Corp | Parallax barrier |
US20110075257A1 (en) * | 2009-09-14 | 2011-03-31 | The Arizona Board Of Regents On Behalf Of The University Of Arizona | 3-Dimensional electro-optical see-through displays |
US20120113092A1 (en) * | 2010-11-08 | 2012-05-10 | Avi Bar-Zeev | Automatic variable virtual focus for augmented reality displays |
US20140168034A1 (en) * | 2012-07-02 | 2014-06-19 | Nvidia Corporation | Near-eye parallax barrier displays |
US20160026253A1 (en) * | 2014-03-11 | 2016-01-28 | Magic Leap, Inc. | Methods and systems for creating virtual and augmented reality |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108375840A (en) * | 2018-02-23 | 2018-08-07 | 苏州耐德佳天成光电科技有限公司 | Light field display unit based on small array image source and the nearly eye display device of three-dimensional using it |
CN108375840B (en) * | 2018-02-23 | 2021-07-27 | 北京耐德佳显示技术有限公司 | Light field display unit based on small array image source and three-dimensional near-to-eye display device using light field display unit |
US11373400B1 (en) | 2019-03-18 | 2022-06-28 | Express Scripts Strategic Development, Inc. | Methods and systems for image processing to present data in augmented reality |
US11727683B2 (en) | 2019-03-18 | 2023-08-15 | Express Scripts Strategic Development, Inc. | Methods and systems for image processing to present data in augmented reality |
Also Published As
Publication number | Publication date |
---|---|
ITUA20163946A1 (en) | 2017-11-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3330771A1 (en) | Display apparatus and method of displaying using focus and context displays | |
TWI516802B (en) | Near-eye optical deconvolution displays | |
US10659771B2 (en) | Non-planar computational displays | |
US20120306725A1 (en) | Apparatus and Method for a Bioptic Real Time Video System | |
WO2018100239A1 (en) | Imaging system and method of producing images for display apparatus | |
EP3548955B1 (en) | Display apparatus and method of displaying using image renderers and optical combiners | |
SG181708A1 (en) | A system and method for producing stereoscopic images | |
JP2012079291A (en) | Program, information storage medium and image generation system | |
CN110709898A (en) | Video see-through display system | |
US9684166B2 (en) | Motor vehicle and display of a three-dimensional graphical object | |
CN106842599B (en) | 3D visual imaging method and glasses for realizing 3D visual imaging | |
CA2875261A1 (en) | Apparatus and method for a bioptic real time video system | |
JP7126115B2 (en) | DISPLAY SYSTEM, MOVING OBJECT AND DESIGN METHOD | |
CN111427152B (en) | Virtual Window Display | |
US10567744B1 (en) | Camera-based display method and system for simulators | |
WO2017208148A1 (en) | Wearable visor for augmented reality | |
US11567323B2 (en) | Partial electronic see-through head-mounted display | |
CN112236711A (en) | Apparatus and method for image display | |
WO2020137088A1 (en) | Head-mounted display, display method, and display system | |
CA3018454C (en) | Camera-based display method and system for simulators | |
WO2024122191A1 (en) | Image processing device and method, program, and storage medium | |
KR20180013607A (en) | Apparatus and method for glass-free hologram display without flipping images | |
US20240056563A1 (en) | Transparent Display with Blur Effect for Physical Objects | |
US10567743B1 (en) | See-through based display method and system for simulators | |
CA3018465C (en) | See-through based display method and system for simulators |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17745864 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17745864 Country of ref document: EP Kind code of ref document: A1 |