CN110709898A - Video see-through display system - Google Patents

Video see-through display system Download PDF

Info

Publication number
CN110709898A
CN110709898A CN201880014802.4A CN201880014802A CN110709898A CN 110709898 A CN110709898 A CN 110709898A CN 201880014802 A CN201880014802 A CN 201880014802A CN 110709898 A CN110709898 A CN 110709898A
Authority
CN
China
Prior art keywords
user
eye
optical
camera unit
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880014802.4A
Other languages
Chinese (zh)
Inventor
B·格林伯格
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Avic Vision Co Ltd
Eyeway Vision Ltd
Original Assignee
Avic Vision Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Avic Vision Co Ltd filed Critical Avic Vision Co Ltd
Publication of CN110709898A publication Critical patent/CN110709898A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0145Head-up displays characterised by optical features creating an intermediate image
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • G02B2027/0174Head mounted characterised by optical features holographic
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Diffracting Gratings Or Hologram Optical Elements (AREA)

Abstract

A display system having a video see-through eye display unit is disclosed. The eye display unit includes: at least one camera unit; at least one image forming module; and an optical deflection module comprising at least one double-sided light-reflecting optical element, the at least one double-sided light-reflecting optical element at least partially reflecting light from both sides thereof. The camera unit is configured to collect light arriving from a region of interest of a scene along a first optical path that intersects the optical deflection module and to generate image data indicative of the region of interest. The image forming module is configured to receive image data indicative of an image to be projected onto a user's eye and generate and project the received image to propagate along a second optical path that intersects the optical deflection module. The double-sided light-reflecting optical element of the optical deflection module is arranged in the display system to be located in front of the eye at a position where the first optical path and the second optical path intersect, and is oriented to define the first optical path (between the camera unit and the scene) and the second optical path (between the image forming unit and the eye) such that a viewpoint of the camera unit is substantially similar to a line of sight of the eye.

Description

Video see-through display system
Technical Field
The present invention relates to the configuration of head mounted display systems and in particular to head mounted display systems having see-through and video see-through configurations.
Background
A video see-through augmented reality system, unlike a virtual reality system, displays data based on the appearance of the actual scene in front of the user. Typically, an input image stream of a scene is captured, and an enhancement data layer is added over the captured image data and displayed to the user. This allows for the provision of a layer of information onto the real world image, the integration of games and other experiences into the surrounding environment and other real world based visualizations.
Such augmented reality systems are typically based on one of two main integration schemes, where a user may be able to view the actual scene through a partially transparent display/window and display the augmented data thereon. In an alternative, the actual scene is acquired with a video capture device (camera) and displayed to the user on a display device together with the enhancement data. Typically, augmented reality systems of the latter type are configured as handheld systems, such as tablets or smartphone-like devices, using a camera unit to provide data about the actual scene and moved by the user to change their point of view. Some augmented reality systems are configured to be wearable and include one or two cameras mounted. Typically, in conventional wearable systems, one or both cameras are typically mounted with a different line of sight from the user's eyes and are displaced with horizontal, lateral, and/or longitudinal (along the Z-axis stretched between the user and the scene). Other configurations provide a "single view" eye solution with a single camera for video capture, which on the one hand reduces the amount of pixels to be processed, and on the other hand provides a limited or non-stereoscopic arrangement for the user.
Conventional head mounted systems configured to provide augmented reality typically provide relatively simple augmented data and display with at least partially transparent elements while allowing a user to view the actual scene. For example, U.S. patent No. us 9195067 describes an electronic device that includes a display and a band configured to be worn on a user's head. The band defines a display end to which the display is secured and extends from the display end to a free end. The band is adjustable such that it can be configured by a user to contact the user's head at a first position proximate the temple, a second position along a portion of the user's ear proximate the temple, and a third position along the back of the user's head. The display end may be positioned away from the head such that the display element hangs on the eye adjacent the temple. The band is also configured to maintain the configuration. The device further comprises in-band image generating means for generating an image on the display that can be presented to the user.
Disclosure of Invention
There is a need in the art for a new system configuration that can provide a user with a display of visual enhancement data in combination with a view of the actual scene. The present technology provides a display system, typically configured as a head-mounted, having one or two eye display units for providing display data to a user. The system is typically configured to provide a virtual or augmented reality experience to the user by displaying image data in a relatively large field of view, and to incorporate actual visual data (the actual scene) in an area in front of the user into the display data in substantially real time.
Each eye display unit is associated with at least one camera unit and comprises at least one image forming module (e.g. a display and/or projection unit) and an optical deflection module. The optical deflection modules are typically located in front of respective eyes of the user (e.g. in front of predefined eyebox positions of the system) and are configured to direct light from a scene located in front of the user to be collected by the at least one associated camera unit and to direct light from the at least one image forming module to the eyes of the user. This configuration allows the camera unit to be positioned such that the field of view of the camera is substantially similar to the field of view of the user's eyes, thereby preventing various orientation obstruction problems associated with changes in the point of view of the camera relative to the user's eyes.
More specifically, according to the technique of the present invention, input image data collected by a camera unit is collected from a viewpoint substantially similar to the viewpoint of the eyes of a user without using a system. Typically, the collected image data is sent to a control unit for processing and generating display data, which is provided to the user by the image forming module. Accordingly, the control unit utilizes input image data corresponding to a line of sight desired by the user. From the input image data, the position of the object and the user's viewpoint with respect to the displayed data are generally similar to the position and viewpoint expected by the user in the real world. This provides the user with the ability to better orient themselves in space, as head movement causes the collected (and therefore displayed) image to change in a manner similar to that desired by the user.
In addition to improving the orientation of the user in space, the system configuration may also enhance the ability of the user to interact with real objects in augmented reality. More specifically, the control unit may be operable to add the image data layer in dependence on certain objects located in the actual scene, since the objects in the input image data are substantially similar to what the user desires and how the user sees the objects, the additional data layer may behave in a more physically realistic manner. This is particularly useful for video see-through systems where the user can still view real objects and interact with them at their actual location while using a head-mounted display system. This is in contrast to conventional systems, where there is a difference between the viewpoint of the camera collecting the image data for enhancement and the position of the user's eyes (corresponding to the user's desired viewpoint).
It should be noted that the system configurations and techniques of the present invention that provide enhanced user orientation in real and augmented environments are particularly important for applications that require a user to interact with real objects while experiencing augmentation of the actual scene. In such applications, the user's interaction with objects located at the user level (e.g., between a few centimeters and 1 or 2 meters) in close proximity to the user's spatial orientation determines the accuracy of the hand movements.
To this end, the technique of the present invention utilizes an optical deflection module that is positioned in front of the user's eye when the system is in use. The optical deflection module is positioned such that light from the scene in the path to the eyes of the user (if the user is not using the system) is deflected along the camera axis and directed towards at least one associated camera unit. In addition, the optical deflection module is directed towards the eye of the user, the light coming from at least one image forming unit location of the respective location.
Thus, according to one broad aspect, the invention provides a system comprising an eye display unit comprising:
at least one camera unit configured to collect light arriving along a first optical path from a region of interest of a scene and to generate image data indicative thereof;
at least one image forming module configured to receive image data and project an image of the image data along a second optical path toward a user's eye; and
an optical deflection module comprising at least one bi-facial light-reflecting optical element configured to at least partially reflect light arriving from both sides thereof, and located in front of the eye where the first and second optical paths intersect, and oriented to define the second optical path between an image forming unit and a user's eye, and the first optical path between the at least one camera unit and a scene; such that the point of view of the at least one camera unit is substantially similar to the line of sight of the user's eyes.
According to some embodiments, the at least one camera unit may be positioned along the first optical path at an optical plane corresponding to an optical plane of the user's eye relative to the scene without the optical deflection module. Additionally or alternatively, the at least one camera may be positioned along the first optical path and configured to provide a line of sight substantially similar to a line of sight of a respective user eye.
According to some embodiments, the at least one image forming unit may comprise an eye projection unit configured to project structured light indicative of one or more images onto respective eyes of a user.
The at least one image forming unit may include an image display unit configured to provide display image data.
According to some embodiments, the at least one double-sided light-reflecting optical element of the optical deflection module may be configured as a double-sided mirror.
According to some embodiments, the system may include first and second eye display units corresponding to right and left eyes of a user.
Typically, the eye display unit may further comprise a control unit configured and operable for receiving image data collected by the at least one camera unit and generating corresponding display image data and sending the display image data to the at least one image forming unit for providing a corresponding image to a user.
According to some embodiments, the optical deflection module may be configured to direct the input light such that the optical position and line of sight of the at least one camera unit corresponds to the optical position and line of sight of the respective eye of the user.
The optical deflection unit may be configured to locate the at least one camera unit at an eye equivalent position.
According to some embodiments, the at least one display unit may be configured to provide seamless image display.
According to some embodiments, the optical deflection module may comprise at least one reflective surface configured to provide a selected optical manipulation of light reflected therefrom.
According to some embodiments, the at least one reflective surface may be configured with at least one of: selected surface curvatures, diffraction gratings, and holographic elements.
According to some embodiments, the optical deflection module is configured and arranged in the system such that a portion along which light from the scene of the first optical path is deflected towards the camera unit is co-aligned with a portion along which light projected by the image forming unit of the second optical path is deflected by the optical deflection module for propagation to the eye; such that the image portion of the scene captured by the camera unit and projected by the image forming unit to the eye is projected in spatial registration with respect to the external scene.
Drawings
In order to better understand the subject matter disclosed herein and to illustrate how it may be carried out in practice, embodiments will now be described, by way of non-limiting example only, with reference to the accompanying drawings, in which:
FIG. 1 schematically illustrates the display system configuration and operating principles of the present invention;
FIGS. 2A and 2B illustrate user interaction with a virtual object in augmented reality, where the system utilizes a camera unit with a displaced line of sight (FIG. 2A) and a common line of sight (FIG. 2B) according to the present invention;
FIG. 3 illustrates user interaction with a physical object in combination with an auxiliary data layer provided by an augmented reality system indicating that a video perspective system may be used, according to some embodiments of the invention;
FIG. 4 illustrates an eye display unit according to some embodiments of the invention;
FIG. 5 illustrates an eye display unit utilizing an eye projection unit according to some embodiments of the invention; and
fig. 6 illustrates another configuration of an eye display unit according to some embodiments of the invention.
Detailed Description
As described above, head mounted virtual and/or augmented reality systems provide users with an active experience and enable them to play roles in the virtual or augmented world. While in a full virtual experience, interaction with real world objects is limited and may often be used as a metaphor for something else in the virtual world, augmented reality experiences rely on interaction with real scenes and objects related to the user's location and surroundings.
To allow the user to actively interact with the actual object, while using the head mounted augmented reality system, the spatial positioning of the user needs to be adjusted to the data displayed by the system. However, since systems typically utilize input image data of the surrounding environment collected by one or more camera units, and sometimes provide the user with display data obtained by one or more camera units, the position and viewpoint of the camera units may greatly affect the user's position in space. Most notably, the location and distance of the object may appear different from the actual location and distance.
To this end, the present invention provides a display system, typically configured as a head mounted display system, configured to collect input image data of a scene at a substantially similar viewpoint as the viewpoint of a user's eyes when the system is not in use. Referring to fig. 1, fig. 1 schematically shows a display system 1000, which generally includes two eye display units 100a and 100b, and is configured to provide display image data corresponding to a line of sight and a viewpoint desired by a user. Each of the eye display units 100a and 100b comprises an image forming module 130, which image forming module 130 is configured for forming image data that can be collected by the eyes 10a and 10b of the user in the form of an image stream and is associated with at least one camera unit 120 configured for collecting input image data. Further, according to the invention, each eye display unit 100a or 100b comprises an optical deflection module 140 configured to direct at least part of the light from the scene in front of the user towards the corresponding camera unit 120 and configured for directing the light from the image forming unit 130 towards the corresponding eye 10a or 10b of the user. The optical deflection module 140 may generally comprise a flat plate configured to be at least partially reflective on both sides thereof. Additionally or alternatively, one or more surfaces of the optical deflection module 140 may have one or more curved surfaces and/or utilize diffractive and/or holographic surface areas. Accordingly, the optical deflection module 140 may be configured to provide optical manipulation to improve imaging by the image forming unit 130 or image acquisition by the camera unit 120. As illustrated in the drawing, the optical deflection module 140 is configured to deflect the optical axes (OA1c and OA2c) of the respective camera units 120 toward the first optical path aligned with the main optical paths OA1s and OA2s of the user's eyes 10a and 10 b. In general, the optical deflection modules 140 may each include at least one two-sided light-reflecting optical element. In addition, the optical deflection module is configured to deflect the light (OA1d and OA2d) generated by the image forming module 130 along a second optical path aligned with the optical axis (OA1v and OA2v) of the user's eye. The operation of the optical deflection module provides a case where the image forming module 130 is directly connected to the corresponding camera unit 120, and the image displayed to the user is substantially seamless as if the system is not used, thereby providing seamless display of the scene.
More specifically, the angular orientation of the optical deflection module 140 and the position of the camera unit 120 (including any optical elements located in the optical path of the light collected by the camera unit 120) provide for the camera unit to be located on the eye equivalent optical plane. Thus, the camera unit has a line of sight that is substantially similar to what it would have if it were located at the eye's position and directed forward.
Typically, the system 1000 further comprises or is connectable to a control unit 500, the control unit 500 being configured for receiving input image data from the camera unit 120, processing the input data and generating output data for display by the image forming module 130. The control unit 500 may also be configured to generate audible data (e.g., sound effects) to be provided to the user through respective headphones or speakers, and may be connected to a communication network or various other devices as the case may be. In general, the control unit 500 may determine a three-dimensional model of the surrounding environment using the input image data and position the virtual object according to the determined three-dimensional model. It should be noted that registering virtual objects in augmented reality may generally provide a reliable experience when collecting input image data from a viewpoint that is substantially similar to the viewpoint of a user, where the objects are located at a correct and trusted location and distance and resolve various failures of virtual objects that appear to float or to be located within other real objects.
It should be noted that according to some embodiments, each eye display unit 100 may comprise a respective camera unit. Alternatively, both eye display units may be associated with a common camera unit that uses a portion of its field of view for capturing image data corresponding to the right eye and another portion of its field of view for capturing image data corresponding to the left eye of the user. For simplicity, the techniques of this disclosure are described herein as using a camera unit for an eye display unit. Furthermore, due to the symmetry between the right and left eyes, the invention is described herein with respect to one eye. Typically in some configurations, the system may include only a single-eye display unit, while the other eye of the user may freely view the surrounding environment without any interference.
Referring to fig. 2A and 2B, fig. 2A and 2B illustrate user interaction with a virtual object in an augmented reality environment, wherein input image data of the surrounding environment (actual scene/reality) is provided by a camera unit displaced with respect to the user's line of sight (fig. 2A) and by a camera unit having a common line of sight with the user (fig. 2B). Fig. 2A shows a user wearing an augmented reality unit in the form of glasses with a display 900. The augmented reality unit 900 further comprises a camera unit 120 configured for collecting image data of the surroundings for determining the position of the displayed virtual object (e.g. object OBJ) relative to the actual scene of the user's surroundings. In general, a processing/control utility (not specifically shown here) of the system 900 utilizes input image data to determine the location of a displayed object. When the line of sight of the camera unit 120 is shifted with respect to the line of sight of the user's eye 10, the display position of the object may be shifted with respect to the actual scene. Such displacement may cause user orientation obstacles when attempting to form interactions with the physical object and the virtual object and when the virtual object is presented in a similar location as the selected physical object.
Fig. 2B is a diagram illustrating an augmented reality system 1000 configured in accordance with the techniques of the present invention. More specifically, the system includes a camera unit 120 configured to have a line of sight substantially similar to the line of sight of the user's eye 10. This configuration provides a processing/control utility (not specifically shown) with input image data as seen from the user's viewpoint. Thus, the control utility may determine the location of displayed virtual objects (e.g., object OBJ) relative to the physical scene in a manner that simplifies user orientation in space and interaction with these objects.
Typically, for objects that are relatively far from the user, small changes in gaze are negligible. However, since many augmented reality applications may be used to facilitate user interaction with displayed objects, the objects may be displayed to appear at a distance of a few centimeters to about 1.5 meters from the user, i.e., within reach of the hand. For such relatively short distances, even a few centimeters of line-of-sight variation may result in meaningful displacement and user disorientation when attempting to generate interaction with the displayed virtual object.
An exemplary use of the video perspective configuration is shown in fig. 3. As shown in fig. 3, according to some embodiments of the invention, a user may receive visual guidance and/or manual instructions while operating and performing actual physical tasks with the augmented reality system 1000. In this example, a user attempts to assemble two objects OBJ1 and OBJ2 while system 1000 provides an additional layer of instruction data VS visible around the objects and provides instructions on how to perform the task. It should be noted that the positions of the actual objects OBJ1 and OBJ2 displayed to the user should correspond to the actual positions of the objects in space (also associated with the user's hand), and the instructional (enhancement) data VS should be visible at the corresponding positions in order to be meaningful to the user. More specifically, if the enhancement data VS in this example appears next to OBJ1, it may be understood that the task unscrews OBJ1 from OBJ2, whereas if the enhancement data VS appears next to OBJ2, the resulting task may be understood to be the opposite (screwing OBJ1 and OBJ2 together).
Referring to fig. 4, an eye display unit 100 according to some embodiments of the invention is schematically shown. The eye display unit 100 includes an image forming unit 130, a camera unit 120, and an optical deflection module 140. The optical deflection module 140 may be a bilaterally at least partially reflective plate (e.g., a double sided mirror) configured to deflect light from a scene in front of the user into light captured by the camera unit 120 and deflected from the image forming unit 130 toward the user's eye 10. The optical deflection module 140 typically has two at least partially reflective surfaces, namely a surface 141 on the side of the surroundings and the camera unit 120 and a surface 142 on the side of the eye and the image forming unit 130, and the optical deflection module 140 is positioned in an angled direction to reflect light from the surroundings towards the camera 120 and from the image forming unit 130 towards the user's eye 10.
More specifically, the optical deflection module 140 and the camera unit 120 are positioned and oriented such that, upon direct forward gaze, the surface 141 deflects the optical axis OAc of the camera unit along a first axis OAs aligned with the intended line of sight of the user. To provide the required field of view, one or more optical elements, two such optical elements 122 and 124 are illustrated in the figure, are provided in the optical path of the light collected by the camera unit 120. These optical elements may include one or more lenses, apertures, or any other elements configured to improve image acquisition and provide a desired field of view. The optical elements may be positioned along an optical path OAc between the camera unit 120 and the optical deflection module 140, and along an optical path OAs between the optical deflection module 140 and a scene in front of the user. In some configurations, the eye display unit 100 may utilize optical elements located only between the camera unit 120 and the optical deflection module 140, as will be described further below. The optical elements 122 and/or 124 may be configured individually or together to provide the desired optical manipulation of the collected input light. For example, the optical element may be configured as a zoom lens, a telephoto lens system, an eyepiece unit, or the like. Additionally, as described above, the surface 141 of the optical deflection module 140 may also be configured to apply the required optical manipulation with appropriate curvature, diffractive or holographic elements, etc., in addition to or instead of the optical elements 122 and 124.
The surface 142 of the optical deflection module 140 generally deflects the light from the image forming unit 130 toward the user's eye 10. As shown, the image forming unit 130 is positioned such that its optical axis OAd is deflected along the second optical path to align along the optical axis OAv of the user's eye. Image forming unit 130 may be any type of image forming unit capable of generating an output light-formed image that is perceivable by a user. For example, the image forming unit 130 may include: a display unit having a display panel configured to form image data by selectively turning on and off a plurality of pixels; or an eye projection unit configured to project the selected image data directly onto the user's eye or onto an intermediate image plane.
In addition, the second optical path may also include one or more optical elements configured to image the displayed data to a user. Two such optical elements are illustrated as 132 and 134 respectively located between the image forming unit 130 and the optical deflection module 140 and between the optical deflection module 140 and the user's eye 10. The optical elements may be used to influence the propagation of light from the image forming unit 130 towards the user's eye 10 to provide a desired focusing distance. Further, as described above, the surface 142 of the optical deflection module 140 may also be configured to apply desired optical manipulations with appropriate curvatures, diffractive or holographic elements, etc., in addition to or in place of the optical elements 132 and 134.
Thus, according to various embodiments of the invention, the optical deflection module 140 (and its deflection surfaces 141 and 142) are configured and arranged in the system 100 such that the nominal optical path OAs (which is part of the first optical path) is co-aligned (coaxial) with the nominal optical path OAv (which is part of the second optical path), the optical deflection module 140 deflects light from/along the nominal optical path OAs from the scene (e.g., through surface 141) for capture by the camera unit 120, and the optical deflection module 140 deflects light projected by the image forming unit 130 (e.g., through opposing surface 142) along the nominal optical path OAv for propagation to the eye 10. Thus, an image of the scene or part thereof captured by the camera unit 120 and projected to the eye by the image forming unit 130 is perceived by spatial registration with respect to the external scene.
It should also be noted that the optical deflection module 140 may be configured to partially reflect and allow light from the scene to propagate along the light paths OAs and OAv toward the user's eye 10. In such a configuration, optical elements 124 and 134 may be configured as two portions of an eyepiece such that a user can see around, or may not be used. In such a configuration, the scene may be directly visible to the user without being processed by the control unit, while an additional image layer may be provided by the image forming unit 130 on top of the actual visible scene.
Referring to fig. 5, an eye display unit 100 according to some embodiments of the invention is shown. In this example, the optical deflection module 140 is configured as a two-sided mirror that actually reflects 100% (e.g., almost 100%, typically about 95% -99%) of the light impinging thereon from either side. The image forming unit 130 is configured as an eye projection unit including a first relay module 136, and the optical element 134 is configured as a second relay module. The eye projection unit may comprise an eye tracker and is capable of changing the angular orientation of the image projection in dependence on the movement of the user's eye.
Further, in the configuration illustrated in fig. 5, the camera unit may include optical elements 122 and 124 for providing a desired field of view. In general, the camera unit may be configured with any selected optical elements to achieve a desired and possibly variable field of view while providing a line of sight corresponding to the user's eye 10. It should be noted that the camera unit 120, if used, may change its line of sight with a deflecting optical element according to data on the eye orientation provided by the eye tracker of the projection unit.
Fig. 6 illustrates an additional configuration of the eye display unit 100 according to some embodiments of the invention. In this configuration, the camera unit 120 is provided with the optical element 122, and the optical element 122 is located in the optical path of the light collected between the optical deflection module 140 and the camera unit 120. An additional optical element 126 may be used, located between the optical element 122 and the camera unit. The illustrated details of the image forming unit 130 are omitted because the image forming module can be configured according to the above-described configuration.
Generally, according to various embodiments of the present invention as described above, the input image data provided by the at least one camera unit 120 of each eye display unit 100 provides stereoscopic input image data. A control unit (generally 500 in fig. 1) that is connectable to a part of the display system may utilize the input data to determine a three-dimensional mapping of the surroundings as described above. This enables the system to provide reliable augmentation data based on input image data of a scene and to improve user orientation in an augmented reality environment. Furthermore, such a three-dimensional map of the scene may be used to provide an image display having a focal length corresponding to the actual distance of the object, in addition to the change in viewpoint of each provided image to the user's eyes, thereby providing the user with a complete three-dimensional experience.
Accordingly, the present invention provides a display system, generally configured as a head-mounted display system, that utilizes an optical deflection module that is used to direct input light for acquisition by one or more camera units to provide image acquisition having a substantially similar viewpoint as that of a user's eye. This configuration provides a reliable augmented reality experience and increases user orientation in the virtual or augmented reality provided by the system.

Claims (13)

1. A system comprising an eye display unit, the eye display unit comprising:
at least one camera unit configured for collecting light arriving along a first optical path from a region of interest of a scene and generating image data indicative thereof;
at least one image forming module configured to receive image data and project an image of the image data along a second optical path toward a user's eye; and
an optical deflection module comprising at least one bi-facial light-reflecting optical element configured to at least partially reflect light from both sides thereof and located in front of the eye where the first and second optical paths intersect and oriented to define the second optical path between the image forming module and the user's eye and the first optical path between the at least one camera unit and the scene; such that the point of view of the at least one camera unit is substantially similar to the line of sight of the user's eyes.
2. The system of claim 1, wherein the at least one camera unit is positioned along the first optical path at an optical plane corresponding to an optical plane of the user's eye relative to the scene without the optical deflection module.
3. The system of claim 1 or 2, wherein the at least one camera is positioned along the first optical path to provide a line of sight substantially similar to a line of sight of a respective user eye.
4. The system of any one of claims 1 to 3, wherein the at least one image forming module comprises an eye projection unit configured to project structured light indicative of one or more images onto respective eyes of a user.
5. The system of any of claims 1 to 4, wherein the at least one image forming module comprises an image display unit configured to provide display image data.
6. The system of any one of claims 1 to 5, wherein the at least one double-sided light-reflecting optical element of the optical deflection module is configured as a double-sided mirror.
7. The system of any one of claims 1 to 6, comprising first and second eye display units corresponding to right and left eyes of a user.
8. The system of any one of claims 1 to 7, wherein the eye display unit further comprises a control unit configured and operable for receiving image data collected by the at least one camera unit and generating corresponding display image data and sending the display image data to the at least one image forming module to provide corresponding images to the user.
9. The system of any one of claims 1 to 8, wherein the optical deflection module is configured to direct input light such that an optical position and line of sight of the at least one camera unit corresponds to an optical position and line of sight of a respective eye of the user.
10. The system of any one of claims 1 to 9, wherein the optical deflection unit locates the at least one camera unit at an eye equivalent location.
11. The system of any one of claims 1 to 10, wherein the at least one display unit is configured to provide seamless image display.
12. The system of any one of claims 1 to 11, wherein the optical deflection module comprises at least one reflective surface configured to provide a selected optical manipulation of light reflected therefrom.
13. The system of claim 12, wherein the at least one reflective surface is configured with at least one of: selected surface curvatures, diffraction gratings, and holographic elements.
CN201880014802.4A 2017-03-01 2018-02-28 Video see-through display system Pending CN110709898A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
IL250869 2017-03-01
IL250869A IL250869B (en) 2017-03-01 2017-03-01 Display system with video see-through
PCT/IL2018/050222 WO2018158765A1 (en) 2017-03-01 2018-02-28 Display system with video see-through

Publications (1)

Publication Number Publication Date
CN110709898A true CN110709898A (en) 2020-01-17

Family

ID=58669502

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880014802.4A Pending CN110709898A (en) 2017-03-01 2018-02-28 Video see-through display system

Country Status (12)

Country Link
US (1) US10890771B2 (en)
EP (1) EP3590098A4 (en)
JP (1) JP2020511687A (en)
KR (1) KR20190141656A (en)
CN (1) CN110709898A (en)
AU (1) AU2018226651A1 (en)
CA (1) CA3055143A1 (en)
IL (1) IL250869B (en)
RU (1) RU2019129442A (en)
SG (1) SG11201907794YA (en)
TW (1) TW201843494A (en)
WO (1) WO2018158765A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111818265A (en) * 2020-07-16 2020-10-23 北京字节跳动网络技术有限公司 Interaction method and device based on augmented reality model, electronic equipment and medium
CN112731645A (en) * 2021-01-12 2021-04-30 塔普翊海(上海)智能科技有限公司 Augmented reality telescope system

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11195336B2 (en) 2018-06-08 2021-12-07 Vulcan Inc. Framework for augmented reality applications
US10996831B2 (en) 2018-06-29 2021-05-04 Vulcan Inc. Augmented reality cursors
US11899214B1 (en) 2018-09-18 2024-02-13 Apple Inc. Head-mounted device with virtually shifted component locations using a double-folded light path
US11029805B2 (en) * 2019-07-10 2021-06-08 Magic Leap, Inc. Real-time preview of connectable objects in a physically-modeled virtual space
US11372478B2 (en) * 2020-01-14 2022-06-28 Htc Corporation Head mounted display
CN114637118A (en) * 2022-04-01 2022-06-17 业成科技(成都)有限公司 Head-up display and operation method thereof

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140300632A1 (en) * 2013-04-07 2014-10-09 Laor Consulting Llc Augmented reality apparatus

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2310114A1 (en) * 1998-02-02 1999-08-02 Steve Mann Wearable camera system with viewfinder means
EP1064783B1 (en) * 1998-03-25 2003-06-04 W. Stephen G. Mann Wearable camera system with viewfinder means
US20020030637A1 (en) * 1998-10-29 2002-03-14 Mann W. Stephen G. Aremac-based means and apparatus for interaction with computer, or one or more other people, through a camera
JP3604990B2 (en) * 2000-03-03 2004-12-22 キヤノン株式会社 Image observation system
US7639208B1 (en) 2004-05-21 2009-12-29 University Of Central Florida Research Foundation, Inc. Compact optical see-through head-mounted display with occlusion support
KR100542370B1 (en) 2004-07-30 2006-01-11 한양대학교 산학협력단 Vision-based augmented reality system using invisible marker
JP2007058194A (en) * 2005-07-26 2007-03-08 Tohoku Univ High-reflectance visible-light reflector member, liquid-crystal display backlight unit employing the same and manufacturing method of the high-reflectance visible-light reflector member
US7522344B1 (en) * 2005-12-14 2009-04-21 University Of Central Florida Research Foundation, Inc. Projection-based head-mounted display with eye-tracking capabilities
JP2012155122A (en) * 2011-01-26 2012-08-16 Kyocera Optec Co Ltd Optical unit and infrared imaging sensor
US9195067B1 (en) 2012-09-28 2015-11-24 Google Inc. Wearable device with input and output structures
CN106255917B (en) * 2014-04-30 2018-12-07 夏普株式会社 Reflective projection type display device
US20170010467A1 (en) 2015-07-08 2017-01-12 Castar, Inc. Hmpd with near eye projection
DE102015118520A1 (en) * 2015-10-29 2017-05-04 Krauss-Maffei Wegmann Gmbh & Co. Kg Head-mounted display device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140300632A1 (en) * 2013-04-07 2014-10-09 Laor Consulting Llc Augmented reality apparatus

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111818265A (en) * 2020-07-16 2020-10-23 北京字节跳动网络技术有限公司 Interaction method and device based on augmented reality model, electronic equipment and medium
CN111818265B (en) * 2020-07-16 2022-03-04 北京字节跳动网络技术有限公司 Interaction method and device based on augmented reality model, electronic equipment and medium
CN112731645A (en) * 2021-01-12 2021-04-30 塔普翊海(上海)智能科技有限公司 Augmented reality telescope system

Also Published As

Publication number Publication date
IL250869A0 (en) 2017-04-30
IL250869B (en) 2021-06-30
AU2018226651A1 (en) 2019-10-17
KR20190141656A (en) 2019-12-24
CA3055143A1 (en) 2018-09-07
JP2020511687A (en) 2020-04-16
TW201843494A (en) 2018-12-16
EP3590098A1 (en) 2020-01-08
EP3590098A4 (en) 2021-01-20
RU2019129442A (en) 2021-04-01
SG11201907794YA (en) 2019-09-27
WO2018158765A1 (en) 2018-09-07
US20200012107A1 (en) 2020-01-09
US10890771B2 (en) 2021-01-12

Similar Documents

Publication Publication Date Title
US10890771B2 (en) Display system with video see-through
JP6433914B2 (en) Autostereoscopic augmented reality display
US6078427A (en) Smooth transition device for area of interest head-mounted display
KR101845350B1 (en) Head-mounted display device, control method of head-mounted display device, and display system
CN103917913B (en) Head mounted display, the method controlling optical system and computer-readable medium
US6222675B1 (en) Area of interest head-mounted display using low resolution, wide angle; high resolution, narrow angle; and see-through views
CN110275297B (en) Head-mounted display device, display control method, and recording medium
KR20180062946A (en) Display apparatus and method of displaying using focus and context displays
EP3714318B1 (en) Position tracking system for head-mounted displays that includes sensor integrated circuits
US11178380B2 (en) Converting a monocular camera into a binocular stereo camera
EP1276333A2 (en) Adaptive autostereoscopic display system
CN106707508A (en) Cap type virtual reality display image system
CN107076984A (en) Virtual image maker
US11036050B2 (en) Wearable apparatus and unmanned aerial vehicle system
US10609364B2 (en) Pupil swim corrected lens for head mounted display
JP2016536635A (en) System and method for reconfigurable projection augmented reality / virtual reality appliance
CN112655202B (en) Reduced bandwidth stereoscopic distortion correction for fisheye lenses of head-mounted displays
JP2022552586A (en) Multipass scanner for near-eye display
JP4270347B2 (en) Distance calculator
KR101941880B1 (en) Focus Free Type Display Apparatus
US20220350149A1 (en) Waveguide configurations in a head-mounted display (hmd) for improved field of view (fov)
CN117957479A (en) Compact imaging optics with distortion compensation and image sharpness enhancement using spatially positioned freeform optics
WO2020137088A1 (en) Head-mounted display, display method, and display system
WO2022064564A1 (en) Head-mounted display
JP2949116B1 (en) Display device using reflection optical system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40021806

Country of ref document: HK

WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200117

WD01 Invention patent application deemed withdrawn after publication