WO2005043218A1 - Dispositif d'affichage d'image - Google Patents

Dispositif d'affichage d'image Download PDF

Info

Publication number
WO2005043218A1
WO2005043218A1 PCT/JP2004/016025 JP2004016025W WO2005043218A1 WO 2005043218 A1 WO2005043218 A1 WO 2005043218A1 JP 2004016025 W JP2004016025 W JP 2004016025W WO 2005043218 A1 WO2005043218 A1 WO 2005043218A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
observer
specific object
display device
unit
Prior art date
Application number
PCT/JP2004/016025
Other languages
English (en)
Japanese (ja)
Inventor
Shoji Yamada
Yasufumi Mase
Original Assignee
Brother Kogyo Kabushiki Kaisha
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2004068889A external-priority patent/JP4599858B2/ja
Application filed by Brother Kogyo Kabushiki Kaisha filed Critical Brother Kogyo Kabushiki Kaisha
Publication of WO2005043218A1 publication Critical patent/WO2005043218A1/fr
Priority to US11/413,046 priority Critical patent/US7825996B2/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor
    • H04N5/7475Constructional details of television projection apparatus
    • H04N5/7491Constructional details of television projection apparatus of head mounted projectors

Definitions

  • the present invention relates to an image display device that allows a viewer to visually recognize a display target as a virtual image by projecting light onto a retina of the viewer, and particularly relates to an image display device that displays an image to the viewer.
  • the present invention relates to a technology for performing control in consideration of a user's operation.
  • An image display device for projecting light onto an observer's retina to allow the observer to view a display object with a virtual image is already known (for example, see Japanese Patent Application Laid-Open No. 8-21975). .).
  • This type of image display device generally includes (a) an emission unit (for example, a light source) that emits light;
  • an emission unit for example, a light source
  • a modulator for modulating light for example, an intensity modulator, a wavefront modulator, etc.
  • a display for displaying a display object (content or object) as a virtual image in an image display area It is configured to include a control unit that controls the emission unit and the modulation unit so that an image is displayed in the image display area.
  • One type of this type of image display device is a head mounted type in which the display unit is used by being mounted on the observer's head, and one prior art is disclosed in Japanese Patent Application Laid-Open No. Hei 8-21975. Is disclosed in Japanese Patent Application Laid-Open No. Hei 8-21975. Is disclosed in Japanese Patent Application Laid-Open No. Hei 8-21975. Is disclosed in Japanese Patent Application Laid-Open No. Hei 8-21975. Is disclosed in Japanese Patent Application Laid-Open No. Hei 8-21975. Is disclosed in Japanese Patent Application Laid-Open No. Hei 8-21975. Is disclosed in Japanese Patent Application Laid-Open No. Hei 8-21975. Is disclosed in Japanese Patent Application Laid-Open No. Hei 8-21975. Is disclosed in Japanese Patent Application Laid-Open No. Hei 8-21975. Is disclosed in Japanese Patent Application Laid-Open No. Hei 8-21975. Is disclosed in Japanese Patent Application Laid-Open No. Hei 8-21
  • the observer can display the image regardless of the observer's movement.
  • the image will be observed at the same position within the area (fixed to the observer's head). That is, the observer observes the image in a state where the image is fixed to the display unit, that is, in a state where the display position of the image is fixed in the field of view of the observer.
  • the image display device is a see-through type
  • an observer can superimpose an image on a real outside world and observe it together with the image.
  • this image display device is designed so that an image is observed while being fixed in the field of view of the observer, The image is fixed in the observer's field of view despite the movement of the body (eg, body movement, head movement). For this reason, an observer who recognizes that an image should be originally observed in a fixed external world is perceived as unnatural to an image immovably displayed in a visual field, not in the external world.
  • the present inventor has studied the above-mentioned head-mounted image display device on the relationship between the movement of the observer and the display of an image to the observer. As a result, it is possible to control the display of an image to the observer by relatively detecting the observer's movement based on the observer's position and referring to the detected observer's movement. I realized that it was possible.
  • the present invention provides an image display device that projects light on the retina of an observer so that the display target can be visually recognized by the observer as a virtual image.
  • the purpose of this study was to propose a new technology to control the motion of the observer in consideration of the observer's motion.
  • An emission unit that emits light is an emission unit that emits light
  • a modulator for modulating the light is a modulator for modulating the light
  • a display unit that emits the modulated light to an observer's retina through an emission port to display an image representing the display target as a virtual image in an image display area, and is attached to the observer's head.
  • a control unit that controls the emission unit and the modulation unit so that the image is displayed in the image display area, and relatively detects an operation of the observer with reference to the position of the observer; Performing at least one of on / off of the display of the image and change of a display position of the image in accordance with the detected operation;
  • An image display device including:
  • an observer uses an image display device in which the observer uses a virtual image to project a display target by projecting light onto the observer's retina
  • the observer may use an image display device.
  • the viewer wishes to temporarily stop displaying the image, prevents the observer's view from being disturbed by the image displayed in the image display area, or prevents the observer from moving. Therefore, there is a possibility that the user desires to move the display position in a desired relationship.
  • the image displayed in the image display area is temporarily deleted, or the displayed image is a part of the body of the observer (for example, It is possible to make it seem as if it is fixed to the waist). Furthermore, it is possible to change the normal position (for example, the center position) where the image is normally displayed in the image display area to the retracted position (for example, a position close to the edge of the image display area) retracted from the normal position. It is.
  • the image display device may, for example, issue a command to temporarily stop (turn off) the display of an image (or a finger for restarting (turning on) the display of a temporarily stopped image). Command) or a command to change the display position of the image to the retreat position from the normal position (or a command to return the temporarily changed display position to the normal position) by the observer.
  • a command to temporarily stop (turn off) the display of an image or a finger for restarting (turning on) the display of a temporarily stopped image.
  • a command to change the display position of the image to the retreat position from the normal position or a command to return the temporarily changed display position to the normal position
  • the observer is forced to use his hand to operate the operation unit. Therefore, in this case, the observer must temporarily suspend the manual operation in order to observe the image while performing another manual operation. Further, when the position of the operation unit changes, the target position where the observer moves his hand to operate the operation unit also changes accordingly. As a result, for example, each time the user attempts to manually operate the operation unit, he or she must visually or tactilely search for the position of the operation unit and move his / her hand to that position prior to the manual operation. It has to be done and it is troublesome.
  • the motion of the observer is relatively detected with reference to the position of the observer, and the display of the image is displayed in accordance with the detected motion.
  • Display control is performed, which is at least one of ON'OFF and change of the display position.
  • this image display device it is necessary for the image display device to detect the above-mentioned wishes of the image display (for example, turning on and off the image display and changing the image display position). It is not indispensable to provide such an operation unit in the image display device. Further, it is not essential for the observer to use the hand to operate the operation unit in order for the observer to input such a request into the image display device.
  • the image display device of this section in order for the observer to input a request for image display to the image display device, the observer performs the operation in parallel while using the image display device. It is possible to implement the image display device in such a manner that the work does not need to be interrupted.
  • an operation required of the observer in order for the observer to input a request for image display to the image display device is performed by a position of the image display device (for example, , The position of the operation unit).
  • a position of the image display device for example, , The position of the operation unit.
  • the observer searches for a specific target object (eg, a button, a switch, a dial, a touch panel, etc.) of the operation unit and inputs the desired image display. It is not necessary to perform an operation.
  • the operation of the observer referred to by the image display device for recognizing the observer's desire for image display depends on the position of the observer (of the observer). (A reference part). Therefore, the motion of the observer detected to recognize the observer's desire does not depend on the observer's absolute space, that is, the position of the observer relative to the outside world.
  • this image display device it is possible for the observer to configure the hardware for detecting the movement of the observer in a self-contained manner. That is, it is possible to omit installing hardware for the detection in the outside world. This contributes to the simplification of the hardware configuration for detecting the observer's motion.
  • the "modulation unit” determines the intensity of each light beam component and each light beam component or At least the intensity of the wavefront curvature of the combined light beam can be modulated. It is not essential to configure to modulate the wavefront curvature.
  • An example of the "emitting unit” in this section is a laser light source.
  • An example of the laser light source is a light source having a function of modulating the intensity of emitted light, such as a semiconductor laser. This light source has both a function as an “emission unit” and a function as a “modulation unit”.
  • “change of display position” in this section means, for example, change of a display position of an image in an image display area.
  • a detection unit that detects a relative position with respect to a reference part selected in advance to set a reference position associated with the control unit, wherein the control unit performs the display control based on the detected relative position.
  • the “reference position” in this section is set, for example, as a position on a certain part of the human body, or is a force that separates a certain part force, and is geometrically fixed to a certain part. It can be set as a position.
  • the detection unit includes a first portion and a second portion that can be relatively displaced from each other, and
  • the first part is attached to the detected part, moves integrally with the detected part, and
  • the two parts are attached to the reference part and move integrally with the reference part, and the detection unit further includes detection means for detecting a relative position between the first part and the second part.
  • Item 10 The image display device according to Item 1.
  • the detector since a detector for detecting the movement of the observer is attached to the observer, the detector is mounted on a part installed in the real world (for example, a fixed part such as a floor or a wall). (A sensor installed on a kimono) is not required.
  • the “detection means” in this section detects, for example, a relative position between the first part and the second part using at least one of an electric field, a magnetic field, an electromagnetic wave, light, and a sound wave. It is possible that
  • One of the first part and the second part is a signal generation unit for generating a detection signal for detecting the relative position, and the other is a generated detection signal.
  • the image display device according to item (3) which is a receiving unit that receives an object that propagates in space.
  • the detection unit uses a signal propagating in space to thereby move the first portion that moves integrally with the detected portion of the human body, and the reference portion of the human body.
  • the relative position between the position and the second part moving integrally is detected. Therefore, for the detection, the first part and the second part are connected to a moving mechanism such as a link mechanism, a cable, Connection with each other by a tangible signal transmission medium such as a key.
  • the range in which the relative position between the first part and the second part in the detection unit can be changed in order to achieve the purpose of detecting the movement of the observer It is not restricted, and thus the range in which the observer's movement can be changed is not restricted.
  • the "detection signal" in this section is, for example, an electric field, a magnetic field, an electromagnetic wave, light, a sound wave, or the like.
  • the control unit performs the display control based on the detected relative position when an observer's operation satisfies a set condition (1).
  • the image display device according to any one of the above.
  • An example of the "setting condition" in this section is a condition relating to the orientation of the head of the observer, and the head is oriented in a direction in which the observer faces the front (for example, the observer's head is shifted). Is looking sideways or down).
  • Another example is a condition related to the movement of the observer's hand, where the hand is lifted to a certain position.
  • a normal directional force for example, a force for rotating the eyeball, or a rotation of the head
  • the setting condition includes a first setting condition and a second setting condition.
  • the display of the image is switched to the on state and the off state to temporarily stop the image display, and the display position of the image is changed to the normal position. At least one of moving to the shelter position is performed.
  • the display of the image is switched to the off state and the display is switched to the on state, and the image display is resumed. Therefore, for example, it is possible to display as if the position of the display image is fixed to a part of the body of the observer (for example, the waist). Furthermore, it is also possible to carry out at least one of returning the image display position from the retracted position to the normal position.
  • the detected part is set on at least the head of the observer's head, arms, and feet, and the reference part is set on the observer's waist (2).
  • the reference part is set on the observer's waist (2).
  • the parts to which the observer can give relatively remarkable positional changes include, for example, a head, an arm, and a foot.
  • the waist is a part whose position is relatively stable despite changes in posture.
  • the head moves as a part that moves with the observer's eyes.
  • the detected part is set at least at the head of the observer's head, arm, and foot, and the reference part is the observation part. Is set on the waist of the person.
  • the control unit controls the emission unit and the modulation unit such that an observer observes the image at a position determined with reference to the reference region (2).
  • the image display device according to any one of (6) and (5).
  • the image display device when an image is displayed by the image display device based on the position of the display unit of the image display device, even if the observer performs any operation, the image is displayed on the display of the observer's head. Since the position relative to the part does not change, the observer observes the image at the same position in the image display area (fixed to the observer's head) regardless of the observer's movement. .
  • the image display device is a see-through type, that is, a format that allows an observer to superimpose an image on the real world and observe it together with the image, the observer may use the image display device.
  • a specific task may be performed while intensively observing a specific object in the outside world with the display unit attached to the head.
  • the related information in the course of such a work, it is conceivable to display the related information as an image in the image display area in order to visually refer to information related to the work.
  • the display image is superimposed on the target object, and thus the target object to be observed may be at least partially obstructed by the display image.
  • the observer may wish to observe the above-mentioned related information with his / her gaze deviating from a specific target!
  • a constant posture is when the observer is deeply lowered to look at his waist.
  • Another example is when an observer turns his or her head to look obliquely in front of or almost right beside it.
  • the observer bends his or her arm so that the part of the arm between the wrist and elbow is in front of the observer's upper body and is almost horizontal. If the observer tilts his or her head to look directly at the site, it is a bad idea.
  • the position of the observer who does not display an image based on the position of the display unit of the image display device that is, An image is displayed based on a reference part of a human body as an observer.
  • the detected part for example, the head
  • the reference part for example, the waist
  • the image display state that is, in the ON state
  • the image display position changes with a change in the relative position between the head and the waist or the arm.
  • determining the display position of the image according to the observer's operation and determining the ON / OFF state of the image display according to the observer's operation are independent of each other. There is no coordination between them.
  • the image display device if the display position of the image is determined according to the observer's operation, the image is displayed in the image display area. In such a case, the image is actually displayed in the image display area after waiting for another observer's intention display (for example, including the operation of the operation unit such as a switch by the observer).
  • the operation of the observer for determining the display position of the image and the intention of the observer for finally permitting the image display are independent of each other.
  • the control unit includes: (a) a definition coordinate system used to define the display target; and (b) the image, in order to display the image in the image display area.
  • a display coordinate system that defines an image to be displayed in the display area, the coordinate system being fixed to the display unit,
  • the control unit further detects a relative relationship between the position and orientation of the observer's head and a reference coordinate system predetermined to be associated with the human body, and reflects the detected relative relationship.
  • the definition data defining the display object on the definition coordinate system is converted into display data for displaying the image on the display coordinate system, and based on the converted display data,
  • the image display device according to any one of (1) to (7), which controls the emission unit and the modulation unit.
  • the position and orientation of the observer's head are displayed on the definition coordinate system so as to reflect the relative relationship between the position and the orientation of the observer's head and the reference coordinate system associated with the human body.
  • the definition data defining the target is converted into display data for displaying an image on the display coordinate system.
  • the emission unit and the modulation unit are controlled based on the converted display data.
  • the position at which the image is displayed in the image display area is determined according to the operation related to the observer's head. Since the determined position conceptually includes a position that is not displayed in the image display area, according to this image display device, an image is displayed in the image display area in accordance with an operation related to the observer's head. It will be decided together.
  • the control unit determines that the coordinate system for definition is fixed to the coordinate system for display, and sets the coordinate system for definition to!
  • This image display device is an observer reference display mode described later, which displays an image at a position determined based on the position of the observer so that the observer observes the image.
  • the definition data that defines the display target on the definition coordinate system, that is, the display, so that the position and orientation of the observer's head and the reference coordinate system associated with the human body are reflected.
  • the target data is converted into display data for displaying an image on the display coordinate system.
  • This image display device is a display unit reference display mode described later instead of the observer reference display mode, and is located at a position determined based on the position of the display unit of the image display device. It is possible to do something that displays the image as an observer observes it.
  • the image display device of this section when executing the display unit reference display mode, substantial coordinate conversion is performed on the definition data defining the display target on the definition coordinate system.
  • the definition data without any action is effectively mapped to the display coordinate system.
  • display data for displaying an image on the display coordinate system is obtained as data equal to the definition data.
  • the control unit sets the definition coordinate system to!
  • the image display device according to (8) wherein the display target data in which the display target is defined is converted into display data defined in the display coordinate system. According to this image display device, since the definition coordinate system and the reference coordinate system are equivalent, the image is displayed as if it were fixed to the reference coordinate system.
  • the original data expressed in the defining coordinate system is based on the position and orientation of the observer's head and the reference coordinate system associated with the human body. Based on the relative relationship, the image data is converted into display data for displaying an image on the display coordinate system, whereby the display target data, which is the definition data defined on the definition coordinate system, is mapped to the display coordinate system.
  • the waist is a part whose position is stable irrespective of a change in posture of the human body. Therefore, when it is necessary to relatively detect the position and orientation of the head with reference to the human body, it is desirable to perform the detection based on the position of the waist. Therefore, for example, when an image is displayed in association with the waist of the observer, whether or not the image can be observed by the observer is determined based on the relative relationship between the waist and the head, and the image is observed. In this case, the position at which the image is observed is further determined by the observer.
  • both the reference coordinate system and the defining coordinate system are used. It should be fixed to the waist. This means that the reference coordinate system is also a definition coordinate system.
  • the control unit is different from any of the definition coordinate system and the display coordinate system and is associated with a third coordinate system associated with any part of the human body of the observer.
  • the coordinate system for definition is fixed, and the data for display in which the display target is defined is converted into the display data defined in the coordinate system for display in the coordinate system for definition.
  • the image display device according to (8).
  • the definition The reference coordinate system can be fixed to the waist while the coordinate system is fixed to the arm.
  • This example is an example in a case where the reference coordinate system is a coordinate system different from both the definition coordinate system and the display coordinate system.
  • the reference coordinate system is:
  • the reference coordinate system in (8) or (11) is set to: (a) at least one reference coordinate system associated with the human body; (B) the position information of at least two points associated with the human body, and the combination of at least two pieces of orientation information associated with at least one point; A combination of at least one orientation information associated with at least one of the points and (c) a location information set of at least three points associated with the human body. , Which is specified.
  • the "point associated with the human body” in this section does not only mean a point located on the human body but also an object having a generally fixed positional relationship with the human body (for example, an object mounted on the human body). It may mean a point on the aid.
  • the object can be, for example, an object held and used by the hand of a human body, or an object used by being attached to a part of the human body other than the hand. This object may be configured as a single object in which a plurality of points are commonly associated with each other, or as a plurality of independent objects in which the plurality of points are individually associated one by one or a plurality of points. It is possible to do.
  • the reference coordinate system is specified by position information of three points associated with a single part of the human body.
  • Image display device
  • the "three points" in this section are, for example, three points that are generally arranged at equal intervals on the outer circumference of a cross section when a human body is cut along a generally horizontal plane.
  • the reference coordinate system includes position information of two points associated with a single part of the human body and a direction of gravity acting on one point associated with the single part.
  • the image display device according to any one of (11) to (11).
  • the "two points” in this section are, for example, two points associated with the waist of the human body, which are respectively associated with the left end point and the right end point of the waist, and the left arm of the human body. Two points associated with the region between the wrist and the elbow joint, either of the right arm or two points associated with the shoulder of the human body It is associated with the left end point and the right end point, respectively.
  • the control unit includes: an observer reference display mode for displaying the image at a position determined based on the reference region so that the observer observes the image; Selecting one of a plurality of display modes including a display unit reference display mode for displaying the image at a position determined based on the position so that an observer can observe the image, and selecting the selected mode.
  • the image display device according to any one of (2) to (14), wherein the image is displayed according to the display mode.
  • the mode for displaying an image is such that the image is displayed so that the observer observes the image at a position determined based on the reference region of the observer.
  • a plurality of types of display modes including an observer reference display mode and a display unit reference display mode for displaying an image at a position determined based on the position of the display unit so that the observer observes the image. Selected to either. [0085] Therefore, according to this image display device, it is possible to change the display mode for displaying an image in accordance with an observer's request or a command from another device. It becomes easy to improve versatility.
  • control unit in this section can select a display mode, for example, in accordance with a command from an observer, or in accordance with a command signal of another device power.
  • the "plural types of display modes" in this section further include a real world reference display mode in which the image is displayed at a position determined with reference to the position of the real world so that the observer observes the image. May be included.
  • the "observer command" in this section means, for example, that the observer operates a specific switch.
  • Specific movements of the observer are, for example, head movements, limb movements, eye blinks, and pupil movements.
  • the image display device is of a see-through type that enables an observer to superimpose an image representing the display target on the outside of the real world and observe it together (1) to (16).
  • this image display device display of an image to the observer is controlled in accordance with the observer's operation. Therefore, by adopting the see-through type, in an environment where the image is superimposed on the real world and observed together with it, the observer's movement causes the observer to display the image in relation to the real world. It is easy to reduce the possibility of feeling unnatural.
  • an object that the observer preferentially observes (that is, an object that is particularly gazed at) according to the observer's motion is an image and a specific object in the real world. Can be switched as appropriate. As a result, it is easy to improve the usability of the image display device.
  • the image display device is of a retinal scanning type in which an image is displayed to an observer by two-dimensionally scanning a light beam on the retina of the observer,
  • the modulator modulates a wavefront radius of curvature of a light beam entering the pupil of the observer from the image display area for each frame of an image to be displayed or for each partial area into which the frame is divided.
  • the image display device according to any one of (1) and (17), including a modulation unit.
  • the radius of curvature of the wavefront of light forming an image projected on the retina of the observer is different for each frame of the image to be formed or for each of the divided parts of the frame. Modulated for each area.
  • each partial area in this section is, for example, each pixel constituting a frame, a pixel group adjacent to each other and having a plurality of pixel forces, and the like.
  • the image includes a plurality of objects to be displayed at once, and the plurality of objects includes depth position information for specifying a depth position of each object, and a plurality of objects on a plane at a depth position of each object.
  • the plurality of objects includes depth position information for specifying a depth position of each object, and a plurality of objects on a plane at a depth position of each object.
  • the image display device further includes:
  • a selection unit that selects any of the plurality of objects as a specific object
  • An image processing unit that performs image processing for changing a display mode of the image based on the depth position information corresponding to the specific object selected by the selection unit;
  • the image display device including:
  • the display mode of the image to be displayed can be changed based on the depth position of the specific object. Therefore, the image to be displayed can be changed based on the depth position of the specific object. It can be displayed in various display modes. Therefore, for example, the observer can clearly see the specific object.
  • this image display device it is possible to detect a specific object that the observer is paying attention to, and to perform image processing based on the depth position of the detected specific object. Therefore, for example, the specific object can be clearly seen by the observer.
  • the specific object detection unit includes:
  • a gaze direction detection unit that detects the gaze direction of the observer
  • image information corresponding to the detected line-of-sight direction is searched from among the plurality of image information, and the specific object is searched for based on the searched image information.
  • a decision unit that decides
  • this image display device by detecting the direction of the line of sight of the observer, it is possible to detect the specific object that the observer is looking at.
  • the specific object detection unit includes:
  • a line-of-sight direction detection unit that detects both the left and right gaze directions of the observer, and a line-of-sight direction of the left and right eyes detected by the line-of-sight direction detection unit and a distance between the left and right eyes
  • a calculation unit that calculates the position of the gazing point of the observer based on the calculation result, and detects the specific object based on the calculation result.
  • a specific object of interest to the observer can be detected more accurately by considering the distance between the eyes in addition to the direction of the observer's line of sight.
  • the image processing unit includes a depth position detecting unit that detects a depth position of each of the objects based on the depth position information, wherein the image processing unit Different display modes of at least a specific object among other objects displayed at the same depth position as the specific object and a non-specific object displayed at a depth position different from the depth position of the specific object.
  • a non-specific object displayed at a depth position different from the depth position of the specific object can be displayed in a different mode from the specific object. Therefore, for example, the specific object can be clearly viewed by the observer.
  • the image processing unit includes means for changing the luminance of the specific object (1
  • the brightness of the specific object can be changed based on the depth position of the specific object, so that the observer can clearly see the specific object, for example.
  • the image processing unit includes means for clarifying the outline of the specific object
  • this image display device it is possible to change the sharpness of the outline of the specific object based on the depth position of the specific object. Therefore, for example, the specific object can be clearly viewed by the observer.
  • the contour can be provided along the outer shape of the specific object, for example, the observer can clearly see the specific object.
  • the image processing unit includes means for displaying a non-specific object located in front of the specific object in the plurality of objects in a transparent or translucent manner based on a detection result of the depth position detecting unit.
  • the image display device according to any of (27) to (24).
  • the non-specific object located in front of the specific object is displayed in a transparent or translucent manner. It can be clearly seen without being disturbed by a specific object.
  • An image display device for displaying an image, wherein the image to be displayed includes a plurality of objects to be displayed at once, and the plurality of objects specify a depth position of each object.
  • a plurality of image information having depth position information and plane position information for specifying the position of each object on the plane at the depth position of each object, which is displayed three-dimensionally.
  • a selection unit that selects any of the plurality of objects as a specific object
  • An image processing unit for performing image processing for changing a display mode of the image based on the depth position information corresponding to the specific object selected by the selection unit;
  • An image display device including:
  • a retinal scanning display that directly displays an image on a human retina is already known.
  • the retinal scanning display 400 modulates light in accordance with image information, and causes the modulated light H to be incident on the projection glasses 402 for scanning, thereby performing the projection.
  • the image is optically displayed by the reflected light from the light reflecting portion 404 of the glasses 402.
  • the modulated light H is reflected by the projection glasses 402 and passes through the pupil M2 surrounded by the iris Ml of the observer's eye M. Is focused on the lens M3. By scanning the collected light on the retina M4, an image is displayed on the viewer's retina M4.
  • the angle of diffusion of the modulated light H is shown in FIG.
  • the angle can be changed to each angle indicated by a dashed line.
  • the observer can make the virtual images (images) exist at the positions Pl, P2, and P3, respectively. recognize. Therefore, according to this type of retinal scanning display, an image can be displayed stereoscopically by changing the angle of diffusion of the modulated light H in this manner.
  • an image can be displayed in three dimensions.
  • the observer focuses his or her eye on the distant object and observes the distant object, Because the eyes are out of focus in the foreground, the foreground is perceived as blurry.
  • the eye is focused on a near-view object, the distant object is perceived as blurred because the eye is not focused on a distant place.
  • an object of the image display device is to provide an image display device capable of displaying an image having an object in three dimensions, regardless of the depth position of the object. It is necessary to provide an image that allows the observer to clearly view the image.
  • the display mode of the image to be displayed can be changed based on the depth position of the specific object. It can be displayed in various display modes based on the depth position. Therefore, for example, a clear visual observation of an image by an observer becomes possible.
  • the image processing unit according to the mode (29), wherein the image processing unit performs image processing for displaying the specific object selected by the selection unit and a non-specific object other than the specific object in different display modes.
  • the image display device as described in the above.
  • the specific object can be displayed in a display mode different from that of the non-specific object, the degree of freedom in the display mode of the specific object is improved.
  • the selecting unit may be configured to select an object of interest of the observer among the plurality of objects.
  • this image display device it is possible to detect the specific object that the observer is paying attention to, and to perform image processing based on the depth position of the detected specific object. Therefore, for example, the specific object can be clearly seen by the observer.
  • the specific object detection unit includes:
  • a gaze direction detection unit that detects the gaze direction of the observer
  • image information corresponding to the detected line-of-sight direction is searched from among the plurality of image information, and the specific object is searched for based on the searched image information.
  • a decision unit that decides
  • this image display device it is possible to detect the specific object that the observer is looking at by detecting the direction of the line of sight of the observer.
  • the specific object detection unit includes:
  • a line-of-sight direction detection unit that detects both the left and right gaze directions of the observer, and a line-of-sight direction of the left and right eyes detected by the line-of-sight direction detection unit and a distance between the left and right eyes
  • a calculation unit that calculates the position of the gazing point of the observer based on the calculation result, and detects the specific object based on the calculation result.
  • a specific object of interest to the observer can be detected more accurately by considering the distance between the eyes in addition to the direction of the observer's line of sight.
  • the image processing section includes a depth position detecting section for detecting a depth position of each of the objects based on the depth position information, wherein the image processing section has the same depth position as the specific object and the depth position of the specific object.
  • Image processing of displaying at least a specific object of another object displayed at a depth position of the object and a non-specific object displayed at a depth position different from the depth position of the specific object in different display modes (29)
  • the image display device according to any of (33) to (33). According to this image display device, a non-specific object displayed at a depth position different from the depth position of the specific object can be displayed in a mode different from that of the specific object. Therefore, for example, the specific object can be clearly viewed by the observer.
  • the image processing unit includes means for changing the brightness of the specific object (2
  • the brightness of the specific object can be changed based on the depth position of the specific object, and therefore, for example, the observer can clearly see the specific object.
  • the image processing unit includes means for clarifying the outline of the specific object (
  • this image display device it is possible to change the sharpness of the outline of the specific object based on the depth position of the specific object. Therefore, for example, the specific object can be clearly viewed by the observer.
  • the image processing unit may include means for displaying a non-specific object located in front of the specific object in the plurality of objects transparently or translucently based on a detection result of the depth position detection unit.
  • the image display device according to any of (37) to (37).
  • the non-specific object located in front of the specific object is displayed in a transparent or translucent manner, so that the observer is not obstructed by the non-specific object in front of the specific object. It can be seen clearly.
  • the image processing unit includes means for changing a display mode of a non-specific object within a range forming a predetermined viewing angle with the specific object among the plurality of objects (31).
  • the image display device according to item (37). According to this image display device, it is possible to change the display mode of a non-specific object within a range that forms a predetermined viewing angle with the specific object, so that non-specific objects within the range are hidden. And the like. Therefore, for example, the specific object can be clearly seen by the observer.
  • the image processing unit includes a display object position change unit that changes and displays at least one of the specific object and a non-specific object other than the specific object among the plurality of objects.
  • the image display device according to any one of (39) to (39).
  • the relative depth position between the specific object and the non-specific object can be arbitrarily changed and displayed. Therefore, for example, the observer can clearly see the specific object.
  • a light beam output unit that outputs a light beam according to the plane position information
  • a wavefront curvature modulation unit that modulates a wavefront curvature of the light beam output from the light beam output unit according to the depth position information
  • the image display device according to item (40), wherein the display object position changing unit controls the wavefront curvature modulation unit.
  • an arbitrary object can be displayed at an arbitrary depth position by controlling the wavefront curvature modulation unit. Therefore, for example, the specific object can be clearly viewed by the observer.
  • a depth position detecting unit for detecting a depth position of each of the objects based on the depth position information is included, and the display object position changing unit is configured to detect a depth position of the object based on a detection result of the depth position detecting unit.
  • the image display device according to (40) or (41), further comprising: means for displaying the non-specific object located at a depth position farther than the depth position of the specific object by approaching the vicinity of the specific object. .
  • a non-specific object for example, a background image
  • a non-specific object farther than the specific object
  • the observer focuses the specific object and the non-specific object on the same diopter (focus) state. It can be seen without blurring.
  • a depth position detecting unit for detecting a depth position of each of the objects based on the depth position information is included, and the display object position changing unit is configured to detect a depth position of the object based on a detection result of the depth position detecting unit.
  • the image display device according to (40) or (41), further including means for displaying the non-specific object located at a depth position farther than the depth position of the specific object, further away from the specific object.
  • a non-specific object farther than the specific object can be displayed further away, so that the observer can clearly see the specific object in a state where the perspective is emphasized. Can be visually observed.
  • a luminous flux output unit that displays one pixel with one luminous flux
  • An overlapping object detection unit that detects a non-specific object to be displayed on the specific object at a position in front of the specific object from among the plurality of objects
  • the overlapping object detection unit detects the non-specific object, at least a portion of the non-specific object that should be displayed so as to overlap with the specific object is not hidden by the non-specific object. And the overlapping object display section
  • this image display device when a specific object should normally be displayed on the back side of a non-specific object in front of the specific object, at least the specific object of the non-specific object is displayed in a superimposed manner. The part to be done is displayed transparently. But Thus, the observer can recognize the specific object completely through the overlapping portion with the non-specific object.
  • the overlapping object display unit includes a second display unit that translucently displays at least a part of the non-specific object that is to be displayed so as to overlap the specific object.
  • the image display device as described in the above.
  • this image display device when a specific object should normally be displayed on the back side of a non-specific object in front of the specific object, at least the specific object of the non-specific object is displayed so as to overlap with the specific object.
  • the part to be done is displayed translucently. Therefore, the observer can recognize the specific object by partially transmitting the overlapping portion with the non-specific object.
  • the second display unit displays, in the non-specific object, a portion to be displayed overlapping the specific object, in the same color as the non-specific object and in the same color as the specific object.
  • the image display device further including a mixing ratio changing unit that displays the mixed colors mixed at a certain ratio and changes a ratio of each color in the mixed colors.
  • the image display device according to any of (47) to (47), further including a scanning unit that scans a light beam on the retina according to the plurality of pieces of image information.
  • an image can be displayed directly on the retina by scanning the light beam on the retina.
  • the image display device according to any one of (29) to (29), further including an operation unit operated to selectively disable the operation of the image processing unit. .
  • the operation of the image processing unit can be selectively disabled according to the user's preference. Therefore, usability of the image display device is improved.
  • FIG. 1 is a block diagram conceptually showing an image display device according to a first embodiment of the present invention.
  • FIG. 2 is a block diagram conceptually showing a configuration of a display optical system 10 in FIG. 1.
  • FIG. 3 is a perspective view showing an example in which the head mounting part 40 and the waist mounting part 50 in FIG. 1 are mounted on the head and the waist of the observer, respectively.
  • FIG. 4 is a block diagram extracting and showing several elements for relatively detecting the position and orientation of the observer's head with reference to the waist in the image display device shown in FIG. .
  • FIG. 5 is a flowchart conceptually showing a position detection program executed by computer 220 in FIG.
  • FIG. 6 is a perspective view for explaining a display coordinate system and a reference coordinate system used in the image display device shown in FIG. 1.
  • FIG. 7 is a flowchart conceptually showing a display data creation program executed by a computer in display data conversion circuit 70 in FIG. 1.
  • FIG. 8 is a block diagram conceptually showing an image display device according to a second embodiment of the present invention.
  • FIG. 9 is a flowchart conceptually showing an image processing program executed by computer 66 in FIG.
  • FIG. 10 is a block diagram and an optical path diagram for describing a configuration and operation of a retinal scanning display 200 according to a third embodiment of the present invention.
  • FIG. 11 is a diagram for explaining a configuration of modulated light output section 204 in FIG.
  • FIG. 12 is a diagram for explaining a configuration of a scanning unit 206 and a wavefront curvature modulation unit 208 in FIG.
  • FIG. 13 is a diagram for explaining the principle of the wavefront curvature modulation unit 208 in FIG. 12 modulating the wavefront curvature.
  • FIG. 14 is a flowchart conceptually showing an image processing program executed by computer 220 in FIG.
  • FIG. 15 is a perspective view showing an example of a display image used for specifically describing the image processing program shown in FIG.
  • FIG. 16 is a flowchart conceptually showing an image processing program executed by computer 220 in retinal scanning display 200 according to a fourth embodiment of the present invention.
  • FIG. 17 is a flowchart conceptually showing an image processing program executed by computer 220 in retinal scanning display 200 according to the fifth embodiment of the present invention.
  • FIG. 18 is a flowchart conceptually showing an image processing program executed by computer 220 in retinal scanning display 200 according to the sixth embodiment of the present invention.
  • FIG. 19 is a flowchart conceptually showing an image processing program executed by computer 220 in retinal scanning display 200 according to a seventh embodiment of the present invention.
  • FIG. 20 is a flowchart conceptually showing an image processing program executed by computer 220 in retinal scanning display 200 according to the eighth embodiment of the present invention.
  • FIG. 21 is a flowchart conceptually showing an image processing program executed by computer 220 in retinal scanning display 200 according to a ninth embodiment of the present invention.
  • FIG. 22 is a plan view for explaining execution contents of S702 in FIG. 21.
  • FIG. 23 is a front view showing an example of a display image used for specifically explaining the image processing program shown in FIG. 21.
  • FIG. 24 is a flowchart conceptually showing an image processing program executed by computer 220 in retinal scanning display 200 according to the tenth embodiment of the present invention.
  • FIG. 25 is a front view showing an example of a display image used for specifically explaining the image processing program shown in FIG. 24.
  • FIG. 26 is a perspective view for explaining a conventional example of a retinal scanning display.
  • FIG. 1 is a block diagram conceptually showing an image display device according to the first embodiment of the present invention.
  • This image display device is a retinal scanning display in which a display object is visually recognized by a viewer as a virtual image by projecting a light beam on the retina of the viewer's eye while scanning.
  • the image display device scans a light beam to display an image, and emits the scanned light beam to the observer's eye by directing the display optical system. Is provided.
  • the scanning light beam reaches the retina via the pupil of the observer's eye, so that the observer perceives the image.
  • the display optical system 10 includes a light source 12 that generates and emits a light beam, an intensity modulator 14, a wavefront modulator 16, a scanner 18,
  • the guiding part 20 and the guiding part 20 are included in a common housing 22.
  • One example of the light source 12 is a laser that generates a laser beam, and one example is a semiconductor laser.
  • the intensity modulator 14 modulates the intensity of a light beam incident from the light source 12, and one example of the intensity modulator 14 performs intensity modulation using an acousto-optic effect.
  • the wavefront modulator 16 modulates the wavefront curvature of the light beam incident from the intensity modulator 14.
  • An example of the wavefront modulator 16 performs wavefront modulation using a condenser lens or a reflection mirror arranged so that the position or the curvature is variable on the optical axis of the light beam.
  • the scanner 18 two-dimensionally scans the light beam incident from the wavefront modulator 16.
  • One example of the scanner 18 is a combination of a micromirror for horizontal scanning utilizing resonance of a vibrating body by a vibrator and a galvano mirror for vertical scanning.
  • the guiding section 20 is a section that guides the light flux emitted from the scanner 18 onto the retina of the observer.
  • One example of the guiding section 20 is a relay optical system, and another example is an optical system through which the light beam simply passes.
  • the luminous flux emitted from the guiding section 20 is formed on the housing 22. The light reaches the observer's retina through the light exit 24.
  • the emission port 24 is filled with a transparent material to prevent foreign matter from entering the inside of the housing 22.
  • the display optical system 10 is used by being mounted on the head of the observer. That is, this image display device is a head mounted type.
  • the light source 12 constitutes an example of the “emission section” in the above item (1), and the intensity modulator 14 and the wavefront modulator 16 are the same.
  • the intensity modulator 14 and the wavefront modulator 16 are the same.
  • the guide section 20 and the emission port 24 are configured together to constitute an example of the “display section” in the section.
  • the light source 12 has its own intensity modulation function like a semiconductor laser, it is not essential to provide the intensity modulator 14 independent of the light source 12.
  • the wavefront modulator 16 is unnecessary.
  • the light source 12, the intensity modulator 14, the wavefront modulator 16, and the scanner 18 are mounted on the observer's head.
  • the light source 12 and the intensity modulator 14 are attached to a part of the observer other than the head, for example, a waist, and a light beam emitted from the intensity modulator 14 is a flexible optical transmission medium (for example, The present invention can be implemented in a mode in which the light is transmitted to the head mounting unit 40 via an optical fiber).
  • the display optical system 10 is housed in a common housing 36 together with the interface unit 30, the magnetic field receiving unit 32, and the signal processing circuit 34 in the present embodiment, as shown in FIG.
  • the display optical system 10, the interface unit 30, the magnetic field receiving unit 32, and the signal processing circuit 34 cooperate with each other to form the head mounted unit 40.
  • FIG. 3 is a perspective view showing an example of a mode in which the head mounting portion 40 is mounted on the observer's head.
  • the head mounting part 40 is mounted on the head using the belt 42 so that the mounting position of the head mounting part 40 does not shift.
  • the display optical system 10 and the magnetic field receiving unit 32 are connected to the interface unit 30 in parallel with each other.
  • the magnetic field receiving unit 32 This is a sensor that detects the strength of the magnetic field.
  • the magnetic field receiving unit 32 is connected to the interface unit 30 via the signal processing circuit 34.
  • An example of the signal processing circuit 34 is a conversion circuit that converts a magnetic field strength signal output as an analog signal from the magnetic field receiving unit 32 and representing the strength of the received magnetic field into a digital signal (for example, magnetic field strength data). It is a vessel.
  • the image display device further includes a waist attachment section 50.
  • the waist mounting part 50 is used by being mounted on the waist of the observer.
  • FIG. 3 is a perspective view showing an example of a mode in which the waist attachment section 50 is attached to the waist of the observer.
  • the waist attachment portion 50 is attached to the waist using the belt 52 so that the attachment position of the waist attachment portion 50 is not shifted.
  • the waist attachment section 50 is configured to include a computer 66 having a CPU 60, a RAMZROM 62, and a bus 64 for connecting them.
  • Various programs including a position detection program described later are stored in the ROM section of the RAMZROM 62 in advance.
  • the bus 64 further includes a magnetic field generation unit 68, a display data conversion circuit 70, an operation unit 72, and an image information memory 74, respectively, corresponding to signal processing circuits 78, 80, 82, Connected via 84.
  • One example of each of the signal processing circuits 78, 80, 82, and 84 is a converter that converts a signal format between an analog signal and a digital signal, and another example is that a signal or data is temporarily converted.
  • the buffer to save.
  • the waist attachment section 50 further includes an interface section 90.
  • the interface section 90 is electrically connected to the interface section 30 of the head mounted section 40 via a flexible signal transmission medium 94 (see FIG. 3).
  • the interface unit 90 is connected in parallel with a node 64 and a display data conversion circuit 70.
  • the bus 64 of the waist-mounted section 50 is connected to the magnetic field receiving section 32 through the interface section 90, the signal transmission medium 94, the interface section 30, and the signal processing circuit 34 in that order, so that the magnetic field generated by the
  • the computer 66 can take in the magnetic field intensity signal from the receiver 32! / Puru.
  • the position detection program is executed by the computer 66 in response to the received magnetic field intensity signal, whereby the position and orientation of the head are detected with reference to the waist.
  • the magnetic field generation unit 68 is provided in the waist attachment unit 50 in order to detect the relative relationship of the head to the waist, in cooperation with the magnetic field reception unit 32 in the head attachment unit 40.
  • the magnetic field generation unit 68, the magnetic field reception unit 32, and the part of the computer 66 that executes the position detection program cooperate with each other, so that Relative relationship that relatively detects the position and orientation of the head (accurately, the position and orientation of the head-mounted part 40) with reference to the waist (accurately, based on the position of the waist-mounted part 50)
  • the detection unit 98 is constituted.
  • This relative relationship detector 98 uses a method similar or similar to the magnetic tracking method employed by magnetic trackers already known as position 'tracker or position Z The position and orientation are detected relative to the waist.
  • the magnetic field generating unit 68 is composed of three generating coils (X, ⁇ , Z) that are orthogonal to each other. It has a coil.
  • the magnetic field receiving unit 32 includes a receiving composite coil in which three mutually orthogonal receiving coils (U, V, W) are combined.
  • the composite coil for generation is incorporated in the waist attachment unit 50.
  • the installation direction of the composite coil for generation is defined such that, for example, the positive direction of the Y axis is in front of the observer, and the positive direction of the Z axis is vertically upward.
  • Such a definition can also be applied to the receiving composite coil in the magnetic field receiving unit 32.
  • the above-mentioned three generating coils can be integrally wound and installed at a single point, and are generally arranged on the same plane that crosses the waist mounting part 50. This is also possible. For example, two of the three generating coils can be arranged at the left end and the right end of the waist, respectively, and the other one can be arranged at the center of both ends. The arrangement described above can be similarly applied to the three receiving coils in the magnetic field receiving unit 32. [0193] In the present embodiment, the magnetic fields from the three generating coils belonging to the generating composite coil are received by the three receiving coils belonging to the receiving composite coil. Force A total of nine magnetic field strength data will be output.
  • the position detection program is executed by the computer 66 in order to relatively detect the position and orientation of the head with reference to the waist.
  • FIG. 5 conceptually shows a flowchart of the position detection program. This position detection program is repeatedly executed after the power of the computer 66 is turned on.
  • step S1 In each execution of the position detection program, first, in step S1 (hereinafter simply referred to as “S1”; the same applies to other steps), a magnetic field is generated by the magnetic field generating unit 68. It outputs a signal for instructing the user to perform the operation. In response, the magnetic field generator 68 also generates a magnetic field with the corresponding generated coil force. The magnetic field generator 68 generates a magnetic field from, for example, three generating coils in a time-division manner.
  • a magnetic field strength signal is fetched from the magnetic field receiving unit 32.
  • the magnetic field strength signal is a signal representing the strength of the magnetic field received by each receiving coil, and is taken in from the magnetic field receiving unit 32 in association with each receiving coil.
  • the position and orientation of the head-mounted unit 40 are relatively detected with reference to the waist-mounted unit 50 based on the captured magnetic field intensity signal.
  • the position and orientation of the head-mounted unit 40 are, for example, the position of one point (X-direction position, Y-direction position, and Z-direction position) representing the above-described generating coil, and It is detected based on the respective detection results (one position information and at least two direction information) for at least two directions (for example, the X direction and the Y direction) associated with.
  • the detection result is stored in RAMZROM62.
  • the magnetic field receiving unit 32, the magnetic field generating unit 68, and the part of the computer 66 that executes the position detection program are shared with each other.
  • an example of the “detection section” in the above item (2), an example of the head constitutes an example of the “detected portion” in the above item (2), and an example of the waist portion being an example of the “reference portion” in the above item. It is composed of
  • the magnetic field receiving unit 32 constitutes an example of the "first portion” in the above item (3)
  • the magnetic field generating unit 68 constitutes an example of the "second portion” in the same item.
  • the part of the computer 66 that executes the position detection program constitutes an example of the “detection means” in the same section.
  • the magnetic field generator 68 constitutes an example of the “signal generator” in the above item (4), and the magnetic field receiver 32 constitutes an example of the “receiver” in the same item.
  • the signal generated by the magnetic field generator 68 to generate a magnetic field in the space constitutes an example of the “detection signal” in the same paragraph.
  • three types of coordinate systems are used to display a display target (content) as an image to an observer. That is, the definition coordinate system used to define the display target, the display coordinate system used to define the image displayed in the image display area, the position and orientation of the head (exactly, And the position and orientation of the head-mounted unit 40).
  • the display coordinate system is fixed to the head-mounted unit 40, while the reference coordinate system is fixed to the waist-mounted unit 50.
  • the position and orientation of the head are detected with reference to the lumbar region 50, and an image is displayed in the image display area with reference to the lumbar region 50.
  • the position and orientation of head-mounted portion 40 are detected based on waist-mounted portion 50, and the display position of the image is determined based on waist-mounted portion 50 based on the detection result.
  • the observer can observe the image only when the head is tilted so as to gaze at his / her waist, and in that case, the display position of the image with respect to the display coordinate system is the waist-mounted position. It changes according to the relative relationship between the part 50 and the head-mounted part 40.
  • the reference coordinate system is used to detect the position and orientation of the head-mounted unit 40 with reference to the waist-mounted unit 50. Is the coordinate system used as At the same time, the reference coordinate system determines whether or not to display an image in the image display area with reference to the lumbar region 50 (i.e., a force for turning on the image display and a force for turning off the image display) and displays the image. If so, it is also the coordinate system used to determine its display position.
  • the display coordinate system is a coordinate system used for displaying an image in the image display area based on the head mounted part 40.
  • the reference coordinate system is an XYZ coordinate system fixed to the waist attachment 50.
  • the position of this reference coordinate system is uniquely determined for the current observer by the position and orientation of the generating composite coils X, ⁇ , and Z in the magnetic field generator 68.
  • the display coordinate system is an xyz coordinate system fixed to the head-mounted unit 40.
  • the position of the display coordinate system is uniquely determined for the current observer by the position and orientation of the receiving composite coils U, V, W in the magnetic field receiving unit 32.
  • the reference coordinate system is specified by the position and orientation of the generating composite coils X, Y, and Z built in the waist attachment unit 50.
  • the display coordinate system is specified by the position and orientation of the receiving composite coils U, V, and W incorporated in the head-mounted unit 40.
  • the reference coordinate system is composed of two points representing the waist-mounted portion 50, that is, two arrangements of two generating coils. It is possible to carry out the invention in a manner specified by the position and the direction of gravity acting on the third point, which coincides with or does not coincide with the deviation of the two points. is there.
  • the display coordinate system is composed of two points representing the head-mounted unit 40, that is, two arrangement positions of the two receiving coils and the two points.
  • the present invention can be implemented in a mode specified by the direction of gravity acting on the third point that does not match any of them.
  • the direction of gravity acting on a specific point is detected using a gravity sensor that directly detects gravity, or, for example, a spherical conductor and the conductor rolls.
  • the detection can be performed by using a combination of a rotating body having an inner peripheral surface and an object that moves integrally with a detected object whose gravity action direction is to be detected.
  • the rotating body has a plurality of contact points arranged on the inner peripheral surface thereof in a state in which the contact points are insulated from each other. At a location, at least two of the plurality of contact points are conducted to each other by a conductor. Which of the plurality of contact points is in contact with the conductor at the same time can be determined by detecting the electrical resistance, capacitance, etc. of all pairs consisting of two of the plurality of contact points. It is possible to specify. If the position of the conductor on the inner peripheral surface of the rotating body is known in this manner, the position of the rotating body and the object to be detected is provided on condition that the position of the center of the rotating body is known (for example, does not change). I know the direction.
  • two modes are provided for displaying an image to an observer. That is, a waist-mounted part reference display mode in which an image is displayed so that an observer observes the image at a position determined with reference to the waist-mounted part 50, and a position determined based on the head-mounted part 40. And a head-mounted part reference display mode for displaying the image so that the observer can observe the image.
  • the lumbar region reference display mode is a mode in which the waist of the observer is selected as a reference region and an image is displayed based on the reference region.
  • the waist is an example of the “reference part” in the above item (2) or (15)
  • the waist-mounted part reference display mode is the “observer standard” in the item (15).
  • This is an example of the “display mode”
  • the head-mounted part reference display mode is an example of the “display part reference display mode” in the same section.
  • the waist-mounted part reference display mode is an example of the real world reference display mode. But it will be.
  • an operation unit 72 is provided for the purpose.
  • definition data (display target data) that defines a display target is stored in the image information memory 74.
  • the definition data is created on the coordinate system for definition so as to express the display object.
  • the definition data that does not need to perform any substantial coordinate conversion on the definition data is effectively mapped to the display coordinate system.
  • display data for displaying an image on the display coordinate system is obtained as data equal to the definition data. That is, the display coordinate system and the definition coordinate system are equal to each other, and this is the head-mounted portion reference display mode.
  • the definition data that defines the display target in the coordinate system for definition is not subjected to substantial coordinate conversion, but the definition data is actually reference coordinates. Maps to the system. Thereafter, the definition data is converted into display data for displaying an image on the display coordinate system based on the relative relationship between the position and orientation of the observer's head and the reference coordinate system, whereby the definition data is converted. Maps to the display coordinate system.
  • the definition coordinate system is not originally associated with the display coordinate system or the reference coordinate system.
  • the definition coordinate system is fixed to the display coordinate system, whereby the image is displayed as if it were fixed to the head-mounted part 40.
  • the coordinate system for definition is fixed to the reference coordinate system, whereby the image is displayed as if it were fixed to the waist-mounted part 50.
  • the definition data force The force at which the display data is created (converted) has been described.
  • This process is executed by the display data conversion circuit 70.
  • This display data conversion circuit 70 is mainly composed of a computer (not shown) different from the computer 66, and executes a display data creation program conceptually represented by a flowchart in FIG. It is configured to execute. However, it is not essential to configure the display data conversion circuit 70 using a combi- ter.
  • the display data conversion circuit 70 can be configured using a DSP.
  • the display data creation program is repeatedly executed. In each execution of the display data creation program, first, in S31 of FIG. 7, image information is taken in from the image information memory 74 as definition data. Next, in S32, the selection display mode currently selected by the observer among the head-mounted part reference display mode and the waist-mounted part reference display mode is identified based on the output signal of the operation unit 72.
  • the determination power is NO in S33, and the process proceeds to S34.
  • the definition coordinate system is fixed to the display coordinate system, whereby the fetched definition data is output to the display optical system 10 as display data that is not subjected to coordinate conversion. That is, in the head-mounted part reference display mode, the display coordinate system and the definition coordinate system are equivalent.
  • the display data includes a target value of light intensity (luminance) and a depth value (luminance) for each image display, for each frame of an image, for each field of one frame, or for each pixel. (Wavefront curvature).
  • a portion of the display data representing the target intensity is finally supplied to the intensity modulator 14 so that the intensity of the light beam output from the intensity modulator 14 becomes equal to the target intensity.
  • a portion of the display data representing the target depth is finally supplied to the wavefront modulator 16 so that the wavefront curvature of the light beam output from the wavefront modulator 16 becomes equal to the target wavefront curvature.
  • the determination is SYES in S33, and the process shifts to S35.
  • S35 the position and orientation of the head-mounted unit 40 are relatively detected with reference to the waist-mounted unit 50.
  • the latest detection result is read from the RAMZROM 62.
  • the definition coordinate system is fixed to the reference coordinate system, whereby the fetched definition data is fixed to the reference coordinate system. That is, in the waist-mounted portion reference display mode, the reference coordinate system and the definition coordinate system are equivalent.
  • the mapped definition data is mapped to the display coordinate system so as to reflect the relative detection result of the position and orientation of the head-mounted unit 40.
  • the mapping is performed, for example, as the relative detection result power of the position and orientation of the head-mounted unit 40 when the relative positional relationship between the image represented by the detection result definition data and the reference coordinate system is set. This is performed so that the display position of the image moves within the field of view of the observer. That is, the displayed image is visually recognized as being fixed to the reference coordinate system.
  • the data is output to the display optical system 10 as the definition data force display data on which such mapping has been performed.
  • the waist attachment portion 50 is connected to the (1)
  • control section constitutes an example of the “control section”.
  • the display mode of the objects is controlled. Image processing is performed.
  • the present embodiment is different from the first embodiment in that such image processing is not performed without performing such image processing, but the other elements are common to the first embodiment.
  • the present embodiment will be described. Elements different from those of the first embodiment will be described in detail, while common elements will be referred to using the same reference numerals or names, and will be described in detail. Is omitted.
  • FIG. 8 is a conceptual block diagram of the image display device according to the present embodiment.
  • This image display device is different from the image display device according to the first embodiment in that the gaze direction detector 1 80 have been added.
  • the line-of-sight direction detection unit 180 is mounted on the head-mounted unit 40 in order to detect the line-of-sight directions of both eyes of the observer.
  • the line-of-sight direction detection unit 180 detects the line-of-sight directions of both eyes of the observer in accordance with a known principle.
  • This gaze direction detection unit 180 is operated, for example, in a state where the distance between the infrared camera and both eyes is known, as disclosed in Japanese Patent Application Laid-Open No. 7-167618 and Japanese Patent Publication No. 61-59132. It is also possible to design such that the eyes of both eyes are photographed by the infrared camera, and the gaze direction is detected based on the relative positional relationship between the photographed corneal reflection image of the eye and the center of the pupil. Other conventional examples of detecting the line of sight are disclosed in Japanese Patent Application Laid-Open Nos. 2004-255074 and 5-199996.
  • depth position information that specifies the depth position at which each object should be displayed, and the depth position information that specifies the depth position at which each object should be displayed, are associated with each object.
  • a plurality of pieces of image information are stored in the image information memory 74 including plane position information for specifying a position where each object is to be displayed on the plane at the position.
  • an object means, for example, an image representing a specific object, and a region where the image is displayed is defined as an object region.
  • plane position information X coordinate value and Y coordinate value described later
  • depth position information Z coordinate value described later
  • the plane position information and the depth position information are stored in advance for all the pixels in the object area in association with a specific object.
  • the plane position and the depth position of each pixel can be obtained. Forces can also identify objects.
  • the entire specific object is displayed at the same depth position, if the depth position information is stored in advance in association with the specific object, only the depth position of each pixel specifies the object. It is possible. Further, depending on the shape of the object, it is possible to omit storing in advance at least a part of the plurality of pixels belonging to the object in association with the specific object and the plane position information. Noh.
  • each pixel on the display screen is defined by an XYZ orthogonal coordinate system, for example, as shown in FIG.
  • the depth position of each pixel on the display screen is defined by the Z coordinate value
  • the position of each pixel on a plane orthogonal to the depth direction is defined by the X coordinate value and the ⁇ coordinate value.
  • the depth position information includes information representing the Z coordinate value for each pixel.
  • the depth position information can include information representing the Z coordinate value for each object.
  • the plane position information includes information indicating a combination of an X coordinate value and a Y coordinate value for each pixel. Further, the plane position information also includes information for distinguishing an area where an object is present from a non-existent area on a plane having the same depth position as each object.
  • the wavefront modulator 16 in Fig. 2 is operated based on the above-described depth position information.
  • the principle of modulating the wavefront curvature by the wavefront modulator 16 will be specifically described later with reference to FIG.
  • the wavefront modulator 16 must be arranged on the assumption that the objects do not overlap in the same line of sight. This can be realized by including only one modulation element that modulates the wavefront curvature and modulating the wavefront curvature at a high speed by the modulation element.
  • the wavefront modulator 16 includes a plurality of modulation elements capable of modulating the wavefront curvature independently of each other, It is necessary to operate these modulators all at once to achieve different wavefront curvatures for each object.
  • FIG. 9 conceptually shows the image processing program in a flowchart.
  • This image processing program is repeatedly executed by the computer 66. At the time of each execution, first, in S51 of FIG. 9, the gaze direction detection unit 180 detects the gaze direction of the observer.
  • an intersection between the plane having the same depth position as each object and the detected line-of-sight direction is determined, and the determined intersection is defined as the intersection of the object. It is determined whether or not a force exists in the area. If the object exists in the area of the object, the object is determined to be the object that the observer is currently paying attention to.
  • the brightness of the specific object is increased from the normal value by operating the intensity modulator 14 of the display optical system 10 based on the changed brightness information.
  • the luminance of the entire specific object is increased from the normal value.
  • the appearance of the display image and the movement after the appearance are automatically performed upon the specific action.
  • an object of interest to the observer among the objects is automatically selected as a specific object, and the selected object is selected.
  • Image processing for displaying a specific object relatively clearly is automatically started.
  • the intention of the observer without the observer performing each operation is automatically reflected in the image display processing, and as a result, an image display device with improved usability is provided. Provided.
  • An example of the “image processing unit” and an example of the “image processing unit” in the above item (20) are configured.
  • the part of the computer 66 that executes S51 in FIG. 9 and the line-of-sight direction detecting unit 180 cooperate with each other to form an example of the “line-of-sight direction detecting unit” in the above paragraph (22).
  • the brightness of the specific object is increased from the normal value, so that the specific object can be clearly viewed by the observer.
  • the brightness of the specific object is relatively increased, whereby the observer can clearly see the specific object. It is possible to carry out the present invention so as to allow a natural visual observation.
  • a specific object is selected from the relationship between the detected line-of-sight direction, that is, the line-of-sight direction of the left eye and the right eye, and the above-described geometric information.
  • the present invention can be implemented such that a position is calculated, and a specific object is selected from the relationship between the calculation result and the above-described geometric information.
  • the brightness of the specific object is increased from the normal value so that the observer can see the specific object more clearly than other objects.
  • the observer can specify the specific object.
  • the present invention can be implemented so that clear visual observation is possible.
  • the observer can specify without obstructing the non-specific object.
  • the present invention can be implemented so that the object can be clearly seen.
  • FIG. 10 is a block diagram conceptually showing a configuration of a retinal scanning display (hereinafter, simply referred to as “RSD”) 200 according to the present embodiment, and light emitted from RSD 200 is shown in FIG.
  • the optical path up to the eye M of the observer is shown in the optical path diagram.
  • the RSD 200 is an example of an image display device, and projects an image directly on the retina M4 by scanning image light representing an image to be displayed on the retina M4 of the eye M of the observer. Also It is.
  • the RSD 200 can simultaneously display a plurality of objects on the same display image. If the depth positions at which the plurality of objects should be displayed are different from each other, the depth positions of the plurality of objects, that is, the objects are displayed so that the difference in the depth positions is perceived by an observer.
  • the wavefront curvature of each image light for controlling is controlled.
  • the one on which the observer is focused is selected as the specific object.
  • image processing for changing the display mode of an image including the plurality of objects is performed so that the selected specific object can be clearly seen by an observer.
  • This image processing is specifically executed to display the selected specific object and a non-specific object other than the specific object among the plurality of objects in different display modes.
  • the brightness of the specific object is increased from the normal value, so that the observer can clearly see the specific object. Visual inspection is possible.
  • the RSD 200 outputs the modulated light H modulated according to the image information G to the eye.
  • An optical device that scans directly on the retina M4 of M and projects an image directly on the retina M4.
  • the RSD 200 includes an external memory 202, a modulated light output unit 204, a scanning unit 206, a wavefront curvature modulation unit 208, a line-of-sight direction detection unit 210, and a control unit 212.
  • the external memory 202 is an electronic component that previously stores image information G necessary for displaying an image to be displayed.
  • the RSD 200 displays an image based on the image information G stored in the external memory 202 in advance.
  • the image information G includes a plurality of (a) plane position information specifying a two-dimensional position of an image to be displayed and depth position information specifying a depth position of the image to be displayed. Geometric information, (b) color information specifying the color of the image to be displayed, and (c) display information. And luminance information for specifying the luminance of the image to be reproduced.
  • the control unit 212 controls the modulated light output unit 204 based on the color information and the luminance information, controls the scanning unit 206 based on the synchronization signal, and controls the wavefront curvature modulation unit 208 based on the depth position information. .
  • the modulated light output unit 204 is an optical device that modulates light based on the above-described color information and luminance information and outputs the modulated light as modulated light H.
  • the modulated light H output from the modulated light output unit 204 passes through the wavefront curvature modulation by the wavefront curvature modulation unit 208, the deflection in the scanning unit 206, and the reflection in the light reflection unit 214, and then the iris Ml of the eye M of the observer.
  • the light enters the pupil M2 surrounded by.
  • the incident modulated light H reaches the retina M4 after being imaged on the crystalline lens M3.
  • the scanning unit 206 is an optical device that projects an image on the retina M4 by scanning the modulated light H on the retina M4 based on the synchronization signal described above.
  • the wavefront curvature modulation unit 208 changes the angle of diffusion of the modulated light H reflected by the light reflection unit 214 and incident on the eye M, that is, changes the wavefront curvature of the modulated light H.
  • the optical device changes the depth position at which the observer perceives an image.
  • the operation of the wavefront curvature modulation unit 208 is controlled by the control unit 212 based on the above-described depth position information.
  • the wavefront curvature modulation unit 208 displays an image such that the observer perceives the image at a position P1 close to the observer, as indicated by a dashed line in FIG. It is possible.
  • the wavefront curvature modulation unit 208 can display the image so that the observer perceives the image at the position P2 where the observer's power is also separated, as indicated by the two-dot chain line in FIG. It is.
  • the wavefront curvature modulation unit 208 can display the image as if the observer were at the distant position P3, as indicated by the broken line in FIG.
  • the line-of-sight direction detection unit 210 is a device that detects the line-of-sight direction of the observer. By using the gaze direction of the observer detected by the gaze direction detection unit 210, the observer pays attention to the displayed image, that is, a part, that is, a specific image, and the observer pays attention to the partial image. Images or non-specific images can be distinguished from each other. This time the display image is multiple Since an object is included, the specific image is referred to as a specific object, and the non-specific image is referred to as a non-specific object.
  • the line-of-sight direction detection unit 210 can be configured in the same manner as the line-of-sight direction detection unit 180 described above.
  • the line-of-sight direction detection unit 210 uses, for example, an infrared camera whose distance to both eyes is known, images the eyes of both eyes with the infrared camera, and calculates the corneal reflection image of the captured eyes and the pupil center. It can be designed to detect the direction of the observer's line of sight based on the relative positional relationship between the two.
  • control unit 212 is mainly configured by a computer 220.
  • Computer 220 is configured to include a CPU 222, a ROM 224, and a RAM 226, as is well known.
  • the control unit 212 executes a predetermined operation by executing various programs stored in the ROM 224 in advance by the CPU 222.
  • control unit 212 controls the modulated light output unit 204 based on the color information and the luminance information, controls the scanning unit 206 based on the synchronization signal, and modulates the wavefront curvature based on the depth position information.
  • Each of the units 208 is controlled.
  • the control unit 212 not only realizes a normal image display function, which is an original function of the RSD 200, but also changes the brightness of the specific object for the purpose of enabling the observer to clearly see the specific object. Execute the process.
  • the control unit 212 executes image processing for changing luminance information corresponding to each object based on a depth position at which each object is to be displayed.
  • the control unit 212 makes a change to the image information G stored in advance in the external memory 202 in accordance with the content of the image processing to be executed, and performs the change.
  • the stored image information G is stored in the RAM 226.
  • the control unit 212 controls the modulated light output unit 204, the scanning unit 206, and the wavefront curvature modulation unit 208 based on the luminance information, the plane position information, and the depth position information stored in the RAM 226.
  • the modulated light output section 204 includes an image signal processing circuit 230, and further includes a red light source 232, a green light source 234, and a blue light source 236 as light sources.
  • This The modulated light output unit 204 further includes a red light source driver 240, a green light source driver 242, and a blue light source driver 244 as light source drivers.
  • the modulated light output unit 204 further includes collimator lenses 246, 248, and 250,
  • the image signal processing circuit 230 includes a red light source driver 240 that drives the red light source 232, the green light source 234, and the blue light source 236, respectively, based on the color information and the luminance information output from the control unit 212, and a green light source driver.
  • An intensity modulation signal corresponding to each color is output to the light source driver 242 and the blue light source driver 244.
  • the image signal processing circuit 230 When the above-described image processing is not performed on the color information and the luminance information, the image signal processing circuit 230 generates the red light source driver 240 and the green light source based on the original color information and the luminance information. An intensity modulation signal is output to the driver 242 and the blue light source driver 244.
  • the image signal processing circuit 230 changes the intensity modulation signal according to the changed color information.
  • the color of the displayed image can be arbitrarily changed, and the image can be displayed transparent by not outputting the color.
  • the image signal processing circuit 230 changes the intensity modulation signal in accordance with the changed luminance information to change the red light source driver 240, the green light source driver 242, and the blue light source. Controls the driver 244.
  • the luminance of the image can be arbitrarily changed by increasing or decreasing the luminous intensity (intensity) of the corresponding one of the red light source 232, the green light source 234, and the blue light source 236.
  • the wavefront curvature modulation unit 208 includes a convex lens 260, a position variable mirror 262, a mirror driving unit 264, and a half mirror 266.
  • the modulated light H emitted from the above-described modulated light output unit 204 is transmitted to the half mirror 266 via the optical fiber 270.
  • the half mirror 266 is an entrance of the modulated light H to the wavefront curvature modulator 208.
  • the position variable mirror 262 is provided on the optical axis of the convex lens 260.
  • the position variable mirror 262 is provided so as to be movable between a focal position f of the convex lens 260 and a position a approaching the convex lens 260 from the focal position f.
  • the position b is an intermediate position between the position a and the focal position f.
  • FIG. 13 shows the position variable mirror 262 in a state where its reflecting surface is located at the position a.
  • the position variable mirror 262 is located inside the focal point f of the convex lens 260. Therefore, as shown by a dashed line in FIG. 13, the modulated light H emitted from the convex lens 260 side as parallel light toward the position variable mirror 262 is reflected by the position variable mirror 262 to be diffused light. Is converted. When the modulated light H converted into the diffused light enters the eye M of the observer, the observer perceives the image at the position P1 in FIG.
  • the mirror driving unit 264 shown in Fig. 13 is formed using, for example, a piezoelectric element.
  • the above-described position variable mirror 262 can be attached to a surface intersecting the electric field application direction among a plurality of surfaces of the piezoelectric element.
  • the position variable mirror One 262 can be moved away from or close to the convex lens 260, and as a result, the position variable mirror 262 can be moved to any of the above-mentioned positions a, b, and f. .
  • the position of the position variable mirror 262 is controlled based on depth position information. Therefore, by changing the depth position information, the depth position of the image to be displayed can be changed to, for example, an arbitrary position between the position P1 and the position P3 shown in FIG.
  • the scanning unit 206 includes a horizontal scanning mirror 280, relay lenses (for example, convex lenses) 282 and 284, a vertical scanning mirror 286, and a relay lens (except for f rows, a convex lens). 288
  • the horizontal scanning mirror 280 is provided rotatable around the rotation axis L1.
  • the horizontal scanning mirror 280 reflects the modulated light H emitted from the half mirror 266 in a direction corresponding to the rotation position of the horizontal scanning mirror 280.
  • the rotation of the horizontal scanning mirror 280 is controlled based on the above-mentioned synchronization signal.
  • the vertical scanning mirror 286 is provided so as to be swingable around the rotation axis L2.
  • the rotation of the vertical scanning mirror 286 is controlled based on the above-mentioned synchronization signal.
  • the relay lenses 282 and 284 transmit the modulated light H reflected from the horizontal scanning mirror 280 to the vertical scanning mirror 286.
  • the giret lenses 288 and 290 transmit the modulated light H reflected from the vertical scanning mirror 286 to the retina M4 via the pupil M2 and the crystalline lens M3 in order.
  • FIG. 14 conceptually shows the image processing program in a flowchart. Hereinafter, steps common to the image processing program according to the second embodiment will be briefly described.
  • This image processing program is repeatedly executed by the computer 220.
  • the gaze direction detecting unit 210 detects the gaze direction of the observer.
  • the detected gaze direction and the external memory 202 for displaying a plurality of objects in the current display image are respectively displayed. From the relationship with the geometric information of each object represented by the plurality of image information stored in the object, the one of the plurality of objects that the observer is currently paying attention to (attention) is selected as the specific object. .
  • the corresponding one of the red light source driver 240, the green light source driver 242, and the blue light source driver 244 is controlled in relation to the display of the specific object. Is done.
  • the luminous intensity (intensity) of the corresponding one of the red light source 232, the green light source 234, and the blue light source 236 is increased from the normal value, and as a result, the luminance of the specific object is increased. Increased from normal value. In the present embodiment, the brightness of the entire specific object is increased in calorie from the normal value.
  • an image in which the rays A, the bream B, and the coral C are located in the water tank in order of lateral force close to the observer is displayed stereoscopically as an image.
  • the images of ray A, the image of snapper B, and the image of coral C are to be displayed simultaneously. It is an example of Bujietato. It is assumed that the observer pays attention to the image of the snapper B in a state where the objects are displayed at the same time.
  • the image processing program when executed by the computer 220, first, in S101, the gaze direction of the observer is detected by the gaze direction detection unit 210. Next, in S102, the object located in the direction of the detected line of sight among the plurality of objects, that is, the image of the sea bream B this time is selected as the specific object that the observer is paying attention to.
  • the luminance of the specific object approaches the observer, and switches to two stages depending on whether the observer approaches the observer.
  • the present invention can be implemented in a mode in which the luminance of a specific object is changed in more stages. Further, the present invention can be implemented in a mode in which the brightness of the specific object is continuously changed as the specific object approaches the observer. [0323] Furthermore, it is possible to carry out the present invention in such a manner that the luminance power of the specific object increases as the specific object moves away from the observer. When this mode is adopted, the observer can clearly see the specific object even though the specific object is located far away.
  • the part of the computer 220 that executes S101 and S102 in Fig. 14 and the line-of-sight direction detection unit 210 cooperate with each other in the item (29).
  • An example of the “image processing unit” and an example of the “image processing unit” in the above item (30) are configured.
  • the part of the computer 220 that executes S101 in FIG. 9 and the gaze direction detection unit 210 cooperate with each other to form the “gaze direction detection unit” in the above section (32).
  • S104 and S105 in FIG. 9 of the computer 220 constitute an example of the “image processing unit” in the above item (35).
  • the present embodiment is different from the second or third embodiment only in the elements related to image processing for displaying a specific object relatively emphasized compared to a non-specific object, and common to other elements. Therefore, only different elements will be described in detail, and common elements will be referred to by using the same reference numerals or names, and detailed description thereof will be omitted.
  • the specific object is highlighted by increasing the luminance of the specific object.
  • the specific object is highlighted by making the outline of the specific object clearer than usual, that is, brighter in the present embodiment.
  • step S201 the gaze direction of the observer is detected in the same manner as in S101.
  • step S202 in the same manner as in step S102, based on the detected line-of-sight direction, an object of interest to the observer among a plurality of simultaneously displayed objects is selected as a specific object.
  • the modulated light output unit 204 is controlled based on the changed luminance information.
  • the brightness of the outline of the specific object is locally increased from the normal value, and the outline is displayed with emphasis.
  • the external memory 202 is configured to increase only the brightness of the contour of the image of the snapper B, which is the specific object. Is changed in the luminance information stored in advance.
  • the modulated light output unit 204 increases the luminance of a plurality of pixels forming the outline of the image of the snapper B based on the changed luminance information.
  • the part of the computer 220 that executes S204 and S205 in FIG. 16 constitutes an example of the “image processing unit” in the above section (36). That's why.
  • the present embodiment is different from the second or third embodiment only in the elements related to image processing for displaying a specific object relatively emphasized compared to a non-specific object, and common to other elements. Therefore, only different elements will be described in detail, and common elements will be referred to by using the same reference numerals or names, and detailed description thereof will be omitted.
  • the specific object is highlighted by increasing the luminance of the specific object.
  • the depth position of the selected specific object is obtained in the same manner as in S103.
  • S304 among a plurality of pixels virtually located on a plane having the same depth position as the specific object, a plurality of pixels located outside the outline of the specific object and separated by a set distance from the outline of the specific object. A pixel is selected as a plurality of peripheral pixels.
  • the corresponding luminance information is further changed such that the luminance of the selected plurality of peripheral pixels increases from the normal value.
  • the modulated light output unit 204 is controlled based on the changed luminance information.
  • the brightness of the peripheral pixels of the outline of the specific object is increased from the normal value, and the peripheral pixels are displayed in an emphasized manner.
  • the part of the computer 220 that executes S304 and S305 in FIG. 17 is one of the “image processing unit” in the above section (37). It is an example.
  • the present embodiment is different from the second or third embodiment only in the elements related to image processing for displaying a specific object relatively emphasized compared to a non-specific object, and common to other elements. Therefore, only different elements will be described in detail, and common elements will be referred to by using the same reference numerals or names, and detailed description thereof will be omitted.
  • the specific object is highlighted by increasing the luminance of the specific object.
  • the non-specific object when there is a non-specific object to be displayed before the specific object among a plurality of objects to be displayed at the same time, the non-specific object is transparently displayed. Thus, the observer can clearly see the specific object without being disturbed by the non-specific object, even though the non-specific object is originally present in front of the specific object.
  • the ROM 224 stores an image processing program conceptually represented by a flowchart in FIG.
  • FIG. 14 detailed description of the steps common to the image processing program shown in FIG. 14 will be omitted by referring to the corresponding step numbers.
  • a Z coordinate value representing the depth position of the selected specific object is obtained in the same manner as in S103.
  • a Z coordinate value representing a depth position to be displayed is obtained for each of the other non-specific objects displayed together with the specific object.
  • each non-specific object is originally specified based on a comparison result between the Z coordinate value of the non-specific object and the Z coordinate value of the specific object. It is determined whether the force should be displayed in front of the object.
  • the judgment power of S405 is SYES, and in S406, the non-specific object to be displayed in front of the specific object is determined. The color information corresponding to the non-specific object is changed so that the specific object is displayed transparently. This completes one execution of the image processing program.
  • a coordinate value is obtained.
  • the corresponding Z coordinate value is compared with the Z coordinate value of the image of the specific object, snapper B, to determine It is determined whether or not the force is a non-specific object to be displayed in front of the image.
  • step S405 Since the image of ray A is a non-specific object that should be displayed in front of the image of snapper B, the determination result of step S405 is YES, and the image of ray A is displayed transparently in step S406. Then, the corresponding color information is changed. On the other hand, since the image of Coral C is a non-specific object that should be displayed behind the image of Snapper B, the determination result of S405 is NO, and the image of Coral C has no color change in S407. Displayed with.
  • the part of the computer 220 that executes S404 to S406 in Fig. 16 is one of the "image processing unit" in the above item (38). It is an example.
  • the foreground object to be displayed in front of other objects is displayed translucently instead of being displayed transparently.
  • the invention can be implemented in embodiments.
  • the color represented by the original color information stored in advance in the external memory 202 in association with the foreground object, and the color behind the object are displayed.
  • a mixture of colors represented by the original color information stored in the external memory 202 in advance and associated with the back object at a predetermined mixture ratio can be added to the inside of the contour of the foreground object. It is.
  • the predetermined ratio can be set, for example, such that the ratio between the original color of the front object and the original color of the back object is 9: 1.
  • the present embodiment differs from the sixth embodiment only in the elements related to image processing for displaying a specific object relatively emphasized than a non-specific object, and is common in other elements. Only different elements will be described in detail, and common elements will be referred to using the same symbols or names, and detailed description will be omitted.
  • the non-specific object to be displayed in front of the specific object is displayed transparently, while the non-specific object to be displayed in the back is displayed without changing the color.
  • the non-specific object to be displayed in the back is the original depth. It is displayed at the position closer to the specific object than the position.
  • the ROM 224 stores an image processing program conceptually represented by a flowchart in FIG.
  • FIG. 18 detailed description of the steps common to the image processing program shown in FIG. 18 will be omitted by referring to the corresponding step numbers.
  • each non-specific object force is originally specified. It is determined whether the force should be displayed in front of the object.
  • the determination power of S505 is SYES, and the non-specific object to be displayed in front of the specific object in S506 is determined.
  • the depth position information corresponding to the non-specific object is not changed so that the specific object is displayed without changing the depth position. This completes one execution of the image processing program.
  • the determination power is SNO in S505
  • the non-specific object is displayed farther than the specific object in S507.
  • the depth position information corresponding to the non-specific object is changed so that the non-specific object to be displayed is located closer to the specific object than the original depth position. This completes one execution of the image processing program.
  • the determination result of S505 is YES, and the image depth of ray A does not change in S506. , That is, displayed at the original depth position.
  • the image of Coral C is a non-specific object that should be displayed behind the image of Snapper B, the result is a semi-IJ fixed result SNO of S505, and the process proceeds to S507.
  • the depth position information corresponding to the image of Coral C is changed.
  • the wavefront curvature modulation unit 208 modulates the wavefront curvature of the image light for displaying the image of the coral C based on the changed depth position information. As a result, the image of Coral C is displayed at a depth position closer to the image of bream B than the original depth position.
  • the partial force of the computer 220 that executes S504 to S507 in Fig. 19 is an example of the "display object position change unit" in the section (40);
  • the modulated light output unit 204 constitutes an example of the “light flux output unit” in the above item (41), and the wavefront curvature modulation unit 208 constitutes an example of the “display object position changing unit” in the item (41).
  • the part of the computer 220 that executes S503 and S504 in FIG. 19 constitutes an example of the “depth position detecting unit” in the above section (42), and the part that executes S505 and S507 in the figure constitutes an example of the “display object position changing unit” in the same section.
  • this embodiment differs from the seventh embodiment only in the elements related to image processing for displaying a specific object relatively emphasized compared to a non-specific object, and is common to other elements. Only different elements will be described in detail, and common elements will be referred to using the same symbols or names, and detailed description will be omitted.
  • non-specific objects that should be displayed farther than the specific object are displayed at a depth position closer to the specific object than the original depth position.
  • the non-specific object to be displayed farther than the specific object is displayed at a depth position farther from the specific object than the original depth position.
  • the ROM 224 stores an image processing program conceptually represented by a flowchart in FIG.
  • FIG. 19 detailed description of the steps common to the image processing program shown in FIG. 19 will be omitted by referring to the corresponding step numbers.
  • each non-specific object force is originally specified. It is determined whether the force should be displayed in front of the object.
  • the judgment power of S605 is SYES, and in S606, the non-specific object to be displayed in front of the specific object is determined.
  • the depth position information corresponding to the non-specific object is not changed so that the specific object is displayed without changing the depth position. This completes one execution of the image processing program.
  • the non-specific object in this case should not be displayed in front of the specific object, that is, should be displayed farther than the specific object.
  • the determination is SNO in S605, and in S607, the non-specific object to be displayed farther than the specific object is displayed in a depth position farther from the specific object than the original depth position. Changes are made to the depth position information corresponding to the object. This completes one execution of the image processing program.
  • the image processing program is executed for the example shown in FIG. 15, in S602, the image of sea bream B is selected as the specific object. Subsequently, in S603, based on the depth position information corresponding to the image of the snapper B, a Z coordinate value indicating the depth position at which the image of the snapper B is to be displayed is obtained.
  • the depth position information corresponding to the image of coral C is changed in order to display the image of coral C away from the image of snapper B.
  • the wavefront curvature modulation unit 208 modulates the wavefront curvature of the image light for displaying the image of the coral C based on the changed depth position information. As a result, the image of Coral C is displayed at a depth position that is farther away from the sea than the original depth position.
  • the partial force of the computer 220 that executes S604 to S607 in Fig. 20 is an example of the "display object position changing unit” in the section (40). This constitutes an example of the “display object position changing unit” in the above item (41). Further, in the present embodiment, the part of the computer 220 that executes S603 and S604 in FIG. 20 constitutes an example of the “depth position detecting unit” in the above section (43), and the same part of the computer 220 The part that executes S605 and S607 in the figure constitutes an example of the “display object position changing unit” in the same section.
  • the present embodiment is different from the second or third embodiment only in the elements related to image processing for displaying a specific object relatively emphasized compared to a non-specific object, and common to other elements. Therefore, only different elements will be described in detail, and common elements will be referred to by using the same reference numerals or names, and detailed description thereof will be omitted.
  • the specific object it is not determined whether or not the specific object has a force that has a portion to be displayed overlapping with the non-specific object located in front of the specific object. However, in the present embodiment, the determination is performed. Further, in the present embodiment, a portion of the specific object that should be displayed so as to overlap with the non-specific object is displayed translucently.
  • the original color of the specific object is changed from the original color of the portion of the specific object that should be displayed to overlap the non-specific object.
  • the original color of the non-specific object is changed to a mixed color mixed at a predetermined ratio.
  • the ROM 224 stores an image processing program conceptually represented by a flowchart in Fig. 21.
  • Fig. 21 Hereinafter, detailed description of the steps common to the image processing program shown in FIG. 14 will be omitted by referring to the corresponding step numbers.
  • the object on which the measured point of interest is located is selected as a specific object based on the aforementioned geometric information corresponding to a plurality of objects.
  • FIG. 22 shows, in plan view, an image of ray A and an image of snapper B in relation to the example shown in FIG.
  • the image of ray A and the image of snapper B are shown in a front view.
  • part of the image of ray A overlaps with the image of snapper B in the line of sight of the observer, and this is shown in plan view in FIG. FIG. 22 further shows that the observer's gaze point is not in the image of ray A but in the image of snapper B.
  • the image of snapper B is selected as the specific object.
  • a specific object is automatically selected by triangulation. For example, an operation for the observer to select the specific object (for example, displaying the specific object is performed).
  • the present invention can be implemented in such a manner that a specific object is manually selected in response to an observer clicking the mouse on the specific object on another screen (for example, a monitor screen). is there.
  • the geometric amount of the specific object represented by the depth position information and the plane position information corresponding to the selected specific object, and the depth position information corresponding to the non-specific object Based on the geometric amount of the non-specific object represented by the position information and the plane position information and the detected line of sight, the overlapping part of the specific object and the non-specific object located in front of the specific object is geometrically determined. Is detected.
  • the color information corresponding to the portion of the specific object that should be displayed overlapping with the non-specific object is changed so as to represent the determined mixed color.
  • a portion of the specific object that should be displayed so as to overlap the non-specific object is displayed with its color changed to the original color strength of the mixed color.
  • control unit 212 controls the wavefront curvature modulation unit 208, so that the image of the sea bream B is displayed so as to exist at the original depth position stored in the external memory 202 in advance. .
  • the mixing ratio described above can be set arbitrarily. For example, it is possible to set the mixture ratio of the original color of the specific object and the original color of the non-specific object located in front of the specific object to a ratio represented by 9: 1.
  • a non-specific object image of ray A, which is an image of ray A, is indicated by a dashed line
  • a specific object image of snapper B, indicated by solid line
  • the modulated light output unit 204 constitutes an example of the "light beam output unit” in the above item (44), and the line-of-sight direction detection unit 210 and the combi- ter
  • the part of S220 that executes S701 and S703 in FIG. 21 cooperates with each other to form an example of the “overlapping object detection unit” in the same part, and the part of the computer 220 that executes S705 in FIG.
  • An example of the “overlapping object display section” in the section, an example of the “second display section” in the above section (46), and an example of the “mixing ratio changing section” in the section (47) are constituted. It is.
  • the present embodiment is different from the second or third embodiment only in the elements related to image processing for displaying a specific object relatively emphasized compared to a non-specific object, and common to other elements. Therefore, only different elements will be described in detail, and common elements will be referred to by using the same reference numerals or names, and detailed description thereof will be omitted.
  • the specific object is set within a range forming a predetermined viewing angle.
  • the non-specific object coexists with the non-specific object, the non-specific object is not displayed transparently, but in the present embodiment, the non-specific object is displayed transparently.
  • the ROM 224 stores an image processing program conceptually represented by a flowchart in Fig. 24.
  • Fig. 24 Hereinafter, detailed description of the steps common to the image processing program shown in FIG. 14 will be omitted by referring to the corresponding step numbers.
  • a non-specific object existing within a viewing angle range that forms a predetermined angle ⁇ with the detected viewing direction is selected from among the plurality of objects excluding the specific object.
  • the above-mentioned predetermined angle ex can be set, for example, based on the angle of view necessary for the observer to observe the entirety of the specific object.
  • the predetermined angle ex is set to an integral multiple of the required angle of view. It is possible to set.
  • the predetermined angle OC can be set to an arbitrary value according to the specification by the observer.
  • the viewing angle range is defined as a range that forms a predetermined angle OC with respect to the viewing direction P of the observer.
  • the predetermined range (X is set to a value (for example, 10 degrees) obtained by multiplying the angle of view (for example, 5 degrees) necessary to observe the entire image of the sea bream B by an integer.
  • FIG. the image of ray A is selected as a non-specific object within the viewing angle range.
  • the part that executes S803 and S804 in 24 constitutes an example of the "image processing unit" in the above item (39).

Abstract

L'invention porte sur un dispositif d'affichage d'image permettant à un utilisateur de regarder une image virtuelle de l'objet à afficher par la projection de lumière dans la rétine de l'utilisateur. L'affichage d'une image par l'affichage d'image à l'utilisateur est contrôlé en fonction du mouvement de l'utilisateur. Le dispositif d'affichage comprend (a) une unité de casque (40) placée autour de la tête de l'utilisateur et conçue pour afficher une image virtuelle de l'objet d'affichage sur une zone d'affichage d'image par modulation de la lumière émise et orientation de la lumière modulée vers une rétine de l'utilisateur à travers une ouverture de sortie (b) et une unité portée à la taille (50) afin de contrôler la section émettrice de lumière et la section de modulation de manière à afficher une image sur la zone d'affichage d'image. L'unité portée à la taille (50) détecte relativement le mouvement de l'utilisateur (par exemple, le mouvement de la tête de l'utilisateur) en relation avec la position de l'utilisateur ( par exemple la position de la taille de l'utilisateur) et enfin allume/éteint l'affichage d'image ou change la position de l'affichage.
PCT/JP2004/016025 2003-10-30 2004-10-28 Dispositif d'affichage d'image WO2005043218A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/413,046 US7825996B2 (en) 2003-10-30 2006-04-28 Apparatus and method for virtual retinal display capable of controlling presentation of images to viewer in response to viewer's motion

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2003370842 2003-10-30
JP2003-370842 2003-10-30
JP2004-068889 2004-03-11
JP2004068889A JP4599858B2 (ja) 2004-03-11 2004-03-11 画像表示装置

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US11/413,046 Continuation-In-Part US7825996B2 (en) 2003-10-30 2006-04-28 Apparatus and method for virtual retinal display capable of controlling presentation of images to viewer in response to viewer's motion

Publications (1)

Publication Number Publication Date
WO2005043218A1 true WO2005043218A1 (fr) 2005-05-12

Family

ID=34554753

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2004/016025 WO2005043218A1 (fr) 2003-10-30 2004-10-28 Dispositif d'affichage d'image

Country Status (2)

Country Link
US (1) US7825996B2 (fr)
WO (1) WO2005043218A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8061845B2 (en) 2005-12-19 2011-11-22 Brother Kogyo Kabushiki Kaisha Image display system and image display method
JP2016004402A (ja) * 2014-06-17 2016-01-12 コニカミノルタ株式会社 透過型hmdを有する情報表示システム及び表示制御プログラム
JP2016173693A (ja) * 2015-03-17 2016-09-29 セイコーエプソン株式会社 頭部装着型表示装置、頭部装着型表示装置の制御方法、および、コンピュータープログラム
JP2017117175A (ja) * 2015-12-24 2017-06-29 セイコーエプソン株式会社 頭部装着型表示装置、頭部装着型表示装置の制御方法、および、コンピュータープログラム
US10175484B2 (en) 2015-03-17 2019-01-08 Seiko Epson Corporation Head-mounted display device, control method for head-mounted display device, and computer program

Families Citing this family (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007272067A (ja) * 2006-03-31 2007-10-18 Brother Ind Ltd 画像表示装置
US20080055194A1 (en) * 2006-08-31 2008-03-06 Motorola, Inc. Method and system for context based user interface information presentation and positioning
FR2909189B1 (fr) * 2006-11-23 2009-01-30 Essilor Int Agencement d'affichage opto-electronique
WO2008067487A2 (fr) * 2006-11-29 2008-06-05 Erik Bakke Système et procédé permettant de commander une présentation affichée, telle qu'une présentation sexuellement explicite
WO2008145169A1 (fr) * 2007-05-31 2008-12-04 Siemens Aktiengesellschaft Dispositif mobile et procédé pour un affichage rétinien virtuel
US9479274B2 (en) 2007-08-24 2016-10-25 Invention Science Fund I, Llc System individualizing a content presentation
US9647780B2 (en) * 2007-08-24 2017-05-09 Invention Science Fund I, Llc Individualizing a content presentation
US8907887B2 (en) * 2008-05-19 2014-12-09 Honeywell International Inc. Methods and systems for operating avionic systems based on user gestures
US9596453B2 (en) * 2010-06-14 2017-03-14 Lg Electronics Inc. Electronic device and control method thereof
KR20120005328A (ko) * 2010-07-08 2012-01-16 삼성전자주식회사 입체 안경 및 이를 포함하는 디스플레이장치
KR101431366B1 (ko) * 2010-07-20 2014-08-19 엠파이어 테크놀로지 디벨롭먼트 엘엘씨 증강 현실 근접 감지
KR20120062170A (ko) * 2010-12-06 2012-06-14 삼성전자주식회사 가상 모니터 제어장치 및 그 제어방법
JP5698529B2 (ja) * 2010-12-29 2015-04-08 任天堂株式会社 表示制御プログラム、表示制御装置、表示制御システム、および表示制御方法
EP2499964B1 (fr) * 2011-03-18 2015-04-15 SensoMotoric Instruments Gesellschaft für innovative Sensorik mbH Dispositif et système de mesure optique
JP5821464B2 (ja) * 2011-09-22 2015-11-24 セイコーエプソン株式会社 頭部装着型表示装置
US8990682B1 (en) 2011-10-05 2015-03-24 Google Inc. Methods and devices for rendering interactions between virtual and physical objects on a substantially transparent display
US9081177B2 (en) 2011-10-07 2015-07-14 Google Inc. Wearable computer with nearby object response
US9547406B1 (en) 2011-10-31 2017-01-17 Google Inc. Velocity-based triggering
US8866852B2 (en) * 2011-11-28 2014-10-21 Google Inc. Method and system for input detection
US9001005B2 (en) 2012-02-29 2015-04-07 Recon Instruments Inc. Modular heads-up display systems
US9069166B2 (en) 2012-02-29 2015-06-30 Recon Instruments Inc. Gaze detecting heads-up display systems
US8947322B1 (en) 2012-03-19 2015-02-03 Google Inc. Context detection and context-based user-interface population
US9096920B1 (en) * 2012-03-22 2015-08-04 Google Inc. User interface method
JP5843340B2 (ja) * 2012-07-27 2016-01-13 Necソリューションイノベータ株式会社 3次元環境共有システム及び3次元環境共有方法
WO2014035118A1 (fr) * 2012-08-31 2014-03-06 Lg Electronics Inc. Visiocasque et procédé de commande de dispositif numérique l'utilisant
KR101958778B1 (ko) 2012-08-31 2019-03-15 엘지전자 주식회사 헤드 마운트 디스플레이 및 이를 이용한 디지털 디바이스 제어 방법
JP2014153645A (ja) 2013-02-13 2014-08-25 Seiko Epson Corp 画像表示装置および画像表示装置の表示制御方法
US9041741B2 (en) * 2013-03-14 2015-05-26 Qualcomm Incorporated User interface for a head mounted display
US9380295B2 (en) * 2013-04-21 2016-06-28 Zspace, Inc. Non-linear navigation of a three dimensional stereoscopic display
US10262462B2 (en) 2014-04-18 2019-04-16 Magic Leap, Inc. Systems and methods for augmented and virtual reality
KR20150141461A (ko) 2014-06-10 2015-12-18 엘지전자 주식회사 헤드 마운티드 디스플레이 및 그 제어 방법
US9992842B2 (en) 2014-12-01 2018-06-05 Industrial Technology Research Institute Illumination system and method for developing target visual perception of an object
US9674920B2 (en) * 2014-12-01 2017-06-06 Industrial Technology Research Institute Illumination system and method for developing target visual perception of an object
EP3245553A4 (fr) * 2015-01-13 2018-04-04 Ricoh Company, Ltd. Appareil de visiocasque et procédé d'affichage
US10156721B2 (en) * 2015-03-09 2018-12-18 Microsoft Technology Licensing, Llc User-based context sensitive hologram reaction
WO2016154711A1 (fr) * 2015-03-31 2016-10-06 Cae Inc. Identification multifactorielle de position des yeux dans un système d'affichage
US20160292919A1 (en) 2015-03-31 2016-10-06 Cae Inc. Modular Infrastructure For An Interactive Computer Program
US9754506B2 (en) 2015-03-31 2017-09-05 Cae Inc. Interactive computer program with virtualized participant
WO2016168788A2 (fr) 2015-04-17 2016-10-20 Tulip Interfaces, Inc. Passerelle de communication conteneurisée
JP6518582B2 (ja) * 2015-12-21 2019-05-22 株式会社ソニー・インタラクティブエンタテインメント 情報処理装置および操作受付方法
GB2548346B (en) * 2016-03-11 2020-11-18 Sony Interactive Entertainment Europe Ltd Image processing method and apparatus
KR20230098916A (ko) * 2016-04-21 2023-07-04 매직 립, 인코포레이티드 시야 주위의 시각적 아우라
US10466474B2 (en) * 2016-08-04 2019-11-05 International Business Machines Corporation Facilitation of communication using shared visual cue
US10489951B2 (en) 2017-09-29 2019-11-26 Qualcomm Incorporated Display of a live scene and auxiliary object
JP6993722B2 (ja) * 2017-10-05 2022-01-14 株式会社Qdレーザ 視覚検査装置
EP3911992A4 (fr) 2019-04-11 2022-03-23 Samsung Electronics Co., Ltd. Dispositif de visiocasque et son procédé de fonctionnement
KR20200120466A (ko) * 2019-04-11 2020-10-21 삼성전자주식회사 헤드 마운트 디스플레이 장치 및 그 동작방법
US11860246B1 (en) * 2019-05-24 2024-01-02 Apple Inc. Short-range position tracking using stationary magnetic field gradient
US11612316B2 (en) * 2019-06-20 2023-03-28 Awss Zidan Medical system and method operable to control sensor-based wearable devices for examining eyes
DE102020126953B3 (de) * 2020-10-14 2021-12-30 Bayerische Motoren Werke Aktiengesellschaft System und Verfahren zum Erfassen einer räumlichen Orientierung einer tragbaren Vorrichtung

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0396913A (ja) * 1989-09-08 1991-04-22 Brother Ind Ltd 画像表示装置
JPH06289318A (ja) * 1993-04-01 1994-10-18 Seiko Epson Corp 頭部装着型表示装置
JPH0795498A (ja) * 1993-09-24 1995-04-07 Sony Corp 眼鏡型ディスプレイ
JPH07261112A (ja) * 1994-03-22 1995-10-13 Hitachi Ltd 頭部搭載型ディスプレイ装置
JPH07303225A (ja) * 1994-05-09 1995-11-14 Olympus Optical Co Ltd 頭部装着型映像表示装置
JPH08160349A (ja) * 1994-12-09 1996-06-21 Sega Enterp Ltd 頭部搭載型映像表示システム、および、頭部搭載型映像表示装置
JPH08328512A (ja) * 1995-05-26 1996-12-13 Canon Inc 頭部装着型表示装置
JPH1195155A (ja) * 1997-09-18 1999-04-09 Nec Software Kyushu Ltd ヘッドアップディスプレイ装置

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6159132A (ja) 1984-08-31 1986-03-26 Matsushita Electric Ind Co Ltd 排煙装置
US4849692A (en) * 1986-10-09 1989-07-18 Ascension Technology Corporation Device for quantitatively measuring the relative position and orientation of two bodies in the presence of metals utilizing direct current magnetic fields
GB8701288D0 (en) * 1987-01-21 1987-02-25 Waldern J D Perception of computer-generated imagery
JP3289953B2 (ja) * 1991-05-31 2002-06-10 キヤノン株式会社 視線方向検出装置
US5369415A (en) 1992-06-29 1994-11-29 Motorola, Inc. Direct retinal scan display with planar imager
US5579026A (en) * 1993-05-14 1996-11-26 Olympus Optical Co., Ltd. Image display apparatus of head mounted type
JP3676391B2 (ja) 1994-04-27 2005-07-27 オリンパス株式会社 頭部装着式映像表示装置
JPH07167618A (ja) 1993-12-14 1995-07-04 Nissan Motor Co Ltd 運転者位置認識装置
JPH0821975A (ja) * 1994-07-06 1996-01-23 Olympus Optical Co Ltd 頭部装着型映像表示システム
US6388638B2 (en) * 1994-10-28 2002-05-14 Canon Kabushiki Kaisha Display apparatus and its control method
JP3727961B2 (ja) 1994-10-28 2005-12-21 キヤノン株式会社 頭部装着型表示装置
TW275590B (en) * 1994-12-09 1996-05-11 Sega Enterprises Kk Head mounted display and system for use therefor
JPH08220470A (ja) 1995-02-20 1996-08-30 Fujitsu General Ltd ヘッドマウントディスプレイ装置
JP2887104B2 (ja) 1996-04-12 1999-04-26 オリンパス光学工業株式会社 頭部装着型映像表示装置
JPH1093889A (ja) 1996-09-13 1998-04-10 Minolta Co Ltd 頭部載置型映像表示装置
DE19802220A1 (de) 1998-01-22 1999-07-29 Bosch Gmbh Robert Anzeigevorrichtung
US6120461A (en) * 1999-08-09 2000-09-19 The United States Of America As Represented By The Secretary Of The Army Apparatus for tracking the human eye with a retinal scanning display, and method thereof
JP4140399B2 (ja) 2003-02-27 2008-08-27 トヨタ自動車株式会社 視線方向検出装置および視線方向検出方法

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0396913A (ja) * 1989-09-08 1991-04-22 Brother Ind Ltd 画像表示装置
JPH06289318A (ja) * 1993-04-01 1994-10-18 Seiko Epson Corp 頭部装着型表示装置
JPH0795498A (ja) * 1993-09-24 1995-04-07 Sony Corp 眼鏡型ディスプレイ
JPH07261112A (ja) * 1994-03-22 1995-10-13 Hitachi Ltd 頭部搭載型ディスプレイ装置
JPH07303225A (ja) * 1994-05-09 1995-11-14 Olympus Optical Co Ltd 頭部装着型映像表示装置
JPH08160349A (ja) * 1994-12-09 1996-06-21 Sega Enterp Ltd 頭部搭載型映像表示システム、および、頭部搭載型映像表示装置
JPH08328512A (ja) * 1995-05-26 1996-12-13 Canon Inc 頭部装着型表示装置
JPH1195155A (ja) * 1997-09-18 1999-04-09 Nec Software Kyushu Ltd ヘッドアップディスプレイ装置

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8061845B2 (en) 2005-12-19 2011-11-22 Brother Kogyo Kabushiki Kaisha Image display system and image display method
JP2016004402A (ja) * 2014-06-17 2016-01-12 コニカミノルタ株式会社 透過型hmdを有する情報表示システム及び表示制御プログラム
JP2016173693A (ja) * 2015-03-17 2016-09-29 セイコーエプソン株式会社 頭部装着型表示装置、頭部装着型表示装置の制御方法、および、コンピュータープログラム
US10175484B2 (en) 2015-03-17 2019-01-08 Seiko Epson Corporation Head-mounted display device, control method for head-mounted display device, and computer program
JP2017117175A (ja) * 2015-12-24 2017-06-29 セイコーエプソン株式会社 頭部装着型表示装置、頭部装着型表示装置の制御方法、および、コンピュータープログラム

Also Published As

Publication number Publication date
US20060197832A1 (en) 2006-09-07
US7825996B2 (en) 2010-11-02

Similar Documents

Publication Publication Date Title
WO2005043218A1 (fr) Dispositif d'affichage d'image
EP3330771B1 (fr) Afficheur et procédé d'affichage à l'aide d'un foyer et affichages de contexte
JP6353105B2 (ja) 表示システムおよび方法
US11188149B2 (en) Image display device using retinal scanning display unit and method thereof
US10645374B2 (en) Head-mounted display device and display control method for head-mounted display device
JP2020515895A (ja) 操作可能な中心窩ディスプレイ
US20160284129A1 (en) Display, control method of display, and program
EP2006827A9 (fr) Dispositif d'affichage d'image
JPH0749744A (ja) 頭部搭載型表示入力装置
JP2000013818A (ja) 立体表示装置及び立体表示方法
JPH10239634A (ja) 立体映像表示装置
JP6707823B2 (ja) 表示装置、表示装置の制御方法、及び、プログラム
US20180324332A1 (en) Display apparatus and method using image renderers and optical combiners
JP2016024751A (ja) 画像表示装置
JP2015060071A (ja) 画像表示装置、画像表示方法、および画像表示プログラム
CN109997067B (zh) 使用便携式电子设备的显示装置和方法
CN112306229A (zh) 电子装置、控制方法和计算机可读介质
JP2016197830A (ja) 表示装置、表示装置の制御方法、および、プログラム
JP2018054976A (ja) 頭部搭載型表示装置及び頭部搭載型表示装置の表示制御方法
JP4839598B2 (ja) 画像表示装置
JP2016090853A (ja) 表示装置、表示装置の制御方法、及び、プログラム
JP2019066564A (ja) 表示装置、表示制御方法、及びプログラム
JP4599858B2 (ja) 画像表示装置
JPH0756517A (ja) 眼鏡型画像表示装置
JPH09127455A (ja) 表示装置

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 11413046

Country of ref document: US

WWP Wipo information: published in national office

Ref document number: 11413046

Country of ref document: US

122 Ep: pct application non-entry in european phase