JP3717653B2 - Head mounted image display device - Google Patents

Head mounted image display device Download PDF

Info

Publication number
JP3717653B2
JP3717653B2 JP00851698A JP851698A JP3717653B2 JP 3717653 B2 JP3717653 B2 JP 3717653B2 JP 00851698 A JP00851698 A JP 00851698A JP 851698 A JP851698 A JP 851698A JP 3717653 B2 JP3717653 B2 JP 3717653B2
Authority
JP
Japan
Prior art keywords
distance
object
real object
means
virtual object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
JP00851698A
Other languages
Japanese (ja)
Other versions
JPH11202256A (en
Inventor
克之 大村
Original Assignee
株式会社リコー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社リコー filed Critical 株式会社リコー
Priority to JP00851698A priority Critical patent/JP3717653B2/en
Publication of JPH11202256A publication Critical patent/JPH11202256A/en
Application granted granted Critical
Publication of JP3717653B2 publication Critical patent/JP3717653B2/en
Anticipated expiration legal-status Critical
Application status is Expired - Fee Related legal-status Critical

Links

Images

Description

[0001]
BACKGROUND OF THE INVENTION
The present invention relates to a head-mounted image display device that can simultaneously observe a real object and a virtual object within the same field of view of a display device mounted on a human head.
[0002]
[Prior art]
For example, as a training for device operation, to the trainee who is watching the actual device, superimposed on the device, present the image of the name and operation procedure of each part of the device as a virtual image, Efforts are being made to make it easy to learn how to operate the equipment.
In such training, an image display device that displays the virtual image on a display device mounted on a person's head is used. As such an image display device, one disclosed in Japanese Patent Laid-Open No. 7-333550 is known.
In order to be able to observe a real device or the like and a virtual image at the same time, it is necessary for the observer to display the virtual image so as to “fuse with the real device or the like”. If the distance between the displayed virtual image and the actual device is significantly different, if you focus on one eye, the other image will be "blurred", so you cannot see both clearly and simultaneously, A big burden is placed on the eyes of the observer.
[0003]
In the invention disclosed in the above publication, when the real device and the virtual image do not match, there is an annoyance that the observer must adjust his / her posture and the position of the display device.
[0004]
[Problems to be solved by the invention]
In the head-mounted image display device, the actual observation object to be observed by the observer and the virtual image are automatically displayed at substantially the same distance from the observer, and the observation object and the virtual image are displayed. The challenge is to facilitate simultaneous observation of various images.
The “observation object” may be an actual object or an image obtained by capturing an image of the actual object.
[0005]
[Means for Solving the Problems]
  Claims 1-4The head-mounted image display device described in the above item “superimposes a real object in an external field of view being observed by an observer and a virtual object displayed by display means for independently displaying a desired image on both the left and right eyes. Is a see-through head-mounted image display device that enables simultaneous observation.
  That is, the “real object” can be observed directly by the observer with the naked eye, and is observed through “display means mounted on the observer's head” (that is, see-through).
  In contrast, a “virtual object” does not exist in a space where a real object exists. A virtual object is an image displayed on the display means independently for both the left and right eyes. The display form of the virtual object is a desired image, and is displayed independently on both the left and right eyes. Therefore, the virtual object can be a “three-dimensional stereoscopic image”.
[0006]
  The head-mounted image display device according to claim 1 includes a distance detection unit and a congestion control unit.are doing.
  “Distance detection means” is a means for detecting in real time the distance of the real object being observed by the observer from the observer.A distance sensor.
  The “congestion control unit” is a unit that controls the congestion of the display unit in real time for a desired virtual object in accordance with the distance detected by the distance detection unit. That is, the congestion control means is detected by the distance detection means.According to the real object distance: D, the convergence angle with respect to the real object: θ 0 And the angle of convergence with respect to the virtual object displayed by the display means: θ is θ 0 Display means to be within ± 40 minutesControl in real time.
[0007]
Referring to FIG. 12, (a) and (b) are diagrams for explaining simultaneous observation of a real object and a virtual object by the display means.
Reference numeral 2 indicates “observer's eyes”, reference numeral 1 indicates a pair of “display devices”, and reference numeral 7 indicates a pair of “lenses”. As shown to (a), the display device 1 and the lens 7 are each arrange | positioned according to each of an observer's both eyes. Actually, as shown in (b), a half mirror 3 is provided in front of both eyes 2 of the observer so as to make an angle of 45 degrees with respect to the observation direction. The lens 7 is disposed on the upper side of the half mirror 3. Accordingly, the observer can observe the “actual outside world” through the half mirror 3 with both eyes 2, and the field of view that the observer can observe through the half mirror 3 at this time is the “outside field field of view”. If the observation object 4 exists in the real space that can be observed as the external field, the observer can observe the observation object 4 as the “real object” in the external field.
Each of the pair of display devices 1 is, for example, a liquid crystal display device (hereinafter referred to as a liquid crystal panel), and can display a desired image independently for both eyes of an observer. Right-eye image I on right-eye display deviceR Is displayed, the observer will see the image IR Is observed through the right-eye lens. Similarly, the image I for the left eye is displayed on the display device for the left eye.LIs displayed, the observer will see the image ILIs observed through the lens for the left eye. At this time, the observer sees the image I by the pair of lenses 7.R, ILIt is an enlarged “virtual image”.
Image IR, ILIs originally a “planar image”, but for convenience of illustration, it is depicted as a three-dimensional shape in FIGS. 12 (a) and 12 (b). Image I observed with the left and right eyes of the observerR, ILThese virtual images are synthesized and observed as a virtual object 5 at a position where the line of sight of the right eye and the line of sight of the left eye intersect. Left and right eye image IR, ILWhen the “stereoscopic image” image is displayed, the virtual object 5 observed by the observer is a three-dimensional stereoscopic image. The distance to the virtual object recognized by the observer: d is hereinafter referred to as “virtual object distance”.
The angle θ between the observer's right eye line of sight and left eye line of sight shown in FIG. 12A is called a “convergence angle” with respect to the virtual object 5. In addition, when the observer views the real object 4, the angle of intersection of the line of sight of the left and right eyes: θ0 Is called the convergence angle for the real object 4. Distance between observer and real object 4: D is called “real object distance”. As is clear from FIG. 12A, the convergence angles: θ and θ0 Are sufficiently close to each other, the virtual object distance and the real object distance are close to each other, and both objects are observed at a position substantially the same distance for the observer.
When observing two objects simultaneously with both eyes of a human being, it is necessary that the “difference of the convergence angle with respect to both” be equal to or less than the “panum fusion criticality” in order to be able to observe them “in fusion”. The fusion criticality of Panoum has been experimentally determined to be 40 minutes (for example, the IEICE Technical Report EID87-42, PP19-26).
Accordingly, in FIG. 12A, in order for the observer to “fuse and simultaneously observe the real object 4 and the virtual object 5,” the convergence angles: θ, θ0The difference between them needs to be 40 minutes or less.
[0008]
  The horizontal axis of FIG. 12C indicates the distance between the observer and the real object 4: D (real object distance) in mm, and the vertical axis indicates the distance between the observer and the virtual object 5: d (virtual object distance) is expressed in mm. Curve A in this figure shows that the convergence angle with respect to the virtual object is “θ” at an arbitrary real object distance indicated on the horizontal axis.0The virtual object distance when it is “−40 minutes”, and the curve: B indicates that the convergence angle with respect to the virtual object is “θ” at the above arbitrary real object distance.0The virtual object distance when “+40 minutes” is reached. That is, in the state of observing the real object, the virtual object is encircled by the curves A and B in accordance with the real object distance so that the virtual object can be fused and observed simultaneously. Must be displayed at the virtual object distance in the region. For example, when the real object distance is 1000 mm, the virtual object distance of the virtual object that can be fused to the real object needs to be in the region indicated by reference numeral C in FIG. 12C, that is, the region of about 700 to 1600 mm. .
  Claims 1-4Since the head-mounted image display apparatus is “see-through type”, the real object distance is “distance between the observer and the real object”. On the other hand, as is apparent from FIG. 12A, the virtual object distance changes as the convergence angle θ changes. Changing the angle of convergence: θ is the “image I displayed on the left and right eyes” displayed on the pair of display devices 1.R, ILCan be changed by changing the interval of X ′ ”.
  If the distance between the eyes 2 and the display device 1 is ignored at the actual object distance D, and the distance between both eyes 2 is “De”, then “De / 2 = D · tan (θ0/ 2) ”and De is a constant, so if the real object distance: D can be known, the equation“ De / D = 2 tan (θ0/ 2) "to" θ0By performing an operation to solve "0Can know.
[0009]
On the other hand, the convergence angle θ for the virtual object is the image I for the left and right eyes.R, ILInterval: X ′ is uniquely determined by X ′. Therefore, if the relationship between θ and X ′ is experimentally determined in advance, the convergence angle of the real object: θ0According to the “convergence angle of virtual object: θ is θ0X '"can be determined to be within ± 40 minutes.
In the first aspect of the invention, the congestion control means “controls the congestion of the display means in real time for a desired virtual object according to the distance detected by the distance detection means” in accordance with the above explanation. Speaking of “the actual object distance detected by the distance detecting means: D, the convergence angle of the virtual object: θ is θ0 X 'that is within ± 40 minutes is determined, and based on this determination, image IR, ILThe interval and display position of the image are controlled in real time.
[0010]
The “distance detection means” in the head-mounted image display device according to claim 1 is an appropriate one conventionally known in connection with distance detection in real time, for example, “directing an ultrasonic pulse train to a real object. An acoustic system that oscillates and measures the return time of a reflected pulse by a real object ”or an optical system known in connection with an autofocus device of a camera or the like may be used.
[0011]
  As a distance detector“Gaze detection means for detecting the eyes of both eyes of the observer, gaze convergence point coordinate computing means for computing the convergence point coordinates of the gaze detected by the gaze detection means, and gaze convergence point coordinate computing means” And a distance calculation means for calculating the distance between the convergence point coordinates and the observer as the distance between the real object and the observer (real object distance).Although a configuration is conceivable, the distance sensor is used in the first aspect.
  “Convergence point coordinates” are the coordinates of the substantial crossing position of the line of sight of both eyes.
[0012]
  As described above, according to the first aspect of the present invention, the congestion control of the display means is performed according to the real object distance, and the virtual object distance and the real object distance are controlled so that the difference is within the panoramic fusion limit. Although the object is displayed, the display position of the virtual object in the external field view is also determined by the congestion control means.
  The display position of the virtual object in the external field of view may be "a predetermined distance within the visual field (for example, the upper left corner of the external field of view, etc.) at a distance substantially equal to the actual object (within the panoramic fusion criticality)". (Claim 2), “A distance that is substantially the same as the real object and different from the convergence point coordinates, a position close to the real object (eg, immediately to the left of the real object, etc.)” (Claim 3) Or “the position of the coordinates of the convergence point (the position that overlaps the real object)” (Claim 4).
[0013]
  Hereinafter, as a reference example, a case will be described in which a line of sight is detected by the line of sight detection means, and the convergence point coordinates of the line of sight detected by the line of sight detection means are calculated by the line of sight convergence point coordinate calculation means.Convergence point coordinates are the coordinates of the position where the left and right gazes intersect, but it is rare that the binocular gazes intersect exactly at one point, and in practice, the effects of measurement errors in gaze detection and the observer In many cases, the detected line of sight does not intersect due to the physiological conditions.
  In such a case, what becomes the “convergence point coordinate” becomes a problem.
  In such cases,Convergence point by means of the gaze convergence point coordinate calculation means "the midpoint position of both lines at the point where the spatial straight line that extends the gaze vector of the observer's right eye and the spatial straight line that extends the gaze vector of the left eye of the observer" Calculate the coordinatesJust go,The midpoint position is “the midpoint on the shortest line segment connecting the points on the spatial line that extends the line-of-sight vector of the right eye and the points on the space line that extends the line-of-sight vector of the left eye” Can be computed as
  In such a caseHowever, regarding the display position of the virtual object, the above claim2 to 4it can.
  In the above reference example, Detecting the line of sight of both eyes and computing the coordinates of the convergence point. Even when the human eye is gazing at one point in space, the eyeball itself will involuntarily move regardless of the point of sight. To do.
  For this reason, the convergence point coordinates detected by the eyes of both eyes are finely moved due to the influence of the fine movement of the eyeball. Especially, when the observer is gazing at the vicinity of the eye, the fine movement of the eyeball greatly affects the calculation of the convergence point of the visual line. The calculated convergence point coordinates may vary greatly regardless of the gaze point.
  Claim 2When the display position of the virtual object is fixed at a fixed position in the external field of view as in the invention, the convergence point coordinates calculated by the viewpoint convergence point coordinate calculation means (Including the above “midpoint”) “Calculate the distance between the corresponding coordinate on the coordinate axis corresponding to the front of the head of the coordinate system fixed to the head of the observer and the observer”To doYou can do it.
[0014]
  Claims 5-8The head-mounted image display apparatus “has an image pickup means for picking up an object to be picked up and a display means for independently displaying a desired image on both the left and right eyes, and uses the image picked up by the image pickup means as the display means. This is a head-mounted image display apparatus that displays as a real object, displays a desired virtual object together with the real object on the display means, and allows both objects to be observed simultaneously.
  This head-mounted image display device can be used as follows, for example, when learning how to handle dangerous materials. That is, the dangerous object is placed away from the learner, the learner is in a safe place, and the image captured by the imaging unit is observed as a real object by the display unit. Name, text describing the handling, etc.). The learner (observer) can learn handling while observing both objects with a sense of reality as if he / she handled a dangerous article directly in reality.
  Therefore,Claims 5-8The “head-mounted image display device” in the present invention need not be a see-through type. If the observer sees the “viewing space that can be observed through the display device” as “outside space”, the real object and the virtual object are images that are displayed in the outside space. It does not always exist.
  Claim 5The head-mounted image display apparatus includes an imaging distance detection unit and a congestion control unit.
  The “imaging distance detecting means” is a means for detecting the distance (imaging distance) between the imaging means and the imaging object.With a distance sensorThe
  “Congestion control means” is means for controlling the congestion of the display means for a desired virtual object in accordance with the distance detected by the imaging distance detection means.
  Since the real object displayed on the display means is an image (can be a three-dimensional image) displaying the imaging target imaged by the imaging means, the actual object distance when displayed on the display means is the imaging means. And the distance between the object to be imaged.
  Therefore, the virtual object display is observed by fusing the real object and the virtual object by performing congestion control by the congestion control means so that the difference between the virtual object distance and the real object distance is within the fusion limit. It will be possible.
[0015]
  In the invention described in claim 5, the imaging distance detecting means includes a distance sensor, and the above-described known acoustic or optical distance detecting means can be used as appropriate. As a reference exampleThe imaging means has “a pair of video cameras arranged in parallel to each other in the horizontal direction at a distance equal to the distance between the eyes of the observer”, and the imaging distance detection means has “the eyes of the observer” "Line-of-sight detection means for detecting the line of sight", "line-of-sight convergence point coordinate calculation means for calculating the convergence point coordinates of the line of sight detected by the line-of-sight detection means", and "convergence point coordinates calculated by the line-of-sight convergence point coordinate calculation means" In this case, the imaging distance is detected by having a perceptual distance calculating means for calculating the distance between the viewer and the observer as a perceptual distance between the observer and the real object being watched.Give.
[0016]
  Claim 5In the head-mounted image display apparatus, the congestion control means, Distance approximately equal to the real objectThe congestion of the display means can be controlled so as to display at a predetermined position in the field of view (Claim 6). In addition, the congestion control means "virtual object,Distance approximately equal to real objectAnd convergence point coordinates ((Position that overlaps the real object)It is also possible to control the congestion of the display means so that it is displayed close to the real object at a different position from (Claim 7) Furthermore, the congestion control means “virtual objects,Position that overlaps the real objectIt is also possible to control the congestion of the display means to “display on” (Claim 8).
  Also,Of the above reference exampleThe line-of-sight convergence point coordinate calculation means: "The point of convergence of the midpoint position of both lines at the closest point of the space line that extends the line-of-sight vector of the observer's right eye and the line of space that extends the line-of-sight vector of the left eye of the observer. To determine the convergence point coordinates by performingit can.
  Also,In the reference exampleThe distance calculation means of the distance detection means is “the coordinates of the convergence point calculated by the viewpoint convergence point coordinate calculation means, the corresponding coordinates on the coordinate axis corresponding to the front of the head of the coordinate system fixed to the head of the observer, and the observer. "Calculate the distance of"it can.
[0017]
  Claims 1-8In any of the inventions described above, the virtual object may be a “desired thing”, that is, an arbitrary image to be displayed as a virtual object. As the types of images of the virtual object, a text object such as a description of the real object, or a graphic object such as a pointer or an auxiliary line for instructing the operation of the real object is possible.
[0018]
  Description aboveIn the above, “corresponding coordinates in the coordinate system corresponding to the front of the head of the coordinate system fixed on the observer's head of the convergence point coordinates calculated by the viewpoint convergence point coordinate calculation means” is the convergence point coordinates from the coordinate system origin. The coordinates of the projection point of the vector tip when the vector heading to the coordinate axis corresponding to the front of the head is projected, or the vector tip when the vector is rotated around the origin to match the coordinate axis Means the coordinates.
[0019]
DETAILED DESCRIPTION OF THE INVENTION
FIG. 1 is a view for explaining one embodiment of the invention described in claim 1. In order to avoid complications, the same symbols are used in the drawings after FIG. In FIG. 1, (a) shows a state in which the observer 8 observes the real object 4 and the virtual object 5 via the head mount 10 in the east-mounted image display device. The upper figure of (b) depicts the above observation state following FIG. 12 (b).
Also in FIG. 1, as in FIG. 12, reference numeral 2 indicates “both eyes of an observer 8 (positioned at a distance: De)”, reference numeral 1 indicates a pair of “display devices”, and reference numeral 7 indicates A pair of “lenses” is shown, and one display device 1 and one lens 7 are provided for each eye of the observer. As shown to (a), the half mirror 3 is provided in front of both eyes 2 of the observer 8 so that it may make 45 degree | times with respect to an observation direction, and a pair of display devices 1 and a pair of lenses 7 are provided. The half mirror 3 is arranged on the upper side. Accordingly, the observer 8 can observe the real object 4 in the “external field of view” through the half mirror 3 with both eyes, and at the same time, a desired image 30 displayed on each display device 1 (referred to as a liquid crystal panel) can be virtually displayed. It can be observed as an object 5.
A distance sensor indicated by reference numeral 20 is attached to the head of the observer 8 so as to be integrated with the head mount 10 as shown in FIG. 4 receives the reflected pulse. By adjusting the geometric positional relationship between the head mount 10 and the distance sensor 20, the pulse is emitted and received with respect to “the center of the external field of view of the observer 8”. Reference numeral 9 denotes a casing of the head mount 10.
The distance sensor 20 sends the required time from pulse transmission to reflection pulse reception as “delay time information” to the distance calculation device 11 in real time. The distance calculation device 11 calculates the distance between the observer 8 and the real object 4, that is, the real object distance: D in real time, based on the input delay time information.
That is, the distance sensor 20 and the distance calculation device 11 constitute “distance detection means”.
[0020]
The distance calculation device 11 outputs the calculated actual object distance: D toward the congestion calculation devices (left) 12 and (right) 13 in real time. Based on the input real object distance: D, the congestion calculation devices 12 and 13 use the convergence angle: θ of the real object.0And the angle of convergence of the virtual object 5 to be displayed: θ is θ0A convergence angle that is ± 40 minutes is calculated, and the result is sent to the same (right) 15 as the virtual object image generation device (left) 14.
These virtual object image generation devices 14 and 15 display the left and right eye images 30 corresponding to the virtual object 5 to be displayed on each liquid crystal panel of the display device 1 so as to realize the convergence angle. At this time, the convergence angle: θ shown in the upper diagram of FIG. 1B is substantially the convergence angle: θ.0And the virtual object distance: d is substantially equal to the real object distance: D. Therefore, the observer 8 can “see and fuse the real object 4 and the virtual object 5 at the same time in the field of view”.
Accordingly, the congestion calculation device (left) 12 and the congestion calculation device (right) 13 constitute “congestion control means”, and the virtual object image generation device (left) 14 and the virtual object image generation device (right) 15 are head-mounted. 10 constitutes “display means”.
In other words, the embodiment shown in FIG. 1 is displayed by the display means 10, 14, and 15 that independently display the real object 4 in the external field of view observed by the observer 8 and the desired image 30 on both the left and right eyes. Is a see-through head-mounted image display device that allows simultaneous observation of the virtual object 5 that is superimposed on the virtual object 5, and is the distance D from the observer 8 of the real object 4 being observed by the observer 8. Distance detection means 11, 20 for detecting in real time, and congestion control means 12, 13 for controlling the congestion of the display means for the desired virtual object 5 in real time according to the distance D detected by the distance detection means. (Claim 1).
[0021]
In the above description, it has been explained that the virtual object distance on which the virtual object 5 is to be displayed should be substantially the same as the real object distance, but there is still a degree of freedom in the display of the virtual object 5. That is, it is a position where the virtual object 5 is to be displayed. The display position of the virtual object is basically arbitrary as long as the condition that “the observation of the real object is not hindered and the virtual object can be observed well” is satisfied, and the display position is a program ( The program can be appropriately set by a program for controlling the congestion control means.
Hereinafter, three examples of the “display position” of the virtual object 5 in the embodiment of FIG. 1 will be described with reference to FIGS. 2 and 3. For the concreteness of explanation, as shown in FIG. 3, the real object 4 is “flower”, and the virtual object 5 is “this is a flower”. It is assumed that it is an “additional information window” displaying “information”.
Referring to FIG. 2 (in this figure, the illustration of the lens 7 is omitted in order to avoid complication of the drawing), the virtual object 5 is obtained by the congestion control means 12 and 13 by the convergence angle: θ of the real object 4.0 Is displayed at “position on the left side of the real object 4” with a convergence angle substantially equal to θ. θ = θ0 If so, the binocular 2 and “the center of the real object 4 and the center of the virtual object 5” have the same circumference, but the distance between the two eyes 2 (the “De”) is the real object distance: D. The angle of convergence: θ (≒ θ0The virtual object displayed in () is displayed in any part of the “outside field visual field”, the virtual object distance: d is substantially equal to the real object distance: D, and can be fused with the real object.
[0022]
  The convergence angle of the virtual object 5 is determined by the interval between the left and right eye images 30 individually displayed on the display panel, and the display position of the virtual object 5 depends on the display position of each image 30 on the display device 1. By specifying "the image display position for one eye (determined by the program that determines the display on the display means) and the display interval for the images for both eyes (determined by the congestion control means)", a virtual object is specified. Can be displayed at a desired display position at a distance substantially equal to the actual object distance.
  FIG. 3A shows a case where the virtual object 5 is displayed on the “upper left corner in the visual field 30A”, and FIG. 3B shows a case where the virtual object 5 is displayed on the “center portion of the visual field 30A”.
  In the embodiment shown in FIG. 1, the emission and reception of pulses are performed on the center of the external field of the observer 8 by adjusting the geometric positional relationship between the head mount 10 and the distance sensor 20. It is like that. This means that the observer 8 “performs observation so that the real object 4 to be observed is located at the center of the external field of view”. Accordingly, since the real object 4 is always at the center of the external field of view, that is, at a fixed position in the view, the case of FIG. 2 described above (the virtual object 5 is displayed on the left side of the real object 4) is also shown in FIG. In the case of (a) (the virtual object 5 is displayed at the upper left corner of the visual field 30A), the case of the same (b) (the virtual object 5 is displayed so as to overlap the real object 4 in the center of the visual field 30A). However, the virtual object whose congestion is controlled by the congestion control means is “at a distance approximately equal to the real object and within the field of view.
Is displayed at “predetermined position” (Claim 2).
  From another point of view, the display in the case of FIG. 2 indicates that “virtual object 5 is at a distance (on the left side) that is substantially the same distance as real object 4 and is different from the convergence point coordinates (position of real object 4). It is also a display method that displays close to the object 4 (Claim 33 (b) is a display method that “displays the virtual object 5 at the position of the convergence point coordinates” (FIG. 3B).Claim 4).
  Of course, not only in the case of FIG. 2, but also in the cases of FIGS. 3A and 3B, the convergence angle of the virtual object 5 is substantially equal to the convergence angle of the real object 4: θ.0be equivalent to.
[0023]
  In FIG.Reference formIndicates.
  With the embodiment of FIG.Reference formFirst, as shown in FIG. 4A, the casing 9 ′ of the head mount 10 ′ includes a display device 1, a pair of lenses 7, a half mirror 3 and a half mirror 3 ′. And the light source 41 and the photographing camera 40 are installed. Second, the output of the photographing camera 40 is input to the line-of-sight arithmetic unit (right) 43 and the same (left) 42, and these line-of-sight arithmetic units 42 , 43 are input to the virtual object congestion control device 46 via the line-of-sight convergence point calculation device 44 and the real object distance calculation device 45. The virtual object image 30 is generated by the video signal generators (left) 14 'and (right) 15'.
  Both the light source 41 and the imaging camera 40 are independently provided for both eyes of the observer 8. The half mirror 3 ′ is provided in parallel with the half mirror 3. Accordingly, the observer 8 can observe the real object 4 in the external field through the two half mirrors 3 and 3 '.
  For example, an LED that emits light in the infrared region is used as the light source 41 so as not to hinder the observation of the observer 8. Infrared light from the left (right) eye of the light source 41 illuminates the left (right) eye of the observer 8 via the half mirrors 3 ′ and 3. Reflected light from the left (right) eye passes through the half mirror 3, is reflected by the half mirror 3 ', and enters the left (right) eye camera 40, and each camera 40 shoots the left (right) eye. To do. The photographing camera 40 is, for example, a “CCD camera”. Data of the left (right) eye image from the photographing camera 40 is input to the line-of-sight arithmetic units 42 and 43. The line-of-sight calculation device (left) 42 calculates the line of sight of the left eye as “left-eye straight line data” in a coordinate system fixed to the head mount 10 ′ based on the data of the left-eye image. Similarly, the line-of-sight calculation device (right) 43 calculates the line of sight of the right eye as “right eye straight line data” in the coordinate system based on the data of the right eye image.
[0024]
The left / right eye straight line data obtained in this way is sent to the line-of-sight convergence point calculation device 44, which calculates the “convergence point coordinates” as the intersection of the input pair of straight line data. If the right and left line of sight does not have an actual intersection (when the simultaneous equations of the straight line represented by the left and right eye straight line data do not have a solution), the method described later will be used. The midpoint of both straight lines is determined as the convergence point coordinate.
When the convergence point coordinates are determined in this way, the actual object distance calculation device 45 calculates the actual object distance: D based on the convergence point coordinates. When the real object distance: D is determined, the virtual object congestion control device 46 determines the convergence angle of the virtual object 5 so that the virtual object 5 is displayed at substantially the same distance as the real object 4. The result is applied to the video signal generator (left) 14 'and the same (right) 15'. Each of these video signal generators 14 ′ and 15 ′ displays the right-eye image and the left-eye image of the image 30 corresponding to the virtual object 5 independently on the display device 1 according to the determined convergence angle. . Thus, the virtual object 5 is displayed at a position substantially equal to the real object distance: D, and the observer can fuse the real object 4 and the virtual object 5 and observe them simultaneously.
Since a series of procedures from the determination of the convergence point coordinates to the display of the virtual object through the calculation of the convergence angle is performed in “real time”, the virtual object 5 is always displayed at the same distance as the real object 4. Become. The display position of the virtual object 5 is determined by software that controls the congestion control means.
[0025]
  As is clear from the above description, the photographing camera 40, the light source 41, the half mirror 3 ′, the line-of-sight calculation device (left) 42, the line-of-sight calculation device (right) 43, the line-of-sight convergence point calculation device 44, and the actual object distance calculation device 45. Constitutes “distance detection means”, and the virtual object congestion control device 46 constitutes “congestion control means”.
  The photographing camera 40, the light source 41, the half mirror 3 ′, the line-of-sight calculation device (left) 42, and the line-of-sight calculation device (right) 43 constitute “line-of-sight detection means”, and the line-of-sight convergence point calculation device 44 displays the “line-of-sight convergence point”. "Coordinate calculation means" is configured, and the real object distance calculation device 45 configures "distance calculation means".
  That is, in FIG.Reference formAre the real object 4 in the external field of view observed by the observer 8, and the virtual object 5 displayed by the display means 10 ′, 14 ′, 15 ′ for independently displaying the desired image 30 on both the left and right eyes. Is a see-through head-mounted image display device that can be observed simultaneously, and is a distance detection means for detecting in real time the distance D of the real object 4 observed by the observer 8 from the observer And a congestion control means 46 for controlling the congestion of the display means 10 ′ in real time for the desired virtual object 5 according to the distance D detected by the distance detection means.HaveThe “distance detection means” is the line-of-sight detection means 40, 41, 3 ′, 42, 43 for detecting the line of sight of both eyes of the observer 8, and the line of sight for calculating the convergence point coordinates of the line of sight detected by the line-of-sight detection means A distance for calculating the distance between the convergence point coordinate calculation means 44 and the convergence point coordinates calculated by the line-of-sight convergence point coordinate calculation means 44 and the observer 8 as the distance between the real object 4 and the observer watching the observer. The computing means 45Have.
[0026]
  In the embodiment of FIG. 1 described above, since the distance detection means “performs a distance detection using an entity located in the center of the observer's observation visual field as a real object”, the position that the real object should occupy in the visual field is It was limited to the “center of view”, and the observer had to always place the real object in the center of the view.
  Of FIG.In reference formCan determine the position of the real object as the “convergence point coordinates of the eyes of both eyes of the observer”, so it can only “gaze” the observation object in the outside world without positioning the real object in the center of the field of view. Thus, the observation target can be specified as a real object.
  For example, as shown in FIG. 5, when there are two observation objects 4 and 4 ′ (both are “flowers”) in the external field of view (square outer frame) 30A of the display means, the implementation of FIG. In this form, when the one to be observed is positioned at the center of the visual field 30A among the observation objects 4 and 4 ′, the observation object becomes a real object.Reference formIn the case shown in FIG. 5, if any one of the observation objects 4 and 4 'is observed, the observation object on the observed side becomes a real object. When the observation objects 4 and 4 ′ illustrated in FIG. 5 exist at different distances from the observer (both eyes 2) as illustrated in FIG. 6, the observer “gazes the observation object 4 as a real object”. The angle of convergence of θ0Then, at this time, when the virtual object 5 is displayed (assuming that it is an “additional information window” such as “This is a flower” as shown in FIG. 5), the convergence angle: θ is substantially Θ0be equivalent to.
  Similarly, the convergence angle when observing the observation target 4 ′ as a real object is θ0′, The convergence angle when displaying the virtual object 5 ′ at this time: θ ′ is substantially θ0'be equivalent to.
  FIG. 5 shows a display state when the display position of the virtual object that is the additional information window is set to “a left upper end of the visual field” that is a predetermined position in the visual field (performed by software that controls the congestion control means). Show. In FIG. 5A, the observer is gazing at the gaze unit 33 of the real object 4, and at this time, the virtual object 5 related to the real object 4 is displayed at the upper left end in the field of view 30A with the real object distance of the real object 4. Displayed with substantially equal virtual object distances. In (b), the observer is gazing at the gaze unit 34 of the real object 4 ′. At this time, the virtual object 5 ′ relating to the real object 4 ′ is located at the upper left end in the field of view 30A. Displayed with a virtual object distance substantially equal to the distance. 3 and 5, the position indicated by reference numeral 31 is a [virtual object display reference point] in the visual field 30 </ b> A, and in the embodiment of FIG. 5, “fixed point in the visual field”.It is.
[0027]
  As another virtual object display method, as shown in FIGS. 7A and 7B, there is a method of displaying it at the “position immediately to the left” of the corresponding real object. In FIG. 7A, the observer is gazing at the gaze portion 33 of the real object 4, and the corresponding virtual object 5 is displayed in the “left side portion of the real object 4”. Similarly, in (b), the observer is gazing at the gaze unit 34 of the real object 4 ′, and the corresponding virtual object 5 is displayed in the “left part of the real object 4 ′”. In this case, the reference position 31 of the virtual object display is determined relative to the gaze position of the real object. That is, the virtual object display method shown in FIG. 7 is a display method of “displaying a virtual object close to the real object at a position substantially equal to the real object and different from the convergence point coordinates”.It is.Since the central portions of the gaze units 33 and 34 in FIG. 7 are also “congestion point coordinates in the observer's real object observation”, the display of the virtual object is displayed at the position of the convergence point coordinates.Also good.
[0028]
In the above description, the display position of the virtual object is determined by software. Of course, this software is software for controlling the “congestion control means”.
[0029]
Here, the calculation for obtaining the “convergence point coordinates” of the eyes of both eyes of the observer will be described.
Since it is a vector operation, the vector display is defined as follows. That is, for example, [A] is a vector in which alphabets are enclosed in square brackets. The above [A] is the vector A. In the drawing, normal vector notation is used.
In FIG. 8, the line-of-sight vector from the left eye EL is [VL], and the line-of-sight vector from the right eye ER is [VR]. These are “unit vectors”. In FIG. 8, symbols m and l are straight lines including line-of-sight vectors [VL] and [VR], respectively. An appropriate reference position (for example, the midpoint position of both eyes EL and ER) is used as a reference, the position vectors of the left eye EL and right eye ER are [EL] and [ER], and the position vectors on the straight lines l and m are respectively [ p] and [q], using parameters t and s,
[P] = [EL] + t [VL] (1)
[Q] = [ER] + s [VR] (2)
It can be expressed as.
[0030]
The condition that the straight line vector: [q]-[p] connecting the tips of the vectors [p] and [q] is orthogonal to the straight lines: l, m is the straight line vector: [q]-[p] and the vector [VL]. , [VR] scalar product becomes 0, that is,
([Q]-[p]). [VL] = 0 (3)
([Q] − [p]) · [VR] = 0 (4)
It is. Therefore, using the equations (1) and (2),
t | [VL] |2-S [VR]. [VL] = ([ER]-[EL]). [VL] (5)
t [VL] · [VR] −s | [VR] |2= ([ER]-[EL]) / [VR] (6)
When solving equations (5) and (6) as simultaneous equations for s and t, the solution is expressed as “s0"And" t0"
[0031]
  T11= T0, T21= S0(T), S11= ([ER]-[EL]) · [VL], S21= ([ER] − [EL]) · “VR” is an element of 2 × 1 “column matrix” (S), U11= | [VL] |2, U12= [VR] · [VL], U21= [VL] ・ [VR], U22= | [VR] |2The matrix of 2 rows and 2 columns with the element is (U) and the inverse matrix is (U)-1These are the matrix equations:
  (T) = (U)-1(S) (7)
Is satisfied, by performing the calculation of the right side of Equation (7), t0, S0Can be obtained. As shown in FIG. 8, on the straight lines: 1 and m, satisfying equations (5) and (6): When the position vectors of CL and CR are [CL] and [CR], respectively,
  [CL] = [EL] + t0"VL" (8)
  [CR] = [ER] + s0[VR] (9)
Therefore, the position vector [C] of the midpoint: C of the position: CL, CR is
  [C] = ([CL] + [CR]) / 2
      = ([EL] + t0“VL” + [ER] + s0[VR]) / 2 (10)
It is. In this way, the coordinates of the middle point C can be calculated as the convergence point coordinates, and based on this, the distance between the observer's position (the reference position in the calculation) and the convergence point coordinates can be known.
  Of FIG.Reference formThe above-mentioned calculation is performed by the line-of-sight convergence point calculation device 44, and the “real object distance calculation” based on the calculated convergence point coordinates is performed by the real object distance calculation device 45.
[0032]
As described above, even when the observer is gazing at a fixed point in the external field of view, the eyeball may involuntarily finely move regardless of the gazing point. When such a fine eyeball movement exists, if the gazing point (actual object position) is calculated as the convergence point coordinates based on the line of sight of both eyes by the method as described above, the calculated gazing point position changes. . In particular, when the observed real object is at a position relatively close to the observer, the variation of the convergence point coordinates due to the influence of the fine eyeball movement is large.
[0033]
  In such a case, if the display method of the virtual object is the “method of displaying the virtual object at a position near the real object” shown in FIG. 7, the display position of the virtual object also changes as the convergence point coordinates change. Resulting in. In such a case, as described with reference to FIG. 6, a display method for displaying a virtual object at a fixed position in the field of view is always preferable as a display method of the virtual object, as described with reference to FIG. 6.
  In this case, the problem is how to determine the actual object distance according to the changing convergence point coordinates.However, viewpoint congestionCorresponding coordinates in the coordinate system corresponding to the front of the head of the coordinate system fixed to the observer's head of the convergence point coordinates computed by the point coordinate computing means (in the embodiment of FIG. 4, the line-of-sight convergence point computing device 44) It is reasonable to calculate the distance from the observer to the actual object distance.
  Of course, the distance calculation is performed by distance calculation means (the actual object distance calculation device 45 in the embodiment of FIG. 4).
  Referring to FIG. 9, FIGS. 9A and 9B define the above-mentioned “corresponding coordinates on the coordinate axis corresponding to the front of the head in the coordinate system fixed to the observer's head of the convergence point coordinates”. Shows how.
[0034]
Referring to FIG. 9A, the XYZ coordinates represent a “coordinate system fixed to the observer's head”. The X axis is a direction connecting both eyes, and the left eye EL and the right eye ER are symmetrical with respect to the origin with respect to the origin: 0. The Y axis is a direction in which the upper side is positive for the observer, and the Z axis is a “coordinate axis corresponding to the front of the head”, which is the front for the observer.
In FIG. 9A, the point C is “convergence point coordinates determined based on the line of sight”. For example, even if the real object is in the Z-axis direction and the observer is gazing at the real object, it is conceivable that the convergence point coordinates are deviated from the Z-axis as shown in FIG. In this case, the actual object distance that is a criterion for displaying the virtual object is determined as follows. That is, an orthographic projection of the “convergence point coordinate: origin of C: position vector based on 0” onto the Z axis is taken as the “corresponding coordinate corresponding to the Z axis” of the convergence point coordinate: C, and the tip of the position vector The projection position: Ci is the corresponding coordinate, and the distance between the origin: 0 and the coordinate point: Ci (ie, the distance: Ci) is set as the real object distance: D. The virtual object is displayed as a virtual object distance: D at a fixed position in the field of view (for example, the upper left corner in the field of view as shown in FIG. 5).
[0035]
9B, the XYZ coordinates are “a coordinate system fixed to the observer's head”, the X axis is “the direction connecting both eyes”, and the Y axis is “the direction in which the upper direction for the observer is positive”. The Z axis is the “coordinate axis corresponding to the front of the head”.
In FIG. 9B, the point C is “the convergence point coordinates determined based on the line of sight”. In the case of FIG. 9B, the actual object distance is determined as follows. That is, as a “corresponding coordinate corresponding to the Z axis” of the convergence point coordinate: C, a position vector based on the origin: 0 of the convergence point coordinate: C is rotated around the origin: 0 and superimposed on the Z axis. The position vector where the front end portion of the position vector overlaps is set as the corresponding coordinate, and the distance between the origin: 0 and the coordinate point: Ci (ie, the distance: Ci) is set as the real object distance: D. The virtual object is displayed as a virtual object distance: D at a fixed position in the field of view (for example, the upper left corner in the field of view as shown in FIG. 5).
[0036]
  FIG.Claim 51 shows an embodiment of the head-mounted image display apparatus. The portion indicated by (a) at the top of the figure shows a state in which the observer 8 is observing the “virtual object 5 and real object 4A” displayed on the head mount 10A of the head-mounted image display device. Yes.
[0037]
As shown in FIG. 10B, the head mount 10A has a pair of display devices 1 and a pair of lenses 7. The display device 1 and the lenses 7 are each one system according to each eye of the observer. Has been deployed. As shown in (a), a pair of display devices 1 are arranged in front of both eyes 2 of an observer 8 via a pair of lenses 7, and a desired display displayed on each display device (liquid crystal panel). The image can be observed.
The display means mounted on the head mount 10A is not “see-through type”.
FIG. 10C shows two video cameras 21 and 22 that are part of the “imaging unit” and a distance sensor 20 that forms part of the “imaging distance detection unit”. The distance sensor 20 is the same as that described with reference to the embodiment of FIG. 1, and transmits an ultrasonic pulse to the imaging object 4 and receives a reflected pulse. By adjusting the geometric positional relationship between the imaging cameras 21 and 22 and the distance sensor 20, the pulse is emitted and received with respect to “the central portion of the imaging area common to the video cameras 21 and 22”. ing. As shown in (d) of FIG. 10, the distance sensor 20 sends the required time from pulse transmission to reception of the reflected pulse as “delay time information” to the distance calculation device 11 in real time. Based on the input delay time information, the distance calculation device 11 calculates the distance between the video cameras 21 and 22 and the imaging object 4, that is, the imaging distance: D in real time. That is, the distance sensor 20 and the distance calculation device 11 constitute an “imaging distance detection unit”.
On the other hand, the outputs of the video cameras 21 and 22 are a camera image (left) 21 ′ and a camera image (right) 22 ′, which are respectively displayed on the display device 1 via the video signal synthesizer (left) 17 and the right (right) 18, respectively. Displayed on the left and right LCD panels. The left and right images displayed in this way are presented as “images of the imaging object 4”, and the observer 8 observes the image 4A as “real objects”. Since the imaging object 4 is located at an imaging distance: D from the video cameras 21, 22, and the display device reproduces this, the actual object distance of the actual object 4A is assumed to be equal to the imaging distance: D for the observer 8. It will be perceived. That is, the real object distance is “D” for the observer 8.
[0038]
On the other hand, the distance calculation means 11 outputs the calculated imaging distance: D toward the congestion calculation device 12 'in real time. The congestion calculation device 12 ′ is based on the input imaging distance: D and the convergence angle of the real object: θ0The convergence angle of the virtual object 5 to be displayed is θ0 A convergence angle that is ± 40 minutes is calculated, and the result is sent to the same (right) 15 as the virtual object image generation device (left) 14. The virtual object image generation device (left) 17 and the same (right) 18 realize the above convergence angle on each liquid crystal panel of the display device 1 with the left-eye and right-eye images of the image 30 forming the virtual object 5 to be displayed. To display. At this time, the convergence angle: θ shown in FIG. 10B is substantially the convergence angle: θ.0And the virtual object distance: d is substantially equal to the real object distance: D. Therefore, the observer 8 can “see and fuse the real object 4A and the virtual object 5 at the same time in the field of view”.
[0039]
  Accordingly, the congestion calculation device 12 ′ constitutes “congestion control means”, which is the same as the virtual object image generation device (left) 14 (right) 15, the video signal synthesis device (left) 17 and the same (right) 18, and head-mounted. 10A constitutes “display means”.
  That is, the embodiment shown in FIG. 10 has “imaging means 21 and 22 for imaging the imaging object 4 and display means 14, 15, 17 and 18 for independently displaying desired images on both the left and right eyes. The image captured by the imaging units 21 and 22 is displayed on the display unit as the real object 4A, and the desired virtual object 5 is displayed on the display unit together with the real object 4A so that both objects can be observed simultaneously. An on-board image display apparatus ”, which includes an imaging distance detection means 20 and 11 for detecting the distance between the imaging means and the imaging object in real time, and a desired virtual object according to the distance detected by the imaging distance detection means. 5 has a congestion control means 12 ′ for controlling the congestion of the display means in real time (Claim 5).
  The real object 4A for the observer is an image in which the imaging target 4A imaged by the imaging unit is displayed on the display device of the display unit, and thus is always located in the “center of visual field”. Accordingly, as the display position of the virtual object 5, as in the case shown in FIGS. 2 to 3 with respect to the embodiment described with reference to FIG. Control the congestion of the display means so that it is displayed at a predetermined position in the field of view with substantially the same perceptual distance. ”Claim 6In this case, the virtual object is displayed in the vicinity of the real object at a perceptual distance substantially equal to the gaze position of the real object and at a position different from the convergence point coordinates (display position of the real object). Well (Claim 7Further, the virtual object may be “displayed at the position of the convergence point coordinates” (Claim 8). The positional relationship between the real object and the virtual object in these cases is the same as that shown in FIGS. 3A, 2 and 3B, for example.
[0040]
  Figure 11 shows the headOn-board image display deviceAnother reference formIs shown. The part shown by (a) has shown the state which the observer 8 is observing the virtual object 5 and the real object 4A which were displayed on head mount 10A 'of a head mounted image display apparatus.
[0041]
The head mount 10A ′ has a pair of display devices 1 and a pair of lenses 7, as shown in FIG. 5B. The display device 1 and the lenses 7 are arranged in one system according to each eye of the observer. Has been. Moreover, as shown to (a), it has half mirror 3 'in front of both eyes 2 of the observer 8, and the observer 8 is a pair of display devices via the half mirror 3' and a pair of lenses 7. The image displayed in 1 can be observed. As shown in FIG. 4A, a light source 41 and a photographing camera 40 are provided above the half mirror 3 ', as in the embodiment of FIG.
The half mirror 3 ′, the light source 41, and the imaging camera 40 are the same as those described with reference to the embodiment of FIG. 4, and one system is provided for each of the left and right eyes of the observer 8. As in the embodiment of FIG. 10, the display means provided in the head mount 10 </ b> A ′ is not “see-through type”.
On the other hand, as shown in FIG. 11C, two video cameras 21 and 22 that form a part of the imaging means capture the imaging object 4.
The video cameras 21 and 22 are disposed in parallel to each other in the horizontal direction at a distance equal to the distance between both eyes of the observer.
The outputs of the video cameras 21 and 22 are a camera image (left) 21 ′ and a camera image (right) 22 ′, which are respectively displayed via the video signal synthesizers (left) 14 and (right) 15 respectively. 1 on the left and right liquid crystal panels. The left and right images displayed in this way are presented as an image of the imaging object 4, and the observer 8 observes the image 4A as a “real object”.
[0042]
Since the video cameras 21 and 22 are “arranged in parallel with each other in the horizontal direction at a distance equal to the distance between both eyes of the observer”, the real object 4 ′ displayed on the display device 1 is as if the observer 8 is. The video cameras 21 and 22 are displayed as if the object to be imaged 4 was seen with both eyes. Since the imaging object 4 is located at the imaging distance: D from the video cameras 21, 22, the real object distance of the real object 4 </ b> A is perceived by the observer 8 as a distance equal to the imaging distance: D. That is, the real object distance is “D” for the observer 8.
[0043]
Data of the left (right) eye image from the photographing camera 40 shown in FIG. 11A is input to the line-of-sight arithmetic units 42 and 43. The line-of-sight calculation device (left) 42 calculates the line of sight of the left eye as “left-eye straight line data” in a coordinate system fixed to the head mount 10A ′ based on the data of the left-eye image. Similarly, the line-of-sight calculation device (right) 43 calculates the line of sight of the right eye as “right eye straight line data” in the coordinate system based on the data of the right eye image. The left / right eye straight line data obtained in this way is sent to the line-of-sight convergence point calculation device 44, which uses the “convergence point coordinates” as the intersection of the input pair of straight line data in accordance with FIG. The calculation described above is used. When the convergence point coordinates are determined in this way, the actual object distance calculation device 45 calculates the actual object distance: D based on the convergence point coordinates.
When the real object distance: D is determined, the virtual object congestion control device 46 determines the convergence angle of the virtual object so that the virtual object is displayed at substantially the same distance as the real object. . That is, the convergence angle: θ according to the real object distance: D0The convergence angle of the virtual object 5 to be displayed is θ0The convergence angle that is ± 40 minutes is determined. Then, the result is input to the virtual object image generation device (left) 14 and the same (right) 15, and the outputs of these devices 14 and 15 are respectively sent to the video signal synthesis device (left) 17 and the same (right) 18. input. The real object image and the virtual object image (for each eye) synthesized by these video signal synthesizers are displayed on the display panel 1 independently of the right eye image and the left eye image, respectively, and the virtual object image is Display according to the determined convergence angle. Thereby, the virtual object 5 is displayed at a position substantially equal to the real object distance: D, and the observer can fuse the real object 4 ′ and the virtual object 5 and observe them simultaneously. Since a series of procedures from the determination of the convergence point coordinates to the display of the virtual object through the calculation of the convergence angle is performed in “real time”, the virtual object is always displayed at the same distance as the real object. The display position of the virtual object 5 is determined by software that controls the congestion control means.
[0044]
  As described above, the photographing camera 40, the light source 41, the half mirror 3 ′, the line-of-sight calculation device (left) 42, the line-of-sight calculation device (right) 43, the line-of-sight convergence point calculation device 44, and the real object distance calculation device 45 are “distance detection”. The virtual object congestion control device 46 constitutes “congestion control means”.
  The photographing camera 40, the light source 41, the half mirror 3 ′, the line-of-sight calculation device (left) 42, the line-of-sight calculation device (right) 43 are “line-of-sight detection means”, and the line-of-sight convergence point calculation device 44 is “line-of-sight convergence point coordinate calculation”. The real object distance computing device 45 constitutes “perceived distance computing means”.
  The virtual object image generating device (left) 14, the virtual object image generating device (right) 15, the video signal synthesizing device (left) 17, the video signal synthesizing device (right) 18, and the head mount 10A ′ constitute “display means”. To do.
  That is, the embodiment shown in FIG. 11 has “imaging means for imaging the imaging object 4 and display means 10A ′, 14, 15, 17, and 18 for independently displaying desired images on the left and right eyes. In addition, the image picked up by the image pickup means is displayed on the display means as the real object 4A, and the desired virtual object 5 is displayed on the display means together with the real object 4A so that both objects can be observed simultaneously. Device ", an imaging distance detecting means for detecting the distance between the imaging means and the imaging object in real time, and a display means for the desired virtual object 5 according to the distance detected by the imaging distance detecting means. Congestion control means 46 for controlling the congestion in real timeHave.
  The “imaging means” includes a pair of video cameras 21 and 22 arranged in parallel to each other at a distance equal to the distance between both eyes of the observer 8. The “imaging distance detection means” Eye-gaze detecting means 3 ′, 40, 41, 42, 43 for detecting the eyes of both eyes of the observer 8, eye-gaze convergence point coordinate computing means 44 for computing the convergence point coordinates of the eyes detected by the eye-gaze detecting means, And a perception distance calculation means 45 for calculating the distance between the convergence point coordinates calculated by the line-of-sight convergence point coordinate calculation means and the observer as a perceptual distance between the real object 4A to which the observer 8 is gazing and the observer.Have.
[0045]
  thisReference formIn FIG. 4, since the actual object distance: D is obtained by calculation as the convergence point coordinates of the detected line of sight, the display position of the virtual object is the case described with reference to the embodiment described with reference to FIG. In the same manner as the above, the congestion control means “controls the congestion of the display means so that the virtual object is displayed at a predetermined position in the field of view (for example, the upper left corner of the field of view) at a perceptual distance substantially equal to the gaze portion of the real object. "Good,The virtual object is displayed in the vicinity of the real object at a position that is substantially the same as the gaze position of the real object and is different from the convergence point coordinates (display position of the real object).Well,Make the virtual object "display at the position of the convergence point coordinates"Also good.
[0046]
  As described above, the convergence point coordinates are calculated as follows: `` The midpoint of the two straight lines at the point where the observer's space straight line with the right eye gaze vector extended and the space straight line with the left eye gaze vector extended closest to each other "Calculation using position as convergence point coordinates"it can.
[0047]
  Further, when the display position of the virtual object is set at a predetermined position in the field of view, as described with reference to FIG. 9, the “observer 8” of the convergence point coordinates calculated by the viewpoint convergence point coordinate calculation unit 44 is used. The actual object distance can be determined by calculating the coordinate axis (XYZ) corresponding to the front of the head in the coordinate system (XYZ) fixed to the head of the head, the corresponding coordinate in Z: the distance between Ci and the observer 8, and D.it can.
[0048]
【The invention's effect】
  As described above, according to the present invention, a novel head-mounted image display apparatus can be realized. According to the head-mounted image display device of the present invention, as described above, a virtual object can be displayed at a position where it can be reliably fused with a real object observed within the visual field of the display means.
  Claims 1-4The head-mounted image display device of the present invention is a see-through type, and can display a virtual object at a position where it can always be fused with respect to a real object observed through the display means.The
  Claims 5-8The head-mounted image display device according to the invention described above uses the image of the imaging object captured by the imaging means as a real object, and displays a virtual object at a position where it can always be fused to the real objectit can.
[Brief description of the drawings]
FIG. 1 is a view for explaining one embodiment of the invention according to claim 1;
FIG. 2 is a diagram for explaining an example of a display position of a virtual object in the embodiment.
FIG. 3 is a diagram for explaining two examples of display positions of virtual objects in the embodiment.
[Fig. 4]Reference formIt is a figure for demonstrating.
FIG. 5 is the same as FIG.Reference formIt is a figure for demonstrating the display position of the virtual object in.
6 is a diagram for explaining display of a virtual object in FIG. 5. FIG.
7 is a diagram of FIG.Reference formIt is a figure for demonstrating another example of the display position of the virtual object in FIG.
[Fig. 8]Convergence point coordinatesIt is a figure for demonstrating this calculation.
FIG. 9Real object distance calculationIt is a figure for demonstrating.
FIG. 10Claim 5It is a figure for demonstrating one Embodiment of this invention.
FIG. 11Another embodimentIt is a figure for demonstrating.
FIG. 12 is a diagram for explaining a head-mounted image display device and its problems.
[Explanation of symbols]
  1 Display device
  3 Half mirror
  7 Lens
  4 Real objects
  5 Virtual objects
  10 Head mount of display means
  20 Distance sensor

Claims (8)

  1. A see-through head mounted that allows observation of the real object in the external field of view being observed by the observer and the virtual object displayed by the display means that displays the desired image on both the left and right eyes simultaneously. Type image display device,
    Distance detection means for detecting in real time the distance of the real object being observed by the observer from the observer;
    The convergence angle: θ 0 for the real object is calculated according to the real object distance: D detected by the distance detection means, and the convergence angle: θ for the virtual object displayed by the display means is within θ 0 ± 40 minutes. so as to have the congestion control means for controlling said display means in real time, and
    The head-mounted image display device, wherein the distance detecting means includes a distance sensor .
  2. The head-mounted image display device according to claim 1,
    A head-mounted image display apparatus, wherein the congestion control means controls the display means so that the virtual object is displayed at a predetermined position in the field of view at a distance substantially equal to the real object.
  3. The head-mounted image display device according to claim 1,
      The congestion control means controls the display means so that the virtual object is displayed at a position that is substantially the same distance as the real object and different from the position where the virtual object overlaps the real object, in proximity to the real object. A head-mounted image display device.
  4. The head-mounted image display device according to claim 1,
      A head-mounted image display device, wherein the congestion control means controls the display means so as to display the virtual object at a position overlapping the real object.
  5. The image pickup means for picking up an object to be picked up and a display means for independently displaying a desired image on both the left and right eyes. The image picked up by the image pickup means is displayed on the display means as a real object, In the head-mounted image display device that displays the virtual object together with the real object on the display means, and enables both the objects to be observed simultaneously.
      Imaging distance detection means for detecting in real time the distance between the imaging means and the imaging object;
      According to the real object distance: D detected by the imaging distance detecting means, the convergence angle with respect to the real object: θ 0 And the angle of convergence with respect to the virtual object displayed by the display means: θ is θ 0 Congestion control means for controlling the display means in real time so as to be within ± 40 minutes,
      A head-mounted image display device characterized in that the imaging distance detecting means includes a distance sensor.
  6. The head-mounted image display device according to claim 5,
    A head-mounted image display apparatus, wherein the congestion control means controls the display means so that the virtual object is displayed at a predetermined position in the field of view at a distance substantially equal to the real object.
  7. The head-mounted image display device according to claim 5,
      The congestion control means controls the display means so that the virtual object is displayed at a position that is substantially the same distance as the real object and different from the position that overlaps the real object, in proximity to the real object. A head-mounted image display device.
  8. The head-mounted image display device according to claim 5,
      A head-mounted image display device, wherein the congestion control means controls the display means so as to display the virtual object at a position overlapping the real object.
JP00851698A 1998-01-20 1998-01-20 Head mounted image display device Expired - Fee Related JP3717653B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP00851698A JP3717653B2 (en) 1998-01-20 1998-01-20 Head mounted image display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP00851698A JP3717653B2 (en) 1998-01-20 1998-01-20 Head mounted image display device

Publications (2)

Publication Number Publication Date
JPH11202256A JPH11202256A (en) 1999-07-30
JP3717653B2 true JP3717653B2 (en) 2005-11-16

Family

ID=11695316

Family Applications (1)

Application Number Title Priority Date Filing Date
JP00851698A Expired - Fee Related JP3717653B2 (en) 1998-01-20 1998-01-20 Head mounted image display device

Country Status (1)

Country Link
JP (1) JP3717653B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014156033A1 (en) 2013-03-26 2014-10-02 Seiko Epson Corporation Head-mounted display device, control method of head-mounted display device, and display system

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4875843B2 (en) * 2004-10-20 2012-02-15 オリンパス株式会社 Information terminal equipment
JP2010139589A (en) * 2008-12-10 2010-06-24 Konica Minolta Opto Inc Video image display apparatus and head mounted display
JP5494153B2 (en) * 2010-04-08 2014-05-14 ソニー株式会社 Image display method for head mounted display
JP5651386B2 (en) * 2010-06-23 2015-01-14 ソフトバンクモバイル株式会社 Eyeglass type display device
JP5434848B2 (en) * 2010-08-18 2014-03-05 ソニー株式会社 Display device
US9497501B2 (en) 2011-12-06 2016-11-15 Microsoft Technology Licensing, Llc Augmented reality virtual monitor
US20160054795A1 (en) * 2013-05-29 2016-02-25 Mitsubishi Electric Corporation Information display device
JP6232763B2 (en) * 2013-06-12 2017-11-22 セイコーエプソン株式会社 Head-mounted display device and method for controlling head-mounted display device
JP6369005B2 (en) * 2013-10-25 2018-08-08 セイコーエプソン株式会社 Head-mounted display device and method for controlling head-mounted display device
JP5751315B2 (en) * 2013-11-20 2015-07-22 ソニー株式会社 Image display method for head mounted display
JP6252849B2 (en) 2014-02-07 2017-12-27 ソニー株式会社 Imaging apparatus and method
JP6394108B2 (en) * 2014-06-24 2018-09-26 セイコーエプソン株式会社 Head-mounted display device, control method therefor, and computer program
JP6394174B2 (en) * 2014-08-19 2018-09-26 セイコーエプソン株式会社 Head-mounted display device, image display system, method for controlling head-mounted display device, and computer program
US20170302904A1 (en) * 2014-09-30 2017-10-19 Mirama Service Inc. Input/output device, input/output program, and input/output method
JP6446465B2 (en) * 2014-09-30 2018-12-26 ミラマサービスインク I / O device, I / O program, and I / O method
JP5983703B2 (en) * 2014-10-29 2016-09-06 ソニー株式会社 Image display method for head mounted display
KR101932368B1 (en) 2015-01-13 2019-03-20 가부시키가이샤 리코 Head-mounted display apparatus, and display method
WO2016115874A1 (en) * 2015-01-21 2016-07-28 成都理想境界科技有限公司 Binocular ar head-mounted device capable of automatically adjusting depth of field and depth of field adjusting method
CN105866949B (en) * 2015-01-21 2018-08-17 成都理想境界科技有限公司 The binocular AR helmets and depth of field adjusting method of the depth of field can be automatically adjusted
CN105866948A (en) * 2015-01-21 2016-08-17 成都理想境界科技有限公司 Method of adjusting virtual image projection distance and angle on binocular head-mounted device
US10162412B2 (en) * 2015-03-27 2018-12-25 Seiko Epson Corporation Display, control method of display, and program
JP2016186561A (en) * 2015-03-27 2016-10-27 セイコーエプソン株式会社 Display device, control method for display device, and program
JP2018121132A (en) * 2017-01-23 2018-08-02 ティフォン インコーポレーテッドTyffon Inc. Image providing system, image providing method, and image providing program
JP2018169428A (en) 2017-03-29 2018-11-01 セイコーエプソン株式会社 Image display device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014156033A1 (en) 2013-03-26 2014-10-02 Seiko Epson Corporation Head-mounted display device, control method of head-mounted display device, and display system

Also Published As

Publication number Publication date
JPH11202256A (en) 1999-07-30

Similar Documents

Publication Publication Date Title
Wann et al. Natural problems for stereoscopic depth perception in virtual environments
TWI534476B (en) Head-mounted display
US5867308A (en) Microscope, in particular for surgical operations
US7428001B2 (en) Materials and methods for simulating focal shifts in viewers using large depth of focus displays
CN102665589B (en) For the patient-side surgeon interface of remote-operated micro-wound surgical operation apparatus
US5742331A (en) Three-dimensional image display apparatus
JP4245750B2 (en) Stereoscopic observation device
JP5887026B2 (en) Head mounted system and method for computing and rendering a stream of digital images using the head mounted system
DE69923480T2 (en) Personal display device with viewing
US4884219A (en) Method and apparatus for the perception of computer-generated imagery
KR101541803B1 (en) Image Recognition Apparatus, Operation Determining Method, and Program
US9191658B2 (en) Head-mounted display and position gap adjustment method
US8159526B2 (en) Stereoscopic image display system
US20110234584A1 (en) Head-mounted display device
US9138135B2 (en) System, a method and a computer program for inspection of a three-dimensional environment by a user
US5644324A (en) Apparatus and method for presenting successive images
US8657444B2 (en) Visual function testing device
US5861936A (en) Regulating focus in accordance with relationship of features of a person&#39;s eyes
US20040263613A1 (en) Stereo-observation system
US20030112508A1 (en) Method and system for controlling space magnification for stereoscopic images
Rolland et al. Towards quantifying depth and size perception in virtual environments
US4559555A (en) Stereoscopic remote viewing system
EP0641132B1 (en) Stereoscopic image pickup apparatus
JP3089306B2 (en) Stereoscopic imaging and display device
Holloway Registration error analysis for augmented reality

Legal Events

Date Code Title Description
A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20041222

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20050426

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20050627

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20050830

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20050831

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20080909

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20090909

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20090909

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20100909

Year of fee payment: 5

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110909

Year of fee payment: 6

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120909

Year of fee payment: 7

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130909

Year of fee payment: 8

LAPS Cancellation because of no payment of annual fees