WO2018147084A1 - Line-of-sight information sharing method and line-of-sight information sharing system - Google Patents

Line-of-sight information sharing method and line-of-sight information sharing system Download PDF

Info

Publication number
WO2018147084A1
WO2018147084A1 PCT/JP2018/002142 JP2018002142W WO2018147084A1 WO 2018147084 A1 WO2018147084 A1 WO 2018147084A1 JP 2018002142 W JP2018002142 W JP 2018002142W WO 2018147084 A1 WO2018147084 A1 WO 2018147084A1
Authority
WO
WIPO (PCT)
Prior art keywords
line
sight
terminal
user
information
Prior art date
Application number
PCT/JP2018/002142
Other languages
French (fr)
Japanese (ja)
Inventor
笠井 一郎
平岡 三郎
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Publication of WO2018147084A1 publication Critical patent/WO2018147084A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/02Viewing or reading apparatus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units

Definitions

  • the present invention relates to a line-of-sight information sharing method and a line-of-sight information sharing system for sharing line-of-sight information of a user between a plurality of information input / output terminals connected via a communication line.
  • a head-mounted see-through information display device called a see-through HMD (Head Mounted Display) has been known.
  • This see-through HMD is a device that can provide desired information to a wearer by adding desired information as an image when observing the outside world, and is roughly classified into a video see-through type and an optical see-through type.
  • the video see-through type a camera that captures the wearer's front view is added to a shielded (closed) display device, and a desired image is superimposed on the image that captures the wearer's front view, and these are integrated. It is a device that allows the wearer to visually recognize the image.
  • the optical see-through type is a device that uses a combiner that superimposes light from the outside and image light, and guides the light to the wearer's pupil by the combiner, thereby allowing the wearer to visually recognize the outside world together with the image.
  • Patent Document 1 An example of supplementing the wearer's work using the HMD is proposed in Patent Document 1, for example.
  • an operator and an instructor wear a video see-through type HMD, and the outside world in front of the operator is imaged by a camera, and the image is transmitted to an instructor at a remote location and shared. Then, the instructor adds additional information to the display video, so that the instruction is transmitted to the worker and the collaborative work is performed.
  • a hand-held type hand-held type, handy type
  • the target position of the instruction word is obtained and the line-of-sight position of the speaker (passenger seat person) is determined by the driver.
  • the driver can accurately know the intention of the speaker.
  • both the driver and the speaker do not need manual work for communication, and hands-freeness is ensured.
  • JP 2006-209664 A (refer to claim 1, paragraphs [0007], [0019], [0102], FIG. 1, FIG. 2, etc.)
  • Japanese Patent Laying-Open No. 2015-41197 (refer to claim 1, paragraph [0010], FIG. 7 etc.)
  • Patent Document 2 it is difficult for the following two reasons to apply the system of Patent Document 2 to a purpose of performing a collaborative work using a plurality of information input / output terminals including an HMD.
  • Patent Document 2 presents information on the line of sight of a speaker with respect to an object that can be directly and directly viewed by two nearby speakers and a driver. For this reason, the system of patent document 2 cannot be utilized in the situation where both cannot recognize a target object simultaneously or directly. For example, when one of the instructor and the worker who performs the collaborative work (for example, the instructor) is in a remote place, the instructor cannot directly see the same object. Even if the instructor and the worker are adjacent to each other, if the work place is small and the worker is close to the target object, the target object is hidden from the direct operator by the operator. It cannot be visually recognized. In these cases, since the system of Patent Document 2 cannot be used, accurate communication based on the line of sight cannot be performed among users who use a plurality of information input / output terminals.
  • the present invention has been made to solve the above-described problems, and its purpose is that any user cannot visually recognize an object in a collaborative work using a plurality of information input / output terminals including an HMD. Gaze information sharing that enables accurate communication based on the line of sight even in such situations, and allows easy communication based on the line of sight without requiring complicated processing It is to provide a method and a line-of-sight information sharing system.
  • a line-of-sight information sharing method is a line-of-sight information sharing method for sharing line-of-sight information of a user between a plurality of information input / output terminals connected via a communication line, wherein the plurality of pieces of information
  • Each of the input / output terminals includes a display device that presents an image to a user, a line-of-sight detection device that detects a user's line-of-sight direction, and a communication unit that inputs and outputs information to and from each other.
  • At least one of the input / output terminals is a head-mounted terminal that is mounted on a user's head and displays the video as a virtual image so as to be visible to the user, and the head-mounted terminal includes the display device.
  • a relative position is fixed with respect to the display device, and further includes an imaging device that images the external world in front of the user.
  • the line-of-sight information sharing method uses the captured image information acquired by the imaging device of the first terminal as Based on the first step of outputting to the second terminal and the input information of the captured image, the second terminal displays the captured image so as to be visible to the user of the second terminal.
  • a line-of-sight information sharing system is a line-of-sight information sharing system for sharing line-of-sight information of a user among a plurality of information input / output terminals connected via a communication line
  • Each of the plurality of information input / output terminals includes a display device that presents an image to the user, a line-of-sight detection device that detects a user's line-of-sight direction, and a communication unit for inputting and outputting information to and from each other
  • At least one of the plurality of information input / output terminals is a head-mounted terminal that is mounted on a user's head and displays the video as a virtual image so as to be visible to the user.
  • the image processing device further includes an imaging device whose relative position is fixed with respect to the display device and images the external environment in front of the user, The head-mounted
  • the first terminal receives the information of the captured image acquired by the imaging device of the first terminal.
  • the second terminal outputs the captured image to the second terminal based on the information of the captured image input by the display device of the second terminal via the communication unit of one terminal. Displayed so as to be visible to the user of the terminal, and the gaze detection device of the second terminal detects the gaze direction of the user of the second terminal with respect to the displayed captured image, and then gaze information related to the gaze direction is displayed.
  • Output to the first terminal via the communication unit of the second terminal and the first terminal uses the second terminal based on the line-of-sight information input by the display device of the first terminal. Display position of the video by the first terminal. The corresponding position displays within.
  • any user in a collaborative work using a plurality of information input / output terminals including the first terminal (HMD), any user cannot visually recognize an object. It is possible to communicate accurately based on the line of sight even in situations, and communicate based on the line of sight in a simple manner without requiring complicated processing such as conversion of position coordinates. Is possible.
  • It is explanatory drawing which shows the image of the field of view of an instruction
  • FIG. 10 is an explanatory diagram schematically showing a state in which an instruction supervisor observes an image sent and displayed by two workers in a line-of-sight information sharing system according to still another embodiment of the present invention. . It is explanatory drawing which shows the image of one worker's visual field when the marker which shows the visual line position of two workers and an instruction
  • FIG. 1 is an explanatory diagram illustrating a schematic configuration of a line-of-sight information sharing system 100 according to the present embodiment.
  • the line-of-sight information sharing system 100 is configured by connecting a plurality of information input / output terminals 200 via a communication line 300 so that they can communicate with each other (so that information can be linked).
  • the communication line 300 is realized by a wireless communication environment such as Wi-Fi (registered trademark), but may be a wired communication line using a cable such as an optical fiber.
  • the plurality of information input / output terminals 200 can input / output (transmit / receive) information to / from each other.
  • Examples of information to be input / output include information on captured images (image data) acquired by each information input / output terminal 200 and information on the line-of-sight direction and the position of the line of sight of the user of each information input / output terminal 200 (Hereinafter also simply referred to as line-of-sight information).
  • line-of-sight information the user's line-of-sight information is shared among the plurality of information input / output terminals 200, and communication based on the user's line-of-sight can be performed as described later.
  • At least one of the plurality of information input / output terminals 200 is a head-mounted display (HMD) that is mounted on the user's head and displays the image as a virtual image so as to be visible to the user. . Details of the HMD will be described below.
  • HMD head-mounted display
  • FIG. 2 is a front view illustrating a schematic configuration of the HMD 201 which is an example of the information input / output terminal 200 of the present embodiment.
  • the HMD 201 includes an image display device 202 that presents an image to the user and a support member 203.
  • the support member 203 is a member that supports the video display device 202 in front of an observer (user of the HMD 201) (for example, in front of the right eye ER).
  • the support member 203 corresponds to a frame or temple of glasses, and a support unit.
  • 203a includes a nose pad 203b attached to 203a and abutting the user's nose when worn.
  • the HMD 201 may further include a right-eye lens and a left-eye lens, and these may be supported by the support member 203. Further, the HMD 201 may have two video display devices 202 and support each video display device 202 in front of the right and left eyes of the user by the support member 203. Furthermore, the HMD 201 may have a position adjustment mechanism that adjusts the position (for example, each position in the left-right direction and the up-down direction) and the attachment angle (tilt angle) of the video display device 202.
  • the video display device 202 is an optical see-through display that allows a user to directly observe the outside world together with video. In other words, the video display device 202 guides light from the outside world to the user's pupil, thereby allowing the user to observe the outside world and displaying the video as a virtual image in a part of the user's field of view. It is a system. Hereinafter, the optical configuration of the video display device 202 will be described.
  • FIG. 3 is a cross-sectional view showing an optical configuration of the image display device 202.
  • the video display device 202 includes an illumination optical system 2, a polarizing plate 3, a polarization beam splitter (PBS) 4, a display element 5, and an eyepiece optical system 6.
  • the upper ends of the illumination optical system 2, the polarizing plate 3, the PBS 4, the display element 5, and the eyepiece optical system 6 are located in the housing 202a shown in FIG.
  • each direction is defined as follows.
  • a direction perpendicular to the optical axis plane of the HOE (Holographic Optical Element) 23 of the eyepiece optical system 6 is defined as an X direction.
  • the optical axis plane of the HOE 23 refers to a plane including incident light and reflected light when a light beam that coincides with the optical axis enters the HOE 23.
  • the direction perpendicular to the X direction in the plane perpendicular to the surface normal at the intersection of each optical member with the optical axis is defined as the Y direction.
  • a direction perpendicular to the X direction and the Y direction is taken as a Z direction.
  • the normal line of the display element 5 and the normal lines of two parallel surfaces 21b and 21c described later of the eyepiece optical system 6 are included, and the center of the display surface of the display element 5 is defined.
  • the included cross section is a YZ cross section.
  • the illumination optical system 2 illuminates the display element 5 and has a light source 11, an illumination mirror 12, and a diffusion plate 13.
  • the light source 11 is composed of RGB integrated LEDs that emit light corresponding to each color of R (red), G (green), and B (blue). A plurality of light emission points (each light emission point of RGB) are arranged in a substantially straight line in the horizontal direction (X direction).
  • the wavelength of light emitted from the light source 11 is, for example, a peak wavelength of light intensity and a wavelength width of half value of light intensity, 462 ⁇ 12 nm (B light), 525 ⁇ 17 nm (G light), 635 ⁇ 11 nm (R light). It is.
  • the light source 11 may be a laser light source.
  • the illumination mirror 12 reflects light (illumination light) emitted from the light source 11 toward the diffuser plate 13 and bends the illumination light so that the optical pupil P and the light source 11 are substantially conjugate with respect to the Y direction. It is an optical element.
  • the diffusing plate 13 is a unidirectional diffusing plate that diffuses incident light, for example, 40 ° in the X direction in which a plurality of light emitting points of the light source 11 are arranged and does not diffuse incident light in the Y direction.
  • the diffusion plate 13 is held on the surface of the polarizing plate 3.
  • the polarizing plate 3 transmits light having a predetermined polarization direction out of light incident through the diffusion plate 13 and guides it to the PBS 4.
  • the PBS 4 reflects the light transmitted through the polarizing plate 3 in the direction of the reflective display element 5, while out of the light reflected by the display element 5, the light corresponding to the image signal ON (transmitted through the polarizing plate 3).
  • the light is a flat plate-shaped polarization separating element that transmits light whose polarization direction is orthogonal, and is attached to a light incident surface 21a of an eyepiece prism 21 (to be described later) of the eyepiece optical system 6.
  • the display element 5 is a display element that modulates the light from the illumination optical system 2 and displays an image.
  • the display element 5 is composed of a reflective liquid crystal display element.
  • the display element 5 may have a configuration having a color filter, and is driven in a time division manner so that an RGB image corresponding to the emission color is displayed in synchronization with the time division emission for each RGB of the light source 11. It may be a configuration.
  • the display element 5 is arranged so that light incident from the PBS 4 substantially vertically is reflected almost vertically and directed toward the PBS 4. This facilitates optical design that increases the resolution as compared with a configuration in which light is incident on the reflective display element at a large incident angle.
  • the display surface of the display element 5 is rectangular, and is arranged so that the longitudinal direction of the display surface is the X direction and the short direction is the Y direction.
  • the eyepiece optical system 6 is an optical system for guiding the image light from the display element 5 to the user's pupil (optical pupil P), and has non-axisymmetric (non-rotationally symmetric) positive optical power.
  • the eyepiece optical system 6 includes an eyepiece prism 21, a deflection prism 22, and a HOE 23.
  • the eyepiece prism 21 guides the image light incident from the display element 5 through the PBS 4 inside, and transmits light from the outside (external light).
  • the upper end of the parallel plate is directed toward the upper end. It is configured to be thicker and thinner at the lower end toward the lower end.
  • the surface to which the PBS 4 is attached is a light incident surface 21 a on which the image light from the display element 5 is incident, and two surfaces 21 b and 21 c that are positioned substantially parallel to the optical pupil P and face each other are The total reflection surface guides the image light by total reflection.
  • the surface 21b on the optical pupil P side also serves as an image light exit surface that is diffracted and reflected by the HOE 23.
  • the eyepiece prism 21 is joined to the deflection prism 22 with an adhesive so as to sandwich the HOE 23 arranged at the lower end thereof.
  • the surfaces (light incident surface 21 a and surface 21 b) that transmit image light other than the surface 21 d in contact with the HOE 23 are flat.
  • the surface 21d with which the HOE 23 comes into contact may be a flat surface, a curved surface, or a surface combining a flat surface and a curved surface.
  • the deflection prism 22 is bonded to the eyepiece prism 21 via the HOE 23 to form a substantially parallel plate.
  • refraction when external light passes through the wedge-shaped lower end of the eyepiece prism 21 can be canceled by the deflecting prism 22, and an image observed (viewed) as the outside world. It is possible to prevent distortion in the (external world image).
  • the HOE 23 is a volume phase type reflective holographic optical element that is provided in contact with the eyepiece prism 21 and diffracts and reflects the image light guided inside the eyepiece prism 21.
  • the HOE 23 diffracts light in three wavelength ranges of, for example, 465 ⁇ 5 nm (B light), 521 ⁇ 5 nm (G light), and 634 ⁇ 5 nm (R light) with a peak wavelength of diffraction efficiency and a half width of the diffraction efficiency. (Reflect). That is, the RGB diffraction wavelength of the HOE 23 substantially corresponds to the wavelength of RGB image light (the emission wavelength of the light source 11).
  • the light emitted from the light source 11 of the illumination optical system 2 is reflected by the illumination mirror 12 and diffused only in the X direction by the diffusion plate 13, and then only the light having a predetermined polarization direction is a polarizing plate. 3 is transmitted. The light transmitted through the polarizing plate 3 is reflected by the PBS 4 and enters the display element 5.
  • incident light is modulated in accordance with an image signal.
  • the image light corresponding to the image signal ON is converted by the display element 5 into light having a polarization direction orthogonal to the incident light and emitted, the light is incident on the eyepiece prism 21 through the PBS 4. Incident from the surface 21a.
  • the image light corresponding to the image signal being turned off is emitted by the display element 5 without changing the polarization direction, and thus is blocked by the PBS 4 and does not enter the eyepiece prism 21.
  • the incident image light is totally reflected once by the two opposing surfaces 21 c and 21 b of the eyepiece prism 21, then enters the HOE 23, where it is diffracted and reflected and emitted from the surface 21 b. Reach pupil P. Therefore, at the position of the optical pupil P, the user can observe the image displayed on the display element 5 as a virtual image.
  • the eyepiece prism 21, the deflecting prism 22, and the HOE 23 transmit almost all of the external light, so that the user can observe the outside world see-through. Therefore, the virtual image of the image displayed on the display element 5 is observed while overlapping a part of the outside world in the user's field of view (view).
  • a reflective image display element is used as the display element 5, but a transmissive liquid crystal display element may be used, and the image display device 202 may be configured with an optical design corresponding thereto.
  • the eyepiece optical system 6 may be configured integrally with the right-eye lens or the left-eye lens of the HMD 201, or may be configured separately.
  • FIG. 4 is a block diagram illustrating a detailed configuration of the HMD 201.
  • the HMD 201 includes an imaging device 31, a line-of-sight detection device 32, an image processing unit 33, a communication unit 34, a storage unit 35, a voice input device 36, a voice output device 37, and A control unit 38 is further provided.
  • the control unit 38 is composed of, for example, a central processing unit (CPU) and controls the operation of each unit of the HMD 201 including the video display device 202. Therefore, the control of the control unit 38 includes, for example, switching on / off of the light source 11 of the video display device 202 and control regarding display of information on the display element 5.
  • the imaging device 31 is composed of, for example, a camera (digital camera, digital video camera) capable of shooting a moving image, and images the outside world ahead of the user's line of sight.
  • the imaging device 31 is provided on the outer surface of the housing 202a illustrated in FIG. 2, but may be provided inside the housing 202a. In the latter case, an opening penetrating the housing 202a or a transparent window is provided in front of the image pickup device 31, so that the image pickup device 31 can shoot in front.
  • the imaging center Co (intersection between the imaging optical axis and the imaging element) of the imaging device 31 is located almost immediately above the observation center Vo (pupil center) of the video display device 202.
  • the relative position of the imaging device 31 with respect to the video display device 202 is fixed.
  • the imaging optical axis of the imaging device 31 substantially coincides with the viewing direction of the user when observing at infinity (see imaging optical axis A 1 and viewing direction (observation central axis) A 2 in FIG. 27). .
  • the imaging device 31 can image a user's normal field of view (front outside).
  • the gaze detection device 32 is a sensor that detects the user's observation gaze (gaze direction).
  • the line-of-sight detection device 32 includes an infrared LED and a camera, and is provided at the left and right edges of the eyepiece optical system 6, for example, as shown in FIG.
  • the line-of-sight detection device 32 directly detects the user's line-of-sight direction by irradiating a safe ray (for example, infrared rays) from the infrared LED toward the user's pupil or retina and measuring the reflected light with a camera.
  • a safe ray for example, infrared rays
  • the position where the infrared light is reflected on the user's cornea (the position of the corneal reflection) is used as a reference point, and the position corresponding to this corneal reflection is determined.
  • the direction of the user's line of sight is detected by detecting the position of the pupil with a camera (by receiving infrared rays). For example, if the pupil is closer to the corneal reflection of the right eye, the user is looking to the right (the user's line of sight is pointing to the right), and the pupil is larger than the corneal reflection. If it is on the side, it is possible to detect that the user is looking at the left side (the user's line of sight is facing left).
  • the number of the line-of-sight detection devices 32 may be one or plural, it is possible to perform line-of-sight detection based on a plurality of pieces of information and improve detection accuracy. This is desirable. For this reason, in this embodiment, as shown in FIG. 2, the gaze detection device 32 is provided at the left and right edges (two places in total) of the eyepiece optical system 6 to perform gaze detection.
  • Tobe Technology has proposed a spectacle-type gaze detection device that detects gaze by imaging a pupil illuminated by a plurality of infrared light sources with a plurality of cameras.
  • the detection device 32 can have a similar configuration.
  • the image processing unit 33 is a processing unit that generates an image to be displayed on the display element 5 by image processing, and includes, for example, a specific arithmetic processing circuit such as an ASIC (application specific integrated circuit).
  • a specific arithmetic processing circuit such as an ASIC (application specific integrated circuit).
  • an index indicating the user's line-of-sight position, which will be described later, is input from another terminal.
  • Image processing for displaying the picked-up images to be displayed on the display element 5 individually or in combination is also included.
  • the communication unit 34 is an interface for inputting and outputting information between the HMD 201 and another information input / output terminal 200, and includes a transmission circuit, a reception circuit, an antenna, and the like.
  • the storage unit 35 is configured by a non-volatile memory such as a flash memory, for example, and stores various types of information (including captured image information (image data) and information indicating the line-of-sight position) output from other terminals.
  • the voice input device 36 is a voice information input device including, for example, a microphone.
  • the audio information input to the audio input device 36 is output to another terminal via the communication unit 34.
  • the audio output device 37 is an audio information output device including, for example, a speaker or an earphone. Audio information input from another terminal via the communication unit 34 is output from the audio output device 37.
  • the plurality of information input / output terminals 200 may include information input / output terminals other than the head-mounted type.
  • the plurality of information input / output terminals 200 are information input / output terminals in which the above-described line-of-sight detection device 32 is combined with a display device installed in a room (for example, a liquid crystal display device or an organic EL (Electro-Luminescence) display device). May be included.
  • FIG. 5 schematically shows how the worker PA and the instruction supervisor PB perform the collaborative work by wearing the HMDs 201A and 201B on their heads.
  • the worker PA and the instruction supervisor PB are users of the HMDs 201A and 201B, respectively.
  • the configurations of the HMDs 201A and 201B are exactly the same as the HMD 201 described above.
  • a symbol “A” or “B” is added after the reference numerals of the constituent elements as in the video display devices 202A and 202B, for example.
  • the device OB which is an object placed on the table T
  • the three sides around the table T (back side, left side and right side when viewed from the worker PA) are surrounded by a built-in wall W, and the worker PA stands in front of the table T and works on the device OB. It is assumed that it is difficult for the instruction supervisor PB standing behind the worker PA to directly access and visually recognize the device OB in the situation where the operator PA is performed.
  • the HMDs 201A and 201B are in a communication state (for example, in a state in which communication is possible using Wi-Fi (registered trademark)) so that information can be linked to each other, and can transmit and receive mutual information.
  • a communication state for example, in a state in which communication is possible using Wi-Fi (registered trademark)
  • Wi-Fi registered trademark
  • FIG. 6 is a flowchart showing an example of an operation flow in the line-of-sight information sharing system 100 of the present embodiment. Hereinafter, description will be given with reference to this flowchart.
  • the HMD 201A (first terminal) of the worker PA receives information on the captured image acquired by the imaging device 31 that is the visual line camera of the HMD 201A via the communication unit 34 of the HMD 201A. 2 terminals) (S1; first step).
  • FIG. 7 shows an image of the field of view of the worker PA.
  • a rectangular area IM (201A) indicated by a broken line indicates an imaging range of the imaging device 31 of the HMD 201A worn by the worker PA.
  • the worker PA can see the device OB, the hand of the worker PA (right hand RH (PA), left hand LH (PA)), the tool held in the hand of the worker PA (driver DR (PA)),
  • the field of view in front of the worker PA is imaged by the imaging device 31 of the HMD 201A.
  • the HMD 201A outputs information (image data) of the captured image of the imaging device 31 viewed from the viewpoint of the worker PA to the HMD 201B.
  • the HMD 201B causes the video display device 202B to display the captured image so as to be visible to the instruction supervisor PB based on the captured image information input from the HMD 201A (S2; second step).
  • FIG. 8 shows a state where the instruction supervisor PB observes the video (virtual image) V (202B) displayed by the video display device 202B of the HMD 201B of the instruction supervisor PB. Since the instruction supervisor PB stands behind the worker PA who is actually working toward the device OB as shown in FIG. 5, as described above, the work and device of the actual worker PA It is difficult to observe OB directly. However, the instruction supervisor PB can confirm the actual work of the worker PA by observing the video V (202B) imaged by the HMD 201A, transmitted to the HMD 201B, and displayed.
  • the line-of-sight direction of the instruction supervisor PB with respect to the captured image (video V (202B)) displayed by the video display device 202B is detected by the line-of-sight detection device 32 of the HMD 201B (S3; third step).
  • the line-of-sight detection device 32 is attached to the video display device 202 (eyepiece optical system 6), and the relative position with respect to the video display device 202 is fixed. Therefore, the line-of-sight detection device 32 of the HMD 201B can detect which part of the video V (202B) displayed on the video display device 202B is being viewed by the instruction supervisor PB.
  • the HMD 201B outputs the line-of-sight information regarding the line-of-sight direction of the instruction supervisor PB detected by the line-of-sight detection device 32 to the HMD 201A via the communication unit 34 of the HMD 201B (S4; fourth step). Then, the HMD 201A visually recognizes the position of the line of sight of the instruction supervisor PB to the worker PA at a corresponding position in the video display area by the HMD 201A based on the line-of-sight information input from the HMD 201B by the video display device 202A. Display is possible (S5; fifth step).
  • the relative position of the imaging device 31 with respect to the video display device 202 is fixed, and the captured image (video V (202B)) displayed on the HMD 201B of the instruction supervisor PB is originally the worker. It is the captured image acquired by HMD201A of PA.
  • the point of sight of the instruction supervisor PB with respect to the captured image displayed on the HMD 201B is known relative to the imaging device 31 in the HMD 201A of the worker PA.
  • the HMD 201A of the worker PA immediately displays the position of the line of sight of the instruction supervisor PB at a predetermined position (a position corresponding to the point of sight of the instruction supervisor PB) in the display area of the video display device 202A. It becomes possible.
  • FIG. 9 shows an image of the field of view of the worker PA when the worker PA is looking at the outside world through the HMD 201A.
  • the HMD 201A by displaying the marker MB indicating the position of the line of sight of the instruction supervisor PB, it is possible to present to the operator PA which part of the device OB the instruction supervisor PB is looking at.
  • the worker PA can perform the work of loosening the screw indicated by the marker MB by looking at the displayed marker MB (the position of the line of sight of the instruction supervisor PB).
  • the worker PA and the instruction supervisor PB can communicate based on the line of sight, and perform the work accurately and efficiently. Is possible.
  • the instruction supervisor PB may emit a voice such as “Loosen this screw!”. Thereby, the worker PA can easily recognize that “this screw should be loosened” while looking at the marker MB displayed on the HMD 201A. That is, the instruction supervisor PB can give an accurate instruction to the worker PA by the line of sight and the voice, and the worker PA can accurately perform the work based on this instruction.
  • a captured image indicating the field of view of the HMD 201A of the worker PA is output to the HMD 201B of the instruction supervisor PB, the captured image is displayed on the HMD 201B, and the instruction supervisor PB for the captured image is displayed. Is detected and the line-of-sight information is output to the HMD 201A. Accordingly, the HMD 201A can immediately display the position of the line of sight of the instruction supervisor PB with the marker MB at a position corresponding to the position of the line of sight of the instruction supervisor PB with respect to the captured image in the display image of the HMD 201A.
  • the instruction supervisor PB uses the captured image (displayed video V (202B)) output from the HMD 201A of the worker PA. OB can be grasped. Therefore, by outputting the line-of-sight information of the instruction supervisor PB with respect to the captured image to the HMD 201A of the worker PA and displaying the position of the line of sight of the instruction supervisor PB on the HMD 201A, the worker PA displays the displayed line-of-sight position. Based on the above, it is possible to accurately perform the operation on the device OB. That is, even if the instruction supervisor PB cannot visually recognize the device OB, accurate communication and work based on the line of sight can be performed.
  • FIG. 10 is a flowchart showing another example of the operation flow in the line-of-sight information sharing system 100 of the present embodiment.
  • the HMD 201A may detect the line-of-sight direction of the worker PA by the line-of-sight detection device 32 of the HMD 201A (S6; sixth step).
  • S6 line-of-sight detection device 32 of the HMD 201A
  • the position of the line of sight of the worker PA based on the detection result in S6 may be displayed on the HMD 201A together with the position of the line of sight of the instruction supervisor PB.
  • FIG. 11 shows an image of the field of view of the worker PA when the line-of-sight positions of both the worker PA and the instruction supervisor PB are displayed on the HMD 201A.
  • the position of the line of sight of the worker PA is indicated by a marker MA
  • the position of the line of sight of the instruction supervisor PB is indicated by a marker MB.
  • the shapes of the markers MA and MB are different from each other. More specifically, the marker MA indicating the line of sight of the operator (worker PA) is displayed as a rectangle, and the marker MB indicating the line of sight of the other person (instructor supervisor PB) is displayed as a circle.
  • the display method of the marker MA / MB is not limited to this.
  • the colors may be different from each other, the line types (solid line, broken line, etc.) may be different from each other, “instruction supervisor”, “worker”, etc.
  • the markers MA and MB may be distinguished from each other by displaying the characters in parallel with the markers, or by combining any of shapes, line types, colors, and characters.
  • the worker PA displays the position of the worker PA's line of sight (marker MA) together with the position of the line of sight of the instruction supervisor PB (marker MB).
  • the worker PA can grasp the part of the OB that he / she sees (the part where the work is to be performed) and the part where the instruction supervisor PB is viewing (the part where the work is instructed), and immediately recognizes the match / mismatch. I can grasp it.
  • the worker PA can take appropriate measures such as confirming by sounding. Therefore, the worker PA can more accurately communicate with the instruction supervisor PB.
  • FIG. 12 is a flowchart showing still another example of the operation flow in the line-of-sight information sharing system 100 of the present embodiment.
  • the HMD 201A may output the line-of-sight information of the worker PA detected in the step S6 to the HMD 201B of the instruction supervisor PB (S7; seventh step). Then, in the HMD 201B, the position of the line of sight of the worker PA acquired in step S7 and the position of the line of sight of the instruction supervisor PB based on the detection result in step S3 are combined with the captured image output from the HMD 201A. May be displayed (S8; eighth step).
  • FIG. 13 shows an image of the field of view of the instruction supervisor PB when the line-of-sight positions of both the worker PA and the instruction supervisor PB are displayed on the HMD 201B.
  • the marker MB indicating the line of sight of the self (instructor supervisor PB) is displayed as a rectangle
  • the marker MB indicating the line of sight of the other person is displayed as a circle.
  • FIGS. 11 and 13 since the subject viewing the video is different between the worker PA and the instruction supervisor PB, for example, even if the marker MA indicates the line-of-sight position of the same worker PA, the shape thereof is as shown in FIGS. 13 and different.
  • FIG. 11 and FIG. 13 are common in that the marker corresponding to the subject viewing the video is displayed as a rectangle and the other's marker is displayed as a circle.
  • the captured image output from the HMD 201A of the worker PA includes the position of the line of sight of the worker PA (marker MA) and the position of the line of sight of the instruction supervisor PB (marker MB).
  • the instruction supervisor PB can take appropriate measures such as adding an instruction by emitting a voice. Therefore, the instruction supervisor PB can more accurately communicate with the worker PA.
  • the instruction supervisor PB can accurately give an instruction based on the line of sight to the worker PA while checking the position where he / she wants to designate the work for the device OB by looking at the marker MB.
  • the position of the line of sight of the worker PA and the position of the line of sight of the instruction supervisor PB are displayed in different patterns (markers MA and MB).
  • the position of the line of sight of the worker PA and the position of the line of sight of the instruction supervisor PB are displayed in different patterns (markers MA and MB) as shown in FIG.
  • the worker PA and the instruction supervisor PB can easily distinguish and grasp each other's line-of-sight positions in the respective HMDs 210A and 201B, thereby enabling quick communication.
  • the HMD 201 ⁇ / b> B is instructed by the instruction supervisor PB only when the line of sight of the instruction supervisor PB is located within the display area of the captured image output from the HMD 201 ⁇ / b> A (one information input / output terminal) to the HMD 201 ⁇ / b> B (other information input / output terminal). It is desirable to output the line-of-sight information to the HMD 201A of the worker PA.
  • the instruction supervisor PB when the instruction supervisor PB does not see the captured image (video V (202B)) output from the HMD 201A on the HMD 201B, the instruction supervisor PB gives some instruction to the device OB included in the captured image. And no advice. In this case, communication with the line of sight is not required between the instruction supervisor PB and the worker PA.
  • the gaze detection of the instruction supervisor PB is always performed by the gaze detection device 32, gaze information (gaze information when looking outside the display area of the photographed image) when communication is unnecessary is obtained from the HMD 201B to the HMD 201A. Even if it outputs to, it will only perform unnecessary communication and is useless.
  • the HMD 201B outputs the line-of-sight information of the instruction supervisor PB to the HMD 201A of the worker PA, thereby reducing the output of useless information to the HMD 201A and reducing the system load. Can do.
  • the system in which the HMD 201B outputs the line-of-sight information of the instruction supervisor PB to the HMD 201A only when the line of sight of the instruction supervisor PB is located within the display area of the captured image specifically, This can be realized by adopting the following configuration.
  • FIG. 14 is a block diagram showing another configuration of the HMD 201.
  • the HMD 201 further includes an input unit 39 in addition to the configuration shown in FIG.
  • the input unit 39 is a device for designating timing for outputting line-of-sight information of the user of the HMD 201 (for example, the instruction supervisor PB in the above example).
  • Such an input unit 39 can be configured by, for example, a voice recognition device, a pupil observation device, a small touch device, or the like.
  • the voice recognition device outputs a signal instructing the output of the line-of-sight information to the control unit 38 when recognizing a specific voice such as “this” or “that”, for example.
  • the pupil observation device observes the user's pupil, and when the characteristic blink of the user is detected, for example, three consecutive blinks, a blink larger than normal, or a blink longer than usual, A signal for instructing output is output to the control unit 38.
  • the line-of-sight detection device 32 may have the function of the above-described pupil observation device.
  • the small touch device has, for example, a small ring-shaped member that can be attached to the finger of the user's hand, and outputs a signal instructing the output of the line-of-sight information to the control unit 38 when the ring-shaped member is touched.
  • the control unit 38 receives the instruction signal from the input unit 39 and controls the communication unit 34 to output the user's line-of-sight information detected by the line-of-sight detection device 32 from the communication unit 34 to the outside.
  • the HMD 201B can cause the HMD 201A to output the line-of-sight information at the detection timing (at the timing specified by the input unit 39) triggered by detection of a specific sound or the like. it can. That is, when the instruction supervisor PB is observing a captured image, the instruction supervisor PB makes a sound such as “this”, makes a specific blink, or touches a ring-shaped member, thereby changing the HMD 201B to the HMD 201A. Gaze information can be output.
  • the line-of-sight information of the instruction supervisor PB is output to the HMD 201A only when the line of sight of the instruction supervisor PB is located within the captured image display area. Further, the line-of-sight information output from the HMD 201B can be surely used as the line-of-sight information when the instruction supervisor PB is looking at the display area of the captured image. Communication can be realized. Further, by using the input unit 39 together, intended designation for the device OB becomes simple, and further improvement in usability can be expected.
  • the ring-shaped member is attached to, for example, the little finger of the user's hand and can be touched with the thumb of the same hand. For this reason, the user's hands-free performance is not impaired only by wearing the small touch device as the input unit 39 by the user. That is, even if a small touch device is used as the input unit 39, the hands-free property peculiar to the HMD 201 is ensured.
  • the HMD 201B of the instruction supervisor PB is configured using the optical see-through video display device 202, but the instruction supervisor PB stands behind the worker PA and directly sees the device OB. It is not necessary to be optically see-through because it is difficult to do. Therefore, the HMD 201B may be configured by using a video see-through type (see FIGS. 30 and 31) video display device 202 described later.
  • the worker PA and the instruction supervisor PB are located at a short distance as shown in FIG. 5, but the worker PA and the instruction supervisor PB are far away as shown in FIG. It may be located at a distance.
  • the worker PA communicates an instruction based on the line-of-sight information from the instruction supervisor PB by applying the system of this embodiment. By receiving via the line 300 (see FIG. 1), the same effect as in the present embodiment can be obtained.
  • the voice input device 36 and the voice output device 37 shown in FIG. 4 are used between the HMDs 201A and 201B, such as “this”, “that”, “loosen this screw!”, “I understand!” Audio information may be input / output.
  • communication using voice information can be performed in addition to the line-of-sight information of the instruction supervisor PB, communication is further simplified.
  • the content of the line-of-sight information can be supplemented with audio information, the accuracy of communication is further improved.
  • FIG. 16 schematically shows a state where the worker PA and the worker PC perform the collaborative work by wearing the HMDs 201A and 201C on their heads, respectively.
  • Worker PA and worker PC are users of HMDs 201A and 201C, respectively.
  • the configurations of the HMDs 201A and 201C are exactly the same as those of the HMD 201 (see FIGS. 2 to 4) described in the first embodiment.
  • a symbol “A” or “C” is added after the reference numerals of the constituent elements as in the video display devices 202A and 202C, for example.
  • a device OB that is an object is placed on a table T in the center of the work room, and a worker PA and a worker PC, which are a plurality of workers, are placed on the device OB. Assume that maintenance is performed.
  • the HMDs 201A and 201C are in a communication state (for example, in a state where communication is possible using Wi-Fi (registered trademark)) so that information can be linked to each other, and the mutual information can be transmitted and received.
  • a communication state for example, in a state where communication is possible using Wi-Fi (registered trademark)
  • Wi-Fi registered trademark
  • steps S1 to S5 shown in FIG. 6 are performed between the HMD 201A of the worker PA and the HMD 201C of the worker PC. More specific description will be given below.
  • the HMD 201A which is one information input / output terminal, outputs information of the captured image acquired by the imaging device 31 of the HMD 201A to the HMD 201C, which is another information input / output terminal.
  • the HMD 201C outputs information of the captured image acquired by the imaging device 31 of the HMD 201C to the HMD 201A.
  • FIG. 17 shows an image of the field of view of the worker PA.
  • a rectangular area IM (201A) indicated by a broken line indicates an imaging range of the imaging device 31 of the HMD 201A worn by the worker PA.
  • the worker PA includes the device OB, the hand of the worker PA (right hand RH (PA), the left hand LH (PA)), the tool held in the hand of the worker PA (driver DR (PA)), and the worker PC.
  • the hands (right hand RH (PC), left hand LH (PC)) are visible, and the field of view of the worker PA is captured by the imaging device 31 of the HMD 201A.
  • the HMD 201A outputs information (image data) of the captured image of the imaging device 31 viewed from the viewpoint of the worker PA to the HMD 201C via the communication unit 34 of the HMD 201A.
  • FIG. 18 shows an image of the field of view of the worker PC.
  • a rectangular area IM (201C) indicated by a broken line indicates an imaging range of the imaging device 31 of the HMD 201C worn by the worker PC.
  • the worker PC includes the device OB, the hand of the worker PA (right hand RH (PA), the left hand LH (PA)), the tool held in the hand of the worker PA (driver DR (PA)), the worker PC
  • the hands (right hand RH (PC), left hand LH (PC)) are visible, and the field of view of the worker PC is captured by the imaging device 31 of the HMD 201C.
  • the HMD 201C outputs information (image data) of the captured image of the imaging device 31 viewed from the viewpoint of the worker PC to the HMD 201A via the communication unit 34 of the HMD 201C.
  • the HMD 201C displays the captured image on the worker PC of the HMD 201C so as to be visible based on the captured image information input from the HMD 201A by the video display device 202C.
  • the HMD 201A causes the video display device 202A to display the captured image so as to be visible to the worker PA of the HMD 201A based on information about the captured image input from the HMD 201C.
  • FIG. 19 shows an operation when an image (view image) captured and output by the HMD 201C of the worker PC is displayed as a video (virtual image) V (202A) on the video display device 202A of the HMD 201A of the worker PA.
  • the view of the person PA is shown.
  • FIG. 20 shows a case where an image (view image) captured and output by the HMD 201A of the worker PA is displayed as a video (virtual image) V (202C) on the video display device 202C of the HMD 201C of the worker PC.
  • the field of view of the worker PC is shown.
  • the HMDs 201A and 201C mutually input and output view images captured by the respective image capturing apparatuses 31, and display the input view images as videos (virtual images).
  • the worker PA can observe the field-of-view image of the worker PC with the HMD 201A worn by the worker PA, and the worker PC can see the worker PC with the HMD 201C worn by the worker PA.
  • a visual field image of PA can be observed.
  • the device OB, the hand of the worker PA, the driver DR (PA), and the hand of the worker PC are real objects in the actual view of the worker PA.
  • the device OB, the hand of the worker PA, the driver DR (PA), and the hand of the worker PC are also real objects in the actual field of view of the worker PC.
  • the video V (202A) displayed on the HMD 201A is a part of a region that can be displayed as a virtual image by the video display device 202A and is displayed in a region other than the center of the region (in the display element 5, the display element 5).
  • the video is displayed on a part of the display surface other than the center of the display surface).
  • the worker PA works on the device OB, it is considered that the worker PA usually works by positioning the device OB in the center of the field of view. Therefore, it is possible to avoid that the video V (202A) and the device OB are visually overlapped and visually recognized during the work of the worker PA, and the workability with respect to the device OB is deteriorated (the work is difficult).
  • the video V (202C) displayed on the HMD 201C is also a part of an area that can be displayed as a virtual image by the video display device 202C and is displayed in a region other than the center of the area.
  • the HMD 202C detects the line-of-sight direction of the worker PC with respect to the captured image (video V (202C)) displayed by the video display device 202C by the line-of-sight detection device 32 of the HMD 202C.
  • the HMD 202A detects the line-of-sight direction of the worker PA with respect to the captured image (video V (202A)) displayed by the video display device 202A by the line-of-sight detection device 32 of the HMD 202A.
  • the line-of-sight detection device 32 is attached to the video display device 202 (eyepiece optical system 6), and the relative position with respect to the video display device 202 is fixed. Therefore, the line-of-sight detection device 32 of the HMD 201C can detect which part of the video V (202C) displayed on the video display device 202C is being viewed by the worker PC. Similarly, the line-of-sight detection device 32 of the HMD 201A can detect which part of the video V (202A) displayed on the video display device 202A is being viewed by the worker PA.
  • the HMD 201C outputs line-of-sight information regarding the line-of-sight direction of the detected worker PC to the HMD 201A via the communication unit 34 of the HMD 201C. Then, the HMD 201A displays the position of the line of sight of the worker PC on the worker PA so as to be visible to the worker PA based on the line-of-sight information input from the HMD 201C by the video display device 202A. Similarly, the HMD 201A outputs the detected line-of-sight information regarding the line-of-sight direction of the worker PA to the HMD 201C via the communication unit 34 of the HMD 201A. Then, the HMD 201C displays the position of the line of sight of the worker PA on the worker PC so as to be visible based on the line-of-sight information input from the HMD 201A by the video display device 202C.
  • the relative position of the imaging device 31 with respect to the video display device 202 is fixed, and the captured image (video V (202C)) displayed on the HMD 201C of the worker PC is originally the worker PA. It is the picked-up image acquired by HMD201A.
  • the gaze point of the worker PC with respect to the captured image of the HMD 201A displayed on the HMD 201C has a known relative position with the imaging device 31 in the HMD 201A of the worker PA.
  • Corresponding to an arbitrary position in the entire video display area of the video display device 202A (substantially equal to the field of view of the worker PA).
  • the HMD 201A of the worker PA can immediately display the position of the line of sight of the worker PC at a predetermined position (a position corresponding to the gazing point of the worker PC) in the display area of the video display device 202A. It becomes possible.
  • the relative position of the imaging device 31 with respect to the video display device 202 is fixed, and the captured image (video V (202 ⁇ / b> A)) displayed on the HMD 201 ⁇ / b> A of the worker PA is originally It is a picked-up image acquired by HMD201C of worker PC.
  • the point of attention of the worker PA (the point ahead of the line of sight of the worker PA) with respect to the captured image displayed on the HMD 201A is a video whose relative position is known with the imaging device 31 in the HMD 201C of the worker PC.
  • the HMD 201C of the worker PC can immediately display the position of the line of sight of the worker PA at a predetermined position (a position corresponding to the gazing point of the worker PA) in the display area of the video display device 202C. It becomes possible.
  • FIG. 21 shows an image of the field of view of the worker PA when the worker PA is looking at the outside world via the HMD 201A.
  • the HMD 201A can present to the worker PA which part of the device OB the worker PC is looking at by displaying the marker MC indicating the position of the line of sight of the worker PC.
  • FIG. 22 shows an image of the field of view of the worker PC when the worker PC is looking at the outside world through the HMD 201C.
  • the HMD 201C can present to the worker PC which part of the device OB the worker PA is viewing by displaying the marker MA indicating the position of the line of sight of the worker PA.
  • the worker PA and the worker PC are adjacent to each other (see FIG. 16), for example, when the worker PC is holding the hand of the worker PC, for example, the worker PC supports the housing of the device OB with both hands.
  • a voice such as “Loosen this screw!” May be emitted while gazing at the intended device OB.
  • the worker PA can easily recognize that “this screw should be loosened” while looking at the marker MC (see FIG. 21) displayed on the HMD 201A, and can accurately perform the work.
  • the worker PC can accurately transmit the instruction to the worker PA by using the line of sight and the voice without damaging the manual work.
  • a specific example of the collaborative work of the worker PA and the worker PC using the line of sight and voice will be described.
  • the numbers (1) to (4) given in the vicinity of the markers MA and MC indicate the display order of the markers.
  • the marker indicating the own line-of-sight position displayed on the own machine is represented by a rectangle, and the marker indicating the line-of-sight position of another worker displayed on the own machine is displayed. Represented by a circle.
  • the worker PC is to be loosened in the video V (202C). While listening to the screw, give a voice instruction "Loosen this screw!
  • the visual line direction of the worker PC with respect to the video V (202C) is detected by the visual line detection device 32 of the HMD 201C.
  • the video display device 202C displays a marker MC indicating the line of sight of the worker PC along with the captured image of the HMD 201A in the video V (202C) (see (1) in FIG. 22). This clearly identifies which screw should be loosened.
  • the line-of-sight information indicating the line-of-sight position of the worker PC detected by the HMD 201C and the captured image information acquired by the HMD 201C are output to the HMD 201A.
  • the worker PC has a position corresponding to the gaze point of the worker PC with respect to the video V (202C).
  • a marker MC indicating the line-of-sight position is displayed (see (2) in FIG. 21).
  • the captured image acquired by the HMD 201C is displayed as the video V (202A) by the video display device 202A of the HMD 201A.
  • the worker PA grasps the screw (screw to be loosened) instructed by the worker PC by visually recognizing the marker MC displayed on his / her HMD 201A, and performs the work of loosening the screw.
  • the worker PA displays the video V In (202A), while listening to the part you want to hold, make a voice request “Please support this cover!”.
  • the visual line direction of the worker PA with respect to the video V (202A) is detected by the visual line detection device 32 of the HMD 201A.
  • the video display device 202A displays the marker MA indicating the line-of-sight position of the worker PA together with the captured image of the HMD 201C as the video V (202A) (see (3) in FIG. 21). Thereby, it is clearly identified which cover the operator PA wants to support.
  • line-of-sight information indicating the line-of-sight position of the worker PA detected by the HMD 201A is output to the HMD 201C.
  • a marker MA indicating the line-of-sight position is displayed (see (4) in FIG. 22).
  • the steps of outputting the information of the captured image, displaying the captured image, detecting the line-of-sight direction with respect to the captured image, outputting the line-of-sight information, and displaying the position of the line of sight are performed between the HMDs 201A and 201C.
  • the HMDs 201A and 201C can share the line-of-sight information of the worker PA and the worker PC with each other and perform a collaborative work.
  • the worker PA and the worker PC wear the HMD 201A / 201C on the head
  • the distant instruction supervisor PD wears the HMD 201D
  • the images sent from the HMD 201A / 201C to the HMD 201D are displayed.
  • a state in which the instruction supervisor PD is observing is schematically shown.
  • the worker PA and the worker PC are users of the HMDs 201A and 201C, respectively, and the instruction supervisor PD is a user of the HMD 201D.
  • the HMD 201D is an information input / output terminal that can communicate with the HMD 201A and the HMD 201C via the communication line 300 (see FIG. 1).
  • the configurations of the HMDs 201A, 201C, and 201D are exactly the same as those of the HMD 201 described in Embodiment 1 (see FIGS. 2 to 4).
  • the components included in the HMDs 201A, 201C, and 201D are distinguished from each other, for example, as in the video display devices 202A, 202C, and 202D, "A" and "C” "Or” D ".
  • Input / output of information between the HMD 201A and the HMD 201C is the same as that in the second embodiment.
  • input / output of information between the HMD 201A and the HMD 201D and between the HMD 201C and the HMD 201D will be described.
  • the information of the captured image acquired by the HMDs 201A and 201C is output to the HMD 201D via each communication unit 34.
  • the captured images of the HMDs 201A and 201C are displayed as video V (202D) -A and video V (202D) -C.
  • the visual line direction of the instruction supervisor PD at that time is detected by the visual line detection device 32 of the HMD 201D.
  • the line-of-sight information regarding the detected line-of-sight direction is output from the HMD 201D to the HMD 201A via the communication unit 34.
  • the position of the line of sight of the instruction supervisor PD is displayed at a predetermined position in the entire display image of the video display device 202A based on the line-of-sight information.
  • the instruction supervisor PD when the instruction supervisor PD is observing the video V (202D) -C, the line-of-sight direction of the instruction supervisor PD at that time is detected by the line-of-sight detection device 32 of the HMD 201D. Then, the line-of-sight information regarding the detected line-of-sight direction is output from the HMD 201D to the HMD 201C via the communication unit 34. Thereafter, in the HMD 201C, the position of the line of sight of the instruction supervisor PD is displayed at a predetermined position in the entire display image of the video display device 202C based on the line-of-sight information.
  • FIG. 24 shows an image of the field of view of the worker PA when the worker PA is looking at the outside world through the HMD 201A.
  • the marker PD indicating the position of the line of sight of the instruction supervisor PD is displayed so that the worker PC and the instruction supervisor PD can identify which part of the device OB. Can be shown to the worker PA.
  • FIG. 25 shows an image of the field of view of the worker PC when the worker PC is looking at the outside world via the HMD 201C.
  • the marker PD indicating the position of the line of sight of the instruction supervisor PD is displayed so that the worker PA and the instruction supervisor PD can identify which part of the device OB. Can be shown to the worker PC.
  • the HMD 201A / 201C can display the positions of the three eyes of the worker PA, the worker PC, and the instruction supervisor PD so that the three parties can communicate and perform the work. It becomes.
  • specific examples of work by the worker PA, the worker PC, and the instruction supervisor PD will be described.
  • the numbers (1) to (10) in the vicinity of the markers MA, MC, and MD indicate the display order of the markers.
  • the marker indicating the position of the user's line of sight displayed on the own machine is represented by a rectangle, and the line of sight of another worker displayed on the own machine is indicated. Markers are represented by circles.
  • a marker indicating the line-of-sight position of the instruction supervisor PD is represented by a triangle.
  • the information of the captured image acquired by the HMD 201A of the worker PA is output to the HMD 201C / HMD 201D, and the captured image is displayed as the video V (202C) / V (202D) -A by the HMD 201C / HMD 201D, respectively. (See FIG. 25 and FIG. 23).
  • the worker PC gives a voice instruction “Loosen this screw!” While watching the screw to be loosened in the video V (202C) displayed on the HMD 201C.
  • the visual line direction of the worker PC with respect to the video V (202C) is detected by the visual line detection device 32 of the HMD 201C.
  • the video display device 202C displays a marker MC indicating the line of sight of the worker PC in the video V (202C) together with the captured image of the HMD 201A (see (1) in FIG. 25).
  • the line-of-sight information indicating the line-of-sight position of the worker PC detected by the HMD 201C and the captured image information acquired by the HMD 201C are output to the HMD 201A.
  • the worker PC has a position corresponding to the gaze point of the worker PC with respect to the video V (202C).
  • a marker MC indicating the line-of-sight position is displayed (see (2) in FIG. 24).
  • the captured image acquired by the HMD 201C is displayed as the video V (202A) by the video display device 202A of the HMD 201A (see FIG. 24).
  • the line-of-sight information indicating the line-of-sight position of the worker PC detected by the HMD 201C is also output to the HMD 201D.
  • the marker MC indicating the line-of-sight position of the worker PC is displayed in the video V (202D) -A by the video display device 201D (see (3) in FIG. 23).
  • the video V (202C) and the video V (202D) -A are both images taken by the HMD 201A, and therefore, the operator PC notes the video V (202C) in the video V (202D) -A. It is possible to easily display the marker MC indicating the line-of-sight position of the worker PC at a position corresponding to the viewpoint.
  • the instruction supervisor PD looks at the marker MC in the video V (202D) -A and determines that the instruction of the worker PC is wrong, the screw to be loosened in the video V (202D) -A , Give a voice instruction saying “No, you should loosen this screw first!”
  • the line-of-sight direction of the instruction supervisor PD with respect to the video V (202D) -A is detected by the line-of-sight detection device 32 of the HMD 201D.
  • the video display device 202D displays a marker MD indicating the line-of-sight position of the instruction supervisor PD in the video V (202D) -A (see (4) in FIG. 23).
  • the line-of-sight information regarding the line-of-sight direction of the instruction supervisor PD is output to the HMD 201A. Accordingly, in the HMD 201A, the entire video display area of the video display device 202A (corresponding to the field of view of the worker PA) is located at a position corresponding to the point of sight of the instruction supervisor PD with respect to the video V (202D) -A. A marker MD indicating the line-of-sight position of the instruction supervisor PD is displayed (see (5) in FIG. 24). Whether the instruction supervisor PD is viewing the video V (202D) -A or the video V (202D) -C can be determined based on the visual line detection result in the visual line detection device 32 of the HMD 201D. Therefore, when the instruction supervisor PD is watching the video V (202D) -A, the line-of-sight information of the instruction supervisor PD can be output to the HMD 201A that is the provider of the video V (202D) -A. .
  • the worker PA performs the work of loosening the screw indicated by the instruction supervisor PD, not the screw indicated by the worker PC, by visually recognizing the marker MD displayed on the own HMD 201A. At this time, the worker PA makes a voice request “Please support this cover!” While observing the portion to be held in the video V (202A).
  • the visual line direction of the worker PA with respect to the video V (202A) is detected by the visual line detection device 32 of the HMD 201A. Then, the video display device 202A displays the marker MA indicating the line-of-sight position of the worker PA together with the captured image of the HMD 201C as the video V (202A) (see (6) in FIG. 24).
  • line-of-sight information indicating the line-of-sight position of the worker PA detected by the HMD 201A is output to the HMD 201C.
  • a marker MA indicating the line-of-sight position is displayed (see (7) in FIG. 25).
  • the line-of-sight information indicating the line-of-sight position of the worker PA detected by the HMD 201A is also output to the HMD 201D.
  • the marker MA indicating the line-of-sight position of the worker PA is displayed in the video V (202D) -C by the video display device 201D (see (8) in FIG. 23).
  • the video V (202A) and the video V (202D) -C are both images taken by the HMD 201C, and therefore the operator PA's note on the video V (202A) in the video V (202D) -C. Displaying the marker MA indicating the line-of-sight position of the worker PA at a position corresponding to the viewpoint can be easily realized.
  • the instruction supervisor PD looks at the marker MA in the video V (202D) -C, and determines that the instruction of the worker PA is wrong, the cover to be held in the video V (202D) -C. While watching, give a voice instruction "No, hold this cover as well as its cover!
  • the line-of-sight direction of the instruction supervisor PD with respect to the video V (202D) -C is detected by the line-of-sight detection device 32 of the HMD 201D.
  • the video display device 202D displays the marker MD indicating the line-of-sight position of the instruction supervisor PD in the video V (202D) -C (see (9) in FIG. 23).
  • the line-of-sight information regarding the line-of-sight direction of the instruction supervisor PD is output to the HMD 201C. Accordingly, in the HMD 201C, the entire video display area of the video display device 202C (corresponding to the field of view of the worker PC) is positioned at the position corresponding to the point of sight of the instruction supervisor PD with respect to the video V (202D) -C. A marker MD indicating the line-of-sight position of the instruction supervisor PD is displayed (see (10) in FIG. 25). As a result, the worker PC can appropriately support the work of the worker PA holding the cover designated by the marker MD and loosening the screw.
  • FIG. 26 is an explanatory diagram showing another configuration of the information input / output terminal 200.
  • the information input / output terminal 200 may include at least a display device 200a and a line-of-sight detection device 32 fixed to the display device 200a.
  • the display device 200a is a direct view display (monitor) such as a television set installed in a room.
  • the line-of-sight detection device 32 detects the line-of-sight direction of the instruction supervisor PD who observes the video V (200) -A and the video V (200) -C displayed on the display device 200a.
  • the position of the line-of-sight detection device 32 is identified with respect to the display device 200a.
  • the line-of-sight detection device 32 having such a configuration has been proposed by Toby Technology, Inc. and can be used.
  • the video V (200) -A is a captured image of the HMD 201A acquired by the HMD 201A of the worker PA, input to the information input / output terminal 200, and displayed.
  • the video V (200) -C is a captured image of the HMD 201C that is acquired by the HMD 201C of the worker PC, input to the information input / output terminal 200, and displayed.
  • the information input / output device 200 displays the other person's line-of-sight image, detects the observation line of sight of the instruction supervisor PD with respect to the display image, and outputs the line-of-sight information to the other person. Similarly, communication based on line of sight is possible. Therefore, the information input / output device 200 of the instruction supervisor PD is not necessarily an HMD.
  • the video display device 202 of the HMD 201 described in each of the above embodiments is an optical see-through display that allows a user to directly observe the outside world together with video.
  • a user for example, worker PA
  • an object for example, device OB
  • FIG. 27 schematically shows the relative positional relationship between the video display device 202 (optical see-through display), the imaging device 31, and the object OB ′ located at infinity.
  • the imaging optical axis A 1 of the imaging device 31 is substantially the same as the viewing direction (observation central axis) A 2 of the user when observing the object OB ′ located at infinity. I am doing (almost parallel).
  • the vertical direction (vertical direction) distance D between the video display device 202 and the imaging device 31 is too large relative to the distance L between the video display device 202 and the object OB ′. , Parallax occurs.
  • the video display device 202A displays the marker MB indicating the line-of-sight position of the instruction supervisor PB on the device OB that is the object.
  • the display of the MB deviates from the object (device OB) that the worker PA is actually looking at. That is, in the relative positional relationship shown in FIG. 27, the display position of the marker MB is shifted as the object OB ′ approaches the video display device 202.
  • the HMD 201 desirably includes a measurement unit 41 and a parallax correction unit 42 in addition to the above-described configuration.
  • the measurement unit 41 measures the distance between the video display device 202 and the object OB ′ that the user observes with see-through.
  • the captured image displayed by the HMD 201B is an image captured by the imaging device 31 of the HMD 201A
  • the object OB ′ is identified by the line-of-sight detection result of the HMD 201B with respect to the captured image.
  • It is easy to measure the distance L from the object OB ′ by the imaging device 31 of the HMD 201A for example, the distance L can be measured by adopting a contrast method in which the imaging device 31 itself performs distance measurement).
  • the parallax correction unit 42 is based on a relationship between a preset relative position of the imaging device 31 with respect to the video display device 201A (corresponding to the distance D in FIG. 27) and the distance L measured by the measurement unit 41.
  • the parallax (angle ⁇ ) between the viewing direction (observation central axis) A 2 of the user observing the object OB ′ and the imaging optical axis A 1 of the imaging device 31 that images the object OB ′ is calculated and calculated.
  • the position of the line of sight to be displayed is corrected based on the parallax.
  • Such a parallax correction unit 42 can be configured by a CPU or a circuit for performing a specific calculation process.
  • the distance D is known in advance by design.
  • the parallax correction unit 42 corrects the position of the line of sight to be displayed based on the calculated parallax (correction step). For example, assuming that the total angle of view in the vertical direction of the display area of the video (virtual image) of the video display device 202 is 10 °, the angle ⁇ corresponding to the parallax obtained above is 1 °, and the number of display pixels in the vertical direction of the video If the display element 5 (corresponding to the number of pixels in the vertical direction of the display element 5 (see FIGS. 3 and 5)) is N1 (number), the display position of the marker is normalized by the number of pixels of (1/10) ⁇ N1.
  • a position corresponding to the point of gaze of the instruction supervisor PB in the entire video display area of the video display device 202A may be shifted in the vertical direction during video observation.
  • the marker display position is set by the predetermined number of pixels calculated by the same method as described above. What is necessary is just to shift to the horizontal direction at the time of video observation from a regular position.
  • the parallax is calculated and the position of the line of sight to be displayed is corrected. Can be displayed. In other words, it is possible to avoid a situation where the line of sight is displayed with a deviation from a predetermined position (a position where the work is instructed) of the object OB ', and it is possible to reliably realize appropriate communication with the line of sight.
  • FIG. 29 is a cross-sectional view showing another optical configuration of the HMD 201 of each embodiment described above.
  • a half mirror 24 is provided on the surface of the HOE 23 on the deflection prism 22 side, and imaging is performed so that the imaging optical axis A 1 is bent by the half mirror 24 and is coaxial with the line-of-sight direction (observation central axis) A 2.
  • the device 31 may be arranged. In this configuration, since no parallax is generated, a parallax correction mechanism such as the measurement unit 41 and the parallax correction unit 42 described above is unnecessary, and the configuration of the apparatus can be simplified by eliminating the need for such a mechanism. Can do.
  • FIG. 30 is a cross-sectional view showing still another optical configuration of the HMD 201.
  • the video display device 202 of the HMD 201 may be a video see-through display.
  • the optical see-through display constituting the video display device 202 may be replaced with a video see-through display.
  • the video see-through display is a display device that uses a shielded display and displays an external image (captured image) captured by the imaging device 31 so as to be visible to the user integrally with the video.
  • the above-described shielding type display can be realized by arranging the shielding plate 25 on the outside side of the eyepiece optical system 6 described above.
  • the video display device 202 is configured with a video see-through display, communication can be achieved through line-of-sight display.
  • the operator can work on the object in the outside world while observing the captured image.
  • FIG. 31 is a cross-sectional view showing still another optical configuration of the HMD 201.
  • the video display device 202 is configured by a video see-through display
  • the outside of the shielding plate 25 on the outside side
  • the imaging device 31 may be disposed on a side opposite to the eyepiece optical system 6 with respect to the shielding plate 25.
  • a parallax correction mechanism such as the measurement unit 41 and the parallax correction unit 42 described above is unnecessary, and the configuration of the apparatus can be simplified by eliminating the need for such a mechanism. Can do.
  • the imaging device 31 is visually recognized by the user even if the imaging device 31 is arranged further outside the shielding plate 25. This does not interfere with the user's observation of the outside world (video). Therefore, it is possible to easily realize the configuration in which the imaging device 31 is arranged on the outside of the shielding plate 25 as described above so that the imaging optical axis A 1 and the line-of-sight direction A 2 are coaxial.
  • the line-of-sight information may be output from the HMD at an arbitrary timing by applying the input unit 39 described in the first embodiment.
  • the information input / output terminal includes a plurality of GPS (Global Positioning System) sensors, geomagnetic sensors, acceleration sensors, temperature sensors, and the like from the viewpoint of improving workability by grasping the worker's work state and work environment.
  • the sensor may be provided.
  • the GPS sensor acquires worker position information.
  • the geomagnetic sensor detects the direction in which the worker is facing.
  • the acceleration sensor detects the movement (posture) of the worker.
  • the temperature sensor detects the temperature of the work environment or the body temperature of the worker himself.
  • the work instruction supervisor By transmitting the information acquired by these sensors to an external input / output terminal via the communication unit 34, the work instruction supervisor understands the work environment outside and appropriate instructions (if necessary) (for example, it is possible to give the operator (HMD) a work stoppage or the like.
  • the line-of-sight information sharing method described in the embodiment of the present invention is a line-of-sight information sharing method for sharing user's line-of-sight information among a plurality of information input / output terminals connected via a communication line.
  • Each of the information input / output terminals includes a display device that presents an image to a user, a line-of-sight detection device that detects a user's line-of-sight direction, and a communication unit that inputs and outputs information to and from each other,
  • At least one of the information input / output terminals is a head-mounted terminal that is mounted on a user's head and displays the video as a virtual image so as to be visible to the user.
  • the display device further includes an imaging device that is fixed in a relative position with respect to the display device and that images the external world in front of the user. Any one of the head
  • the line-of-sight information sharing method uses information on the captured image acquired by the imaging device of the first terminal. Based on the first step of outputting to the second terminal and the input information of the captured image, the captured image is displayed on the second terminal so as to be visible to the user of the second terminal.
  • a second step a third step of detecting a gaze direction of the user of the second terminal with respect to the displayed captured image, and gaze information relating to the detected gaze direction of the user of the second terminal, Based on the fourth step of outputting to the first terminal and the input line-of-sight information, the position of the line of sight of the user of the second terminal is associated within the display area of the video by the first terminal. And a fifth step of displaying the position.
  • the line-of-sight information sharing system described in the embodiment of the present invention is a line-of-sight information sharing system that shares line-of-sight information of a user between a plurality of information input / output terminals connected via a communication line
  • Each of the plurality of information input / output terminals includes a display device that presents an image to the user, a line-of-sight detection device that detects a user's line-of-sight direction, and a communication unit for inputting and outputting information to and from each other
  • At least one of the plurality of information input / output terminals is a head-mounted terminal that is mounted on a user's head and displays the video as a virtual image so as to be visible to the user.
  • the image processing device further includes an imaging device whose relative position is fixed with respect to the display device and images the external environment in front of the user, The head
  • the first terminal uses the captured image information acquired by the imaging device of the first terminal
  • the second terminal outputs the captured image to the second terminal via the communication unit of the first terminal based on the information of the captured image input by the display device of the second terminal.
  • a line of sight regarding the line-of-sight direction is displayed after the line-of-sight detecting device of the second terminal displays the line-of-sight of the displayed image with respect to the displayed captured image.
  • Information is output to the first terminal via the communication unit of the second terminal, and the first terminal is configured to output the second terminal based on the line-of-sight information input by the display device of the first terminal.
  • the position of the user's line of sight is displayed in the table of the video by the first terminal. Displayed in the corresponding position in the area.
  • information of the captured image acquired by the imaging device of the first terminal is output to the second terminal via the communication unit.
  • the captured image is displayed by the display device, and the visual direction of the user of the second terminal with respect to the displayed captured image is detected by the visual direction detection device.
  • the detected line-of-sight information is output from the second terminal to the first terminal via the communication unit.
  • the line-of-sight information of the user of the second terminal is shared between the first terminal and the second terminal.
  • the display device displays the position of the line of sight of the user of the second terminal based on the line-of-sight information.
  • the imaging device has a fixed relative position with respect to the display device, and the captured image displayed on the second terminal is the captured image originally acquired by the first terminal. For this reason, the user's gaze point of the second terminal with respect to the captured image displayed on the second terminal is within the display area of the video (virtual image) by the display device whose relative position is fixed to the imaging device on the first terminal. Corresponding to any position. Therefore, the first terminal does not require complicated processing such as understanding the mutual spatial position of the user of the first terminal and the user of the second terminal and converting the position coordinates of the line of sight.
  • the user of the second terminal outputs the captured image output from the first terminal.
  • the object can be grasped based on the above.
  • the user of the first terminal outputs the line-of-sight information of the user of the second terminal with respect to the captured image to the first terminal, and displays the line-of-sight position of the user of the second terminal on the first terminal.
  • the line-of-sight information sharing method further includes a sixth step of detecting a line-of-sight direction of the user of the first terminal.
  • the line-of-sight information sharing method is combined with the position of the line of sight of the user of the second terminal.
  • the position of the line of sight of the user of the first terminal based on the detection result in the sixth step may be displayed.
  • the first terminal detects the line-of-sight direction of the user of the first terminal by using the line-of-sight detection device of the first terminal, and the display device of the first terminal Based on the detection result of the line-of-sight detection device, the position of the line of sight of the user of the first terminal may be displayed together with the position of the line of sight of the user of the second terminal.
  • the user of the first terminal uses the second terminal by displaying the position of the user's line of sight along with the position of the user's line of sight of the second terminal. It is possible to more accurately communicate with the person based on the line of sight.
  • the line-of-sight information sharing method includes: a seventh step of outputting line-of-sight information related to the line-of-sight direction of the user of the first terminal to the second terminal; and the second terminal by the first step In the acquired captured image, the position of the line of sight of the user of the first terminal based on the line of sight information and the position of the line of sight of the user of the second terminal based on the detection result in the third step. And an eighth step of displaying together.
  • the first terminal outputs line-of-sight information regarding the line-of-sight direction of the user of the first terminal to the second terminal via the communication unit of the first terminal
  • the second terminal uses the display device of the second terminal to position the line of sight of the user of the first terminal based on the line-of-sight information in the captured image and the position of the line of sight of the user of the second terminal. And may be displayed together.
  • the captured image output from the first terminal displays the position of the line of sight of the user of the first terminal and the position of the line of sight of the user of the second terminal, thereby displaying the second terminal. Can more accurately communicate with the user of the first terminal based on the line of sight.
  • the line of sight of the user of the first terminal and the line of sight of the user of the second terminal are displayed in different patterns. In this case, since it is possible to easily distinguish and grasp the mutual line-of-sight positions, it is possible to communicate quickly and accurately.
  • the gaze information sharing method at least two of the output of the captured image information, the display of the captured image, the detection of the gaze direction with respect to the captured image, the output of the gaze information, and the display of the position of the gaze You may perform mutually between information input / output terminals.
  • the output of the captured image information, the display of the captured image, the detection of the line-of-sight direction with respect to the captured image, the output of the line-of-sight information, and the display of the position of the line of sight are at least 2.
  • the two information input / output terminals may be performed mutually.
  • the line-of-sight information can be shared between at least two information input / output terminals so that the collaborative work can be performed efficiently.
  • the line of sight of the user of the other information input / output terminal within the display area of the captured image output from one information input / output terminal to the other information input / output terminal It is preferable that the other information input / output terminal outputs line-of-sight information related to the user's line-of-sight direction to the one information input / output terminal only when the is located.
  • the other information input / output terminal further includes an input unit for designating a timing for outputting the line-of-sight information, and at a timing designated by the input unit,
  • the line-of-sight information of the user of the other information input / output terminal may be output to the one information input / output terminal.
  • the user of the other information input / output terminal when the user of the other information input / output terminal views the display area of the captured image, the user specifies (identifies) the output timing of the line-of-sight information by the input unit.
  • the line-of-sight information is output from another information input / output terminal to one information input / output terminal at that timing. Therefore, the configuration in which the other information input / output terminal outputs the line-of-sight information to one information input / output terminal is ensured only when the line of sight of the user of the other information input / output terminal is located within the display area of the captured image. Can be realized.
  • the line-of-sight information output from another information input / output terminal can be surely set as the line-of-sight information when the user of the other information input / output terminal is looking at the display area of the captured image. It becomes possible to realize simple and highly accurate communication using the line-of-sight information.
  • each of the plurality of information input / output terminals further includes a voice information input device and an output device, and the voice information is input between the plurality of information input / output terminals. It may be output. Since communication using voice information in addition to line-of-sight information is possible, communication is further simplified. In addition, since the content of the line-of-sight information can be supplemented with audio information, the accuracy of communication is further improved.
  • the display device of the head-mounted terminal may be an optical see-through display that allows a user to directly observe the outside world together with an image.
  • the user can perform work while directly observing the object in the outside world, and workability can be improved.
  • the line-of-sight information sharing method includes a step of measuring a distance between the display device of the head-mounted terminal and an object observed by a user through the see-through method, and a relative relationship of the imaging device with respect to the preset display device A step of calculating a parallax between an observation line of sight of a user observing the object and an imaging optical axis of the imaging apparatus that images the object based on a relationship between a position and the distance; And a step of correcting the position of the line of sight to be displayed based on the parallax.
  • the head-mounted terminal includes a measuring unit that measures a distance between the display device and an object that a user observes through, and the preset display device. Based on the relationship between the relative position of the imaging device and the distance, the parallax between the observation line of sight of the user observing the object and the imaging optical axis of the imaging device that images the object is calculated. And a parallax correction unit that corrects the position of the line of sight to be displayed based on the parallax.
  • the position of the line of sight to be displayed is corrected based on the parallax.
  • the line of sight can be displayed at an appropriate position in the display image.
  • the display device of the head-mounted terminal shields the front and displays a captured image of the outside world so as to be visible to the user integrally with the video.
  • Type display Even if the user cannot directly observe the object through the see-through, the user can grasp the object by observing the captured image of the outside world, thereby enabling the user to work on the object. .
  • the display device includes a display element that displays the video, and optically displays a center of a display surface of the display element and a pupil center of a user during video observation.
  • the imaging device may be arranged so that the observation center axis and the imaging optical axis are coaxial with each other when the axis connected to is the observation center axis. Since the parallax between the observation center axis and the imaging optical axis does not occur, a mechanism for correcting parallax is not required in order to improve the positional accuracy of the line of sight to be displayed. Can be simplified.
  • the present invention can be used in a system for sharing user's line-of-sight information among a plurality of information input / output terminals connected via a communication line.
  • Display Element 31 Imaging Device 32 Gaze Detection Device 36 Voice Input Device (Input Device) 37 Audio output device (output device) 39 Input unit 41 Measuring unit 42 Parallax correcting unit 100 Line-of-sight information sharing system 200 Information input / output terminal 200a Display device 201, 201A, 201B, 201C, 201D HMD (head-mounted terminal) 202, 202A, 202B, 202C, 202D Video display device (display device) 300 Communication line

Abstract

A line-of-sight information sharing method that includes: a first step for outputting information about a photographic image obtained with an imaging device of a first terminal (HMD) to a second terminal; a second step for displaying the photographic image at the second terminal so as to be visible to a user of the second terminal on the basis of the input photographic image information; a third step for detecting the line-of-sight direction of the user of the second terminal with respect to the displayed photographic image; a fourth step for outputting line-of-sight information associated with the detected line-of-sight direction of the user of the second terminal to the first terminal; and a fifth step for displaying the line-of-sight position of the user of the second terminal at a position corresponding to a position within the field of view of the user of the first terminal on the basis of the input line-of-sight information.

Description

視線情報共有方法および視線情報共有システムGaze information sharing method and gaze information sharing system
 本発明は、通信回線を介して接続される複数の情報入出力端末間で、使用者の視線情報を共有する視線情報共有方法および視線情報共有システムに関する。 The present invention relates to a line-of-sight information sharing method and a line-of-sight information sharing system for sharing line-of-sight information of a user between a plurality of information input / output terminals connected via a communication line.
 従来から、シースルーHMD(Head Mounted Display;ヘッドマウントディスプレイ)と呼ばれる頭部装着型のシースルー情報表示装置が知られている。このシースルーHMDは、外界観察時に所望の情報を映像として付加して装着者に提供できる装置であり、大別すると、ビデオシースルー型と、光学シースルー型とがある。 Conventionally, a head-mounted see-through information display device called a see-through HMD (Head Mounted Display) has been known. This see-through HMD is a device that can provide desired information to a wearer by adding desired information as an image when observing the outside world, and is roughly classified into a video see-through type and an optical see-through type.
 ビデオシースルー型は、遮蔽型(クローズ型)の表示装置に、装着者の前方視界を撮像するカメラを付加し、装着者の前方の視界を撮像した画像に所望の映像を重ね、これらを一体的な映像として装着者に視認させる装置である。一方、光学シースルー型は、外界からの光と映像光とを重ねるコンバイナを用い、コンバイナによってこれらの光を装着者の瞳に導くことにより、装着者に映像とともに外界を直接視認させる装置である。 In the video see-through type, a camera that captures the wearer's front view is added to a shielded (closed) display device, and a desired image is superimposed on the image that captures the wearer's front view, and these are integrated. It is a device that allows the wearer to visually recognize the image. On the other hand, the optical see-through type is a device that uses a combiner that superimposes light from the outside and image light, and guides the light to the wearer's pupil by the combiner, thereby allowing the wearer to visually recognize the outside world together with the image.
 上記いずれのタイプのシースルーHMDについても、視認される外界の像(以下、外界像とも称する)に所望の映像を重畳することによって、装着者の作業を補完して助ける活用提案がある。HMDの持つハンズフリー性と合わせて外界を観察可能とすることで、装着者に通常の作業を妨げることがなく、さらに必要な情報(例えば作業指示やマニュアル)を外界像に重畳して表示して、装着者の作業性を向上させることができる。 For any of the above-described types of see-through HMDs, there is a proposal of utilization that complements and assists the work of the wearer by superimposing a desired image on a visually recognized external image (hereinafter also referred to as an external image). By making the outside world observable in combination with the hands-free nature of the HMD, it prevents the wearer from interfering with normal work, and displays necessary information (for example, work instructions and manuals) superimposed on the outside world image. Thus, the workability of the wearer can be improved.
 HMDを用いて装着者の作業を補完する例については、例えば特許文献1で提案されている。特許文献1では、作業者および指示者がビデオシースルー型のHMDを装着し、作業者の前方の外界をカメラで撮像して、その映像を遠隔地の指示者に送信して共有する。そして、表示映像に対して指示者が付加情報を追加することで、作業者に指示を伝達し、協働作業を行う。付加情報を追加する手段としては、手に持つタイプ(手持ち型、ハンディ型)の入力装置が用いられる。 An example of supplementing the wearer's work using the HMD is proposed in Patent Document 1, for example. In Patent Document 1, an operator and an instructor wear a video see-through type HMD, and the outside world in front of the operator is imaged by a camera, and the image is transmitted to an instructor at a remote location and shared. Then, the instructor adds additional information to the display video, so that the instruction is transmitted to the worker and the collaborative work is performed. As a means for adding additional information, a hand-held type (hand-held type, handy type) input device is used.
 しかし、付加情報の追加に手持ち型の入力装置を用いた場合、入力装置によって手がふさがるため、本来のHMDが持つハンズフリー性が損なわれる。このため、HMDを用い、例えば、対象物のネジを実際に締める、部品を取り外すなど、手による作業が必要な場合には、その作業性が悪化する。 However, when a handheld input device is used to add additional information, the hands are blocked by the input device, and the hands-free property of the original HMD is impaired. For this reason, when work by hand is required, such as actually tightening a screw of an object or removing a part using an HMD, the workability is deteriorated.
 その他、上記のような複数人での協働作業や、指示者からの指示を受けて作業者が実作業を行う活用においては、音声や、音声入力による表示(音声を活字に変換して表示する手法)での意思疎通も考えられる。しかし、実際の対象物に対する作業箇所を特定するにあたって、「これ」、「あれ」、「そこ」などの曖昧な音声や表示による指示では、作業箇所を特定しにくい。また、詳細に意思疎通を行おうとすると、例えば、特定のボルトに対する作業を指示する際に、「右から3列目で上から2番目のボルト」、などのように、指示内容が長くなったり複雑化する。このため、音声または音声入力による表示のみで意思疎通を的確に行うことは困難である。 In addition, in collaborative work with multiple people as described above, or in the case where an operator performs an actual work in response to an instruction from the instructor, display by voice or voice input (display by converting the voice to type) Communication method). However, when specifying a work location for an actual object, it is difficult to specify the work location with an ambiguous voice or display instruction such as “this”, “that”, “there”. Also, when trying to communicate in detail, for example, when instructing a work on a specific bolt, the instruction content becomes long, such as “second bolt from the top in the third column from the right”. To be complicated. For this reason, it is difficult to accurately communicate only by display by voice or voice input.
 したがって、HMDを活用した複数者の協働・連携作業においては、HMDの有用性であるハンズフリー性を損なわずに、相互の意思を的確に伝える手段が求められる。 Therefore, in the collaborative / collaborative work of multiple people using HMD, there is a need for a means for accurately communicating each other's intentions without compromising the hands-free nature of HMD.
 一方、近年では、指示者の視線を検出し、検出した視線を作業者との意思疎通に活用する提案もある。例えば特許文献2では、車両の助手席の人(指示者)の視線を検出し、運転手(作業者)の前方のシースルー型ディスプレイ(透過型ヘッドアップディスプレイ)に、指示者の視線方向に対応する視線マーカーを、作業者の視点から見て適切な位置となるように座標変換して提示(表示)することで、相互の意思疎通を図るようにしている。 On the other hand, in recent years, there is also a proposal for detecting the gaze of the instructor and utilizing the detected gaze for communication with the worker. For example, in Patent Document 2, the line of sight of a person (instructor) in the passenger seat of a vehicle is detected, and the see-through display (transmission head-up display) in front of the driver (operator) corresponds to the line of sight of the instructor. The line-of-sight marker is coordinate-transformed and presented (displayed) so as to be in an appropriate position when viewed from the operator's viewpoint, thereby achieving mutual communication.
 例えば、助手席の人が「そこ」、「あの」等の指示語を発したときの視線位置から、指示語の対象先を求め、発話者(助手席の人)の視線位置を、運転手が視認するヘッドアップディスプレイ上の表示位置に変換して重畳表示させることにより、発話者の視線の情報を運転手と共有することができる。これにより、運転手が発話者の意図を的確に知ることができる。また、運転手および発話者ともに、意思疎通に手作業が必要でなく、ハンズフリー性も確保されている。 For example, from the line-of-sight position when a person in the passenger's seat emits an instruction word such as “There” or “Ano”, the target position of the instruction word is obtained and the line-of-sight position of the speaker (passenger seat person) is determined by the driver. By converting to a display position on the head-up display that is visually recognized and superimposed, information on the line of sight of the speaker can be shared with the driver. As a result, the driver can accurately know the intention of the speaker. In addition, both the driver and the speaker do not need manual work for communication, and hands-freeness is ensured.
特開2006-209664号公報(請求項1、段落〔0007〕、〔0019〕、〔0102〕、図1、図2等参照)JP 2006-209664 A (refer to claim 1, paragraphs [0007], [0019], [0102], FIG. 1, FIG. 2, etc.) 特開2015-41197号公報(請求項1、段落〔0010〕、図7等参照)Japanese Patent Laying-Open No. 2015-41197 (refer to claim 1, paragraph [0010], FIG. 7 etc.)
 ところが、特許文献2のシステムを、HMDを含む複数の情報入出力端末を用いて協働作業を行う用途に適用することは、以下の2点の理由により困難である。 However, it is difficult for the following two reasons to apply the system of Patent Document 2 to a purpose of performing a collaborative work using a plurality of information input / output terminals including an HMD.
 (理由1)
 特許文献2のシステムは、近くにいる発話者および運転手の二人が同時にかつ直接視認できる対象物に対する発話者の視線の情報を、運転手に提示するものである。このため、両者が対象物を同時にまたは直接視認できない状況では、特許文献2のシステムを利用することができない。例えば、協働作業を行う指示者および作業者のどちらか一方(例えば指示者)が遠隔地にいる場合、同一の対象物を指示者が直接視認することができない。また、指示者および作業者が隣接していても、作業場が狭く、作業者が対象物に近接している場合には、指示者からは作業者自身で対象物が隠されて対象物を直接視認することができない。これらの場合には、特許文献2のシステムを利用することができないため、複数の情報入出力端末を使用する使用者の間で、視線に基づく的確な意思疎通ができない。
(Reason 1)
The system of Patent Document 2 presents information on the line of sight of a speaker with respect to an object that can be directly and directly viewed by two nearby speakers and a driver. For this reason, the system of patent document 2 cannot be utilized in the situation where both cannot recognize a target object simultaneously or directly. For example, when one of the instructor and the worker who performs the collaborative work (for example, the instructor) is in a remote place, the instructor cannot directly see the same object. Even if the instructor and the worker are adjacent to each other, if the work place is small and the worker is close to the target object, the target object is hidden from the direct operator by the operator. It cannot be visually recognized. In these cases, since the system of Patent Document 2 cannot be used, accurate communication based on the line of sight cannot be performed among users who use a plurality of information input / output terminals.
 (理由2)
 特許文献2のシステムでは、同一の対象物に対する発話者の視線と運転手の視線とが異なるため、同一の対象物に対する発話者の視線の情報を、運転手の視点位置から見た視線の情報に変換する必要がある。そのためには、発話者、運転手、対象物の3者の相対位置を空間的に把握し、対象物に対する発話者の視線方向を運転手の視線方向に変換する処理(位置座標の変換処理)を行うことが必要となる。したがって、このような特許文献2のシステムを、複数人で協働作業を行う用途に適用した場合、システムが大掛かりで複雑化し、処理の負荷も大きくなり、作業者または指示者の視線に対応する表示を即時に行うことができないと考えられる。視線表示が遅れることは、視線に基づく意思疎通の妨げとなるため、望ましくはない。
(Reason 2)
In the system of Patent Document 2, the line of sight of the speaker for the same object and the line of sight of the driver are different. Need to be converted to For this purpose, a process of spatially grasping the relative positions of the speaker, the driver, and the object, and converting the direction of the speaker's line of sight with respect to the object into the line of sight of the driver (position coordinate conversion process). It is necessary to do. Therefore, when such a system of Patent Document 2 is applied to a use in which a collaborative work is performed by a plurality of people, the system becomes large and complicated, the processing load increases, and it corresponds to the line of sight of the operator or the instructor. It is considered that the display cannot be performed immediately. It is not desirable that the line-of-sight display is delayed because it interferes with communication based on the line-of-sight.
 よって、HMDを含む複数の情報入出力端末を用いて協働作業を行う用途においては、いずれかの使用者が対象物を視認できない状況にあっても、視線に基づく的確な意思疎通を行うことが可能であり、また、相互の空間位置を把握して座標変換を行うなどの複雑な処理を必要とすることなく簡便な方法で、視線に基づく意思疎通を行うことが可能なシステムを実現することが望まれる。 Therefore, in applications where collaborative work is performed using a plurality of information input / output terminals including the HMD, accurate communication based on the line of sight is performed even if any user cannot visually recognize the object. Realize a system that can communicate based on the line of sight with a simple method without requiring complicated processing such as coordinate conversion by grasping the mutual spatial position. It is desirable.
 本発明は、上記の問題点を解決するためになされたもので、その目的は、HMDを含む複数の情報入出力端末を用いた協働作業において、いずれかの使用者が対象物を視認できないような状況であっても、視線に基づく的確な意思疎通を行うことができ、かつ、複雑な処理を必要とすることなく簡便な方法で、視線に基づく意思疎通を行うことができる視線情報共有方法および視線情報共有システムを提供することにある。 The present invention has been made to solve the above-described problems, and its purpose is that any user cannot visually recognize an object in a collaborative work using a plurality of information input / output terminals including an HMD. Gaze information sharing that enables accurate communication based on the line of sight even in such situations, and allows easy communication based on the line of sight without requiring complicated processing It is to provide a method and a line-of-sight information sharing system.
 本発明の一側面に係る視線情報共有方法は、通信回線を介して接続される複数の情報入出力端末間で、使用者の視線情報を共有する視線情報共有方法であって、前記複数の情報入出力端末は、それぞれ、使用者に映像を提示する表示装置と、使用者の視線方向を検出する視線検出装置と、相互に情報を入出力するための通信部とを含み、前記複数の情報入出力端末の少なくとも1つは、使用者の頭部に装着され、前記映像を虚像として使用者に視認可能に表示する頭部装着型端末であり、前記頭部装着型端末は、前記表示装置、前記視線検出装置および前記通信部に加えて、前記表示装置に対して相対位置が固定され、使用者の前方の外界を撮像する撮像装置をさらに含み、前記複数の情報入出力端末のうち、いずれかの前記頭部装着型端末を第1端末とし、他の少なくとも1つの情報入出力端末を第2端末としたとき、該視線情報共有方法は、前記第1端末の前記撮像装置にて取得された撮像画像の情報を、前記第2端末に出力する第1の工程と、入力された前記撮像画像の情報に基づいて、前記第2端末にて、前記撮像画像を前記第2端末の使用者に視認可能に表示する第2の工程と、表示された前記撮像画像に対する前記第2端末の使用者の視線方向を検出する第3の工程と、検出された前記第2端末の使用者の視線方向に関する視線情報を、前記第1端末に出力する第4の工程と、入力された前記視線情報に基づいて、前記第2端末の使用者の視線の位置を、前記第1端末による前記映像の表示領域内で対応する位置に表示する第5の工程とを含む。 A line-of-sight information sharing method according to an aspect of the present invention is a line-of-sight information sharing method for sharing line-of-sight information of a user between a plurality of information input / output terminals connected via a communication line, wherein the plurality of pieces of information Each of the input / output terminals includes a display device that presents an image to a user, a line-of-sight detection device that detects a user's line-of-sight direction, and a communication unit that inputs and outputs information to and from each other. At least one of the input / output terminals is a head-mounted terminal that is mounted on a user's head and displays the video as a virtual image so as to be visible to the user, and the head-mounted terminal includes the display device. In addition to the line-of-sight detection device and the communication unit, a relative position is fixed with respect to the display device, and further includes an imaging device that images the external world in front of the user. Any of the head mounted end Is the first terminal, and at least one other information input / output terminal is the second terminal, the line-of-sight information sharing method uses the captured image information acquired by the imaging device of the first terminal as Based on the first step of outputting to the second terminal and the input information of the captured image, the second terminal displays the captured image so as to be visible to the user of the second terminal. The third step of detecting the line-of-sight direction of the user of the second terminal with respect to the displayed captured image, and the line-of-sight information relating to the detected line-of-sight direction of the user of the second terminal, Based on the fourth step of outputting to one terminal and the input line-of-sight information, the position of the line of sight of the user of the second terminal is set to a corresponding position in the display area of the video by the first terminal. And a fifth step of displaying.
 また、本発明の他の側面に係る視線情報共有システムは、通信回線を介して接続される複数の情報入出力端末間で、使用者の視線情報を共有する視線情報共有システムであって、前記複数の情報入出力端末は、それぞれ、使用者に映像を提示する表示装置と、使用者の視線方向を検出する視線検出装置と、相互に情報を入出力するための通信部とを含み、前記複数の情報入出力端末の少なくとも1つは、使用者の頭部に装着され、前記映像を虚像として使用者に視認可能に表示する頭部装着型端末であり、前記頭部装着型端末は、前記表示装置および前記視線検出装置に加えて、前記表示装置に対して相対位置が固定され、使用者の前方の外界を撮像する撮像装置をさらに含み、前記複数の情報入出力端末のうち、いずれかの前記頭部装着型端末を第1端末とし、他の少なくとも1つの情報入出力端末を第2端末としたとき、前記第1端末は、該第1端末の撮像装置にて取得された撮像画像の情報を、該第1端末の通信部を介して前記第2端末に出力し、前記第2端末は、該第2端末の表示装置により、入力された前記撮像画像の情報に基づいて、前記撮像画像を前記第2端末の使用者に視認可能に表示し、該第2端末の視線検出装置により、表示された前記撮像画像に対する前記第2端末の使用者の視線方向を検出した後、前記視線方向に関する視線情報を、該第2端末の通信部を介して前記第1端末に出力し、前記第1端末は、該第1端末の表示装置により、入力された前記視線情報に基づいて、前記第2端末の使用者の視線の位置を、前記第1端末による前記映像の表示領域内で対応する位置に表示する。 A line-of-sight information sharing system according to another aspect of the present invention is a line-of-sight information sharing system for sharing line-of-sight information of a user among a plurality of information input / output terminals connected via a communication line, Each of the plurality of information input / output terminals includes a display device that presents an image to the user, a line-of-sight detection device that detects a user's line-of-sight direction, and a communication unit for inputting and outputting information to and from each other, At least one of the plurality of information input / output terminals is a head-mounted terminal that is mounted on a user's head and displays the video as a virtual image so as to be visible to the user. In addition to the display device and the line-of-sight detection device, the image processing device further includes an imaging device whose relative position is fixed with respect to the display device and images the external environment in front of the user, The head-mounted When the terminal is the first terminal and the other at least one information input / output terminal is the second terminal, the first terminal receives the information of the captured image acquired by the imaging device of the first terminal. The second terminal outputs the captured image to the second terminal based on the information of the captured image input by the display device of the second terminal via the communication unit of one terminal. Displayed so as to be visible to the user of the terminal, and the gaze detection device of the second terminal detects the gaze direction of the user of the second terminal with respect to the displayed captured image, and then gaze information related to the gaze direction is displayed. Output to the first terminal via the communication unit of the second terminal, and the first terminal uses the second terminal based on the line-of-sight information input by the display device of the first terminal. Display position of the video by the first terminal. The corresponding position displays within.
 上記した視線情報共有方法および視線情報共有システムによれば、第1端末(HMD)を含む複数の情報入出力端末を用いた協働作業において、いずれかの使用者が対象物を視認できないような状況であっても、視線に基づく的確な意思疎通を行うことが可能となり、また、位置座標の変換等の複雑な処理を必要とすることなく簡便な方法で、視線に基づく意思疎通を行うことが可能となる。 According to the above-described line-of-sight information sharing method and line-of-sight information sharing system, in a collaborative work using a plurality of information input / output terminals including the first terminal (HMD), any user cannot visually recognize an object. It is possible to communicate accurately based on the line of sight even in situations, and communicate based on the line of sight in a simple manner without requiring complicated processing such as conversion of position coordinates. Is possible.
本発明の実施の一形態に係る視線情報共有システムの概略の構成を示す説明図である。It is explanatory drawing which shows the structure of the outline of the gaze information sharing system which concerns on one Embodiment of this invention. 上記視線情報共有システムを構成する情報入出力端末の一例であるHMDの概略の構成を示す正面図である。It is a front view which shows the outline structure of HMD which is an example of the information input / output terminal which comprises the said gaze information sharing system. 上記HMDが備える映像表示装置の光学構成を示す断面図である。It is sectional drawing which shows the optical structure of the video display apparatus with which said HMD is provided. 上記HMDの詳細な構成を示すブロック図である。It is a block diagram which shows the detailed structure of said HMD. 作業者と指示監督者とが協働作業を行う様子を模式的に示す説明図である。It is explanatory drawing which shows typically a mode that an operator and an instruction | inspection supervisor perform a collaborative work. 上記視線情報共有システムにおける動作の流れの一例を示すフローチャートである。It is a flowchart which shows an example of the flow of operation | movement in the said gaze information sharing system. 作業者の視界のイメージを示す説明図である。It is explanatory drawing which shows the image of an operator's visual field. 指示監督者が表示映像を観察している様子を示す説明図である。It is explanatory drawing which shows a mode that the instruction | inspection supervisor observes a display image | video. 指示監督者の視線位置を示すマーカーが表示されたときの、作業者の視界のイメージを示す説明図である。It is explanatory drawing which shows the image of an operator's visual field when the marker which shows the instruction | inspection supervisor's eyes | visual_axis position is displayed. 上記視線情報共有システムにおける動作の流れの他の例を示すフローチャートである。It is a flowchart which shows the other example of the flow of operation | movement in the said gaze information sharing system. 作業者および指示監督者の両者の視線の位置が表示されたときの、作業者の視界のイメージを示す説明図である。It is explanatory drawing which shows the image of an operator's visual field when the position of the eyes | visual_axis of both an operator and an instruction | inspection supervisor is displayed. 上記視線情報共有システムにおける動作の流れのさらに他の例を示すフローチャートである。It is a flowchart which shows the further another example of the flow of operation | movement in the said gaze information sharing system. 作業者および指示監督者の両者の視線の位置が表示されたときの、指示監督者の視界のイメージを示す説明図である。It is explanatory drawing which shows the image of the field of view of an instruction | inspection supervisor when the position of the eyes | visual_axis of both an operator and an instruction | indication supervisor is displayed. 上記HMDの他の構成を示すブロック図である。It is a block diagram which shows the other structure of the said HMD. 作業者が遠距離の指示監督者から視線情報に基づく指示を受けて作業を行う様子を示す説明図である。It is explanatory drawing which shows a mode that a worker receives the instruction | indication based on line-of-sight information from a long-distance instruction | inspection supervisor. 本発明の他の実施の形態に係る視線情報共有システムにおいて、2人の作業者が協働作業を行う様子を模式的に示す説明図である。It is explanatory drawing which shows typically a mode that two workers perform a collaborative work in the gaze information sharing system which concerns on other embodiment of this invention. 一方の作業者の視界のイメージを示す説明図である。It is explanatory drawing which shows the image of one worker's visual field. 他方の作業者の視界のイメージを示す説明図である。It is explanatory drawing which shows the image of the visual field of the other operator. 他方の作業者の視界画像を、一方の作業者のHMDによって表示したときの、一方の作業者の視界のイメージを示す説明図である。It is explanatory drawing which shows the image of the visual field of one worker when the visual field image of the other worker is displayed by one worker's HMD. 一方の作業者の視界画像を、他方の作業者のHMDによって表示したときの、他方の作業者の視界のイメージを示す説明図である。It is explanatory drawing which shows the image of the visual field of the other worker when the visual field image of one worker is displayed by HMD of the other worker. 2人の作業者の視線位置を示すマーカーが表示されたときの、一方の作業者の視界のイメージを示す説明図である。It is explanatory drawing which shows the image of the visual field of one worker when the marker which shows the line-of-sight position of two workers is displayed. 2人の作業者の視線位置を示すマーカーが表示されたときの、他方の作業者の視界のイメージを示す説明図である。It is explanatory drawing which shows the image of the other worker's visual field when the marker which shows the visual line position of two workers is displayed. 本発明のさらに他の実施の形態に係る視線情報共有システムにおいて、2人の作業者から送られて表示された映像を、指示監督者が観察している様子を模式的に示す説明図である。FIG. 10 is an explanatory diagram schematically showing a state in which an instruction supervisor observes an image sent and displayed by two workers in a line-of-sight information sharing system according to still another embodiment of the present invention. . 2人の作業者および指示監督者の視線位置を示すマーカーが表示されたときの、一方の作業者の視界のイメージを示す説明図である。It is explanatory drawing which shows the image of one worker's visual field when the marker which shows the visual line position of two workers and an instruction | inspection supervisor is displayed. 2人の作業者および指示監督者の視線位置を示すマーカーが表示されたときの、他方の作業者の視界のイメージを示す説明図である。It is explanatory drawing which shows the image of the field of view of the other operator when the marker which shows the gaze position of two workers and an instruction | inspection supervisor is displayed. 上記視線情報共有システムを構成する情報入出力端末の他の構成を示す説明図である。It is explanatory drawing which shows the other structure of the information input / output terminal which comprises the said gaze information sharing system. 本発明のさらに他の実施の形態に係る視線情報共有システムにおいて、映像表示装置、撮像装置、および無限遠に位置する対象物の相対位置関係を模式的に示す説明図である。In the line-of-sight information sharing system according to still another embodiment of the present invention, it is an explanatory view schematically showing a relative positional relationship between an image display device, an imaging device, and an object located at infinity. 視差補正を行うHMDの構成を示すブロック図である。It is a block diagram which shows the structure of HMD which performs parallax correction. HMDの他の光学構成を示す断面図である。It is sectional drawing which shows the other optical structure of HMD. HMDのさらに他の光学構成を示す断面図である。It is sectional drawing which shows other optical structure of HMD. HMDのさらに他の光学構成を示す断面図である。It is sectional drawing which shows other optical structure of HMD.
 〔実施の形態1〕
 本発明の実施の一形態について、図面に基づいて説明すれば、以下の通りである。なお、本明細書において、数値範囲をa~bと表記した場合、その数値範囲に下限aおよび上限bの値は含まれるものとする。また、本発明は、以下の内容に限定されるものではない。
[Embodiment 1]
An embodiment of the present invention will be described below with reference to the drawings. In this specification, when the numerical range is expressed as a to b, the numerical value range includes the values of the lower limit a and the upper limit b. The present invention is not limited to the following contents.
 (視線情報共有システムの構成)
 図1は、本実施形態の視線情報共有システム100の概略の構成を示す説明図である。視線情報共有システム100は、複数の情報入出力端末200を、通信回線300を介して通信可能に(情報連携可能に)接続して構成されている。通信回線300は、例えばWi-Fi(登録商標)などの無線通信環境によって実現されているが、光ファイバーなどのケーブルを用いた有線の通信回線であってもよい。
(Gaze information sharing system configuration)
FIG. 1 is an explanatory diagram illustrating a schematic configuration of a line-of-sight information sharing system 100 according to the present embodiment. The line-of-sight information sharing system 100 is configured by connecting a plurality of information input / output terminals 200 via a communication line 300 so that they can communicate with each other (so that information can be linked). The communication line 300 is realized by a wireless communication environment such as Wi-Fi (registered trademark), but may be a wired communication line using a cable such as an optical fiber.
 複数の情報入出力端末200は、相互に情報を入出力(送受信)することが可能である。入出力の対象となる情報には、例えば、各情報入出力端末200で取得される撮像画像の情報(画像データ)や、各情報入出力端末200の使用者の視線方向や視線の位置に関する情報(以下、単に視線情報とも称する)が含まれる。これにより、複数の情報入出力端末200間で、使用者の視線情報を共有して、後述するように使用者の視線に基づく意思疎通が可能となる。 The plurality of information input / output terminals 200 can input / output (transmit / receive) information to / from each other. Examples of information to be input / output include information on captured images (image data) acquired by each information input / output terminal 200 and information on the line-of-sight direction and the position of the line of sight of the user of each information input / output terminal 200 (Hereinafter also simply referred to as line-of-sight information). As a result, the user's line-of-sight information is shared among the plurality of information input / output terminals 200, and communication based on the user's line-of-sight can be performed as described later.
 複数の情報入出力端末200の少なくとも1つは、使用者の頭部に装着され、映像を虚像として使用者に視認可能に表示する頭部装着型端末、すなわち、ヘッドマウントディスプレイ(HMD)である。以下、HMDの詳細について説明する。 At least one of the plurality of information input / output terminals 200 is a head-mounted display (HMD) that is mounted on the user's head and displays the image as a virtual image so as to be visible to the user. . Details of the HMD will be described below.
 (HMDの構成)
 図2は、本実施形態の情報入出力端末200の一例であるHMD201の概略の構成を示す正面図である。HMD201は、使用者に映像を提示する映像表示装置202と、支持部材203とを含んで構成されている。支持部材203は、映像表示装置202を観察者(HMD201の使用者)の眼前(例えば右眼ERの前)で支持する部材であり、眼鏡のフレームやテンプルに相当する支持部203aと、支持部203aに取り付けられて、装着時に使用者の鼻と当接する鼻当て203bとを含む。
(Configuration of HMD)
FIG. 2 is a front view illustrating a schematic configuration of the HMD 201 which is an example of the information input / output terminal 200 of the present embodiment. The HMD 201 includes an image display device 202 that presents an image to the user and a support member 203. The support member 203 is a member that supports the video display device 202 in front of an observer (user of the HMD 201) (for example, in front of the right eye ER). The support member 203 corresponds to a frame or temple of glasses, and a support unit. 203a includes a nose pad 203b attached to 203a and abutting the user's nose when worn.
 なお、HMD201は、さらに、右眼用レンズや左眼用レンズを含み、これらが支持部材203で支持される構成であってもよい。また、HMD201は、映像表示装置202を2個有し、支持部材203によって各映像表示装置202を使用者の右眼および左眼の眼前で支持する構成であってもよい。さらに、HMD201は、映像表示装置202の位置(例えば左右方向、上下方向の各位置)や取付角度(アオリ角)を調整する位置調整機構を有していてもよい。 The HMD 201 may further include a right-eye lens and a left-eye lens, and these may be supported by the support member 203. Further, the HMD 201 may have two video display devices 202 and support each video display device 202 in front of the right and left eyes of the user by the support member 203. Furthermore, the HMD 201 may have a position adjustment mechanism that adjusts the position (for example, each position in the left-right direction and the up-down direction) and the attachment angle (tilt angle) of the video display device 202.
 映像表示装置202は、使用者が映像とともに外界を直接観察可能な光学シースルー型ディスプレイである。すなわち、映像表示装置202は、外界からの光を使用者の瞳に導くことにより、使用者に外界を観察させるとともに、使用者の視野の一部に映像を虚像として表示して提供する表示光学系である。以下、映像表示装置202の光学構成について説明する。 The video display device 202 is an optical see-through display that allows a user to directly observe the outside world together with video. In other words, the video display device 202 guides light from the outside world to the user's pupil, thereby allowing the user to observe the outside world and displaying the video as a virtual image in a part of the user's field of view. It is a system. Hereinafter, the optical configuration of the video display device 202 will be described.
 図3は、映像表示装置202の光学構成を示す断面図である。映像表示装置202は、照明光学系2と、偏光板3と、偏光ビームスプリッタ(PBS)4と、表示素子5と、接眼光学系6とを有している。照明光学系2、偏光板3、PBS4、表示素子5、および接眼光学系6の上端部は、図2で示した筐体202a内に位置している。 FIG. 3 is a cross-sectional view showing an optical configuration of the image display device 202. The video display device 202 includes an illumination optical system 2, a polarizing plate 3, a polarization beam splitter (PBS) 4, a display element 5, and an eyepiece optical system 6. The upper ends of the illumination optical system 2, the polarizing plate 3, the PBS 4, the display element 5, and the eyepiece optical system 6 are located in the housing 202a shown in FIG.
 なお、以下での説明の便宜上、各方向を以下のように定義しておく。図3において、接眼光学系6によって形成される光学瞳Pの中心(映像観察時の使用者の瞳中心)と表示素子5の表示面の中心とを光学的に結ぶ軸およびその軸の延長線を光軸(観察中心軸)とする。そして、接眼光学系6のHOE(Holographic Optical Element;ホログラフィック光学素子)23の光軸平面に垂直な方向をX方向とする。なお、HOE23の光軸平面とは、上記光軸と一致する光線がHOE23に入射するときの、入射光線と反射光線とを含む平面を指す。また、各光学部材の上記光軸との交点における、面法線と垂直な面内で、X方向に垂直な方向をY方向とする。そして、X方向およびY方向に垂直な方向をZ方向とする。このような定義を用いると、例えば、表示素子5の法線と接眼光学系6の後述する2つの平行な面21b・21cの法線とを含み、かつ、表示素子5の表示面の中心を含む断面は、YZ断面となる。 For convenience of explanation below, each direction is defined as follows. In FIG. 3, an axis that optically connects the center of the optical pupil P formed by the eyepiece optical system 6 (the user's pupil center at the time of image observation) and the center of the display surface of the display element 5, and an extension of the axis. Is the optical axis (observation central axis). A direction perpendicular to the optical axis plane of the HOE (Holographic Optical Element) 23 of the eyepiece optical system 6 is defined as an X direction. Note that the optical axis plane of the HOE 23 refers to a plane including incident light and reflected light when a light beam that coincides with the optical axis enters the HOE 23. In addition, the direction perpendicular to the X direction in the plane perpendicular to the surface normal at the intersection of each optical member with the optical axis is defined as the Y direction. A direction perpendicular to the X direction and the Y direction is taken as a Z direction. When such a definition is used, for example, the normal line of the display element 5 and the normal lines of two parallel surfaces 21b and 21c described later of the eyepiece optical system 6 are included, and the center of the display surface of the display element 5 is defined. The included cross section is a YZ cross section.
 照明光学系2は、表示素子5を照明するものであり、光源11と、照明ミラー12と、拡散板13とを有している。 The illumination optical system 2 illuminates the display element 5 and has a light source 11, an illumination mirror 12, and a diffusion plate 13.
 光源11は、R(赤)、G(緑)、B(青)の各色に対応する光を出射するRGB一体型のLEDで構成されている。複数の発光点(RGBの各発光点)は、水平方向(X方向)に略直線状に並んでいる。光源11から出射される光の波長は、例えば、光強度のピーク波長および光強度半値の波長幅で、462±12nm(B光)、525±17nm(G光)、635±11nm(R光)である。なお、光源11は、レーザ光源であってもよい。 The light source 11 is composed of RGB integrated LEDs that emit light corresponding to each color of R (red), G (green), and B (blue). A plurality of light emission points (each light emission point of RGB) are arranged in a substantially straight line in the horizontal direction (X direction). The wavelength of light emitted from the light source 11 is, for example, a peak wavelength of light intensity and a wavelength width of half value of light intensity, 462 ± 12 nm (B light), 525 ± 17 nm (G light), 635 ± 11 nm (R light). It is. The light source 11 may be a laser light source.
 照明ミラー12は、光源11から出射された光(照明光)を拡散板13に向けて反射させるとともに、Y方向に関して、光学瞳Pと光源11とが略共役となるように、照明光を曲げる光学素子である。 The illumination mirror 12 reflects light (illumination light) emitted from the light source 11 toward the diffuser plate 13 and bends the illumination light so that the optical pupil P and the light source 11 are substantially conjugate with respect to the Y direction. It is an optical element.
 拡散板13は、光源11の複数の発光点が並ぶX方向に入射光を例えば40°拡散し、Y方向には入射光を拡散しない一方向拡散板である。拡散板13は、偏光板3の表面に保持されている。 The diffusing plate 13 is a unidirectional diffusing plate that diffuses incident light, for example, 40 ° in the X direction in which a plurality of light emitting points of the light source 11 are arranged and does not diffuse incident light in the Y direction. The diffusion plate 13 is held on the surface of the polarizing plate 3.
 偏光板3は、拡散板13を介して入射する光のうち、所定の偏光方向の光を透過させてPBS4に導く。 The polarizing plate 3 transmits light having a predetermined polarization direction out of light incident through the diffusion plate 13 and guides it to the PBS 4.
 PBS4は、偏光板3を透過した光を反射型の表示素子5の方向に反射させる一方、表示素子5にて反射された光のうち、画像信号オンに対応する光(偏光板3を透過した光とは偏光方向が直交する光)を透過させる平板状の偏光分離素子であり、接眼光学系6の後述する接眼プリズム21の光入射面21aに貼り付けられている。 The PBS 4 reflects the light transmitted through the polarizing plate 3 in the direction of the reflective display element 5, while out of the light reflected by the display element 5, the light corresponding to the image signal ON (transmitted through the polarizing plate 3). The light is a flat plate-shaped polarization separating element that transmits light whose polarization direction is orthogonal, and is attached to a light incident surface 21a of an eyepiece prism 21 (to be described later) of the eyepiece optical system 6.
 表示素子5は、照明光学系2からの光を変調して映像を表示する表示素子であり、本実施形態では、反射型の液晶表示素子で構成されている。表示素子5はカラーフィルタを有する構成であってもよいし、光源11のRGBごとの時分割発光に同期して、発光色に対応するRGBの画像が表示されるように、時分割で駆動される構成であってもよい。 The display element 5 is a display element that modulates the light from the illumination optical system 2 and displays an image. In the present embodiment, the display element 5 is composed of a reflective liquid crystal display element. The display element 5 may have a configuration having a color filter, and is driven in a time division manner so that an RGB image corresponding to the emission color is displayed in synchronization with the time division emission for each RGB of the light source 11. It may be a configuration.
 表示素子5は、PBS4からほぼ垂直に入射する光がほぼ垂直に反射されてPBS4に向かうように配置されている。これにより、反射型の表示素子に対して大きな入射角で光を入射させる構成に比べて、解像度を増大させるような光学設計が容易となる。表示素子5の表示面は長方形となっており、表示面の長手方向がX方向となり、短手方向がY方向となるように配置されている。 The display element 5 is arranged so that light incident from the PBS 4 substantially vertically is reflected almost vertically and directed toward the PBS 4. This facilitates optical design that increases the resolution as compared with a configuration in which light is incident on the reflective display element at a large incident angle. The display surface of the display element 5 is rectangular, and is arranged so that the longitudinal direction of the display surface is the X direction and the short direction is the Y direction.
 接眼光学系6は、表示素子5からの映像光を使用者の瞳(光学瞳P)に導くための光学系であり、非軸対称(非回転対称)な正の光学パワーを有している。この接眼光学系6は、接眼プリズム21と、偏向プリズム22と、HOE23とを有している。 The eyepiece optical system 6 is an optical system for guiding the image light from the display element 5 to the user's pupil (optical pupil P), and has non-axisymmetric (non-rotationally symmetric) positive optical power. . The eyepiece optical system 6 includes an eyepiece prism 21, a deflection prism 22, and a HOE 23.
 接眼プリズム21は、表示素子5からPBS4を介して入射する映像光を内部で導光する一方、外界からの光(外光)を透過させるものであり、平行平板の上端部を上端に向かうほど厚くし、下端部を下端に向かうほど薄くした形状で構成されている。 The eyepiece prism 21 guides the image light incident from the display element 5 through the PBS 4 inside, and transmits light from the outside (external light). The upper end of the parallel plate is directed toward the upper end. It is configured to be thicker and thinner at the lower end toward the lower end.
 接眼プリズム21において、PBS4が貼り付けられる面は、表示素子5からの映像光が入射する光入射面21aであり、光学瞳Pとほぼ平行に位置して互いに対向する2つの面21b・21cは、映像光を全反射によって導光する全反射面となっている。そのうち、光学瞳P側の面21bは、HOE23で回折反射される映像光の出射面を兼ねている。 In the eyepiece prism 21, the surface to which the PBS 4 is attached is a light incident surface 21 a on which the image light from the display element 5 is incident, and two surfaces 21 b and 21 c that are positioned substantially parallel to the optical pupil P and face each other are The total reflection surface guides the image light by total reflection. Among them, the surface 21b on the optical pupil P side also serves as an image light exit surface that is diffracted and reflected by the HOE 23.
 接眼プリズム21は、その下端部に配置されるHOE23を挟むように偏向プリズム22と接着剤で接合されている。本実施形態では、接眼プリズム21を構成する面のうち、HOE23が接する面21d以外で映像光が透過する面(光入射面21a、面21b)は、平面となっている。接眼プリズム21において、HOE23が接する面21dは、平面であってもよいし、曲面であってもよいし、平面と曲面とを組み合わせた面であってもよい。 The eyepiece prism 21 is joined to the deflection prism 22 with an adhesive so as to sandwich the HOE 23 arranged at the lower end thereof. In the present embodiment, among the surfaces constituting the eyepiece prism 21, the surfaces (light incident surface 21 a and surface 21 b) that transmit image light other than the surface 21 d in contact with the HOE 23 are flat. In the eyepiece prism 21, the surface 21d with which the HOE 23 comes into contact may be a flat surface, a curved surface, or a surface combining a flat surface and a curved surface.
 偏向プリズム22は、接眼プリズム21とHOE23を介して貼り合わされて略平行平板を形成している。偏向プリズム22を接眼プリズム21と貼り合わせることで、外光が接眼プリズム21の楔状の下端部を透過するときの屈折を偏向プリズム22でキャンセルすることができ、外界として観察(視認)される像(外界像)に歪みが生じるのを防止することができる。 The deflection prism 22 is bonded to the eyepiece prism 21 via the HOE 23 to form a substantially parallel plate. By attaching the deflecting prism 22 to the eyepiece prism 21, refraction when external light passes through the wedge-shaped lower end of the eyepiece prism 21 can be canceled by the deflecting prism 22, and an image observed (viewed) as the outside world. It is possible to prevent distortion in the (external world image).
 HOE23は、接眼プリズム21に接して設けられ、接眼プリズム21内部で導光された映像光を回折反射する体積位相型で反射型のホログラフィック光学素子である。HOE23は、回折効率のピーク波長および回折効率半値の波長幅で、例えば465±5nm(B光)、521±5nm(G光)、634±5nm(R光)の3つの波長域の光を回折(反射)させる。すなわち、HOE23のRGBの回折波長は、RGBの映像光の波長(光源11の発光波長)とほぼ対応している。 The HOE 23 is a volume phase type reflective holographic optical element that is provided in contact with the eyepiece prism 21 and diffracts and reflects the image light guided inside the eyepiece prism 21. The HOE 23 diffracts light in three wavelength ranges of, for example, 465 ± 5 nm (B light), 521 ± 5 nm (G light), and 634 ± 5 nm (R light) with a peak wavelength of diffraction efficiency and a half width of the diffraction efficiency. (Reflect). That is, the RGB diffraction wavelength of the HOE 23 substantially corresponds to the wavelength of RGB image light (the emission wavelength of the light source 11).
 上記の構成において、照明光学系2の光源11から出射された光は、照明ミラー12で反射され、拡散板13にてX方向にのみ拡散された後、所定の偏光方向の光のみが偏光板3を透過する。そして、偏光板3を透過した光は、PBS4で反射され、表示素子5に入射する。 In the above configuration, the light emitted from the light source 11 of the illumination optical system 2 is reflected by the illumination mirror 12 and diffused only in the X direction by the diffusion plate 13, and then only the light having a predetermined polarization direction is a polarizing plate. 3 is transmitted. The light transmitted through the polarizing plate 3 is reflected by the PBS 4 and enters the display element 5.
 表示素子5では、入射光が画像信号に応じて変調される。このとき、画像信号オンに対応する映像光は、表示素子5にて入射光とは偏光方向が直交する光に変換されて出射されるため、PBS4を透過して接眼プリズム21の内部に光入射面21aから入射する。一方、画像信号オフに対応する映像光は、表示素子5にて偏光方向が変換されずに出射されるため、PBS4で遮断され、接眼プリズム21の内部に入射しない。 In the display element 5, incident light is modulated in accordance with an image signal. At this time, since the image light corresponding to the image signal ON is converted by the display element 5 into light having a polarization direction orthogonal to the incident light and emitted, the light is incident on the eyepiece prism 21 through the PBS 4. Incident from the surface 21a. On the other hand, the image light corresponding to the image signal being turned off is emitted by the display element 5 without changing the polarization direction, and thus is blocked by the PBS 4 and does not enter the eyepiece prism 21.
 接眼プリズム21では、入射した映像光が接眼プリズム21の対向する2つの面21c・21bでそれぞれ1回ずつ全反射された後、HOE23に入射し、そこで回折反射されて面21bから出射され、光学瞳Pに達する。したがって、この光学瞳Pの位置では、使用者は、表示素子5に表示された映像を虚像として観察することができる。 In the eyepiece prism 21, the incident image light is totally reflected once by the two opposing surfaces 21 c and 21 b of the eyepiece prism 21, then enters the HOE 23, where it is diffracted and reflected and emitted from the surface 21 b. Reach pupil P. Therefore, at the position of the optical pupil P, the user can observe the image displayed on the display element 5 as a virtual image.
 一方、接眼プリズム21、偏向プリズム22およびHOE23は、外光をほとんど全て透過させるので、使用者は外界をシースルーで観察することができる。したがって、表示素子5に表示された映像の虚像は、使用者の視野(視界)内で、外界の一部に重なって観察されることになる。 On the other hand, the eyepiece prism 21, the deflecting prism 22, and the HOE 23 transmit almost all of the external light, so that the user can observe the outside world see-through. Therefore, the virtual image of the image displayed on the display element 5 is observed while overlapping a part of the outside world in the user's field of view (view).
 なお、本実施形態では、表示素子5として、反射型の映像表示素子を用いているが、透過型の液晶表示素子を用い、それに応じた光学設計で映像表示装置202を構成してもよい。また、接眼光学系6は、HMD201の右眼用レンズまたは左眼用レンズと一体的に構成されてもよいし、別体で構成されてもよい。 In this embodiment, a reflective image display element is used as the display element 5, but a transmissive liquid crystal display element may be used, and the image display device 202 may be configured with an optical design corresponding thereto. The eyepiece optical system 6 may be configured integrally with the right-eye lens or the left-eye lens of the HMD 201, or may be configured separately.
 (HMDの詳細な構成)
 図4は、HMD201の詳細な構成を示すブロック図である。HMD201は、上記した映像表示装置202および支持部材203に加えて、撮像装置31、視線検出装置32、画像処理部33、通信部34、記憶部35、音声入力装置36、音声出力装置37、および制御部38をさらに備えている。制御部38は、例えば中央演算処理装置(CPU;Central Processing Unit)で構成されており、映像表示装置202を含むHMD201の各部の動作を制御する。したがって、制御部38の制御には、例えば、映像表示装置202の光源11の点灯/消灯の切り替えや、表示素子5における情報の表示に関する制御も含まれる。
(Detailed configuration of HMD)
FIG. 4 is a block diagram illustrating a detailed configuration of the HMD 201. In addition to the video display device 202 and the support member 203 described above, the HMD 201 includes an imaging device 31, a line-of-sight detection device 32, an image processing unit 33, a communication unit 34, a storage unit 35, a voice input device 36, a voice output device 37, and A control unit 38 is further provided. The control unit 38 is composed of, for example, a central processing unit (CPU) and controls the operation of each unit of the HMD 201 including the video display device 202. Therefore, the control of the control unit 38 includes, for example, switching on / off of the light source 11 of the video display device 202 and control regarding display of information on the display element 5.
 撮像装置31は、例えば動画の撮影が可能なカメラ(デジタルカメラ、デジタルビデオカメラ)で構成されており、使用者の視線方向の先にある外界を撮像する。この撮像装置31は、例えば図2で示した筐体202aの外面に設けられているが、筐体202aの内部に設けられてもよい。後者の場合、撮像装置31の前方に、筐体202aを貫通する開口部または透明な窓が設けられ、これによって撮像装置31による前方の撮影が可能となる。 The imaging device 31 is composed of, for example, a camera (digital camera, digital video camera) capable of shooting a moving image, and images the outside world ahead of the user's line of sight. For example, the imaging device 31 is provided on the outer surface of the housing 202a illustrated in FIG. 2, but may be provided inside the housing 202a. In the latter case, an opening penetrating the housing 202a or a transparent window is provided in front of the image pickup device 31, so that the image pickup device 31 can shoot in front.
 また、図2に示すように、撮像装置31の撮像中心Co(撮像光軸と撮像素子との交点)は、映像表示装置202の観察中心Vo(瞳中心)のほぼ真上に位置しており、映像表示装置202に対する撮像装置31の相対位置が固定されている。また、撮像装置31の撮像光軸は、無限遠を観察するときの使用者の視線方向と略一致している(図27の撮像光軸A1、視線方向(観察中心軸)A2参照)。これにより、撮像装置31は、使用者の通常視界(前方の外界)を撮像することができる。 As shown in FIG. 2, the imaging center Co (intersection between the imaging optical axis and the imaging element) of the imaging device 31 is located almost immediately above the observation center Vo (pupil center) of the video display device 202. The relative position of the imaging device 31 with respect to the video display device 202 is fixed. Further, the imaging optical axis of the imaging device 31 substantially coincides with the viewing direction of the user when observing at infinity (see imaging optical axis A 1 and viewing direction (observation central axis) A 2 in FIG. 27). . Thereby, the imaging device 31 can image a user's normal field of view (front outside).
 視線検出装置32は、使用者の観察視線(視線方向)を検出するセンサである。視線検出装置32は、赤外線LEDとカメラとを含んで構成されており、例えば図2で示すように、接眼光学系6の左右の縁部に設けられている。視線検出装置32は、赤外線LEDから使用者の瞳または網膜に向けて安全な光線(例えば赤外線)を照射し、その反射光をカメラで計測することにより、使用者の視線方向を直接検知する。より具体的には、赤外線LEDで使用者の顔を照らしたときに、使用者の角膜上で赤外光が反射される位置(角膜反射の位置)を基準点とし、この角膜反射の位置に対する瞳孔の位置をカメラで(赤外線を受光して)検知することにより、使用者の視線方向を検知する。例えば、右眼の角膜反射の位置よりも瞳孔が目じり側にあれば、使用者は右側を見ており(使用者の視線が右方向を向いており)、角膜反射の位置よりも瞳孔が目頭側にあれば、使用者は左側を見ている(使用者の視線が左方向を向いている)ことを検出できる。 The gaze detection device 32 is a sensor that detects the user's observation gaze (gaze direction). The line-of-sight detection device 32 includes an infrared LED and a camera, and is provided at the left and right edges of the eyepiece optical system 6, for example, as shown in FIG. The line-of-sight detection device 32 directly detects the user's line-of-sight direction by irradiating a safe ray (for example, infrared rays) from the infrared LED toward the user's pupil or retina and measuring the reflected light with a camera. More specifically, when the infrared LED illuminates the user's face, the position where the infrared light is reflected on the user's cornea (the position of the corneal reflection) is used as a reference point, and the position corresponding to this corneal reflection is determined. The direction of the user's line of sight is detected by detecting the position of the pupil with a camera (by receiving infrared rays). For example, if the pupil is closer to the corneal reflection of the right eye, the user is looking to the right (the user's line of sight is pointing to the right), and the pupil is larger than the corneal reflection. If it is on the side, it is possible to detect that the user is looking at the left side (the user's line of sight is facing left).
 視線検出装置32の数は、1個であってもよいし、複数個であってもよいが、複数個であるほうが、複数の情報に基づいて視線検知を行うことができ、検知精度が向上するため望ましい。このため、本実施形態では、図2に示すように、接眼光学系6の左右の縁部(計2か所)に視線検出装置32を設けて、視線検出を行うようにしている。 Although the number of the line-of-sight detection devices 32 may be one or plural, it is possible to perform line-of-sight detection based on a plurality of pieces of information and improve detection accuracy. This is desirable. For this reason, in this embodiment, as shown in FIG. 2, the gaze detection device 32 is provided at the left and right edges (two places in total) of the eyepiece optical system 6 to perform gaze detection.
 なお、トビーテクノロジー社からは、複数の赤外光源で照明された瞳を、複数のカメラで撮像することによって視線を検知するメガネ型の視線検出装置が提案されているが、本実施形態の視線検出装置32はこれに類する構成とすることができる。 Note that Tobe Technology has proposed a spectacle-type gaze detection device that detects gaze by imaging a pupil illuminated by a plurality of infrared light sources with a plurality of cameras. The detection device 32 can have a similar configuration.
 画像処理部33は、表示素子5に表示させる映像を画像処理によって生成する処理部であり、例えばASIC(application specific integrated circuit)のような特定の演算処理回路で構成される。上記の画像処理には、通常の画像処理(色補正、拡大/縮小、エッジ強調などの画像処理)のほか、後述する使用者の視線位置を示す指標(マーカー)や、他の端末から入力される撮像画像を個別に、または合成して表示素子5に表示させるための画像処理なども含まれる。 The image processing unit 33 is a processing unit that generates an image to be displayed on the display element 5 by image processing, and includes, for example, a specific arithmetic processing circuit such as an ASIC (application specific integrated circuit). In the above image processing, in addition to normal image processing (image processing such as color correction, enlargement / reduction, and edge enhancement), an index (marker) indicating the user's line-of-sight position, which will be described later, is input from another terminal. Image processing for displaying the picked-up images to be displayed on the display element 5 individually or in combination is also included.
 通信部34は、HMD201と他の情報入出力端末200との間で情報の入出力を行うためのインターフェースであり、送信回路、受信回路、アンテナなどを含んで構成される。記憶部35は、例えばフラッシュメモリなどの不揮発性メモリで構成され、他の端末から出力された各種の情報(撮像画像の情報(画像データ)、視線位置を示す情報を含む)を記憶する。 The communication unit 34 is an interface for inputting and outputting information between the HMD 201 and another information input / output terminal 200, and includes a transmission circuit, a reception circuit, an antenna, and the like. The storage unit 35 is configured by a non-volatile memory such as a flash memory, for example, and stores various types of information (including captured image information (image data) and information indicating the line-of-sight position) output from other terminals.
 音声入力装置36は、例えばマイクなどで構成される音声情報の入力装置である。音声入力装置36に入力された音声情報は、通信部34を介して他の端末に出力される。音声出力装置37は、例えばスピーカーやイヤホンなどで構成される音声情報の出力装置である。他の端末から通信部34を介して入力された音声情報は、音声出力装置37から出力される。 The voice input device 36 is a voice information input device including, for example, a microphone. The audio information input to the audio input device 36 is output to another terminal via the communication unit 34. The audio output device 37 is an audio information output device including, for example, a speaker or an earphone. Audio information input from another terminal via the communication unit 34 is output from the audio output device 37.
 なお、複数の情報入出力端末200は、頭部装着型以外の情報入出力端末を含んでいてもよい。例えば、複数の情報入出力端末200は、室内に設置の表示装置(例えば液晶表示装置や有機EL(Electro-Luminescence)表示装置)に、上記の視線検出装置32などを組み合わせた情報入出力端末を含んでいてもよい。 The plurality of information input / output terminals 200 may include information input / output terminals other than the head-mounted type. For example, the plurality of information input / output terminals 200 are information input / output terminals in which the above-described line-of-sight detection device 32 is combined with a display device installed in a room (for example, a liquid crystal display device or an organic EL (Electro-Luminescence) display device). May be included.
 (視線情報共有システムの活用例について)
 次に、上記した視線情報共有システム100の活用例について説明する。図5は、作業者PAと指示監督者PBとがそれぞれHMD201A・201Bを頭部に装着して協働作業を行う様子を模式的に示している。作業者PAおよび指示監督者PBは、それぞれHMD201A・201Bの使用者である。なお、HMD201A・201Bの構成は、上述したHMD201と全く同じである。また、HMD201A・201Bに含まれる各構成要素を互いに区別する場合は、例えば映像表示装置202A・202Bのように、各構成要素の符号の後に「A」または「B」の記号を付す。
(Examples of using eye-gaze information sharing system)
Next, an application example of the above-described line-of-sight information sharing system 100 will be described. FIG. 5 schematically shows how the worker PA and the instruction supervisor PB perform the collaborative work by wearing the HMDs 201A and 201B on their heads. The worker PA and the instruction supervisor PB are users of the HMDs 201A and 201B, respectively. The configurations of the HMDs 201A and 201B are exactly the same as the HMD 201 described above. Further, in order to distinguish the constituent elements included in the HMDs 201A and 201B from each other, a symbol “A” or “B” is added after the reference numerals of the constituent elements as in the video display devices 202A and 202B, for example.
 ここでは、例として、テーブルT上に載置された対象物であるデバイスOBに対して、メンテナンスのためにカバーを外す作業を想定する。また、テーブルTの周囲三方(作業者PAから見て奥側、左側、右側)は、作り付けの壁Wで囲まれており、作業者PAがテーブルTの前に立ち、デバイスOBに対して作業を行う状況では、作業者PAの背後に立つ指示監督者PBからは、デバイスOBへの直接のアクセスや視認が困難であるとする。 Here, as an example, it is assumed that the device OB, which is an object placed on the table T, is removed for maintenance. Further, the three sides around the table T (back side, left side and right side when viewed from the worker PA) are surrounded by a built-in wall W, and the worker PA stands in front of the table T and works on the device OB. It is assumed that it is difficult for the instruction supervisor PB standing behind the worker PA to directly access and visually recognize the device OB in the situation where the operator PA is performed.
 また、HMD201A・201Bは、相互に情報連携が可能なように通信状態下(例えばWi-Fi(登録商標)によって通信可能な状態)にあり、相互の情報を送受信可能である。このとき、HMD201A・201B間の通信は、それぞれの制御部38の制御のもとで、それぞれの通信部34を介して行われるものとする。 In addition, the HMDs 201A and 201B are in a communication state (for example, in a state in which communication is possible using Wi-Fi (registered trademark)) so that information can be linked to each other, and can transmit and receive mutual information. At this time, it is assumed that communication between the HMDs 201A and 201B is performed via each communication unit 34 under the control of each control unit 38.
 図6は、本実施形態の視線情報共有システム100における動作の流れの一例を示すフローチャートである。以下、このフローチャートを参照しながら説明する。 FIG. 6 is a flowchart showing an example of an operation flow in the line-of-sight information sharing system 100 of the present embodiment. Hereinafter, description will be given with reference to this flowchart.
 まず、作業者PAのHMD201A(第1端末)は、HMD201Aの視線カメラである撮像装置31にて取得される撮像画像の情報を、HMD201Aの通信部34を介して指示監督者PBのHMD201B(第2端末)に出力する(S1;第1の工程)。 First, the HMD 201A (first terminal) of the worker PA receives information on the captured image acquired by the imaging device 31 that is the visual line camera of the HMD 201A via the communication unit 34 of the HMD 201A. 2 terminals) (S1; first step).
 図7は、作業者PAの視界のイメージを示している。破線で示した矩形の領域IM(201A)は、作業者PAが装着したHMD201Aの撮像装置31の撮像範囲を示している。作業者PAには、デバイスOB、作業者PAの手(右手RH(PA)、左手LH(PA))、作業者PAの手に持たれた工具(ドライバDR(PA))が見えており、作業者PAの前方の視界がHMD201Aの撮像装置31によって撮像されている。HMD201Aは、このような作業者PAの視点から見た撮像装置31の撮像画像の情報(画像データ)を、HMD201Bに出力する。 FIG. 7 shows an image of the field of view of the worker PA. A rectangular area IM (201A) indicated by a broken line indicates an imaging range of the imaging device 31 of the HMD 201A worn by the worker PA. The worker PA can see the device OB, the hand of the worker PA (right hand RH (PA), left hand LH (PA)), the tool held in the hand of the worker PA (driver DR (PA)), The field of view in front of the worker PA is imaged by the imaging device 31 of the HMD 201A. The HMD 201A outputs information (image data) of the captured image of the imaging device 31 viewed from the viewpoint of the worker PA to the HMD 201B.
 続いて、HMD201Bは、映像表示装置202Bにより、HMD201Aから入力された撮像画像の情報に基づいて、上記撮像画像を指示監督者PBに視認可能に表示する(S2;第2の工程)。 Subsequently, the HMD 201B causes the video display device 202B to display the captured image so as to be visible to the instruction supervisor PB based on the captured image information input from the HMD 201A (S2; second step).
 図8は、指示監督者PBのHMD201Bの映像表示装置202Bによって表示された映像(虚像)V(202B)を、指示監督者PBが観察している様子を示している。指示監督者PBは、図5で示したように、デバイスOBに向かって実作業をしている作業者PAの背後に立っているため、上述のように、実際の作業者PAの作業やデバイスOBを直接観察することが困難である。しかし、HMD201Aで撮像され、HMD201Bに送信されて表示された映像V(202B)を指示監督者PBが観察することにより、指示監督者PBは作業者PAの実際の作業を確認することができる。 FIG. 8 shows a state where the instruction supervisor PB observes the video (virtual image) V (202B) displayed by the video display device 202B of the HMD 201B of the instruction supervisor PB. Since the instruction supervisor PB stands behind the worker PA who is actually working toward the device OB as shown in FIG. 5, as described above, the work and device of the actual worker PA It is difficult to observe OB directly. However, the instruction supervisor PB can confirm the actual work of the worker PA by observing the video V (202B) imaged by the HMD 201A, transmitted to the HMD 201B, and displayed.
 次に、映像表示装置202Bによって表示された撮像画像(映像V(202B))に対する指示監督者PBの視線方向を、HMD201Bの視線検出装置32によって検出する(S3;第3の工程)。 Next, the line-of-sight direction of the instruction supervisor PB with respect to the captured image (video V (202B)) displayed by the video display device 202B is detected by the line-of-sight detection device 32 of the HMD 201B (S3; third step).
 視線検出装置32は、図2で示したように、映像表示装置202(接眼光学系6)に取り付けられており、映像表示装置202に対する相対位置が固定されている。このため、HMD201Bの視線検出装置32は、映像表示装置202Bにて表示された映像V(202B)のどの部分を指示監督者PBが見ているのかを検出することができる。 As shown in FIG. 2, the line-of-sight detection device 32 is attached to the video display device 202 (eyepiece optical system 6), and the relative position with respect to the video display device 202 is fixed. Therefore, the line-of-sight detection device 32 of the HMD 201B can detect which part of the video V (202B) displayed on the video display device 202B is being viewed by the instruction supervisor PB.
 次に、HMD201Bは、上記視線検出装置32によって検出された指示監督者PBの視線方向に関する視線情報を、HMD201Bの通信部34を介してHMD201Aに出力する(S4;第4の工程)。そして、HMD201Aは、映像表示装置202Aにより、HMD201Bから入力された視線情報に基づいて、指示監督者PBの視線の位置を、HMD201Aによる映像の表示領域内で対応する位置に、作業者PAに視認可能に表示する(S5;第5の工程)。 Next, the HMD 201B outputs the line-of-sight information regarding the line-of-sight direction of the instruction supervisor PB detected by the line-of-sight detection device 32 to the HMD 201A via the communication unit 34 of the HMD 201B (S4; fourth step). Then, the HMD 201A visually recognizes the position of the line of sight of the instruction supervisor PB to the worker PA at a corresponding position in the video display area by the HMD 201A based on the line-of-sight information input from the HMD 201B by the video display device 202A. Display is possible (S5; fifth step).
 作業者PAのHMD201Aでは、撮像装置31は映像表示装置202に対して相対位置が固定されており、指示監督者PBのHMD201Bで表示された撮像画像(映像V(202B))は、元々作業者PAのHMD201Aで取得された撮像画像である。このため、HMD201Bで表示された撮像画像に対する指示監督者PBの注視点(指示監督者PBの視線方向の先にある点)は、作業者PAのHMD201Aにおいて、撮像装置31と相対位置が既知である映像表示装置202Aの映像(虚像)の表示領域全体における任意の位置と対応関係を持つ。したがって、作業者PAのHMD201Aにて、指示監督者PBの視線の位置を、映像表示装置202Aの表示領域内の所定の位置(指示監督者PBの注視点と対応する位置)に即座に表示することが可能となる。 In the HMD 201A of the worker PA, the relative position of the imaging device 31 with respect to the video display device 202 is fixed, and the captured image (video V (202B)) displayed on the HMD 201B of the instruction supervisor PB is originally the worker. It is the captured image acquired by HMD201A of PA. For this reason, the point of sight of the instruction supervisor PB with respect to the captured image displayed on the HMD 201B (the point ahead of the direction of the line of sight of the instruction supervisor PB) is known relative to the imaging device 31 in the HMD 201A of the worker PA. Corresponding to an arbitrary position in the entire display area of a video (virtual image) of a video display device 202A. Therefore, the HMD 201A of the worker PA immediately displays the position of the line of sight of the instruction supervisor PB at a predetermined position (a position corresponding to the point of sight of the instruction supervisor PB) in the display area of the video display device 202A. It becomes possible.
 図9は、作業者PAがHMD201Aを介して外界を見ているときの、作業者PAの視界のイメージを示す。HMD201Aでは、指示監督者PBの視線の位置を示すマーカーMBを表示することにより、指示監督者PBがデバイスOBのどの部分を見ているかを作業者PAに提示することができる。作業者PAは、表示されたマーカーMB(指示監督者PBの視線の位置)を見て、マーカーMBが示すネジを緩める作業を行うことができる。このように、指示監督者PBの視線の位置をHMD201Aにて表示することで、作業者PAと指示監督者PBとで視線に基づく意思疎通を行うことができ、作業を的確かつ効率よく行うことが可能となる。 FIG. 9 shows an image of the field of view of the worker PA when the worker PA is looking at the outside world through the HMD 201A. In the HMD 201A, by displaying the marker MB indicating the position of the line of sight of the instruction supervisor PB, it is possible to present to the operator PA which part of the device OB the instruction supervisor PB is looking at. The worker PA can perform the work of loosening the screw indicated by the marker MB by looking at the displayed marker MB (the position of the line of sight of the instruction supervisor PB). Thus, by displaying the position of the line of sight of the instruction supervisor PB on the HMD 201A, the worker PA and the instruction supervisor PB can communicate based on the line of sight, and perform the work accurately and efficiently. Is possible.
 このとき、作業者PAおよび指示監督者PBは隣接しているため(図5参照)、例えば指示監督者PBは、「このネジを緩めろ!」などの音声を発するようにしてもよい。これにより、作業者PAは、HMD201Aで表示されたマーカーMBを見ながら、「このネジを緩めればよい」ということを容易に認識することができる。つまり、指示監督者PBは、視線と音声とにより、正確な指示を作業者PAに与えることができ、この指示に基づいて作業者PAは作業を正確に行うことができる。 At this time, since the worker PA and the instruction supervisor PB are adjacent to each other (see FIG. 5), for example, the instruction supervisor PB may emit a voice such as “Loosen this screw!”. Thereby, the worker PA can easily recognize that “this screw should be loosened” while looking at the marker MB displayed on the HMD 201A. That is, the instruction supervisor PB can give an accurate instruction to the worker PA by the line of sight and the voice, and the worker PA can accurately perform the work based on this instruction.
 以上のように、本実施形態では、作業者PAのHMD201Aの視界を示す撮影画像を指示監督者PBのHMD201Bに出力し、HMD201Bにてその撮影画像を表示し、その撮影画像に対する指示監督者PBの視線を検出してその視線情報をHMD201Aに出力する。これにより、HMD201Aは、撮影画像に対する指示監督者PBの視線の位置とHMD201Aの表示映像で対応する位置に、指示監督者PBの視線の位置をマーカーMBで即座に表示することができる。したがって、複数のHMD201A・201Bを用いた協働作業において、作業者PAおよび指示監督者PBの相互の空間位置の把握や、視線情報の変換(位置座標の変換)などの複雑な処理を必要とすることなく、簡便な方法で、視線に基づく意思疎通を行うことが可能となる。 As described above, in the present embodiment, a captured image indicating the field of view of the HMD 201A of the worker PA is output to the HMD 201B of the instruction supervisor PB, the captured image is displayed on the HMD 201B, and the instruction supervisor PB for the captured image is displayed. Is detected and the line-of-sight information is output to the HMD 201A. Accordingly, the HMD 201A can immediately display the position of the line of sight of the instruction supervisor PB with the marker MB at a position corresponding to the position of the line of sight of the instruction supervisor PB with respect to the captured image in the display image of the HMD 201A. Therefore, in collaborative work using a plurality of HMDs 201A and 201B, complicated processing such as grasping the mutual spatial positions of the worker PA and the instruction supervisor PB and conversion of line-of-sight information (conversion of position coordinates) is required. Therefore, communication based on the line of sight can be performed with a simple method.
 また、指示監督者PBがデバイスOBを直接視認できない状況であっても、指示監督者PBは、作業者PAのHMD201Aから出力された撮像画像(表示された映像V(202B))に基づいてデバイスOBを把握することができる。このため、撮像画像に対する指示監督者PBの視線情報を作業者PAのHMD201Aに出力し、HMD201Aにて指示監督者PBの視線の位置を表示させることにより、作業者PAは、表示された視線位置に基づいてデバイスOBに対して的確に作業を行うことが可能となる。つまり、指示監督者PBがデバイスOBを視認できないような状況であっても、視線に基づく的確な意思疎通および作業が可能となる。 Further, even in a situation where the instruction supervisor PB cannot directly recognize the device OB, the instruction supervisor PB uses the captured image (displayed video V (202B)) output from the HMD 201A of the worker PA. OB can be grasped. Therefore, by outputting the line-of-sight information of the instruction supervisor PB with respect to the captured image to the HMD 201A of the worker PA and displaying the position of the line of sight of the instruction supervisor PB on the HMD 201A, the worker PA displays the displayed line-of-sight position. Based on the above, it is possible to accurately perform the operation on the device OB. That is, even if the instruction supervisor PB cannot visually recognize the device OB, accurate communication and work based on the line of sight can be performed.
 図10は、本実施形態の視線情報共有システム100における動作の流れの他の例を示すフローチャートである。HMD201Aは、作業者PA自身の視線方向を、HMD201Aの視線検出装置32によって検出してよい(S6;第6の工程)。そして、上記したS5の工程では、HMD201Aにて、指示監督者PBの視線の位置と合わせて、S6での検出結果に基づく作業者PAの視線の位置を表示してもよい。 FIG. 10 is a flowchart showing another example of the operation flow in the line-of-sight information sharing system 100 of the present embodiment. The HMD 201A may detect the line-of-sight direction of the worker PA by the line-of-sight detection device 32 of the HMD 201A (S6; sixth step). In the step S5 described above, the position of the line of sight of the worker PA based on the detection result in S6 may be displayed on the HMD 201A together with the position of the line of sight of the instruction supervisor PB.
 図11は、HMD201Aにて、作業者PAおよび指示監督者PBの両者の視線の位置が表示されたときの、作業者PAの視界のイメージを示している。図11では、作業者PAの視線の位置をマーカーMAで示し、指示監督者PBの視線の位置をマーカーMBで示している。ただし、マーカーMA・MBを互いに区別するために、ここでは、マーカーMA・MBの形状を互いに異ならせている。より具体的には、自己(作業者PA)の視線を示すマーカーMAを四角形で表示し、他者(指示監督者PB)の視線を示すマーカーMBを円形で表示している。なお、マーカーMA・MBの表示方法はこれに限定されず、例えば互いに色を異ならせたり、互いに線種(実線、破線など)を異ならせたり、「指示監督者」、「作業者」などの文字をマーカーに並記したり、形状、線種、色、文字のいずれかを組み合わせるなどにより、マーカーMA・MBを互いに区別して表示してもよい。 FIG. 11 shows an image of the field of view of the worker PA when the line-of-sight positions of both the worker PA and the instruction supervisor PB are displayed on the HMD 201A. In FIG. 11, the position of the line of sight of the worker PA is indicated by a marker MA, and the position of the line of sight of the instruction supervisor PB is indicated by a marker MB. However, in order to distinguish the markers MA and MB from each other, here, the shapes of the markers MA and MB are different from each other. More specifically, the marker MA indicating the line of sight of the operator (worker PA) is displayed as a rectangle, and the marker MB indicating the line of sight of the other person (instructor supervisor PB) is displayed as a circle. The display method of the marker MA / MB is not limited to this. For example, the colors may be different from each other, the line types (solid line, broken line, etc.) may be different from each other, “instruction supervisor”, “worker”, etc. The markers MA and MB may be distinguished from each other by displaying the characters in parallel with the markers, or by combining any of shapes, line types, colors, and characters.
 このように、作業者PAのHMD201Aにおいて、指示監督者PBの視線の位置(マーカーMB)と合わせて、作業者PAの視線の位置(マーカーMA)を表示することにより、作業者PAは、デバイスOBに対して自分が見ている箇所(作業を行おうとする箇所)と指示監督者PBが見ている箇所(作業を指示する箇所)とを同時に把握して、これらの一致/不一致を即座に把握することができる。そして、不一致の場合、作業者PAは、音声を発して確認するなど、適切な措置をとることが可能となる。したがって、作業者PAは、指示監督者PBとの間で、意思疎通をより正確に行うことが可能となる。 In this way, in the HMD 201A of the worker PA, the worker PA displays the position of the worker PA's line of sight (marker MA) together with the position of the line of sight of the instruction supervisor PB (marker MB). At the same time, it is possible to grasp the part of the OB that he / she sees (the part where the work is to be performed) and the part where the instruction supervisor PB is viewing (the part where the work is instructed), and immediately recognizes the match / mismatch. I can grasp it. In the case of a mismatch, the worker PA can take appropriate measures such as confirming by sounding. Therefore, the worker PA can more accurately communicate with the instruction supervisor PB.
 図12は、本実施形態の視線情報共有システム100における動作の流れのさらに他の例を示すフローチャートである。HMD201Aは、S6の工程で検出した作業者PAの視線情報を、指示監督者PBのHMD201Bに出力してもよい(S7;第7の工程)。そして、HMD201Bでは、HMD201Aから出力された撮像画像に、S7の工程で取得された作業者PAの視線の位置と、S3の工程での検出結果に基づく指示監督者PBの視線の位置とを合わせて表示してもよい(S8;第8の工程)。 FIG. 12 is a flowchart showing still another example of the operation flow in the line-of-sight information sharing system 100 of the present embodiment. The HMD 201A may output the line-of-sight information of the worker PA detected in the step S6 to the HMD 201B of the instruction supervisor PB (S7; seventh step). Then, in the HMD 201B, the position of the line of sight of the worker PA acquired in step S7 and the position of the line of sight of the instruction supervisor PB based on the detection result in step S3 are combined with the captured image output from the HMD 201A. May be displayed (S8; eighth step).
 図13は、HMD201Bにて、作業者PAおよび指示監督者PBの両者の視線の位置が表示されたときの、指示監督者PBの視界のイメージを示している。なお、図13では、自己(指示監督者PB)の視線を示すマーカーMBを四角形で表示し、他者(作業者PA)の視線を示すマーカーMBを円形で表示している。図11と図13とでは、映像を見る主体が作業者PAと指示監督者PBとで異なるため、例えば同じ作業者PAの視線位置を示すマーカーMAであっても、その形状は図11と図13とで異なっている。しかし、映像を見る主体に対応するマーカーを四角形で表示し、他者のマーカーを円形で表示している点では、図11と図13とは共通している。 FIG. 13 shows an image of the field of view of the instruction supervisor PB when the line-of-sight positions of both the worker PA and the instruction supervisor PB are displayed on the HMD 201B. In FIG. 13, the marker MB indicating the line of sight of the self (instructor supervisor PB) is displayed as a rectangle, and the marker MB indicating the line of sight of the other person (worker PA) is displayed as a circle. In FIGS. 11 and 13, since the subject viewing the video is different between the worker PA and the instruction supervisor PB, for example, even if the marker MA indicates the line-of-sight position of the same worker PA, the shape thereof is as shown in FIGS. 13 and different. However, FIG. 11 and FIG. 13 are common in that the marker corresponding to the subject viewing the video is displayed as a rectangle and the other's marker is displayed as a circle.
 このように、指示監督者PBのHMD201Bにおいて、作業者PAのHMD201Aから出力される撮像画像に、作業者PAの視線の位置(マーカーMA)と、指示監督者PBの視線の位置(マーカーMB)とを合わせて表示することにより、指示監督者PBは、デバイスOBに対して自分が見ている箇所(作業を指示したい箇所)と作業者PAが見ている箇所(作業を行おうとする箇所)とを同時に把握して、これらの一致/不一致を即座に把握することができる。そして、不一致の場合、指示監督者PBは、音声を発して指示を追加するなど、適切な措置をとることが可能となる。したがって、指示監督者PBは、作業者PAとの間で意思疎通をより正確に行うことが可能となる。また、指示監督者PBは、マーカーMBを見て、デバイスOBに対して作業を指定したい箇所を自身で確認しながら、作業者PAに対して視線による指示を正確に出すことができる。 As described above, in the HMD 201B of the instruction supervisor PB, the captured image output from the HMD 201A of the worker PA includes the position of the line of sight of the worker PA (marker MA) and the position of the line of sight of the instruction supervisor PB (marker MB). Are displayed together with the instruction supervisor PB, the part that the device OB is looking at (the place where the work is to be instructed) and the place where the worker PA is looking (the place where the work is to be performed) Can be grasped at the same time, and the coincidence / mismatch can be grasped immediately. In the case of mismatch, the instruction supervisor PB can take appropriate measures such as adding an instruction by emitting a voice. Therefore, the instruction supervisor PB can more accurately communicate with the worker PA. In addition, the instruction supervisor PB can accurately give an instruction based on the line of sight to the worker PA while checking the position where he / she wants to designate the work for the device OB by looking at the marker MB.
 また、図11に示すように、作業者PAのHMD201Aにおいて、作業者PAの視線の位置と、指示監督者PBの視線の位置とは、異なるパターン(マーカーMA・MB)で表示される。同様に、図13に示すように、指示監督者PBのHMD201Bにおいても、作業者PAの視線の位置と、指示監督者PBの視線の位置とは、異なるパターン(マーカーMA・MB)で表示される。このような表示により、作業者PAおよび指示監督者PBは、それぞれのHMD210A・201Bにおいて、相互の視線位置を容易に区別して把握することができ、これによって迅速な意思疎通が可能となる。 Also, as shown in FIG. 11, in the HMD 201A of the worker PA, the position of the line of sight of the worker PA and the position of the line of sight of the instruction supervisor PB are displayed in different patterns (markers MA and MB). Similarly, in the HMD 201B of the instruction supervisor PB, the position of the line of sight of the worker PA and the position of the line of sight of the instruction supervisor PB are displayed in different patterns (markers MA and MB) as shown in FIG. The With such a display, the worker PA and the instruction supervisor PB can easily distinguish and grasp each other's line-of-sight positions in the respective HMDs 210A and 201B, thereby enabling quick communication.
 (視線情報の出力タイミングについて)
 HMD201Bは、HMD201A(一の情報入出力端末)からHMD201B(他の情報入出力端末)に出力された撮像画像の表示領域内に指示監督者PBの視線が位置する場合のみ、指示監督者PBの視線情報を作業者PAのHMD201Aに出力することが望ましい。
(Gaze information output timing)
The HMD 201 </ b> B is instructed by the instruction supervisor PB only when the line of sight of the instruction supervisor PB is located within the display area of the captured image output from the HMD 201 </ b> A (one information input / output terminal) to the HMD 201 </ b> B (other information input / output terminal). It is desirable to output the line-of-sight information to the HMD 201A of the worker PA.
 例えば、指示監督者PBが、HMD201Bにて、HMD201Aから出力された撮像画像(映像V(202B))を見ていない場合、その撮影画像に含まれるデバイスOBに対して指示監督者PBが何らかの指示や助言を行うことはないと考えられる。この場合、指示監督者PBと作業者PAとの間で、視線による意思疎通は不要である。視線検出装置32による指示監督者PBの視線検出は常時行われているが、意思疎通が不要なときの視線情報(撮影画像の表示領域以外を見ているときの視線情報)を、HMD201BからHMD201Aに出力しても、不要な通信を行うだけであり、無駄である。 For example, when the instruction supervisor PB does not see the captured image (video V (202B)) output from the HMD 201A on the HMD 201B, the instruction supervisor PB gives some instruction to the device OB included in the captured image. And no advice. In this case, communication with the line of sight is not required between the instruction supervisor PB and the worker PA. Although the gaze detection of the instruction supervisor PB is always performed by the gaze detection device 32, gaze information (gaze information when looking outside the display area of the photographed image) when communication is unnecessary is obtained from the HMD 201B to the HMD 201A. Even if it outputs to, it will only perform unnecessary communication and is useless.
 そこで、HMD201Bでの撮像画像の表示領域内に指示監督者PBの視線が位置する場合のみ、つまり、指示監督者PBがHDM201Bにて撮像画像を見ており、作業者PAとの間で視線に基づく意思疎通が実際に必要な場合のみ、HMD201Bが指示監督者PBの視線情報を作業者PAのHMD201Aに出力することにより、HMD201Aへの無駄な情報の出力を減らして、システム負荷を低減することができる。 Therefore, only when the line of sight of the instruction supervisor PB is located within the display area of the captured image on the HMD 201B, that is, the instruction supervisor PB is viewing the captured image on the HDM 201B, and the line of sight with the worker PA Only when the communication based on this is actually necessary, the HMD 201B outputs the line-of-sight information of the instruction supervisor PB to the HMD 201A of the worker PA, thereby reducing the output of useless information to the HMD 201A and reducing the system load. Can do.
 ここで、上記のように、HMD201Bが、撮像画像の表示領域内に指示監督者PBの視線が位置する場合のみ、指示監督者PBの視線情報をHMD201Aに出力するシステムは、具体的には、以下の構成を採用することによって実現可能である。 Here, as described above, the system in which the HMD 201B outputs the line-of-sight information of the instruction supervisor PB to the HMD 201A only when the line of sight of the instruction supervisor PB is located within the display area of the captured image, specifically, This can be realized by adopting the following configuration.
 図14は、HMD201の他の構成を示すブロック図である。HMD201は、図4等で示した構成に加えて、入力部39をさらに有している。入力部39は、HMD201の使用者(例えば上記の例では指示監督者PB)の視線情報を出力するタイミングを指定するための装置である。このような入力部39は、例えば音声認識装置、瞳観察装置、小型タッチデバイスなどで構成することができる。 FIG. 14 is a block diagram showing another configuration of the HMD 201. The HMD 201 further includes an input unit 39 in addition to the configuration shown in FIG. The input unit 39 is a device for designating timing for outputting line-of-sight information of the user of the HMD 201 (for example, the instruction supervisor PB in the above example). Such an input unit 39 can be configured by, for example, a voice recognition device, a pupil observation device, a small touch device, or the like.
 音声認識装置は、例えば「これ」、「あれ」などの特定の音声を認識したときに、視線情報の出力を指示する信号を制御部38に出力する。瞳観察装置は、使用者の瞳を観察し、例えば3回連続の瞬きや、通常よりも大きな瞬き、通常よりも長い瞬きなど、使用者の特徴的な瞬きを検出したときに、視線情報の出力を指示する信号を制御部38に出力する。なお、上記した瞳観察装置の機能を、視線検出装置32に持たせてもよい。小型タッチデバイスは、例えば使用者の手の指に装着可能な小型のリング状部材を有し、リング状部材をタッチしたときに、視線情報の出力を指示する信号を制御部38に出力する。制御部38は、入力部39からの指示信号を受けて通信部34を制御し、視線検出装置32にて検出された使用者の視線情報を通信部34から外部に出力させる。 The voice recognition device outputs a signal instructing the output of the line-of-sight information to the control unit 38 when recognizing a specific voice such as “this” or “that”, for example. The pupil observation device observes the user's pupil, and when the characteristic blink of the user is detected, for example, three consecutive blinks, a blink larger than normal, or a blink longer than usual, A signal for instructing output is output to the control unit 38. Note that the line-of-sight detection device 32 may have the function of the above-described pupil observation device. The small touch device has, for example, a small ring-shaped member that can be attached to the finger of the user's hand, and outputs a signal instructing the output of the line-of-sight information to the control unit 38 when the ring-shaped member is touched. The control unit 38 receives the instruction signal from the input unit 39 and controls the communication unit 34 to output the user's line-of-sight information detected by the line-of-sight detection device 32 from the communication unit 34 to the outside.
 このように、入力部39を用いることにより、HMD201Bは、特定の音声の検知等をトリガーとして、その検知タイミングで(入力部39によって指定されたタイミングで)、視線情報をHMD201Aに出力させることができる。つまり、指示監督者PBは、自身が撮像画像を観察しているときに、「これ」などの音声を発したり、特定の瞬きをしたり、リング状部材をタッチすることで、HMD201BからHMD201Aに視線情報を出力させることができる。よって、上述のように、撮像画像の表示領域内に、指示監督者PBの視線が位置する場合のみ、指示監督者PBの視線情報をHMD201Aに出力する構成を確実に実現することができる。また、HMD201Bから出力される視線情報を、確実に、指示監督者PBが撮像画像の表示領域を見ているときの視線情報とすることができるため、その視線情報を用いて簡便で高精度な意思疎通を実現することが可能となる。また、入力部39を合わせて用いることにより、デバイスOBに対する意図した指定が簡便になり、更なるユーザビリティの向上が期待できる。 In this way, by using the input unit 39, the HMD 201B can cause the HMD 201A to output the line-of-sight information at the detection timing (at the timing specified by the input unit 39) triggered by detection of a specific sound or the like. it can. That is, when the instruction supervisor PB is observing a captured image, the instruction supervisor PB makes a sound such as “this”, makes a specific blink, or touches a ring-shaped member, thereby changing the HMD 201B to the HMD 201A. Gaze information can be output. Therefore, as described above, it is possible to reliably realize the configuration in which the line-of-sight information of the instruction supervisor PB is output to the HMD 201A only when the line of sight of the instruction supervisor PB is located within the captured image display area. Further, the line-of-sight information output from the HMD 201B can be surely used as the line-of-sight information when the instruction supervisor PB is looking at the display area of the captured image. Communication can be realized. Further, by using the input unit 39 together, intended designation for the device OB becomes simple, and further improvement in usability can be expected.
 なお、入力部39の一例である小型タッチデバイスにおいて、リング状部材は、例えば使用者の手の小指に装着され、同じ手の親指でタッチ可能である。このことから、小型タッチデバイスを入力部39として使用者が装着するだけで、使用者のハンズフリー性が損なわれることにはならない。つまり、入力部39として、小型タッチデバイスを使用しても、HMD201に特有のハンズフリー性は確保される。 In the small touch device which is an example of the input unit 39, the ring-shaped member is attached to, for example, the little finger of the user's hand and can be touched with the thumb of the same hand. For this reason, the user's hands-free performance is not impaired only by wearing the small touch device as the input unit 39 by the user. That is, even if a small touch device is used as the input unit 39, the hands-free property peculiar to the HMD 201 is ensured.
 なお、本実施形態では、指示監督者PBのHMD201Bは、光学シースルー型の映像表示装置202を用いて構成されているが、指示監督者PBは作業者PAの背後に立ち、デバイスOBを直接視認することが困難であるため、光学的にシースルーである必要はない。したがって、HMD201Bは、後述するビデオシースルー型(図30、図31参照)の映像表示装置202を用いて構成されていてもよい。 In this embodiment, the HMD 201B of the instruction supervisor PB is configured using the optical see-through video display device 202, but the instruction supervisor PB stands behind the worker PA and directly sees the device OB. It is not necessary to be optically see-through because it is difficult to do. Therefore, the HMD 201B may be configured by using a video see-through type (see FIGS. 30 and 31) video display device 202 described later.
 以上では、図5で示したように、作業者PAおよび指示監督者PBが近距離に位置している場合を想定したが、図15に示すように、作業者PAおよび指示監督者PBが遠距離に位置していてもよい。例えば、指示監督者PBから数百キロ離れた場所で作業者PAが作業を行う場合でも、本実施形態のシステムを適用して作業者PAが指示監督者PBから視線情報に基づく指示を、通信回線300(図1参照)を介して受けることにより、本実施形態と同様の効果を得ることができる。 In the above, it is assumed that the worker PA and the instruction supervisor PB are located at a short distance as shown in FIG. 5, but the worker PA and the instruction supervisor PB are far away as shown in FIG. It may be located at a distance. For example, even when the worker PA performs work at a location several hundred kilometers away from the instruction supervisor PB, the worker PA communicates an instruction based on the line-of-sight information from the instruction supervisor PB by applying the system of this embodiment. By receiving via the line 300 (see FIG. 1), the same effect as in the present embodiment can be obtained.
 このとき、HMD201A・201B間で、図4で示した音声入力装置36および音声出力装置37により、「これ」、「あれ」、「このネジを緩めろ!」、「了解しました!」などの音声情報が入出力されてもよい。この場合、指示監督者PBの視線情報に加えて、音声情報を用いた意思疎通が可能となるため、意思疎通がさらに簡便となる。また、視線情報の内容を音声情報で補足できるため、意思疎通の精度もさらに向上する。 At this time, the voice input device 36 and the voice output device 37 shown in FIG. 4 are used between the HMDs 201A and 201B, such as “this”, “that”, “loosen this screw!”, “I understand!” Audio information may be input / output. In this case, since communication using voice information can be performed in addition to the line-of-sight information of the instruction supervisor PB, communication is further simplified. In addition, since the content of the line-of-sight information can be supplemented with audio information, the accuracy of communication is further improved.
 本実施形態では、撮像画像の出力が、HMD201AからHMD201Bへの一方向である場合について説明したが、双方向である場合も、本実施形態の構成を適用して同様の効果を得ることができる。以下、その具体例について、実施の形態2として説明する。 In the present embodiment, the case where the output of the captured image is unidirectional from the HMD 201A to the HMD 201B has been described, but the same effect can be obtained by applying the configuration of the present embodiment even when the output is bidirectional. . Hereinafter, a specific example thereof will be described as a second embodiment.
 〔実施の形態2〕
 本発明の他の実施の形態について、図面に基づいて説明すれば、以下の通りである。図16は、作業者PAと作業者PCとがそれぞれHMD201A・201Cを頭部に装着して協働作業を行う様子を模式的に示している。作業者PAおよび作業者PCは、それぞれHMD201A・201Cの使用者である。なお、HMD201A・201Cの構成は、実施の形態1で説明したHMD201(図2~図4等参照)と全く同じである。また、HMD201A・201Cに含まれる各構成要素を互いに区別する場合は、例えば映像表示装置202A・202Cのように、各構成要素の符号の後に「A」または「C」の記号を付す。
[Embodiment 2]
The following will describe another embodiment of the present invention with reference to the drawings. FIG. 16 schematically shows a state where the worker PA and the worker PC perform the collaborative work by wearing the HMDs 201A and 201C on their heads, respectively. Worker PA and worker PC are users of HMDs 201A and 201C, respectively. The configurations of the HMDs 201A and 201C are exactly the same as those of the HMD 201 (see FIGS. 2 to 4) described in the first embodiment. Further, in order to distinguish the constituent elements included in the HMDs 201A and 201C from each other, a symbol “A” or “C” is added after the reference numerals of the constituent elements as in the video display devices 202A and 202C, for example.
 ここでは、例として、工作室の中央のテーブルT上に、対象物であるデバイスOBが載置されており、このデバイスOBに対して、複数の作業者である作業者PAおよび作業者PCがメンテナンスを実行する場合を想定する。 Here, as an example, a device OB that is an object is placed on a table T in the center of the work room, and a worker PA and a worker PC, which are a plurality of workers, are placed on the device OB. Assume that maintenance is performed.
 また、HMD201A・201Cは、相互に情報連携が可能なように通信状態下(例えばWi-Fi(登録商標)によって通信可能な状態)にあり、相互の情報を送受信可能である。このとき、HMD201A・201C間の通信は、それぞれの制御部38の制御のもとで、それぞれの通信部34を介して行われるものとする。 In addition, the HMDs 201A and 201C are in a communication state (for example, in a state where communication is possible using Wi-Fi (registered trademark)) so that information can be linked to each other, and the mutual information can be transmitted and received. At this time, communication between the HMDs 201 </ b> A and 201 </ b> C is assumed to be performed via each communication unit 34 under the control of each control unit 38.
 本実施形態では、図6で示したS1~S5の工程を、作業者PAのHMD201Aと作業者PCのHMD201Cとの間で相互に行う。以下、より具体的に説明する。 In this embodiment, the steps S1 to S5 shown in FIG. 6 are performed between the HMD 201A of the worker PA and the HMD 201C of the worker PC. More specific description will be given below.
 (撮像画像の情報の出力)
 一の情報入出力端末であるHMD201Aは、HMD201Aの撮像装置31にて取得される撮像画像の情報を、他の情報入出力端末であるHMD201Cに出力する。同様に、HMD201Cは、HMD201Cの撮像装置31にて取得される撮像画像の情報を、HMD201Aに出力する。
(Output of captured image information)
The HMD 201A, which is one information input / output terminal, outputs information of the captured image acquired by the imaging device 31 of the HMD 201A to the HMD 201C, which is another information input / output terminal. Similarly, the HMD 201C outputs information of the captured image acquired by the imaging device 31 of the HMD 201C to the HMD 201A.
 図17は、作業者PAの視界のイメージを示している。破線で示した矩形の領域IM(201A)は、作業者PAが装着したHMD201Aの撮像装置31の撮像範囲を示している。作業者PAには、デバイスOB、作業者PAの手(右手RH(PA)、左手LH(PA))、作業者PAの手に持たれた工具(ドライバDR(PA))、作業者PCの手(右手RH(PC)、左手LH(PC))が見えており、作業者PAの視界がHMD201Aの撮像装置31によって撮像されている。HMD201Aは、このような作業者PAの視点から見た撮像装置31の撮像画像の情報(画像データ)を、HMD201Aの通信部34を介してHMD201Cに出力する。 FIG. 17 shows an image of the field of view of the worker PA. A rectangular area IM (201A) indicated by a broken line indicates an imaging range of the imaging device 31 of the HMD 201A worn by the worker PA. The worker PA includes the device OB, the hand of the worker PA (right hand RH (PA), the left hand LH (PA)), the tool held in the hand of the worker PA (driver DR (PA)), and the worker PC. The hands (right hand RH (PC), left hand LH (PC)) are visible, and the field of view of the worker PA is captured by the imaging device 31 of the HMD 201A. The HMD 201A outputs information (image data) of the captured image of the imaging device 31 viewed from the viewpoint of the worker PA to the HMD 201C via the communication unit 34 of the HMD 201A.
 一方、図18は、作業者PCの視界のイメージを示している。破線で示した矩形の領域IM(201C)は、作業者PCが装着したHMD201Cの撮像装置31の撮像範囲を示している。作業者PCには、デバイスOB、作業者PAの手(右手RH(PA)、左手LH(PA))、作業者PAの手に持たれた工具(ドライバDR(PA))、作業者PCの手(右手RH(PC)、左手LH(PC))が見えており、作業者PCの視界がHMD201Cの撮像装置31によって撮像されている。HMD201Cは、このような作業者PCの視点から見た撮像装置31の撮像画像の情報(画像データ)を、HMD201Cの通信部34を介してHMD201Aに出力する。 On the other hand, FIG. 18 shows an image of the field of view of the worker PC. A rectangular area IM (201C) indicated by a broken line indicates an imaging range of the imaging device 31 of the HMD 201C worn by the worker PC. The worker PC includes the device OB, the hand of the worker PA (right hand RH (PA), the left hand LH (PA)), the tool held in the hand of the worker PA (driver DR (PA)), the worker PC The hands (right hand RH (PC), left hand LH (PC)) are visible, and the field of view of the worker PC is captured by the imaging device 31 of the HMD 201C. The HMD 201C outputs information (image data) of the captured image of the imaging device 31 viewed from the viewpoint of the worker PC to the HMD 201A via the communication unit 34 of the HMD 201C.
 (撮像画像の表示)
 HMD201Cは、映像表示装置202Cにより、HMD201Aから入力された撮像画像の情報に基づいて、上記撮像画像をHMD201Cの作業者PCに視認可能に表示する。同様に、HMD201Aは、映像表示装置202Aにより、HMD201Cから入力された撮像画像の情報に基づいて、上記撮像画像をHMD201Aの作業者PAに視認可能に表示する。
(Display of captured image)
The HMD 201C displays the captured image on the worker PC of the HMD 201C so as to be visible based on the captured image information input from the HMD 201A by the video display device 202C. Similarly, the HMD 201A causes the video display device 202A to display the captured image so as to be visible to the worker PA of the HMD 201A based on information about the captured image input from the HMD 201C.
 図19は、作業者PCのHMD201Cで撮像されて出力された画像(視界画像)を、作業者PAのHMD201Aの映像表示装置202Aにて、映像(虚像)V(202A)として表示したときの作業者PAの視界を示している。また、図20は、作業者PAのHMD201Aで撮像されて出力された画像(視界画像)を、作業者PCのHMD201Cの映像表示装置202Cにて、映像(虚像)V(202C)として表示したときの作業者PCの視界を示している。このように、HMD201A・201Cでは、各々の撮像装置31で撮像された視界画像を相互に入出力し、入力された視界画像を映像(虚像)として表示する。これにより、作業者PAは、自身が装着しているHMD201Aにて、作業者PCの視界画像を観察することができ、また、作業者PCは、自身が装着しているHMD201Cにて、作業者PAの視界画像を観察することができる。 FIG. 19 shows an operation when an image (view image) captured and output by the HMD 201C of the worker PC is displayed as a video (virtual image) V (202A) on the video display device 202A of the HMD 201A of the worker PA. The view of the person PA is shown. FIG. 20 shows a case where an image (view image) captured and output by the HMD 201A of the worker PA is displayed as a video (virtual image) V (202C) on the video display device 202C of the HMD 201C of the worker PC. The field of view of the worker PC is shown. As described above, the HMDs 201A and 201C mutually input and output view images captured by the respective image capturing apparatuses 31, and display the input view images as videos (virtual images). As a result, the worker PA can observe the field-of-view image of the worker PC with the HMD 201A worn by the worker PA, and the worker PC can see the worker PC with the HMD 201C worn by the worker PA. A visual field image of PA can be observed.
 特に、図19で示したように、デバイスOB、作業者PAの手、ドライバDR(PA)、作業者PCの手は、作業者PAの実際の視界内のリアルな物体である。また、図20で示したように、デバイスOB、作業者PAの手、ドライバDR(PA)、作業者PCの手も、作業者PCの実際の視界内のリアルな物体である。このような視界画像をHMD201A・201Cの相互間で入出力して表示することにより、作業者PAおよび作業者PCは、各々の視界(視線方向の視野)を相互に共有して、各々の視界内のリアルな物体を視認することができる。 In particular, as shown in FIG. 19, the device OB, the hand of the worker PA, the driver DR (PA), and the hand of the worker PC are real objects in the actual view of the worker PA. As shown in FIG. 20, the device OB, the hand of the worker PA, the driver DR (PA), and the hand of the worker PC are also real objects in the actual field of view of the worker PC. By displaying such visual field images by inputting / outputting them between the HMDs 201A and 201C, the worker PA and the worker PC share their respective visual fields (visual fields in the line of sight) with each other, and A real object inside can be visually recognized.
 なお、HMD201Aで表示される映像V(202A)は、映像表示装置202Aによって虚像として表示可能な領域の一部であって、上記領域の中央以外に表示される(表示素子5では、表示素子5の表示面の一部であって、上記表示面の中央以外に映像が表示される)。作業者PAがデバイスOBに対して作業を行うとき、作業者PAは視界の中央にデバイスOBを位置させて作業を行うのが通常と考えられる。したがって、作業者PAの作業時に、映像V(202A)とデバイスOBとが重なって視認されて、デバイスOBに対する作業性が低下する(作業がしにくくなる)のを回避することができる。同様の観点から、HMD201Cで表示される映像V(202C)も、映像表示装置202Cによって虚像として表示可能な領域の一部であって、上記領域の中央以外に表示される。 Note that the video V (202A) displayed on the HMD 201A is a part of a region that can be displayed as a virtual image by the video display device 202A and is displayed in a region other than the center of the region (in the display element 5, the display element 5). The video is displayed on a part of the display surface other than the center of the display surface). When the worker PA works on the device OB, it is considered that the worker PA usually works by positioning the device OB in the center of the field of view. Therefore, it is possible to avoid that the video V (202A) and the device OB are visually overlapped and visually recognized during the work of the worker PA, and the workability with respect to the device OB is deteriorated (the work is difficult). From the same point of view, the video V (202C) displayed on the HMD 201C is also a part of an area that can be displayed as a virtual image by the video display device 202C and is displayed in a region other than the center of the area.
 (撮像画像に対する視線方向の検出)
 HMD202Cは、映像表示装置202Cによって表示された撮像画像(映像V(202C))に対する作業者PCの視線方向を、HMD202Cの視線検出装置32によって検出する。同様に、HMD202Aは、映像表示装置202Aによって表示された撮像画像(映像V(202A))に対する作業者PAの視線方向を、HMD202Aの視線検出装置32によって検出する。
(Detection of gaze direction for captured image)
The HMD 202C detects the line-of-sight direction of the worker PC with respect to the captured image (video V (202C)) displayed by the video display device 202C by the line-of-sight detection device 32 of the HMD 202C. Similarly, the HMD 202A detects the line-of-sight direction of the worker PA with respect to the captured image (video V (202A)) displayed by the video display device 202A by the line-of-sight detection device 32 of the HMD 202A.
 視線検出装置32は、図2で示したように、映像表示装置202(接眼光学系6)に取り付けられており、映像表示装置202に対する相対位置が固定されている。このため、HMD201Cの視線検出装置32は、映像表示装置202Cにて表示された映像V(202C)のどの部分を作業者PCが見ているのかを検出することができる。同様に、HMD201Aの視線検出装置32は、映像表示装置202Aにて表示された映像V(202A)のどの部分を作業者PAが見ているのかを検出することができる。 As shown in FIG. 2, the line-of-sight detection device 32 is attached to the video display device 202 (eyepiece optical system 6), and the relative position with respect to the video display device 202 is fixed. Therefore, the line-of-sight detection device 32 of the HMD 201C can detect which part of the video V (202C) displayed on the video display device 202C is being viewed by the worker PC. Similarly, the line-of-sight detection device 32 of the HMD 201A can detect which part of the video V (202A) displayed on the video display device 202A is being viewed by the worker PA.
 (視線情報の出力、視線の位置の表示)
 HMD201Cは、検出した作業者PCの視線方向に関する視線情報を、HMD201Cの通信部34を介してHMD201Aに出力する。そして、HMD201Aは、映像表示装置202Aにより、HMD201Cから入力された視線情報に基づいて、作業者PCの視線の位置を、作業者PAに視認可能に表示する。同様に、HMD201Aは、検出した作業者PAの視線方向に関する視線情報を、HMD201Aの通信部34を介してHMD201Cに出力する。そして、HMD201Cは、映像表示装置202Cにより、HMD201Aから入力された視線情報に基づいて、作業者PAの視線の位置を、作業者PCに視認可能に表示する。
(Gaze information output, gaze position display)
The HMD 201C outputs line-of-sight information regarding the line-of-sight direction of the detected worker PC to the HMD 201A via the communication unit 34 of the HMD 201C. Then, the HMD 201A displays the position of the line of sight of the worker PC on the worker PA so as to be visible to the worker PA based on the line-of-sight information input from the HMD 201C by the video display device 202A. Similarly, the HMD 201A outputs the detected line-of-sight information regarding the line-of-sight direction of the worker PA to the HMD 201C via the communication unit 34 of the HMD 201A. Then, the HMD 201C displays the position of the line of sight of the worker PA on the worker PC so as to be visible based on the line-of-sight information input from the HMD 201A by the video display device 202C.
 作業者PAのHMD201Aでは、撮像装置31は映像表示装置202に対して相対位置が固定されており、作業者PCのHMD201Cで表示された撮像画像(映像V(202C))は、元々作業者PAのHMD201Aで取得された撮像画像である。このため、HMD201Cで表示されたHMD201Aの撮像画像に対する作業者PCの注視点(作業者PCの視線方向の先にある点)は、作業者PAのHMD201Aにおいて、撮像装置31と相対位置が既知である映像表示装置202Aの映像の表示領域全体(作業者PAの視界範囲とほぼ等しい)における任意の位置と対応関係を持つ。したがって、作業者PAのHMD201Aにて、作業者PCの視線の位置を、映像表示装置202Aの表示領域内の所定の位置(作業者PCの注視点と対応する位置)に即座に表示することが可能となる。 In the HMD 201A of the worker PA, the relative position of the imaging device 31 with respect to the video display device 202 is fixed, and the captured image (video V (202C)) displayed on the HMD 201C of the worker PC is originally the worker PA. It is the picked-up image acquired by HMD201A. For this reason, the gaze point of the worker PC with respect to the captured image of the HMD 201A displayed on the HMD 201C (the point ahead of the worker PC's line-of-sight direction) has a known relative position with the imaging device 31 in the HMD 201A of the worker PA. Corresponding to an arbitrary position in the entire video display area of the video display device 202A (substantially equal to the field of view of the worker PA). Accordingly, the HMD 201A of the worker PA can immediately display the position of the line of sight of the worker PC at a predetermined position (a position corresponding to the gazing point of the worker PC) in the display area of the video display device 202A. It becomes possible.
 同様に、作業者PCのHMD201Cでは、撮像装置31は映像表示装置202に対して相対位置が固定されており、作業者PAのHMD201Aで表示された撮像画像(映像V(202A))は、元々作業者PCのHMD201Cで取得された撮像画像である。このため、HMD201Aで表示された撮像画像に対する作業者PAの注視点(作業者PAの視線方向の先にある点)は、作業者PCのHMD201Cにおいて、撮像装置31と相対位置が既知である映像表示装置202Cの表示領域全体(作業者PCの視界範囲とほぼ等しい)における任意の位置と対応関係を持つ。したがって、作業者PCのHMD201Cにて、作業者PAの視線の位置を、映像表示装置202Cの表示領域内の所定の位置(作業者PAの注視点と対応する位置)に即座に表示することが可能となる。 Similarly, in the HMD 201 </ b> C of the worker PC, the relative position of the imaging device 31 with respect to the video display device 202 is fixed, and the captured image (video V (202 </ b> A)) displayed on the HMD 201 </ b> A of the worker PA is originally It is a picked-up image acquired by HMD201C of worker PC. For this reason, the point of attention of the worker PA (the point ahead of the line of sight of the worker PA) with respect to the captured image displayed on the HMD 201A is a video whose relative position is known with the imaging device 31 in the HMD 201C of the worker PC. Corresponding to an arbitrary position in the entire display area of the display device 202C (substantially equal to the field of view of the worker PC). Therefore, the HMD 201C of the worker PC can immediately display the position of the line of sight of the worker PA at a predetermined position (a position corresponding to the gazing point of the worker PA) in the display area of the video display device 202C. It becomes possible.
 図21は、作業者PAがHMD201Aを介して外界を見ているときの、作業者PAの視界のイメージを示す。HMD201Aでは、作業者PCの視線の位置を示すマーカーMCを表示することにより、作業者PCがデバイスOBのどの部分を見ているかを作業者PAに提示することができる。また、図22は、作業者PCがHMD201Cを介して外界を見ているときの、作業者PCの視界のイメージを示す。HMD201Cでは、作業者PAの視線の位置を示すマーカーMAを表示することにより、作業者PAがデバイスOBのどの部分を見ているかを作業者PCに提示することができる。 FIG. 21 shows an image of the field of view of the worker PA when the worker PA is looking at the outside world via the HMD 201A. The HMD 201A can present to the worker PA which part of the device OB the worker PC is looking at by displaying the marker MC indicating the position of the line of sight of the worker PC. FIG. 22 shows an image of the field of view of the worker PC when the worker PC is looking at the outside world through the HMD 201C. The HMD 201C can present to the worker PC which part of the device OB the worker PA is viewing by displaying the marker MA indicating the position of the line of sight of the worker PA.
 このとき、作業者PAおよび作業者PCは隣接しているため(図16参照)、例えば作業者PCがデバイスOBの筐体を両手で支えているなど、作業者PCの手がふさがっている場合に、意図するデバイスOBの部分を注視しながら、「このネジを緩めろ!」などの音声を発するようにしてもよい。これにより、作業者PAは、HMD201Aで表示されたマーカーMC(図21参照)を見ながら、「このネジを緩めればよい」ということを容易に認識して、作業を正確に行うことができる。また、作業者PCは、視線と音声とにより、手作業を損なわず正確に、指示を作業者PAに伝達することができる。以下、視線と音声とによる作業者PAおよび作業者PCの協働作業の具体例について説明する。 At this time, since the worker PA and the worker PC are adjacent to each other (see FIG. 16), for example, when the worker PC is holding the hand of the worker PC, for example, the worker PC supports the housing of the device OB with both hands. In addition, a voice such as “Loosen this screw!” May be emitted while gazing at the intended device OB. Thereby, the worker PA can easily recognize that “this screw should be loosened” while looking at the marker MC (see FIG. 21) displayed on the HMD 201A, and can accurately perform the work. . Further, the worker PC can accurately transmit the instruction to the worker PA by using the line of sight and the voice without damaging the manual work. Hereinafter, a specific example of the collaborative work of the worker PA and the worker PC using the line of sight and voice will be described.
 なお、図21および図22において、マーカーMA・MCの近傍に付した(1)~(4)の番号は、マーカーの表示順序を示す。また、ここでは、マーカーMA・MCを互いに区別するため、自機で表示される自分の視線位置を示すマーカーを四角形で表し、自機で表示される他の作業者の視線位置を示すマーカーを円形で表す。 21 and 22, the numbers (1) to (4) given in the vicinity of the markers MA and MC indicate the display order of the markers. In addition, here, in order to distinguish the markers MA and MC from each other, the marker indicating the own line-of-sight position displayed on the own machine is represented by a rectangle, and the marker indicating the line-of-sight position of another worker displayed on the own machine is displayed. Represented by a circle.
 まず、作業者PAのHMD201Aから出力された撮像画像が、HMD201Cの映像表示装置202Cにて映像V(202C)として表示されると、作業者PCは、その映像V(202C)中で、緩める対象となるネジを見ながら、「このネジを緩めろ!」という音声指示を出す。映像V(202C)に対する作業者PCの視線方向は、HMD201Cの視線検出装置32によって検出される。そして、映像表示装置202Cにより、映像V(202C)において、HMD201Aの撮像画像と合わせて、作業者PCの視線位置を示すマーカーMCが表示される(図22の(1)参照)。これにより、どのネジを緩めるべきかが明確に同定される。 First, when the captured image output from the HMD 201A of the worker PA is displayed as the video V (202C) on the video display device 202C of the HMD 201C, the worker PC is to be loosened in the video V (202C). While listening to the screw, give a voice instruction "Loosen this screw!" The visual line direction of the worker PC with respect to the video V (202C) is detected by the visual line detection device 32 of the HMD 201C. Then, the video display device 202C displays a marker MC indicating the line of sight of the worker PC along with the captured image of the HMD 201A in the video V (202C) (see (1) in FIG. 22). This clearly identifies which screw should be loosened.
 次に、HMD201Cで検出された作業者PCの視線位置を示す視線情報、およびHMD201Cで取得された撮像画像の情報が、HMD201Aに出力される。そして、HMD201Aの映像表示装置202Aの映像の表示領域全体(作業者PAの視界範囲に対応)における、上記の映像V(202C)に対する作業者PCの注視点と対応する位置に、作業者PCの視線位置を示すマーカーMCが表示される(図21の(2)参照)。また、HMD201Cで取得された撮像画像が、HMD201Aの映像表示装置202Aにより、映像V(202A)として表示される。 Next, the line-of-sight information indicating the line-of-sight position of the worker PC detected by the HMD 201C and the captured image information acquired by the HMD 201C are output to the HMD 201A. Then, in the entire video display area of the video display device 202A of the HMD 201A (corresponding to the field of view of the worker PA), the worker PC has a position corresponding to the gaze point of the worker PC with respect to the video V (202C). A marker MC indicating the line-of-sight position is displayed (see (2) in FIG. 21). The captured image acquired by the HMD 201C is displayed as the video V (202A) by the video display device 202A of the HMD 201A.
 作業者PAは、自身のHMD201Aで表示されたマーカーMCを視認することで、作業者PCが指示するネジ(緩める対象となるネジ)を把握し、そのネジを緩める作業を行うことになる。このとき、例えば、緩める対象となるネジが止められている筐体カバーを安全のために保持してほしいなど、作業者PCに対して補助を要求したい場合には、作業者PAは、映像V(202A)中で、保持してほしい箇所を見ながら、「このカバーを支えてください!」という要求を音声で発する。映像V(202A)に対する作業者PAの視線方向は、HMD201Aの視線検出装置32によって検出される。そして、映像表示装置202Aにより、HMD201Cの撮像画像と合わせて、作業者PAの視線位置を示すマーカーMAが映像V(202A)として表示される(図21の(3)参照)。これにより、作業者PAがどのカバーを支えて欲しいのかが明確に同定される。 The worker PA grasps the screw (screw to be loosened) instructed by the worker PC by visually recognizing the marker MC displayed on his / her HMD 201A, and performs the work of loosening the screw. At this time, for example, when it is desired to request assistance from the worker PC, for example, when the worker wants to hold the housing cover to which the screw to be loosened is held for safety, the worker PA displays the video V In (202A), while listening to the part you want to hold, make a voice request “Please support this cover!”. The visual line direction of the worker PA with respect to the video V (202A) is detected by the visual line detection device 32 of the HMD 201A. Then, the video display device 202A displays the marker MA indicating the line-of-sight position of the worker PA together with the captured image of the HMD 201C as the video V (202A) (see (3) in FIG. 21). Thereby, it is clearly identified which cover the operator PA wants to support.
 その後、HMD201Aで検出された作業者PAの視線位置を示す視線情報が、HMD201Cに出力される。そして、HMD201Cの映像表示装置202Cの映像の表示領域全体(作業者PCの視界範囲に対応)における、上記の映像V(202A)に対する作業者PAの注視点と対応する位置に、作業者PAの視線位置を示すマーカーMAが表示される(図22の(4)参照)。これにより、作業者PCは、マーカーMAを見て、支えるべきカバーを即座に把握し、作業者PAの作業を的確に補助することができる。 Then, line-of-sight information indicating the line-of-sight position of the worker PA detected by the HMD 201A is output to the HMD 201C. Then, in the entire video display area of the video display device 202C of the HMD 201C (corresponding to the visual field range of the worker PC), the worker PA's position at the position corresponding to the gaze point of the worker PA with respect to the video V (202A). A marker MA indicating the line-of-sight position is displayed (see (4) in FIG. 22). As a result, the worker PC can immediately grasp the cover to be supported by looking at the marker MA, and can accurately assist the worker PA in the work.
 以上のように、撮像画像の情報の出力、撮像画像の表示、撮像画像に対する視線方向の検出、視線情報の出力、視線の位置の表示、の各工程を、HMD201A・201Cの相互間で行うことにより、各々のHMD201A・201Cで作業者PAおよび作業者PCの視線情報を相互に共有して、協働作業を行うことができる。また、「これ」、「あれ」という場所や対象物があいまいな音声指示を、視線位置を示すマーカーMA・MCを用いて視覚的に示すことで、手作業を損なうことなく、意思疎通が良好で、効率的な協働作業を行うことが可能となる。特に、複数のHMD201A・201C間で視線情報を共有することにより、HMDに特有のハンズフリー性を活かした協働作業が可能となる。また、視線情報に加えて音声情報を用いることにより、より正確な意思疎通が可能となる。 As described above, the steps of outputting the information of the captured image, displaying the captured image, detecting the line-of-sight direction with respect to the captured image, outputting the line-of-sight information, and displaying the position of the line of sight are performed between the HMDs 201A and 201C. As a result, the HMDs 201A and 201C can share the line-of-sight information of the worker PA and the worker PC with each other and perform a collaborative work. In addition, it is possible to communicate smoothly without compromising manual work by visually indicating the voice instructions with ambiguous locations and objects such as “this” and “that” using the marker MA / MC indicating the line-of-sight position. Thus, it is possible to perform efficient collaborative work. In particular, by sharing the line-of-sight information among the plurality of HMDs 201A and 201C, a collaborative work utilizing the hands-free property peculiar to the HMD becomes possible. Further, by using voice information in addition to line-of-sight information, more accurate communication is possible.
 〔実施の形態3〕
 本発明のさらに他の実施の形態について、図面に基づいて説明すれば、以下の通りである。上述の実施の形態2では、作業者PAと作業者PCの2人で連携して作業を行う例について説明したが、本実施形態では、3人で連携して作業を行う場合について説明する。ここでは、実施の形態2のシステムにおいて、作業者PAおよび作業者PCに、遠方の離れた場所にいる指示監督者PDが加わって作業(作業の指示および支援を含む)を行う例について説明する。
[Embodiment 3]
The following will describe still another embodiment of the present invention with reference to the drawings. In the above-described second embodiment, the example in which the worker PA and the worker PC work together is described, but in the present embodiment, a case where the work is performed by three people is described. Here, in the system of the second embodiment, an example will be described in which an instruction supervisor PD located at a distant place is added to the worker PA and the worker PC to perform work (including work instructions and support). .
 図23は、作業者PAおよび作業者PCがそれぞれHMD201A・201Cを頭部に装着し、遠方の指示監督者PDがHMD201Dを装着し、HMD201A・201CからHMD201Dに送られて表示された映像を、指示監督者PDが観察している様子を模式的に示している。作業者PAおよび作業者PCは、それぞれHMD201A・201Cの使用者であり、指示監督者PDは、HMD201Dの使用者である。HMD201Dは、通信回線300(図1参照)を介して、HMD201AおよびHMD201Cと通信可能な情報入出力端末である。なお、HMD201A・201C・201Dの構成は、実施の形態1で説明したHMD201(図2~図4等参照)と全く同じである。以下での説明において、HMD201A・201C・201Dに含まれる各構成要素を互いに区別する場合は、例えば映像表示装置202A・202C・202Dのように、各構成要素の符号の後に「A」、「C」または「D」の記号を付す。 In FIG. 23, the worker PA and the worker PC wear the HMD 201A / 201C on the head, the distant instruction supervisor PD wears the HMD 201D, and the images sent from the HMD 201A / 201C to the HMD 201D are displayed. A state in which the instruction supervisor PD is observing is schematically shown. The worker PA and the worker PC are users of the HMDs 201A and 201C, respectively, and the instruction supervisor PD is a user of the HMD 201D. The HMD 201D is an information input / output terminal that can communicate with the HMD 201A and the HMD 201C via the communication line 300 (see FIG. 1). The configurations of the HMDs 201A, 201C, and 201D are exactly the same as those of the HMD 201 described in Embodiment 1 (see FIGS. 2 to 4). In the following description, when the components included in the HMDs 201A, 201C, and 201D are distinguished from each other, for example, as in the video display devices 202A, 202C, and 202D, "A" and "C" "Or" D ".
 HMD201AとHMD201Cとの間での情報の入出力は、実施の形態2と同様である。以下、HMD201AとHMD201Dとの間、およびHMD201CとHMD201Dとの間での情報の入出力について説明する。 Input / output of information between the HMD 201A and the HMD 201C is the same as that in the second embodiment. Hereinafter, input / output of information between the HMD 201A and the HMD 201D and between the HMD 201C and the HMD 201D will be described.
 まず、HMD201A・201Cにて取得された撮影画像の情報は、それぞれの通信部34を介してHMD201Dに出力される。そして、HMD201Dでは、HMD201A・201Cの各撮影画像が、映像V(202D)-Aおよび映像V(202D)-Cとして表示される。 First, the information of the captured image acquired by the HMDs 201A and 201C is output to the HMD 201D via each communication unit 34. In the HMD 201D, the captured images of the HMDs 201A and 201C are displayed as video V (202D) -A and video V (202D) -C.
 HMD201Dにて、指示監督者PDが映像V(202D)-Aを観察している場合、そのときの指示監督者PDの視線方向が、HMD201Dの視線検出装置32によって検出される。そして、検出した視線方向に関する視線情報が、HMD201Dから通信部34を介してHMD201Aに出力される。その後、HMD201Aでは、上記視線情報に基づいて、映像表示装置202Aの表示映像全体における所定位置に、指示監督者PDの視線の位置が表示される。 When the instruction supervisor PD is observing the video V (202D) -A on the HMD 201D, the visual line direction of the instruction supervisor PD at that time is detected by the visual line detection device 32 of the HMD 201D. The line-of-sight information regarding the detected line-of-sight direction is output from the HMD 201D to the HMD 201A via the communication unit 34. Thereafter, in the HMD 201A, the position of the line of sight of the instruction supervisor PD is displayed at a predetermined position in the entire display image of the video display device 202A based on the line-of-sight information.
 一方、指示監督者PDが映像V(202D)-Cを観察している場合、そのときの指示監督者PDの視線方向が、HMD201Dの視線検出装置32によって検出される。そして、検出した視線方向に関する視線情報が、HMD201Dから通信部34を介してHMD201Cに出力される。その後、HMD201Cでは、上記視線情報に基づいて、映像表示装置202Cの表示映像全体における所定位置に、指示監督者PDの視線の位置が表示される。 On the other hand, when the instruction supervisor PD is observing the video V (202D) -C, the line-of-sight direction of the instruction supervisor PD at that time is detected by the line-of-sight detection device 32 of the HMD 201D. Then, the line-of-sight information regarding the detected line-of-sight direction is output from the HMD 201D to the HMD 201C via the communication unit 34. Thereafter, in the HMD 201C, the position of the line of sight of the instruction supervisor PD is displayed at a predetermined position in the entire display image of the video display device 202C based on the line-of-sight information.
 図24は、作業者PAがHMD201Aを介して外界を見ているときの、作業者PAの視界のイメージを示す。HMD201Aでは、作業者PCの視線の位置を示すマーカーMCに加えて、指示監督者PDの視線の位置を示すマーカーPDを表示することにより、作業者PCおよび指示監督者PDがデバイスOBのどの部分を見ているかを作業者PAに提示することができる。また、図25は、作業者PCがHMD201Cを介して外界を見ているときの、作業者PCの視界のイメージを示す。HMD201Cでは、作業者PAの視線の位置を示すマーカーMAに加えて、指示監督者PDの視線の位置を示すマーカーPDを表示することにより、作業者PAおよび指示監督者PDがデバイスOBのどの部分を見ているかを作業者PCに提示することができる。 FIG. 24 shows an image of the field of view of the worker PA when the worker PA is looking at the outside world through the HMD 201A. In the HMD 201A, in addition to the marker MC indicating the position of the line of sight of the worker PC, the marker PD indicating the position of the line of sight of the instruction supervisor PD is displayed so that the worker PC and the instruction supervisor PD can identify which part of the device OB. Can be shown to the worker PA. FIG. 25 shows an image of the field of view of the worker PC when the worker PC is looking at the outside world via the HMD 201C. In the HMD 201C, in addition to the marker MA indicating the position of the line of sight of the worker PA, the marker PD indicating the position of the line of sight of the instruction supervisor PD is displayed so that the worker PA and the instruction supervisor PD can identify which part of the device OB. Can be shown to the worker PC.
 このように、HMD201A・201Cにて、作業者PA、作業者PC、指示監督者PDの3者の視線位置をそれぞれ表示させることにより、3者間で意思疎通を行って作業を行うことが可能となる。以下、作業者PA、作業者PCおよび指示監督者PDによる作業の具体例について説明する。 In this manner, the HMD 201A / 201C can display the positions of the three eyes of the worker PA, the worker PC, and the instruction supervisor PD so that the three parties can communicate and perform the work. It becomes. Hereinafter, specific examples of work by the worker PA, the worker PC, and the instruction supervisor PD will be described.
 なお、図23~図25において、マーカーMA・MC・MDの近傍に付した(1)~(10)の番号は、マーカーの表示順序を示す。また、ここでは、マーカーMA・MC・MDを互いに区別するため、自機で表示される自分の視線位置を示すマーカーを四角形で表し、自機で表示される他の作業者の視線位置を示すマーカーを円形で表す。また、図24および図25では、指示監督者PDの視線位置を示すマーカーを三角形で表す。 In FIG. 23 to FIG. 25, the numbers (1) to (10) in the vicinity of the markers MA, MC, and MD indicate the display order of the markers. In addition, here, in order to distinguish the markers MA, MC, and MD from each other, the marker indicating the position of the user's line of sight displayed on the own machine is represented by a rectangle, and the line of sight of another worker displayed on the own machine is indicated. Markers are represented by circles. In FIGS. 24 and 25, a marker indicating the line-of-sight position of the instruction supervisor PD is represented by a triangle.
 まず、作業者PAのHMD201Aにて取得された撮像画像の情報は、HMD201C・HMD201Dに出力され、HMD201C・HMD201Dにて、上記撮像画像が映像V(202C)・V(202D)-Aとしてそれぞれ表示される(図25、図23参照)。 First, the information of the captured image acquired by the HMD 201A of the worker PA is output to the HMD 201C / HMD 201D, and the captured image is displayed as the video V (202C) / V (202D) -A by the HMD 201C / HMD 201D, respectively. (See FIG. 25 and FIG. 23).
 作業者PCは、HMD201Cで表示された映像V(202C)中で、緩める対象となるネジを見ながら、「このネジを緩めろ!」という音声指示を出す。映像V(202C)に対する作業者PCの視線方向は、HMD201Cの視線検出装置32によって検出される。そして、映像表示装置202Cにより、映像V(202C)において、HMD201Aの撮像画像と合わせて、作業者PCの視線位置を示すマーカーMCが表示される(図25の(1)参照)。 The worker PC gives a voice instruction “Loosen this screw!” While watching the screw to be loosened in the video V (202C) displayed on the HMD 201C. The visual line direction of the worker PC with respect to the video V (202C) is detected by the visual line detection device 32 of the HMD 201C. Then, the video display device 202C displays a marker MC indicating the line of sight of the worker PC in the video V (202C) together with the captured image of the HMD 201A (see (1) in FIG. 25).
 次に、HMD201Cで検出された作業者PCの視線位置を示す視線情報、およびHMD201Cで取得された撮像画像の情報は、HMD201Aに出力される。そして、HMD201Aの映像表示装置202Aの映像の表示領域全体(作業者PAの視界範囲に対応)における、上記の映像V(202C)に対する作業者PCの注視点と対応する位置に、作業者PCの視線位置を示すマーカーMCが表示される(図24の(2)参照)。また、HMD201Aの映像表示装置202Aにより、HMD201Cで取得された撮像画像が、映像V(202A)として表示される(図24参照)。 Next, the line-of-sight information indicating the line-of-sight position of the worker PC detected by the HMD 201C and the captured image information acquired by the HMD 201C are output to the HMD 201A. Then, in the entire video display area of the video display device 202A of the HMD 201A (corresponding to the field of view of the worker PA), the worker PC has a position corresponding to the gaze point of the worker PC with respect to the video V (202C). A marker MC indicating the line-of-sight position is displayed (see (2) in FIG. 24). The captured image acquired by the HMD 201C is displayed as the video V (202A) by the video display device 202A of the HMD 201A (see FIG. 24).
 また、HMD201Cで検出された作業者PCの視線位置を示す視線情報は、HMD201Dにも出力される。これにより、HMD201Dでは、映像表示装置201Dにより、映像V(202D)-A中に作業者PCの視線位置を示すマーカーMCが表示される(図23の(3)参照)。なお、映像V(202C)および映像V(202D)-Aは、どちらも、HMD201Aで撮像された画像であるため、映像V(202D)-Aにおいて、映像V(202C)に対する作業者PCの注視点と対応する位置に、作業者PCの視線位置を示すマーカーMCを表示することは容易に実現できる。 Further, the line-of-sight information indicating the line-of-sight position of the worker PC detected by the HMD 201C is also output to the HMD 201D. Accordingly, in the HMD 201D, the marker MC indicating the line-of-sight position of the worker PC is displayed in the video V (202D) -A by the video display device 201D (see (3) in FIG. 23). Note that the video V (202C) and the video V (202D) -A are both images taken by the HMD 201A, and therefore, the operator PC notes the video V (202C) in the video V (202D) -A. It is possible to easily display the marker MC indicating the line-of-sight position of the worker PC at a position corresponding to the viewpoint.
 指示監督者PDは、映像V(202D)-A中のマーカーMCを見て、作業者PCの指示が誤っていると判断した場合、映像V(202D)-A中で、緩める対象となるネジを見ながら、「いや、最初にこのネジから緩めた方がよい!」という音声指示を出す。映像V(202D)-Aに対する指示監督者PDの視線方向は、HMD201Dの視線検出装置32によって検出される。そして、映像表示装置202Dにより、映像V(202D)-Aにおいて、指示監督者PDの視線位置を示すマーカーMDが表示される(図23の(4)参照)。 When the instruction supervisor PD looks at the marker MC in the video V (202D) -A and determines that the instruction of the worker PC is wrong, the screw to be loosened in the video V (202D) -A , Give a voice instruction saying “No, you should loosen this screw first!” The line-of-sight direction of the instruction supervisor PD with respect to the video V (202D) -A is detected by the line-of-sight detection device 32 of the HMD 201D. The video display device 202D displays a marker MD indicating the line-of-sight position of the instruction supervisor PD in the video V (202D) -A (see (4) in FIG. 23).
 指示監督者PDの視線方向に関する視線情報は、HMD201Aに出力される。これにより、HMD201Aでは、映像表示装置202Aの映像の表示領域全体(作業者PAの視界範囲に対応)における、上記の映像V(202D)-Aに対する指示監督者PDの注視点と対応する位置に、指示監督者PDの視線位置を示すマーカーMDが表示される(図24の(5)参照)。なお、指示監督者PDが、映像V(202D)-Aを見ているか、映像V(202D)-Cを見ているかは、HMD201Dの視線検出装置32での視線検出結果に基づいて判断できる。したがって、指示監督者PDが映像V(202D)-Aを見ている場合には、映像V(202D)-Aの提供元となるHMD201Aに、指示監督者PDの視線情報を出力することができる。 The line-of-sight information regarding the line-of-sight direction of the instruction supervisor PD is output to the HMD 201A. Accordingly, in the HMD 201A, the entire video display area of the video display device 202A (corresponding to the field of view of the worker PA) is located at a position corresponding to the point of sight of the instruction supervisor PD with respect to the video V (202D) -A. A marker MD indicating the line-of-sight position of the instruction supervisor PD is displayed (see (5) in FIG. 24). Whether the instruction supervisor PD is viewing the video V (202D) -A or the video V (202D) -C can be determined based on the visual line detection result in the visual line detection device 32 of the HMD 201D. Therefore, when the instruction supervisor PD is watching the video V (202D) -A, the line-of-sight information of the instruction supervisor PD can be output to the HMD 201A that is the provider of the video V (202D) -A. .
 作業者PAは、自身のHMD201Aで表示されたマーカーMDを視認することで、作業者PCが指示するネジではなく、指示監督者PDが指示するネジを緩める作業を行うことになる。このとき、作業者PAは、映像V(202A)中で、保持してほしい箇所を見ながら、「このカバーを支えてください!」という要求を音声で発する。映像V(202A)に対する作業者PAの視線方向は、HMD201Aの視線検出装置32によって検出される。そして、映像表示装置202Aにより、作業者PAの視線位置を示すマーカーMAが、HMD201Cの撮像画像と合わせて映像V(202A)として表示される(図24の(6)参照)。 The worker PA performs the work of loosening the screw indicated by the instruction supervisor PD, not the screw indicated by the worker PC, by visually recognizing the marker MD displayed on the own HMD 201A. At this time, the worker PA makes a voice request “Please support this cover!” While observing the portion to be held in the video V (202A). The visual line direction of the worker PA with respect to the video V (202A) is detected by the visual line detection device 32 of the HMD 201A. Then, the video display device 202A displays the marker MA indicating the line-of-sight position of the worker PA together with the captured image of the HMD 201C as the video V (202A) (see (6) in FIG. 24).
 その後、HMD201Aで検出された作業者PAの視線位置を示す視線情報が、HMD201Cに出力される。そして、HMD201Cの映像表示装置202Cの映像の表示領域全体(作業者PCの視界範囲に対応)における、上記の映像V(202A)に対する作業者PAの注視点と対応する位置に、作業者PAの視線位置を示すマーカーMAが表示される(図25の(7)参照)。 Then, line-of-sight information indicating the line-of-sight position of the worker PA detected by the HMD 201A is output to the HMD 201C. Then, in the entire video display area of the video display device 202C of the HMD 201C (corresponding to the visual field range of the worker PC), the worker PA's position at the position corresponding to the gaze point of the worker PA with respect to the video V (202A). A marker MA indicating the line-of-sight position is displayed (see (7) in FIG. 25).
 また、HMD201Aで検出された作業者PAの視線位置を示す視線情報は、HMD201Dにも出力される。これにより、HMD201Dでは、映像表示装置201Dにより、映像V(202D)-C中に作業者PAの視線位置を示すマーカーMAが表示される(図23の(8)参照)。なお、映像V(202A)および映像V(202D)-Cは、どちらも、HMD201Cで撮像された画像であるため、映像V(202D)-Cにおいて、映像V(202A)に対する作業者PAの注視点と対応する位置に、作業者PAの視線位置を示すマーカーMAを表示することは容易に実現できる。 Further, the line-of-sight information indicating the line-of-sight position of the worker PA detected by the HMD 201A is also output to the HMD 201D. Accordingly, in the HMD 201D, the marker MA indicating the line-of-sight position of the worker PA is displayed in the video V (202D) -C by the video display device 201D (see (8) in FIG. 23). Note that the video V (202A) and the video V (202D) -C are both images taken by the HMD 201C, and therefore the operator PA's note on the video V (202A) in the video V (202D) -C. Displaying the marker MA indicating the line-of-sight position of the worker PA at a position corresponding to the viewpoint can be easily realized.
 指示監督者PDは、映像V(202D)-C中のマーカーMAを見て、作業者PAの指示が誤っていると判断した場合、映像V(202D)-C中で、保持すべきカバーを見ながら、「いや、そのカバーだけでなく、このカバーを保持しろ!」という音声指示を出す。映像V(202D)-Cに対する指示監督者PDの視線方向は、HMD201Dの視線検出装置32によって検出される。そして、映像表示装置202Dにより、映像V(202D)-Cにおいて、指示監督者PDの視線位置を示すマーカーMDが表示される(図23の(9)参照)。 The instruction supervisor PD looks at the marker MA in the video V (202D) -C, and determines that the instruction of the worker PA is wrong, the cover to be held in the video V (202D) -C. While watching, give a voice instruction "No, hold this cover as well as its cover!" The line-of-sight direction of the instruction supervisor PD with respect to the video V (202D) -C is detected by the line-of-sight detection device 32 of the HMD 201D. The video display device 202D displays the marker MD indicating the line-of-sight position of the instruction supervisor PD in the video V (202D) -C (see (9) in FIG. 23).
 指示監督者PDの視線方向に関する視線情報は、HMD201Cに出力される。これにより、HMD201Cでは、映像表示装置202Cの映像の表示領域全体(作業者PCの視界範囲に対応)における、上記の映像V(202D)-Cに対する指示監督者PDの注視点と対応する位置に、指示監督者PDの視線位置を示すマーカーMDが表示される(図25の(10)参照)。これにより、作業者PCは、マーカーMDで指定されたカバーを保持し、ネジを緩める作業者PAの作業を適切に補助することが可能となる。なお、上述のように、指示監督者PDが、映像V(202D)-Aを見ているか、映像V(202D)-Cを見ているかは、HMD201Dの視線検出装置32での視線検出結果に基づいて判断できるので、指示監督者PDが映像V(202D)-Cを見ている場合には、映像V(202D)-Cの提供元となるHMD201Cに指示監督者PDの視線情報を出力することができる。 The line-of-sight information regarding the line-of-sight direction of the instruction supervisor PD is output to the HMD 201C. Accordingly, in the HMD 201C, the entire video display area of the video display device 202C (corresponding to the field of view of the worker PC) is positioned at the position corresponding to the point of sight of the instruction supervisor PD with respect to the video V (202D) -C. A marker MD indicating the line-of-sight position of the instruction supervisor PD is displayed (see (10) in FIG. 25). As a result, the worker PC can appropriately support the work of the worker PA holding the cover designated by the marker MD and loosening the screw. As described above, whether the instruction supervisor PD is viewing the video V (202D) -A or the video V (202D) -C depends on the visual line detection result in the visual line detection device 32 of the HMD 201D. Therefore, when the instruction supervisor PD is watching the video V (202D) -C, the line-of-sight information of the instruction supervisor PD is output to the HMD 201C that is the provider of the video V (202D) -C. be able to.
 以上のように、HMD201A・201C・201Dの3つの情報入出力端末間で、作業者PA、作業者PCおよび指示監督者PDの視線情報を相互に共有することにより、手作業を損なうことなく、意思疎通が良好で、協働作業を効率的に行うことが可能となる。特に、視線情報に加えて音声情報をさらに活用することにより、さらに正確な意思疎通が可能となり、その意思疎通に基づいてさらに効率的な作業が可能となる。 As described above, by sharing the line-of-sight information of the worker PA, the worker PC, and the instruction supervisor PD among the three information input / output terminals of the HMDs 201A, 201C, and 201D, without damaging the manual work, Communication is good and collaborative work can be done efficiently. In particular, by further utilizing voice information in addition to line-of-sight information, more accurate communication is possible, and more efficient work can be performed based on the communication.
 ところで、本実施形態では、指示監督者PDが使用する情報入出力端末200として、頭部に装着可能なHMD201Dを用いる例について説明したが、指示監督者PDが特段ハンズフリーやモビリティを必要としない場合には、情報入出力端末200は別の形態であっても構わない。 By the way, although this embodiment demonstrated the example using HMD201D which can be mounted | worn to a head as the information input / output terminal 200 which instruction supervisor PD uses, instruction supervisor PD does not require special hands-free and mobility. In this case, the information input / output terminal 200 may have another form.
 図26は、情報入出力端末200の他の構成を示す説明図である。同図のように、情報入出力端末200は、少なくとも、表示装置200aと、表示装置200aに固定される視線検出装置32とを有して構成されていてもよい。表示装置200aは、部屋に設置されるテレビなどの直視型ディスプレイ(モニター)である。視線検出装置32は、表示装置200aにて表示される映像V(200)-Aおよび映像V(200)-Cを観察する指示監督者PDの視線方向を検出する。視線検出装置32は、表示装置200aに対して位置が同定されている。このような形態の視線検出装置32については、先のトビーテクノロジー社からも提案されており、これを使用することができる。なお、映像V(200)-Aは、作業者PAのHMD201Aにて取得され、情報入出力端末200に入力されて表示されたHMD201Aの撮像画像である。また、映像V(200)-Cは、作業者PCのHMD201Cにて取得され、情報入出力端末200に入力されて表示されたHMD201Cの撮像画像である。 FIG. 26 is an explanatory diagram showing another configuration of the information input / output terminal 200. As shown in the figure, the information input / output terminal 200 may include at least a display device 200a and a line-of-sight detection device 32 fixed to the display device 200a. The display device 200a is a direct view display (monitor) such as a television set installed in a room. The line-of-sight detection device 32 detects the line-of-sight direction of the instruction supervisor PD who observes the video V (200) -A and the video V (200) -C displayed on the display device 200a. The position of the line-of-sight detection device 32 is identified with respect to the display device 200a. The line-of-sight detection device 32 having such a configuration has been proposed by Toby Technology, Inc. and can be used. Note that the video V (200) -A is a captured image of the HMD 201A acquired by the HMD 201A of the worker PA, input to the information input / output terminal 200, and displayed. The video V (200) -C is a captured image of the HMD 201C that is acquired by the HMD 201C of the worker PC, input to the information input / output terminal 200, and displayed.
 本実施形態のように、指示監督者PDが、作業者PAおよび作業者PCに指示を与えるだけの場合(指示監督者PDが実際にデバイスOBに対して手作業を行わない場合)、上記構成の情報入出力装置200にて他者の視線映像を表示し、その表示映像に対する指示監督者PDの観察視線を検出し、その視線情報を他者に出力することで、図23~図25と同様に、視線に基づく意思疎通が可能である。したがって、指示監督者PDの情報入出力装置200は、必ずしもHMDである必要はない。 When the instruction supervisor PD only gives instructions to the worker PA and the worker PC as in the present embodiment (when the instruction supervisor PD does not actually perform manual work on the device OB), the above configuration The information input / output device 200 displays the other person's line-of-sight image, detects the observation line of sight of the instruction supervisor PD with respect to the display image, and outputs the line-of-sight information to the other person. Similarly, communication based on line of sight is possible. Therefore, the information input / output device 200 of the instruction supervisor PD is not necessarily an HMD.
 〔実施の形態4〕
 本発明のさらに他の実施の形態について、図面に基づいて説明すれば、以下の通りである。本実施形態では、以上で説明した各実施の形態1~3に付随的な構成、または置換可能な構成について説明する。
[Embodiment 4]
The following will describe still another embodiment of the present invention with reference to the drawings. In the present embodiment, a configuration incidental to or replaceable with each of the first to third embodiments described above will be described.
 (視差補正について)
 以上の各実施の形態で説明したHMD201の映像表示装置202は、使用者が映像とともに外界を直接観察可能な光学シースルー型ディスプレイである。この場合、使用者(例えば作業者PA)は、外界にある対象物(例えばデバイスOB)を直接観察しながら作業を行うことが可能となり、作業性を向上させることができる。
(About parallax correction)
The video display device 202 of the HMD 201 described in each of the above embodiments is an optical see-through display that allows a user to directly observe the outside world together with video. In this case, a user (for example, worker PA) can perform work while directly observing an object (for example, device OB) in the outside world, and workability can be improved.
 ここで、図27は、映像表示装置202(光学シースルー型ディスプレイ)、撮像装置31、および無限遠に位置する対象物OB’の相対位置関係を模式的に示している。実施の形態1で説明したように、撮像装置31の撮像光軸A1は、無限遠に位置する対象物OB’を観察するときの使用者の視線方向(観察中心軸)A2と略一致している(ほぼ平行である)。このような光学設計では、映像表示装置202と撮像装置31との垂直方向(上下方向)の距離Dが、映像表示装置202と対象物OB’との距離Lに対して相対的に大きすぎると、視差(パララックス)が発生する。上述のように、例えば、HMD201Aでは、映像表示装置202Aにより、対象物であるデバイスOBに重ねて、指示監督者PBの視線位置を示すマーカーMBを表示するため、上記の視差が大きいと、マーカーMBの表示が、実際に作業者PAが見ている対象物(デバイスOB)からずれる。すなわち、図27で示した相対位置関係では、対象物OB’が映像表示装置202に近づくほど、マーカーMBの表示位置がずれることになる。 Here, FIG. 27 schematically shows the relative positional relationship between the video display device 202 (optical see-through display), the imaging device 31, and the object OB ′ located at infinity. As described in the first embodiment, the imaging optical axis A 1 of the imaging device 31 is substantially the same as the viewing direction (observation central axis) A 2 of the user when observing the object OB ′ located at infinity. I am doing (almost parallel). In such an optical design, if the vertical direction (vertical direction) distance D between the video display device 202 and the imaging device 31 is too large relative to the distance L between the video display device 202 and the object OB ′. , Parallax occurs. As described above, for example, in the HMD 201A, the video display device 202A displays the marker MB indicating the line-of-sight position of the instruction supervisor PB on the device OB that is the object. The display of the MB deviates from the object (device OB) that the worker PA is actually looking at. That is, in the relative positional relationship shown in FIG. 27, the display position of the marker MB is shifted as the object OB ′ approaches the video display device 202.
 そこで、このような視差を補正するため、図28に示すように、HMD201は、上述した構成に加えて、測定部41と、視差補正部42とを有していることが望ましい。測定部41は、映像表示装置202と、使用者がシースルーで観察する対象物OB’との距離を測定する。例えば、実施の形態1において、HMD201Bで表示される撮像画像は、HMD201Aの撮像装置31によって撮像された画像であり、その撮像画像に対するHMD201Bでの視線検知結果によって対象物OB’が同定されるため、HMD201Aの撮像装置31によって対象物OB’との距離Lを測定することは容易である(例えば撮像装置31自身で測距を行うコントラスト方式を採用して距離Lを測定できる)。 Therefore, in order to correct such parallax, as shown in FIG. 28, the HMD 201 desirably includes a measurement unit 41 and a parallax correction unit 42 in addition to the above-described configuration. The measurement unit 41 measures the distance between the video display device 202 and the object OB ′ that the user observes with see-through. For example, in the first embodiment, the captured image displayed by the HMD 201B is an image captured by the imaging device 31 of the HMD 201A, and the object OB ′ is identified by the line-of-sight detection result of the HMD 201B with respect to the captured image. It is easy to measure the distance L from the object OB ′ by the imaging device 31 of the HMD 201A (for example, the distance L can be measured by adopting a contrast method in which the imaging device 31 itself performs distance measurement).
 視差補正部42は、予め設定された映像表示装置201Aに対する撮像装置31の相対位置(図27の距離Dに対応)と、測定部41にて測定された距離Lとの関係に基づいて、対象物OB’を観察する使用者の視線方向(観察中心軸)A2と、対象物OB’を撮像する撮像装置31の撮像光軸A1との視差(角度θ)を算出し、算出された視差に基づいて、表示する視線の位置を補正する。このような視差補正部42は、CPUまたは特定の演算処理を行う回路等で構成することができる。 The parallax correction unit 42 is based on a relationship between a preset relative position of the imaging device 31 with respect to the video display device 201A (corresponding to the distance D in FIG. 27) and the distance L measured by the measurement unit 41. The parallax (angle θ) between the viewing direction (observation central axis) A 2 of the user observing the object OB ′ and the imaging optical axis A 1 of the imaging device 31 that images the object OB ′ is calculated and calculated. The position of the line of sight to be displayed is corrected based on the parallax. Such a parallax correction unit 42 can be configured by a CPU or a circuit for performing a specific calculation process.
 例えば、図27において、距離Dは設計によって予めわかっている。測定部41が距離Lを測定すると(測定工程)、視差補正部42は、距離Dと距離Lとの関係に基づき、視差に相当する角度θ(°)を算出する(算出工程)。つまり、図27で示した幾何学的関係により、tanθ=(D/L)であるため、視差補正部42は、この式より、角度θを算出することができる。 For example, in FIG. 27, the distance D is known in advance by design. When the measurement unit 41 measures the distance L (measurement step), the parallax correction unit 42 calculates an angle θ (°) corresponding to the parallax based on the relationship between the distance D and the distance L (calculation step). That is, since tan θ = (D / L) according to the geometrical relationship shown in FIG. 27, the parallax correction unit 42 can calculate the angle θ from this equation.
 視差補正部42は、算出した視差に基づいて、表示する視線の位置を補正する(補正工程)。例えば、映像表示装置202の映像(虚像)の表示領域の垂直方向の全画角を10°とし、上記で求めた視差に相当する角度θが1°であり、映像の垂直方向の表示画素数(表示素子5(図3、図5参照)の縦方向の画素数に対応)がN1(個)であれば、(1/10)×N1の画素数分だけ、マーカーの表示位置を、正規の位置(例えば映像表示装置202Aの映像の表示領域全体における指示監督者PBの注視点と対応する位置)から、映像観察時の垂直方向にずらせばよい。また、撮像装置31が映像表示装置202に対して水平方向(左右方向)にずれて位置している場合は、上記と同様の手法によって算出される所定の画素数分だけ、マーカーの表示位置を正規の位置から映像観察時の水平方向にずらせばよい。 The parallax correction unit 42 corrects the position of the line of sight to be displayed based on the calculated parallax (correction step). For example, assuming that the total angle of view in the vertical direction of the display area of the video (virtual image) of the video display device 202 is 10 °, the angle θ corresponding to the parallax obtained above is 1 °, and the number of display pixels in the vertical direction of the video If the display element 5 (corresponding to the number of pixels in the vertical direction of the display element 5 (see FIGS. 3 and 5)) is N1 (number), the display position of the marker is normalized by the number of pixels of (1/10) × N1. (For example, a position corresponding to the point of gaze of the instruction supervisor PB in the entire video display area of the video display device 202A) may be shifted in the vertical direction during video observation. In addition, when the imaging device 31 is positioned so as to be shifted in the horizontal direction (left-right direction) with respect to the video display device 202, the marker display position is set by the predetermined number of pixels calculated by the same method as described above. What is necessary is just to shift to the horizontal direction at the time of video observation from a regular position.
 このように、映像表示装置202と対象物OB’との距離によって視差が発生する場合でも、その視差を算出して、表示する視線の位置が補正されるため、表示映像の適切な位置に視線を表示させることができる。つまり、視線が対象物OB’の所定の位置(作業を指示する箇所)からずれて表示される事態を回避して、視線による適切な意思疎通を確実に実現することができる。 Thus, even when parallax occurs depending on the distance between the video display device 202 and the object OB ′, the parallax is calculated and the position of the line of sight to be displayed is corrected. Can be displayed. In other words, it is possible to avoid a situation where the line of sight is displayed with a deviation from a predetermined position (a position where the work is instructed) of the object OB ', and it is possible to reliably realize appropriate communication with the line of sight.
 (HMDの他の構成例)
 図29は、上述した各実施形態のHMD201の他の光学構成を示す断面図である。同図のように、HOE23の偏向プリズム22側の面にハーフミラー24を設け、撮像光軸A1がハーフミラー24で折れ曲がって視線方向(観察中心軸)A2と同軸となるように、撮像装置31を配置してもよい。この構成では、視差が発生しないため、上述した測定部41および視差補正部42のような視差補正機構が不要となり、そのような機構を設けなくても済む分、装置の構成を簡素化することができる。
(Other configuration examples of HMD)
FIG. 29 is a cross-sectional view showing another optical configuration of the HMD 201 of each embodiment described above. As shown in the figure, a half mirror 24 is provided on the surface of the HOE 23 on the deflection prism 22 side, and imaging is performed so that the imaging optical axis A 1 is bent by the half mirror 24 and is coaxial with the line-of-sight direction (observation central axis) A 2. The device 31 may be arranged. In this configuration, since no parallax is generated, a parallax correction mechanism such as the measurement unit 41 and the parallax correction unit 42 described above is unnecessary, and the configuration of the apparatus can be simplified by eliminating the need for such a mechanism. Can do.
 図30は、HMD201のさらに他の光学構成を示す断面図である。同図のように、HMD201の映像表示装置202は、ビデオシースルー型ディスプレイであってもよい。すなわち、以上の各実施形態で説明したHMD201において、映像表示装置202を構成する光学シースルー型ディスプレイは、ビデオシースルー型ディスプレイで置き換えて構成されてもよい。ビデオシースルー型ディスプレイは、遮蔽型ディスプレイを用い、撮像装置31によって撮像される外界の像(撮像画像)を映像と一体的に使用者に視認可能に表示する表示装置である。上記の遮蔽型ディスプレイは、上述した接眼光学系6の外界側に遮蔽板25を配置することによって実現できる。 FIG. 30 is a cross-sectional view showing still another optical configuration of the HMD 201. As shown in the figure, the video display device 202 of the HMD 201 may be a video see-through display. In other words, in the HMD 201 described in each of the above embodiments, the optical see-through display constituting the video display device 202 may be replaced with a video see-through display. The video see-through display is a display device that uses a shielded display and displays an external image (captured image) captured by the imaging device 31 so as to be visible to the user integrally with the video. The above-described shielding type display can be realized by arranging the shielding plate 25 on the outside side of the eyepiece optical system 6 described above.
 このように映像表示装置202をビデオシースルー型ディスプレイで構成した場合でも、視線表示によって意思疎通を図ることができる。また、作業者は、撮像画像を観察しながら、外界にある対象物に対して作業することが可能となる。 As described above, even when the video display device 202 is configured with a video see-through display, communication can be achieved through line-of-sight display. In addition, the operator can work on the object in the outside world while observing the captured image.
 図31は、HMD201のさらに他の光学構成を示す断面図である。同図のように、映像表示装置202がビデオシースルー型ディスプレイで構成される場合、撮像光軸A1が視線方向(観察中心軸)A2と同軸となるように、遮蔽板25の外界側(遮蔽板25に対して接眼光学系6とは反対側)に撮像装置31を配置してもよい。この構成では、視差が発生しないため、上述した測定部41および視差補正部42のような視差補正機構が不要となり、そのような機構を設けなくても済む分、装置の構成を簡素化することができる。また、ビデオシースルー型ディスプレイでは、使用者の前方に遮蔽板25が位置するため、遮蔽板25よりもさらに外界側に撮像装置31を配置しても、その撮像装置31は使用者に視認されることはなく、使用者の外界(映像)の観察の妨げとなることもない。よって、上記のように遮蔽板25の外界側に撮像装置31を配置して、撮像光軸A1と視線方向A2とを同軸にする構成を容易に実現することが可能となる。 FIG. 31 is a cross-sectional view showing still another optical configuration of the HMD 201. As shown in the figure, in the case where the video display device 202 is configured by a video see-through display, the outside of the shielding plate 25 (on the outside side) (so that the imaging optical axis A 1 is coaxial with the viewing direction (observation central axis) A 2. The imaging device 31 may be disposed on a side opposite to the eyepiece optical system 6 with respect to the shielding plate 25. In this configuration, since no parallax is generated, a parallax correction mechanism such as the measurement unit 41 and the parallax correction unit 42 described above is unnecessary, and the configuration of the apparatus can be simplified by eliminating the need for such a mechanism. Can do. Further, in the video see-through display, since the shielding plate 25 is positioned in front of the user, the imaging device 31 is visually recognized by the user even if the imaging device 31 is arranged further outside the shielding plate 25. This does not interfere with the user's observation of the outside world (video). Therefore, it is possible to easily realize the configuration in which the imaging device 31 is arranged on the outside of the shielding plate 25 as described above so that the imaging optical axis A 1 and the line-of-sight direction A 2 are coaxial.
 (その他)
 以上で説明した各実施の形態の構成や方法を適宜組み合わせることも勿論可能である。例えば、実施の形態2および3で説明したシステムにおいて、実施の形態1で説明した入力部39を適用して、視線情報を任意のタイミングでHMDから出力させるようにしてもよい。
(Other)
Of course, it is possible to appropriately combine the configurations and methods of the embodiments described above. For example, in the system described in the second and third embodiments, the line-of-sight information may be output from the HMD at an arbitrary timing by applying the input unit 39 described in the first embodiment.
 以上では、最大3人で連携して作業を行う例について説明したが、4人以上で作業を行う場合でも、各実施の形態で説明したシステム、構成、方法を適用して、相互に視線情報を共有し、連携して作業を行うことは勿論可能である。 In the above description, an example in which a maximum of three people work together has been described. However, even when four or more people work, the system, configuration, and method described in each embodiment are applied to each other so that line-of-sight information can be obtained. Of course, it is possible to share and work together.
 本実施形態の情報入出力端末は、作業者の作業状態や作業環境を把握して、作業性向上を図る観点から、GPS(Global Positioning System)センサ、地磁気センサ、加速度センサ、温度センサなどの複数のセンサを備えていてもよい。例えばGPSセンサは、作業者の位置情報を取得する。地磁気センサは、作業者が向いている方向を検知する。加速度センサは、作業者の動き(姿勢)を検知する。温度センサは、作業環境の温度または作業者自身の体温を検知する。これらのセンサで取得された情報を、通信部34を介して外部の入出力端末に送信することより、外部では、作業の指示監督者が作業環境を把握し、必要に応じて適切な指示(例えば作業の中止等)を作業者(HMD)に出すことが可能となる。 The information input / output terminal according to the present embodiment includes a plurality of GPS (Global Positioning System) sensors, geomagnetic sensors, acceleration sensors, temperature sensors, and the like from the viewpoint of improving workability by grasping the worker's work state and work environment. The sensor may be provided. For example, the GPS sensor acquires worker position information. The geomagnetic sensor detects the direction in which the worker is facing. The acceleration sensor detects the movement (posture) of the worker. The temperature sensor detects the temperature of the work environment or the body temperature of the worker himself. By transmitting the information acquired by these sensors to an external input / output terminal via the communication unit 34, the work instruction supervisor understands the work environment outside and appropriate instructions (if necessary) ( For example, it is possible to give the operator (HMD) a work stoppage or the like.
 以上、本発明の実施形態につき説明したが、本発明の範囲はこれに限定されるものではなく、発明の主旨を逸脱しない範囲で種々の変更を加えて実施することができる。 The embodiment of the present invention has been described above, but the scope of the present invention is not limited to this, and various modifications can be made without departing from the spirit of the invention.
 なお、以上で説明した視線情報共有方法および視線情報共有システムは、以下のように表現することができ、これによって以下の作用効果を奏する。 Note that the line-of-sight information sharing method and line-of-sight information sharing system described above can be expressed as follows, thereby producing the following operational effects.
 本発明の実施形態で説明した視線情報共有方法は、通信回線を介して接続される複数の情報入出力端末間で、使用者の視線情報を共有する視線情報共有方法であって、前記複数の情報入出力端末は、それぞれ、使用者に映像を提示する表示装置と、使用者の視線方向を検出する視線検出装置と、相互に情報を入出力するための通信部とを含み、前記複数の情報入出力端末の少なくとも1つは、使用者の頭部に装着され、前記映像を虚像として使用者に視認可能に表示する頭部装着型端末であり、前記頭部装着型端末は、前記表示装置、前記視線検出装置および前記通信部に加えて、前記表示装置に対して相対位置が固定され、使用者の前方の外界を撮像する撮像装置をさらに含み、前記複数の情報入出力端末のうち、いずれかの前記頭部装着型端末を第1端末とし、他の少なくとも1つの情報入出力端末を第2端末としたとき、該視線情報共有方法は、前記第1端末の前記撮像装置にて取得された撮像画像の情報を、前記第2端末に出力する第1の工程と、入力された前記撮像画像の情報に基づいて、前記第2端末にて、前記撮像画像を前記第2端末の使用者に視認可能に表示する第2の工程と、表示された前記撮像画像に対する前記第2端末の使用者の視線方向を検出する第3の工程と、検出された前記第2端末の使用者の視線方向に関する視線情報を、前記第1端末に出力する第4の工程と、入力された前記視線情報に基づいて、前記第2端末の使用者の視線の位置を、前記第1端末による前記映像の表示領域内で対応する位置に表示する第5の工程とを含む。 The line-of-sight information sharing method described in the embodiment of the present invention is a line-of-sight information sharing method for sharing user's line-of-sight information among a plurality of information input / output terminals connected via a communication line. Each of the information input / output terminals includes a display device that presents an image to a user, a line-of-sight detection device that detects a user's line-of-sight direction, and a communication unit that inputs and outputs information to and from each other, At least one of the information input / output terminals is a head-mounted terminal that is mounted on a user's head and displays the video as a virtual image so as to be visible to the user. In addition to the device, the line-of-sight detection device, and the communication unit, the display device further includes an imaging device that is fixed in a relative position with respect to the display device and that images the external world in front of the user. Any one of the head When the type terminal is the first terminal and the other at least one information input / output terminal is the second terminal, the line-of-sight information sharing method uses information on the captured image acquired by the imaging device of the first terminal. Based on the first step of outputting to the second terminal and the input information of the captured image, the captured image is displayed on the second terminal so as to be visible to the user of the second terminal. A second step, a third step of detecting a gaze direction of the user of the second terminal with respect to the displayed captured image, and gaze information relating to the detected gaze direction of the user of the second terminal, Based on the fourth step of outputting to the first terminal and the input line-of-sight information, the position of the line of sight of the user of the second terminal is associated within the display area of the video by the first terminal. And a fifth step of displaying the position.
 また、本発明の実施形態で説明した視線情報共有システムは、通信回線を介して接続される複数の情報入出力端末間で、使用者の視線情報を共有する視線情報共有システムであって、前記複数の情報入出力端末は、それぞれ、使用者に映像を提示する表示装置と、使用者の視線方向を検出する視線検出装置と、相互に情報を入出力するための通信部とを含み、前記複数の情報入出力端末の少なくとも1つは、使用者の頭部に装着され、前記映像を虚像として使用者に視認可能に表示する頭部装着型端末であり、前記頭部装着型端末は、前記表示装置および前記視線検出装置に加えて、前記表示装置に対して相対位置が固定され、使用者の前方の外界を撮像する撮像装置をさらに含み、前記複数の情報入出力端末のうち、いずれかの前記頭部装着型端末を第1端末とし、他の少なくとも1つの情報入出力端末を第2端末としたとき、前記第1端末は、該第1端末の撮像装置にて取得された撮像画像の情報を、該第1端末の通信部を介して前記第2端末に出力し、前記第2端末は、該第2端末の表示装置により、入力された前記撮像画像の情報に基づいて、前記撮像画像を前記第2端末の使用者に視認可能に表示し、該第2端末の視線検出装置により、表示された前記撮像画像に対する前記第2端末の使用者の視線方向を検出した後、前記視線方向に関する視線情報を、該第2端末の通信部を介して前記第1端末に出力し、前記第1端末は、該第1端末の表示装置により、入力された前記視線情報に基づいて、前記第2端末の使用者の視線の位置を、前記第1端末による前記映像の表示領域内で対応する位置に表示する。 The line-of-sight information sharing system described in the embodiment of the present invention is a line-of-sight information sharing system that shares line-of-sight information of a user between a plurality of information input / output terminals connected via a communication line, Each of the plurality of information input / output terminals includes a display device that presents an image to the user, a line-of-sight detection device that detects a user's line-of-sight direction, and a communication unit for inputting and outputting information to and from each other, At least one of the plurality of information input / output terminals is a head-mounted terminal that is mounted on a user's head and displays the video as a virtual image so as to be visible to the user. In addition to the display device and the line-of-sight detection device, the image processing device further includes an imaging device whose relative position is fixed with respect to the display device and images the external environment in front of the user, The head When the destination terminal is the first terminal and the other at least one information input / output terminal is the second terminal, the first terminal uses the captured image information acquired by the imaging device of the first terminal, The second terminal outputs the captured image to the second terminal via the communication unit of the first terminal based on the information of the captured image input by the display device of the second terminal. A line of sight regarding the line-of-sight direction is displayed after the line-of-sight detecting device of the second terminal displays the line-of-sight of the displayed image with respect to the displayed captured image. Information is output to the first terminal via the communication unit of the second terminal, and the first terminal is configured to output the second terminal based on the line-of-sight information input by the display device of the first terminal. The position of the user's line of sight is displayed in the table of the video by the first terminal. Displayed in the corresponding position in the area.
 上記の方法およびシステムによれば、第1端末(HMD)の撮像装置で取得された撮像画像の情報は、通信部を介して第2端末に出力される。第2端末では、表示装置により、上記撮像画像が表示され、その表示された撮像画像に対する第2端末の使用者の視線方向が視線方向検出装置によって検出される。検出された視線情報は、第2端末から通信部を介して第1端末に出力される。これにより、第2端末の使用者の視線情報が、第1端末と第2端末とで共有される。第1端末では、表示装置により、上記視線情報に基づいて、第2端末の使用者の視線の位置が表示される。 According to the above method and system, information of the captured image acquired by the imaging device of the first terminal (HMD) is output to the second terminal via the communication unit. In the second terminal, the captured image is displayed by the display device, and the visual direction of the user of the second terminal with respect to the displayed captured image is detected by the visual direction detection device. The detected line-of-sight information is output from the second terminal to the first terminal via the communication unit. Thereby, the line-of-sight information of the user of the second terminal is shared between the first terminal and the second terminal. In the first terminal, the display device displays the position of the line of sight of the user of the second terminal based on the line-of-sight information.
 ここで、第1端末において、撮像装置は表示装置に対して相対位置が固定されており、第2端末で表示された撮像画像は、元々第1端末で取得された撮像画像である。このため、第2端末で表示された撮像画像に対する第2端末の使用者の注視点は、第1端末において、撮像装置と相対位置が固定された表示装置による映像(虚像)の表示領域内の任意の位置と対応関係を持つ。したがって、第1端末の使用者および第2端末の使用者の相互の空間位置を把握して視線の位置座標を変換するなどの複雑な処理を必要とすることなく、第1端末にて、第2端末の使用者の視線の位置を所定の位置(第2端末の使用者の注視点と対応する位置)に即座に表示することが可能となり、その表示に基づいて相互に意思疎通を行うことが可能となる。つまり、複数の情報入出力端末を用いた協働作業において、上記複雑な処理を必要とすることなく簡便な方法で、視線に基づいて相互に意思疎通を行うことが可能となる。 Here, in the first terminal, the imaging device has a fixed relative position with respect to the display device, and the captured image displayed on the second terminal is the captured image originally acquired by the first terminal. For this reason, the user's gaze point of the second terminal with respect to the captured image displayed on the second terminal is within the display area of the video (virtual image) by the display device whose relative position is fixed to the imaging device on the first terminal. Corresponding to any position. Therefore, the first terminal does not require complicated processing such as understanding the mutual spatial position of the user of the first terminal and the user of the second terminal and converting the position coordinates of the line of sight. It is possible to immediately display the position of the line of sight of the user of the two terminals at a predetermined position (the position corresponding to the gazing point of the user of the second terminal), and communicate with each other based on the display Is possible. That is, in collaborative work using a plurality of information input / output terminals, it is possible to communicate with each other based on the line of sight by a simple method without requiring the above complicated processing.
 また、例えば、対象物に対する協働作業の際に、第2端末の使用者が対象物を直接視認できない状況であっても、第2端末の使用者は、第1端末から出力される撮像画像に基づいて対象物を把握することができる。そして、その撮像画像に対する第2端末の使用者の視線情報を第1端末に出力し、第1端末にて第2端末の使用者の視線位置を表示させることにより、第1端末の使用者は、表示された視線に基づいて対象物に対して作業を行うことが可能となる。つまり、いずれかの使用者(上記の例では第2端末の使用者)が対象物を視認できないような状況であっても、視線に基づく的確な意思疎通が可能となる。 In addition, for example, even when the user of the second terminal cannot visually recognize the target object at the time of the collaborative work on the target object, the user of the second terminal outputs the captured image output from the first terminal. The object can be grasped based on the above. The user of the first terminal outputs the line-of-sight information of the user of the second terminal with respect to the captured image to the first terminal, and displays the line-of-sight position of the user of the second terminal on the first terminal. Thus, it is possible to work on the object based on the displayed line of sight. That is, even if any one of the users (the user of the second terminal in the above example) cannot visually recognize the target object, accurate communication based on the line of sight is possible.
 上記の視線情報共有方法は、前記第1端末の使用者の視線方向を検出する第6の工程をさらに含み、前記第5の工程では、前記第2端末の使用者の視線の位置と合わせて、前記第6の工程での検出結果に基づく前記第1端末の使用者の視線の位置を表示してもよい。また、上記の視線情報共有システムにおいて、前記第1端末は、該第1端末の視線検出装置により、前記第1端末の使用者の視線方向を検出し、該第1端末の表示装置により、前記視線検出装置での検出結果に基づいて、前記第1端末の使用者の視線の位置を、前記第2端末の使用者の視線の位置と合わせて表示してもよい。 The line-of-sight information sharing method further includes a sixth step of detecting a line-of-sight direction of the user of the first terminal. In the fifth step, the line-of-sight information sharing method is combined with the position of the line of sight of the user of the second terminal. The position of the line of sight of the user of the first terminal based on the detection result in the sixth step may be displayed. In the above-described line-of-sight information sharing system, the first terminal detects the line-of-sight direction of the user of the first terminal by using the line-of-sight detection device of the first terminal, and the display device of the first terminal Based on the detection result of the line-of-sight detection device, the position of the line of sight of the user of the first terminal may be displayed together with the position of the line of sight of the user of the second terminal.
 第1端末において、第2端末の使用者の視線の位置と合わせて、第1端末の使用者の視線の位置を合わせて表示することにより、第1端末の使用者は、第2端末の使用者との間で視線に基づく意思疎通をより正確に行うことが可能となる。 In the first terminal, the user of the first terminal uses the second terminal by displaying the position of the user's line of sight along with the position of the user's line of sight of the second terminal. It is possible to more accurately communicate with the person based on the line of sight.
 上記の視線情報共有方法は、前記第1端末の使用者の前記視線方向に関する視線情報を、前記第2端末に出力する第7の工程と、前記第2端末にて、前記第1の工程によって取得される前記撮像画像に、前記視線情報に基づく前記第1端末の使用者の視線の位置と、前記第3の工程での検出結果に基づく前記第2端末の使用者の視線の位置とを合わせて表示する第8の工程とをさらに含んでいてもよい。また、上記の視線情報共有システムにおいて、前記第1端末は、前記第1端末の使用者の視線方向に関する視線情報を、該第1端末の前記通信部を介して前記第2端末に出力し、前記第2端末は、該第2端末の前記表示装置により、前記撮像画像に、前記視線情報に基づく前記第1端末の使用者の視線の位置と、前記第2端末の使用者の視線の位置とを合わせて表示してもよい。 The line-of-sight information sharing method includes: a seventh step of outputting line-of-sight information related to the line-of-sight direction of the user of the first terminal to the second terminal; and the second terminal by the first step In the acquired captured image, the position of the line of sight of the user of the first terminal based on the line of sight information and the position of the line of sight of the user of the second terminal based on the detection result in the third step. And an eighth step of displaying together. In the above-described line-of-sight information sharing system, the first terminal outputs line-of-sight information regarding the line-of-sight direction of the user of the first terminal to the second terminal via the communication unit of the first terminal, The second terminal uses the display device of the second terminal to position the line of sight of the user of the first terminal based on the line-of-sight information in the captured image and the position of the line of sight of the user of the second terminal. And may be displayed together.
 第2端末において、第1端末から出力される撮像画像に、第1端末の使用者の視線の位置と、第2端末の使用者の視線の位置とを合わせて表示することにより、第2端末の使用者は、第1端末の使用者との間で視線に基づく意思疎通をより正確に行うことが可能となる。 In the second terminal, the captured image output from the first terminal displays the position of the line of sight of the user of the first terminal and the position of the line of sight of the user of the second terminal, thereby displaying the second terminal. Can more accurately communicate with the user of the first terminal based on the line of sight.
 上記の視線情報共有方法および上記視線情報共有システムにおいて、前記第1端末の使用者の視線の位置と、前記第2端末の使用者の視線の位置とは、異なるパターンで表示されることが望ましい。この場合、相互の視線位置を容易に区別して把握することが可能となるため、迅速かつ的確な意思疎通が可能となる。 In the line-of-sight information sharing method and the line-of-sight information sharing system, it is preferable that the line of sight of the user of the first terminal and the line of sight of the user of the second terminal are displayed in different patterns. . In this case, since it is possible to easily distinguish and grasp the mutual line-of-sight positions, it is possible to communicate quickly and accurately.
 上記の視線情報共有方法において、前記撮像画像の情報の出力、前記撮像画像の表示、前記撮像画像に対する前記視線方向の検出、前記視線情報の出力、前記視線の位置の表示を、少なくとも2つの前記情報入出力端末間で相互に行ってもよい。また、上記の視線情報共有システムにおいて、前記撮像画像の情報の出力、前記撮像画像の表示、前記撮像画像に対する前記視線方向の検出、前記視線情報の出力、前記視線の位置の表示が、少なくとも2つの前記情報入出力端末間で相互に行われてもよい。この場合、少なくとも2つの情報入出力端末間で視線情報を相互に共有して、協働作業を効率よく行うことが可能となる。 In the gaze information sharing method, at least two of the output of the captured image information, the display of the captured image, the detection of the gaze direction with respect to the captured image, the output of the gaze information, and the display of the position of the gaze You may perform mutually between information input / output terminals. In the above-described line-of-sight information sharing system, the output of the captured image information, the display of the captured image, the detection of the line-of-sight direction with respect to the captured image, the output of the line-of-sight information, and the display of the position of the line of sight are at least 2. The two information input / output terminals may be performed mutually. In this case, the line-of-sight information can be shared between at least two information input / output terminals so that the collaborative work can be performed efficiently.
 上記の視線情報共有方法および視線情報共有システムにおいて、一の情報入出力端末から他の情報入出力端末に出力された前記撮像画像の表示領域内に前記他の情報入出力端末の使用者の視線が位置する場合のみ、前記他の情報入出力端末は、前記使用者の視線方向に関する視線情報を前記一の情報入出力端末に出力することが望ましい。 In the above-described line-of-sight information sharing method and line-of-sight information sharing system, the line of sight of the user of the other information input / output terminal within the display area of the captured image output from one information input / output terminal to the other information input / output terminal It is preferable that the other information input / output terminal outputs line-of-sight information related to the user's line-of-sight direction to the one information input / output terminal only when the is located.
 他の情報入出力端末の使用者が撮像画像を見ており、一の情報入出力端末の使用者との間で視線に基づく意思疎通が実際に必要な場合のみ、他の情報入出力端末の使用者の視線情報を一の情報入出力端末に出力することにより、一の情報入出力端末への不要な(無駄な)情報の出力を減らして、システム負荷を低減することができる。 Only when a user of another information input / output terminal is looking at a captured image and actually needs communication based on the line of sight with the user of one information input / output terminal, By outputting the user's line-of-sight information to one information input / output terminal, it is possible to reduce the output of unnecessary (unnecessary) information to one information input / output terminal and reduce the system load.
 上記の視線情報共有方法および視線情報共有システムにおいて、前記他の情報入出力端末は、前記視線情報を出力するタイミングを指定するための入力部をさらに含み、前記入力部によって指定されたタイミングで、前記他の情報入出力端末の使用者の前記視線情報を前記一の情報入出力端末に出力してもよい。 In the line-of-sight information sharing method and line-of-sight information sharing system, the other information input / output terminal further includes an input unit for designating a timing for outputting the line-of-sight information, and at a timing designated by the input unit, The line-of-sight information of the user of the other information input / output terminal may be output to the one information input / output terminal.
 例えば、他の情報入出力端末において、撮像画像の表示領域を他の情報入出力端末の使用者が見ている場合に、上記使用者が入力部によって視線情報の出力タイミングを指定(同定)することで、そのタイミングで上記視線情報が他の情報入出力端末から一の情報入出力端末に出力される。したがって、上記撮像画像の表示領域内に、他の情報入出力端末の使用者の視線が位置する場合のみ、他の情報入出力端末が視線情報を一の情報入出力端末に出力する構成を確実に実現することができる。また、他の情報入出力端末から出力される視線情報を、確実に、他の情報入出力端末の使用者が撮像画像の表示領域を見ているときの視線情報とすることができるため、その視線情報を用いて簡便で高精度な意思疎通を実現することが可能となる。 For example, in another information input / output terminal, when the user of the other information input / output terminal views the display area of the captured image, the user specifies (identifies) the output timing of the line-of-sight information by the input unit. Thus, the line-of-sight information is output from another information input / output terminal to one information input / output terminal at that timing. Therefore, the configuration in which the other information input / output terminal outputs the line-of-sight information to one information input / output terminal is ensured only when the line of sight of the user of the other information input / output terminal is located within the display area of the captured image. Can be realized. In addition, the line-of-sight information output from another information input / output terminal can be surely set as the line-of-sight information when the user of the other information input / output terminal is looking at the display area of the captured image. It becomes possible to realize simple and highly accurate communication using the line-of-sight information.
 上記の視線情報共有方法および視線情報共有システムにおいて、前記複数の情報入出力端末は、それぞれ、音声情報の入力装置および出力装置をさらに含み、前記複数の情報入出力端末間で前記音声情報が入出力されてもよい。視線情報に加えて音声情報を用いた意思疎通が可能となるため、意思疎通がさらに簡便となる。また、視線情報の内容を音声情報で補足できるため、意思疎通の精度もさらに向上する。 In the above-described line-of-sight information sharing method and line-of-sight information sharing system, each of the plurality of information input / output terminals further includes a voice information input device and an output device, and the voice information is input between the plurality of information input / output terminals. It may be output. Since communication using voice information in addition to line-of-sight information is possible, communication is further simplified. In addition, since the content of the line-of-sight information can be supplemented with audio information, the accuracy of communication is further improved.
 上記の視線情報共有方法および視線情報共有システムにおいて、前記頭部装着型端末の前記表示装置は、使用者が映像とともに外界を直接観察可能な光学シースルー型ディスプレイであってもよい。この場合、使用者は、外界にある対象物を直接観察しながら作業を行うことが可能となり、作業性を向上させることができる。 In the above-described line-of-sight information sharing method and line-of-sight information sharing system, the display device of the head-mounted terminal may be an optical see-through display that allows a user to directly observe the outside world together with an image. In this case, the user can perform work while directly observing the object in the outside world, and workability can be improved.
 上記の視線情報共有方法は、前記頭部装着型端末の前記表示装置と使用者がシースルーで観察する対象物との距離を測定する工程と、予め設定された前記表示装置に対する前記撮像装置の相対位置と、前記距離との関係に基づいて、前記対象物を観察する使用者の観察視線と、前記対象物を撮像する前記撮像装置の撮像光軸との視差を算出する工程と、算出された視差に基づいて、表示する視線の位置を補正する工程とをさらに含んでいてもよい。また、上記の視線情報共有システムにおいて、前記頭部装着型端末は、前記表示装置と使用者がシースルーで観察する対象物との距離を測定する測定部と、予め設定された前記表示装置に対する前記撮像装置の相対位置と、前記距離との関係に基づいて、前記対象物を観察する使用者の観察視線と、前記対象物を撮像する前記撮像装置の撮像光軸との視差を算出し、算出された視差に基づいて、表示する視線の位置を補正する視差補正部とをさらに含んでいてもよい。 The line-of-sight information sharing method includes a step of measuring a distance between the display device of the head-mounted terminal and an object observed by a user through the see-through method, and a relative relationship of the imaging device with respect to the preset display device A step of calculating a parallax between an observation line of sight of a user observing the object and an imaging optical axis of the imaging apparatus that images the object based on a relationship between a position and the distance; And a step of correcting the position of the line of sight to be displayed based on the parallax. In the above-described line-of-sight information sharing system, the head-mounted terminal includes a measuring unit that measures a distance between the display device and an object that a user observes through, and the preset display device. Based on the relationship between the relative position of the imaging device and the distance, the parallax between the observation line of sight of the user observing the object and the imaging optical axis of the imaging device that images the object is calculated. And a parallax correction unit that corrects the position of the line of sight to be displayed based on the parallax.
 表示装置と対象物との距離に応じて、使用者の観察視線と撮像装置の撮像光軸との視差が発生する場合でも、その視差に基づいて、表示する視線の位置が補正されるため、表示映像の適切な位置に視線を表示させることができる。 Even when a parallax between the user's observation line of sight and the imaging optical axis of the imaging device occurs according to the distance between the display device and the object, the position of the line of sight to be displayed is corrected based on the parallax. The line of sight can be displayed at an appropriate position in the display image.
 上記の視線情報共有方法および視線情報共有システムにおいて、前記頭部装着型端末の前記表示装置は、前方が遮蔽され、外界の撮像画像を映像と一体的に使用者に視認可能に表示するビデオシースルー型ディスプレイであってもよい。使用者は、シースルーで直接対象物を観察できなくても、外界の撮像画像を観察することで対象物を把握することができ、これによって、対象物に対して作業を行うことが可能となる。 In the above-described line-of-sight information sharing method and line-of-sight information sharing system, the display device of the head-mounted terminal shields the front and displays a captured image of the outside world so as to be visible to the user integrally with the video. Type display. Even if the user cannot directly observe the object through the see-through, the user can grasp the object by observing the captured image of the outside world, thereby enabling the user to work on the object. .
 上記の視線情報共有方法および視線情報共有システムにおいて、前記表示装置は、前記映像を表示する表示素子を含み、前記表示素子の表示面の中心と映像観察時の使用者の瞳中心とを光学的に結ぶ軸を、観察中心軸としたとき、前記撮像装置は、前記観察中心軸と撮像光軸とが同軸となるように配置されていてもよい。観察中心軸と撮像光軸との視差が発生しないため、表示する視線の位置精度を向上させるために視差を補正する機構が不要となり、そのような機構を設けなくても済む分、装置の構成を簡素化することができる。 In the above-described line-of-sight information sharing method and line-of-sight information sharing system, the display device includes a display element that displays the video, and optically displays a center of a display surface of the display element and a pupil center of a user during video observation. The imaging device may be arranged so that the observation center axis and the imaging optical axis are coaxial with each other when the axis connected to is the observation center axis. Since the parallax between the observation center axis and the imaging optical axis does not occur, a mechanism for correcting parallax is not required in order to improve the positional accuracy of the line of sight to be displayed. Can be simplified.
 本発明は、通信回線を介して接続される複数の情報入出力端末間で、使用者の視線情報を共有するシステムに利用可能である。 The present invention can be used in a system for sharing user's line-of-sight information among a plurality of information input / output terminals connected via a communication line.
   5   表示素子
  31   撮像装置
  32   視線検出装置
  36   音声入力装置(入力装置)
  37   音声出力装置(出力装置)
  39   入力部
  41   測定部
  42   視差補正部
 100   視線情報共有システム
 200   情報入出力端末
 200a  表示装置
 201、201A、201B、201C、201D   HMD(頭部装着型端末)
 202、202A、202B、202C、202D   映像表示装置(表示装置)
 300   通信回線
5 Display Element 31 Imaging Device 32 Gaze Detection Device 36 Voice Input Device (Input Device)
37 Audio output device (output device)
39 Input unit 41 Measuring unit 42 Parallax correcting unit 100 Line-of-sight information sharing system 200 Information input / output terminal 200a Display device 201, 201A, 201B, 201C, 201D HMD (head-mounted terminal)
202, 202A, 202B, 202C, 202D Video display device (display device)
300 Communication line

Claims (24)

  1.  通信回線を介して接続される複数の情報入出力端末間で、使用者の視線情報を共有する視線情報共有方法であって、
     前記複数の情報入出力端末は、それぞれ、使用者に映像を提示する表示装置と、使用者の視線方向を検出する視線検出装置と、相互に情報を入出力するための通信部とを含み、
     前記複数の情報入出力端末の少なくとも1つは、使用者の頭部に装着され、前記映像を虚像として使用者に視認可能に表示する頭部装着型端末であり、
     前記頭部装着型端末は、前記表示装置、前記視線検出装置および前記通信部に加えて、前記表示装置に対して相対位置が固定され、使用者の前方の外界を撮像する撮像装置をさらに含み、
     前記複数の情報入出力端末のうち、いずれかの前記頭部装着型端末を第1端末とし、他の少なくとも1つの情報入出力端末を第2端末としたとき、
     該視線情報共有方法は、
     前記第1端末の前記撮像装置にて取得された撮像画像の情報を、前記第2端末に出力する第1の工程と、
     入力された前記撮像画像の情報に基づいて、前記第2端末にて、前記撮像画像を前記第2端末の使用者に視認可能に表示する第2の工程と、
     表示された前記撮像画像に対する前記第2端末の使用者の視線方向を検出する第3の工程と、
     検出された前記第2端末の使用者の視線方向に関する視線情報を、前記第1端末に出力する第4の工程と、
     入力された前記視線情報に基づいて、前記第2端末の使用者の視線の位置を、前記第1端末による前記映像の表示領域内で対応する位置に表示する第5の工程とを含む、視線情報共有方法。
    A line-of-sight information sharing method for sharing line-of-sight information of a user between a plurality of information input / output terminals connected via a communication line,
    Each of the plurality of information input / output terminals includes a display device that presents a video to the user, a line-of-sight detection device that detects a user's line-of-sight direction, and a communication unit for inputting and outputting information to and from each other.
    At least one of the plurality of information input / output terminals is a head-mounted terminal that is mounted on a user's head and displays the video as a virtual image so as to be visible to the user,
    In addition to the display device, the line-of-sight detection device, and the communication unit, the head-mounted terminal further includes an imaging device whose relative position is fixed with respect to the display device and images the external world in front of the user. ,
    Among the plurality of information input / output terminals, when one of the head-mounted terminals is a first terminal and the other at least one information input / output terminal is a second terminal,
    The line-of-sight information sharing method includes:
    A first step of outputting information of a captured image acquired by the imaging device of the first terminal to the second terminal;
    A second step of displaying, on the second terminal, the captured image so as to be visible to the user of the second terminal, based on the input information of the captured image;
    A third step of detecting a line-of-sight direction of the user of the second terminal with respect to the displayed captured image;
    A fourth step of outputting the detected line-of-sight information about the line-of-sight direction of the user of the second terminal to the first terminal;
    A line of sight including a fifth step of displaying, based on the input line-of-sight information, the position of the line of sight of the user of the second terminal at a corresponding position in the display area of the video by the first terminal. Information sharing method.
  2.  前記第1端末の使用者の視線方向を検出する第6の工程をさらに含み、
     前記第5の工程では、前記第2端末の使用者の視線の位置と合わせて、前記第6の工程での検出結果に基づく前記第1端末の使用者の視線の位置を表示する、請求項1に記載の視線情報共有方法。
    A sixth step of detecting a gaze direction of the user of the first terminal;
    The said 5th process displays the position of the user's eyes | visual_axis of the said 1st terminal based on the detection result in the said 6th process together with the position of the user's eyes | visual_axis of the said 2nd terminal. The line-of-sight information sharing method according to 1.
  3.  前記第1端末の使用者の前記視線方向に関する視線情報を、前記第2端末に出力する第7の工程と、
     前記第2端末にて、前記第1の工程によって取得される前記撮像画像に、前記視線情報に基づく前記第1端末の使用者の視線の位置と、前記第3の工程での検出結果に基づく前記第2端末の使用者の視線の位置とを合わせて表示する第8の工程とをさらに含む、請求項2に記載の視線情報共有方法。
    A seventh step of outputting gaze information related to the gaze direction of the user of the first terminal to the second terminal;
    In the second terminal, based on the captured image acquired in the first step based on the position of the user's line of sight based on the line-of-sight information and the detection result in the third step. The line-of-sight information sharing method according to claim 2, further comprising an eighth step of displaying the position of the line of sight of the user of the second terminal together.
  4.  前記第1端末の使用者の視線の位置と、前記第2端末の使用者の視線の位置とは、異なるパターンで表示される、請求項2または3に記載の視線情報共有方法。 The line-of-sight information sharing method according to claim 2 or 3, wherein the position of the line of sight of the user of the first terminal and the position of the line of sight of the user of the second terminal are displayed in different patterns.
  5.  前記撮像画像の情報の出力、前記撮像画像の表示、前記撮像画像に対する前記視線方向の検出、前記視線情報の出力、前記視線の位置の表示を、少なくとも2つの前記情報入出力端末間で相互に行う、請求項1から4のいずれかに記載の視線情報共有方法。 Output of the captured image information, display of the captured image, detection of the line-of-sight direction with respect to the captured image, output of the line-of-sight information, and display of the position of the line of sight are mutually performed between the at least two information input / output terminals. The line-of-sight information sharing method according to claim 1, wherein the line-of-sight information sharing method is performed.
  6.  一の情報入出力端末から他の情報入出力端末に出力された前記撮像画像の表示領域内に前記他の情報入出力端末の使用者の視線が位置する場合のみ、前記他の情報入出力端末は、前記使用者の視線方向に関する視線情報を前記一の情報入出力端末に出力する、請求項1から5のいずれかに記載の視線情報共有方法。 The other information input / output terminal only when the line of sight of the user of the other information input / output terminal is located within the display area of the captured image output from one information input / output terminal to the other information input / output terminal. The line-of-sight information sharing method according to claim 1, wherein line-of-sight information relating to the direction of the line of sight of the user is output to the one information input / output terminal.
  7.  前記他の情報入出力端末は、前記視線情報を出力するタイミングを指定するための入力部をさらに含み、前記入力部によって指定されたタイミングで、前記他の情報入出力端末の使用者の前記視線情報を前記一の情報入出力端末に出力する、請求項6に記載の視線情報共有方法。 The other information input / output terminal further includes an input unit for designating a timing for outputting the line-of-sight information, and the line of sight of a user of the other information input / output terminal at a timing designated by the input unit. The line-of-sight information sharing method according to claim 6, wherein information is output to the one information input / output terminal.
  8.  前記複数の情報入出力端末は、それぞれ、音声情報の入力装置および出力装置をさらに含み、
     前記複数の情報入出力端末間で前記音声情報が入出力される、請求項1から7のいずれかに記載の視線情報共有方法。
    Each of the plurality of information input / output terminals further includes an input device and an output device for audio information,
    The line-of-sight information sharing method according to claim 1, wherein the audio information is input / output between the plurality of information input / output terminals.
  9.  前記頭部装着型端末の前記表示装置は、使用者が映像とともに外界を直接観察可能な光学シースルー型ディスプレイである、請求項1から8のいずれかに記載の視線情報共有方法。 The line-of-sight information sharing method according to any one of claims 1 to 8, wherein the display device of the head-mounted terminal is an optical see-through display that allows a user to directly observe the outside world together with an image.
  10.  前記頭部装着型端末の前記表示装置と使用者がシースルーで観察する対象物との距離を測定する工程と、
     予め設定された前記表示装置に対する前記撮像装置の相対位置と、前記距離との関係に基づいて、前記対象物を観察する使用者の観察視線と、前記対象物を撮像する前記撮像装置の撮像光軸との視差を算出する工程と、
     算出された視差に基づいて、表示する視線の位置を補正する工程とをさらに含む、請求項9に記載の視線情報共有方法。
    Measuring the distance between the display device of the head-mounted terminal and an object observed by a user see-through,
    Based on a relationship between the relative position of the imaging device with respect to the preset display device and the distance, an observation line of sight of a user observing the object, and imaging light of the imaging device that images the object Calculating a parallax with the axis;
    The line-of-sight information sharing method according to claim 9, further comprising a step of correcting a position of the line of sight to be displayed based on the calculated parallax.
  11.  前記頭部装着型端末の前記表示装置は、前方が遮蔽され、外界の撮像画像を映像と一体的に使用者に視認可能に表示するビデオシースルー型ディスプレイである、請求項1から8のいずれかに記載の視線情報共有方法。 The display device of the head-mounted terminal is a video see-through display that is shielded from the front and displays a captured image of the outside world so as to be visible to the user integrally with the video. Line-of-sight information sharing method described in 1.
  12.  前記表示装置は、前記映像を表示する表示素子を含み、
     前記表示素子の表示面の中心と映像観察時の使用者の瞳中心とを光学的に結ぶ軸を、観察中心軸としたとき、
     前記撮像装置は、前記観察中心軸と撮像光軸とが同軸となるように配置されている、請求項9または11に記載の視線情報共有方法。
    The display device includes a display element for displaying the video,
    When the axis that optically connects the center of the display surface of the display element and the pupil center of the user at the time of video observation is the observation center axis,
    The line-of-sight information sharing method according to claim 9 or 11, wherein the imaging device is arranged so that the observation center axis and the imaging optical axis are coaxial.
  13.  通信回線を介して接続される複数の情報入出力端末間で、使用者の視線情報を共有する視線情報共有システムであって、
     前記複数の情報入出力端末は、それぞれ、使用者に映像を提示する表示装置と、使用者の視線方向を検出する視線検出装置と、相互に情報を入出力するための通信部とを含み、
     前記複数の情報入出力端末の少なくとも1つは、使用者の頭部に装着され、前記映像を虚像として使用者に視認可能に表示する頭部装着型端末であり、
     前記頭部装着型端末は、前記表示装置および前記視線検出装置に加えて、前記表示装置に対して相対位置が固定され、使用者の前方の外界を撮像する撮像装置をさらに含み、
     前記複数の情報入出力端末のうち、いずれかの前記頭部装着型端末を第1端末とし、他の少なくとも1つの情報入出力端末を第2端末としたとき、
     前記第1端末は、該第1端末の撮像装置にて取得された撮像画像の情報を、該第1端末の通信部を介して前記第2端末に出力し、
     前記第2端末は、該第2端末の表示装置により、入力された前記撮像画像の情報に基づいて、前記撮像画像を前記第2端末の使用者に視認可能に表示し、該第2端末の視線検出装置により、表示された前記撮像画像に対する前記第2端末の使用者の視線方向を検出した後、前記視線方向に関する視線情報を、該第2端末の通信部を介して前記第1端末に出力し、
     前記第1端末は、該第1端末の表示装置により、入力された前記視線情報に基づいて、前記第2端末の使用者の視線の位置を、前記第1端末による前記映像の表示領域内で対応する位置に表示する、視線情報共有システム。
    A line-of-sight information sharing system for sharing line-of-sight information of a user between a plurality of information input / output terminals connected via a communication line,
    Each of the plurality of information input / output terminals includes a display device that presents a video to the user, a line-of-sight detection device that detects a user's line-of-sight direction, and a communication unit for inputting and outputting information to and from each other.
    At least one of the plurality of information input / output terminals is a head-mounted terminal that is mounted on a user's head and displays the video as a virtual image so as to be visible to the user,
    In addition to the display device and the line-of-sight detection device, the head-mounted terminal further includes an imaging device whose relative position is fixed with respect to the display device and images the external world ahead of the user,
    Among the plurality of information input / output terminals, when one of the head-mounted terminals is a first terminal and the other at least one information input / output terminal is a second terminal,
    The first terminal outputs information of a captured image acquired by the imaging device of the first terminal to the second terminal via a communication unit of the first terminal;
    The second terminal displays the captured image so as to be visible to the user of the second terminal based on the information of the captured image input by the display device of the second terminal. After the gaze detection device detects the gaze direction of the user of the second terminal with respect to the displayed captured image, gaze information related to the gaze direction is transmitted to the first terminal via the communication unit of the second terminal. Output,
    Based on the line-of-sight information input by the display device of the first terminal, the first terminal determines the position of the line of sight of the user of the second terminal within the display area of the video by the first terminal. A line-of-sight information sharing system that displays in the corresponding position.
  14.  前記第1端末は、該第1端末の視線検出装置により、前記第1端末の使用者の視線方向を検出し、該第1端末の表示装置により、前記視線検出装置での検出結果に基づいて、前記第1端末の使用者の視線の位置を、前記第2端末の使用者の視線の位置と合わせて表示する、請求項13に記載の視線情報共有システム。 The first terminal detects a gaze direction of a user of the first terminal by a gaze detection device of the first terminal, and based on a detection result of the gaze detection device by a display device of the first terminal The line-of-sight information sharing system according to claim 13, wherein the line-of-sight position of the user of the first terminal is displayed together with the line-of-sight position of the user of the second terminal.
  15.  前記第1端末は、前記第1端末の使用者の視線方向に関する視線情報を、該第1端末の前記通信部を介して前記第2端末に出力し、
     前記第2端末は、該第2端末の前記表示装置により、前記撮像画像に、前記視線情報に基づく前記第1端末の使用者の視線の位置と、前記第2端末の使用者の視線の位置とを合わせて表示する、請求項14に記載の視線情報共有システム。
    The first terminal outputs line-of-sight information related to the line-of-sight direction of the user of the first terminal to the second terminal via the communication unit of the first terminal,
    The second terminal uses the display device of the second terminal to position the line of sight of the user of the first terminal based on the line-of-sight information in the captured image and the position of the line of sight of the user of the second terminal. The line-of-sight information sharing system according to claim 14, which is displayed together.
  16.  前記第1端末の使用者の視線の位置と、前記第2端末の使用者の視線の位置とは、異なるパターンで表示される、請求項14または15に記載の視線情報共有システム。 The line-of-sight information sharing system according to claim 14 or 15, wherein the position of the line of sight of the user of the first terminal and the position of the line of sight of the user of the second terminal are displayed in different patterns.
  17.  前記撮像画像の情報の出力、前記撮像画像の表示、前記撮像画像に対する前記視線方向の検出、前記視線情報の出力、前記視線の位置の表示が、少なくとも2つの前記情報入出力端末間で相互に行われる、請求項13から16のいずれかに記載の視線情報共有システム。 Output of the captured image information, display of the captured image, detection of the line-of-sight direction with respect to the captured image, output of the line-of-sight information, display of the position of the line of sight are mutually performed between the at least two information input / output terminals. The line-of-sight information sharing system according to claim 13, wherein the line-of-sight information sharing system is performed.
  18.  一の情報入出力端末から他の情報入出力端末に出力された前記撮像画像の表示領域内に前記他の情報入出力端末の使用者の視線が位置する場合のみ、前記他の情報入出力端末は、前記使用者の視線方向に関する視線情報を前記一の情報入出力端末に出力する、請求項13から17のいずれかに記載の視線情報共有システム。 The other information input / output terminal only when the line of sight of the user of the other information input / output terminal is located within the display area of the captured image output from one information input / output terminal to the other information input / output terminal. The line-of-sight information sharing system according to claim 13, wherein line-of-sight information related to the line-of-sight direction of the user is output to the one information input / output terminal.
  19.  前記他の情報入出力端末は、前記視線情報を出力するタイミングを指定するための入力部をさらに含み、前記入力部によって指定されたタイミングで、前記他の情報入出力端末の使用者の前記視線情報を前記一の情報入出力端末に出力する、請求項18に記載の視線情報共有システム。 The other information input / output terminal further includes an input unit for designating a timing for outputting the line-of-sight information, and the line of sight of a user of the other information input / output terminal at a timing designated by the input unit. The line-of-sight information sharing system according to claim 18, wherein information is output to the one information input / output terminal.
  20.  前記複数の情報入出力端末は、それぞれ、音声情報の入力装置および出力装置をさらに含み、
     前記複数の情報入出力端末間で前記音声情報が入出力される、請求項13から19のいずれかに記載の視線情報共有システム。
    Each of the plurality of information input / output terminals further includes an input device and an output device for audio information,
    The line-of-sight information sharing system according to claim 13, wherein the audio information is input / output between the plurality of information input / output terminals.
  21.  前記頭部装着型端末の前記表示装置は、使用者が映像とともに外界を直接観察可能な光学シースルー型ディスプレイである、請求項13から20のいずれかに記載の視線情報共有システム。 The line-of-sight information sharing system according to any one of claims 13 to 20, wherein the display device of the head-mounted terminal is an optical see-through display that allows a user to directly observe the outside world together with an image.
  22.  前記頭部装着型端末は、
     前記表示装置と使用者がシースルーで観察する対象物との距離を測定する測定部と、
     予め設定された前記表示装置に対する前記撮像装置の相対位置と、前記距離との関係に基づいて、前記対象物を観察する使用者の観察視線と、前記対象物を撮像する前記撮像装置の撮像光軸との視差を算出し、算出された視差に基づいて、表示する視線の位置を補正する視差補正部とをさらに含む、請求項21に記載の視線情報共有システム。
    The head-mounted terminal is
    A measuring unit for measuring a distance between the display device and an object observed by a user through the see-through;
    Based on a relationship between the relative position of the imaging device with respect to the preset display device and the distance, an observation line of sight of a user observing the object, and imaging light of the imaging device that images the object The line-of-sight information sharing system according to claim 21, further comprising: a parallax correction unit that calculates a parallax with the axis and corrects a position of a line of sight to be displayed based on the calculated parallax.
  23.  前記頭部装着型端末の前記表示装置は、前方が遮蔽され、外界の撮像画像を映像と一体的に使用者に視認可能に表示するビデオシースルー型ディスプレイである、請求項13から20のいずれかに記載の視線情報共有システム。 The display device of the head-mounted terminal is a video see-through display that is shielded from the front and displays a captured image of the outside world so as to be visible to the user integrally with the video. Line-of-sight information sharing system described in 1.
  24.  前記表示装置は、前記映像を表示する表示素子を含み、
     前記表示素子の表示面の中心と映像観察時の使用者の瞳中心とを光学的に結ぶ軸を、観察中心軸としたとき、
     前記撮像装置は、前記観察中心軸と撮像光軸とが同軸となるように配置されている、請求項21または23に記載の視線情報共有システム。
    The display device includes a display element for displaying the video,
    When the axis that optically connects the center of the display surface of the display element and the pupil center of the user at the time of video observation is the observation center axis,
    The line-of-sight information sharing system according to claim 21 or 23, wherein the imaging device is arranged such that the observation center axis and the imaging optical axis are coaxial.
PCT/JP2018/002142 2017-02-07 2018-01-24 Line-of-sight information sharing method and line-of-sight information sharing system WO2018147084A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-020535 2017-02-07
JP2017020535 2017-02-07

Publications (1)

Publication Number Publication Date
WO2018147084A1 true WO2018147084A1 (en) 2018-08-16

Family

ID=63107995

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/002142 WO2018147084A1 (en) 2017-02-07 2018-01-24 Line-of-sight information sharing method and line-of-sight information sharing system

Country Status (1)

Country Link
WO (1) WO2018147084A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020035191A (en) * 2018-08-30 2020-03-05 株式会社安川電機 Food product processing system
CN113014982A (en) * 2021-02-20 2021-06-22 咪咕音乐有限公司 Video sharing method, user equipment and computer storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015041197A (en) * 2013-08-21 2015-03-02 三菱電機株式会社 Display control device
JP2015132787A (en) * 2014-01-16 2015-07-23 コニカミノルタ株式会社 Spectacle type display device
JP2016181245A (en) * 2015-03-24 2016-10-13 富士ゼロックス株式会社 Attention/comment method and computing device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015041197A (en) * 2013-08-21 2015-03-02 三菱電機株式会社 Display control device
JP2015132787A (en) * 2014-01-16 2015-07-23 コニカミノルタ株式会社 Spectacle type display device
JP2016181245A (en) * 2015-03-24 2016-10-13 富士ゼロックス株式会社 Attention/comment method and computing device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020035191A (en) * 2018-08-30 2020-03-05 株式会社安川電機 Food product processing system
US11238861B2 (en) 2018-08-30 2022-02-01 Kabushiki Kaisha Yaskawa Denki Food processing system
CN113014982A (en) * 2021-02-20 2021-06-22 咪咕音乐有限公司 Video sharing method, user equipment and computer storage medium

Similar Documents

Publication Publication Date Title
US10564919B2 (en) Display system, display apparatus, method for controlling display apparatus, and program
US10579320B2 (en) Display system, display device, information display method, and program
US10874284B2 (en) Display control device, display device, surgical endoscopic system and display control system
US20160284129A1 (en) Display, control method of display, and program
JP6089705B2 (en) Display device and control method of display device
WO2015012280A1 (en) Sight line detection device
CN111295702A (en) Virtual image display device and head-mounted display using the same
JP7087481B2 (en) Head-mounted display device, display control method, and computer program
JP5960799B2 (en) Head mounted display and display method
US9563058B2 (en) Display device, display method, and readable medium
JP2018182570A (en) Line-of-sight information sharing method and line-of-sight information sharing system
JP2018142857A (en) Head mounted display device, program, and control method of head mounted display device
US20160021360A1 (en) Display device, method of controlling display device, and program
JP2018170554A (en) Head-mounted display
WO2018147084A1 (en) Line-of-sight information sharing method and line-of-sight information sharing system
US11061237B2 (en) Display apparatus
JP6540426B2 (en) Display system, display device, information display method, and program
JP2018054976A (en) Head-mounted display device and display control method of head-mounted display device
JP2016090853A (en) Display device, control method of display device and program
WO2012176683A1 (en) Image display device and image display system
JP2017055233A (en) Display device, display system, and control method of display device
JP2016219897A (en) Display device, control method for the same and program
US11635625B2 (en) Virtual image display device and optical unit
JP2018061850A (en) Endoscope system, display control system, and display control device
JP2018101034A (en) Video display device and head-mounted display

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18750735

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18750735

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP