WO2018073969A1 - Dispositif d'affichage d'image et système d'affichage d'image - Google Patents

Dispositif d'affichage d'image et système d'affichage d'image Download PDF

Info

Publication number
WO2018073969A1
WO2018073969A1 PCT/JP2016/081358 JP2016081358W WO2018073969A1 WO 2018073969 A1 WO2018073969 A1 WO 2018073969A1 JP 2016081358 W JP2016081358 W JP 2016081358W WO 2018073969 A1 WO2018073969 A1 WO 2018073969A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
display
type
user
eye
Prior art date
Application number
PCT/JP2016/081358
Other languages
English (en)
Japanese (ja)
Inventor
貴拓 伊達
康博 中村
Original Assignee
サン電子株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by サン電子株式会社 filed Critical サン電子株式会社
Priority to JP2018546133A priority Critical patent/JP6867566B2/ja
Priority to PCT/JP2016/081358 priority patent/WO2018073969A1/fr
Publication of WO2018073969A1 publication Critical patent/WO2018073969A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/38Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position

Definitions

  • the technology disclosed in this specification relates to an image display device that is used while being worn on a user's head, and an image display system including the image display device.
  • Patent Document 1 Japanese Patent Laying-Open No. 2014-93050 discloses an image display device used by being mounted on a user's head.
  • This type of image display device relates to a camera that captures an image in a range corresponding to a user's field of view (ie, a real image), a display unit that displays the captured real image, and an image displayed on the display unit.
  • a computer that displays the object image to be combined with the real image displayed on the display unit.
  • Patent Literature 1 when an object image is combined with a real image and displayed, it is difficult to adjust the display position of the object image, and the user feels uncomfortable, for example, the object image cannot be seen stereoscopically. There is a case.
  • This specification discloses a technology that can reduce the user's uncomfortable feeling.
  • a first image display device disclosed in this specification includes a first frame that can be mounted on a head of a first user, a first frame that is provided in the first frame, and that is mounted on the first frame.
  • the first right-eye display unit disposed at a position facing the user's right eye and the first frame, disposed at a position facing the left eye of the first user wearing the first frame
  • a left camera that captures a range corresponding to the visual field range of the user's left eye and a first control unit are provided.
  • the first control unit generates the first type of right image data using at least the right camera image of the right camera image captured by the right camera and the left camera image captured by the left camera.
  • the first type of left image data is generated using at least the left camera image of the camera image and the left camera image
  • the first type of right display image represented by the first type of right image data is represented by the first type. Displayed on the right eye display unit, the first type left display image represented by the first type left image data is displayed on the first left eye display unit, and the first user selects the first type right image.
  • a first adjustment process for changing at least one of them according to the change instruction is executed.
  • the image display device captures the target article with both the right camera and the left camera, displays the first type right display image on the first right-eye display unit, and the first type. Are displayed on the first left-eye display unit.
  • the first user wearing the first frame can visually recognize the first type display image with both eyes based on the first type right display image and the first type left display image.
  • the display portion by the first type right display image and the first type left display image are used.
  • the positional deviation from the display portion can be eliminated.
  • the “position” here includes a horizontal position, a vertical position, and a depth direction position.
  • images that is, first type right display image and first type left display image
  • the display unit that is, the first right eye display unit and the first left eye display unit.
  • “To do” is sufficient if the user wearing the image display device can see the image when viewing the display unit. That is, the display unit has a configuration that only reflects the projected light, and the user recognizes the image by viewing the reflected light reflected by the display unit (that is, receiving light by the retina). Including the case where it is possible. The same applies hereinafter.
  • the first adjustment process includes a first right mark image and a second right mark image that is visually recognized so as to be arranged farther from the first user than the first right mark image.
  • the first left mark image displayed on the first right-eye display unit and the second left mark image visually recognized so as to be arranged farther from the first user than the first left mark image Are displayed on the first display unit for the left eye, and when viewed from the first user, the first right mark image and the first left mark image overlap, and the second right mark image
  • the display position of the first right mark image and the second right mark, and the display position of the first left mark image and the second left mark so that the two right mark images appear to overlap. It may include changing at least one of them.
  • the way human objects are viewed differs depending on whether an object at a close position is viewed or an object at a distant position.
  • the human eye has a function of adjusting so as to compensate for a difference in appearance when an object at a close position and an object at a distant position are viewed at different timings. Therefore, in the above image display device, when the adjustment process for viewing an image displayed at a close position and the adjustment process for viewing an image displayed at a distant position are performed at different timings, the close position is obtained.
  • the first user may not easily view them three-dimensionally.
  • the first user in the first adjustment process, overlaps the first right mark image and the first left mark image as viewed from the first user, and Display positions of the first right mark image and the second right mark, and the first left mark image and the second left mark so that the second right mark image and the second right mark image appear to overlap each other. At least one of the display position and the display position can be changed. Therefore, according to the above configuration, even if the first type of display image includes an image at a close position and an image at a distant position as viewed from the first user, the first user has a three-dimensional view. The first type display image can be displayed on the display unit in an easily recognizable manner.
  • the first adjustment processing recognizes the shape of an object included in a right camera image captured by the right camera and a left camera image captured by the left camera, and a right mark image having a shape that matches the shape of the object Is displayed on the first right-eye display unit, a left mark image having a shape suitable for the right mark image is displayed on the first left-eye display unit, and the right mark image and the left mark image are further displayed. At least a part of the first image is displayed in such a manner that the first image can be visually recognized so as to overlap the object when viewed from the first user, and the right mark image and the left mark image are viewed from the first user. It may include changing the display position of the second image so that the second image, which is the other of them, appears to overlap the first image.
  • the first user in the first adjustment process, can make the first image displayed so that at least a part overlaps the object and the second image appear to overlap each other.
  • the display position of the second image can be changed.
  • the two images With the object as a reference, the two images can be aligned. Therefore, the two images can be accurately aligned as compared with the case where the object is not used as a reference. Therefore, according to the above configuration, it is possible to display the first type display image on the display unit more appropriately in such a manner that the first user can easily recognize three-dimensionally.
  • the first image display device may further include a storage device.
  • the first control unit displays the display position of the first type of right display image on the first right eye display unit and the first type of left display on the first left eye display unit.
  • the display position of the first type of right display image on the first right-eye display unit after being changed by the first adjustment processing after at least one of the display position of the image is changed.
  • Identification information indicating the right adjustment position, the left adjustment position that is the display position of the first type of left display image in the first left-eye display unit after being changed by the first adjustment process, and the first user Are stored in the storage device in association with each other, and when the identification information is input after the right adjustment position, the left adjustment position, and the identification information are stored in the storage device, the first right eye
  • the first type is located at a position corresponding to the right adjustment position on the display unit. Together to display the display image may be displayed first kind of left display image at a position corresponding to left adjustment position in the first left-eye display unit.
  • the first image display device stores the right adjustment position and the left adjustment position for the first user in the storage device in association with the identification information, and then when the identification information is input, Based on the stored right adjustment position and left adjustment position, a first type of right display image and a first type of left display image are displayed. That is, when the identification information is input, the result of the first adjustment process performed by the first user can be reflected.
  • the image display device can display the first type display image on the display unit in a manner that is easily recognized by the first user.
  • the first image display device may further include a first communication interface for executing communication with an external device different from the first image display device.
  • the first control unit may generate the second type of right image data based on the first type of right image data.
  • the second type of right image data may be data for displaying the second type of right display image on the target right eye display unit of the external device.
  • the second type of right display image may be an image related to the first type of right display image.
  • the first control unit may generate the second type of left image data based on the first type of left image data.
  • the second type of left image data may be data for displaying the second type of left display image on the target left eye display unit of the external device.
  • the second type of left display image may be an image related to the first type of left display image.
  • the first control unit may transmit the second type of right image data and the second type of left image data to the external device via the first communication interface.
  • the first image display device can allow the user of the external device to visually recognize the display image related to the first type of display image visually recognized by the first user.
  • the first user and the user of the external device can visually recognize images related to each other.
  • the first control unit receives the third type of right image data and the third type of left image data from the external device via the first communication interface, and is represented by the third type of right image data.
  • the third type right display image is displayed on the first right eye display unit, and the third type left display image represented by the third type left image data is displayed on the first left eye display unit. May be.
  • the third type right display image may be an image related to the target right display image displayed on the target right eye display unit of the external device.
  • the third type of left display image may be an image related to the target left display image displayed on the target left eye display unit of the external device.
  • the first image device can cause the first user to visually recognize a display image related to the display image visually recognized by the user of the external device.
  • the first user and the user of the external device can visually recognize images related to each other.
  • the present specification further discloses an image display system including the first image display device and the second image display device.
  • the second image display device includes a second frame that can be attached to the head of a second user different from the first user, and a second frame that is provided in the second frame and that is attached to the second frame.
  • a second right-eye display unit disposed at a position facing the user's right eye, and a second frame provided at the position facing the left eye of the second user wearing the second frame.
  • a second left-eye display unit a second communication interface for executing communication with the first image display device, and a second control unit.
  • the second control unit receives the second type of right image data and the second type of left image data from the first image display device via the second communication interface, and receives the second type of right image data.
  • the second type of right display image is displayed in the second type of display image that can be viewed by both eyes based on the second type of right display image and the second type of left display image.
  • the display position of the second type of right display image in the second right eye display unit and the second left eye display so that there is no positional deviation between the display part of FIG. 2 and the display part of the second type of left display image Type 2 left display on the display.
  • the display position of the ray image at least one of, performing a second adjusting process for changing according to the change instruction input by the second user.
  • the second image display device can cause the second user to visually recognize the second type of display image related to the first type of display image visually recognized by the first user. Images related to each other can be visually recognized by the first user and the second user.
  • the first image display device can display the first type of display image on the display unit as a target that is easily recognized by the first user, and the second image display device The second type display image can be displayed on the display unit in a manner that is easy for the user to recognize.
  • control method a control method, a computer program, and a computer-readable recording medium storing the computer program for realizing the first and second image display apparatuses are also novel and useful.
  • An outline of the image display system is shown. 1 shows a block diagram of an image display system. An outline of an image display method by the image display apparatus will be described.
  • the flowchart of the starting process of 1st Example is shown.
  • the example of the up-down adjustment screen of 1st Example is shown.
  • the example of the left-right adjustment screen of 1st Example is shown.
  • the example of the fine adjustment screen of 1st Example is shown.
  • the example of the menu screen of 1st Example is shown.
  • the flowchart of the display process of 1st Example is shown.
  • the example of a display of the image of each image display apparatus by the display process of 1st Example is shown.
  • the example of a display of the image of each image display apparatus by the display process of 1st Example is shown.
  • the flowchart of the display process of 2nd Example is shown.
  • the example of a display of the image of each image display apparatus by the display process of 2nd Example is shown.
  • the example of the adjustment screen of 3rd Example is shown.
  • the example of the adjustment screen of 4th Example is shown.
  • An appearance of an image display apparatus according to a fifth embodiment is shown.
  • the image display system 2 includes image display devices 10 and 50 and a server 80.
  • the image display apparatuses 10 and 50 and the server 80 can perform wireless communication (specifically, Wi-Fi communication) with each other via the Internet 4.
  • the image display device 10 is an image display device (a so-called head mounted display) used by being mounted on a user's head. As shown in FIG. 1, the image display device 10 includes a frame 12, a right display unit 14R, a left display unit 14L, a right projection unit 15R, a left projection unit 15L, a right camera 16R, and a left camera 16L. A control box 18 and an operation unit 19 are provided.
  • the frame 12 is a spectacle frame-shaped member.
  • the user can wear the image display device 10 on the head by wearing the frame 12 like wearing glasses.
  • the right display portion 14R and the left display portion 14L are translucent display members, respectively.
  • the right display unit 14R is disposed at a position facing the user's right eye
  • the left display unit 14L is disposed at a position facing the left eye.
  • the right display unit 14R and the left display unit 14L may be collectively referred to as “display unit 14”.
  • the right projection unit 15R and the left projection unit 15L are members that project images onto the right display unit 14R and the left display unit 14L, respectively.
  • the right projection unit 15R and the left projection unit 15L are provided on the sides of the right display unit 14R and the left display unit 14L, respectively.
  • the right projection unit 15R and the left projection unit 15L may be collectively referred to as “projection unit 15”.
  • the projection unit 15 projects a predetermined virtual image (hereinafter referred to as “object image”) on the display unit 14 in accordance with an instruction from the control unit 30.
  • the user can view the real-world object or / and space as if the object image was synthesized at a predetermined position in the real-world object or / and space that can be visually recognized by the user through the display unit 14.
  • the object image An image of a real-world object or a combination of a space and an object image (that is, an image visually recognized by a user) may be referred to as a “display image” below.
  • the control unit 30 displays a desired image on the display unit 14 by instructing the projection unit 15 to project an image, the operation of the projection unit 15 will be described. It may be omitted and simply expressed as “the control unit 30 causes the display unit 14 to display a desired image”.
  • the right camera 16R is a camera disposed in the frame 12 at a position above the right display unit 14R (that is, a position corresponding to above the user's right eye).
  • the left camera 16L is a camera disposed in the frame 12 at an upper position of the left display unit 14L (that is, a position corresponding to the upper portion of the user's left eye).
  • Each of the right camera 16R and the left camera 16L can capture a range corresponding to the field of view of the user wearing the image display device 10 (hereinafter may be referred to as a “specific range”) from different angles.
  • the right camera 16R captures the viewing range of the user's right eye
  • the left camera 16L captures the viewing range of the user's left eye.
  • the right camera 16R and the left camera 16L may be collectively referred to as “camera 16”.
  • the control box 18 is a box attached to a part of the frame 12.
  • the control box 18 accommodates each element that controls the control system of the image display apparatus 10. Specifically, as shown in FIG. 2, the control box 18 houses a sensor 20, a Wi-Fi interface 22, a control unit 30, and a memory 32.
  • the interface is described as “I / F”.
  • the operation unit 19 is provided on the outer surface of the control box 18.
  • the operation unit 19 is a button that can be operated by the user, and the user can input various instructions to the image display device 10 by operating the operation unit 19.
  • the sensor 20 is a triaxial acceleration sensor.
  • the sensor 20 detects acceleration of three axes of X, Y, and Z.
  • the control unit 30 can specify the posture and motion state of the image display device 10.
  • the Wi-Fi I / F 22 is an I / F for executing Wi-Fi communication with an external device (for example, the server 80) via the Internet 4.
  • the control unit 30 executes various processes according to the program stored in the memory 32. Details of processing executed by the control unit 30 will be described later in detail (see FIGS. 4 and 9).
  • the control unit 30 is electrically connected to the display unit 14, the projection unit 15, the camera 16, the sensor 20, the Wi-Fi I / F 22, and the memory 32, and controls the operation of each of these elements. Can do.
  • the memory 32 stores various programs.
  • the memory 32 is provided with a user information storage area 34 for storing user information related to the user of the image display device 10.
  • the user information includes a user ID for identifying the user, a password used for authentication of the user, and adjustment information.
  • the adjustment information is for adjusting the display positions of the left and right display images when the user corresponding to the user ID visually recognizes the image (that is, the display image) using the image display device 10 to an appropriate position for the user.
  • Information is stored in the memory 32 by executing a recording process (see FIG. 4) described later.
  • the memory 32 also stores a device ID for identifying the image display device 10.
  • the image display device 50 has the same configuration as the image display device 10. That is, the image display device 50 also has the frame 52, the right display unit 54R, the left display unit 54L, the right camera 56R, the left camera 56L, the control box 58, and the operation unit 59, as with the image display device 10. And have.
  • a sensor 60, a Wi-Fi I / F 62, a control unit 70, and a memory 72 are provided in the control box 58.
  • the memory 72 is provided with a user information storage area 74.
  • the memory 72 also stores a device ID for identifying the image display device 50.
  • each process for example, the startup process in FIG. 4, the display process in FIG. 9, etc.
  • the control unit 70 can also execute all the same processes as the processes executed by the control unit 30 of the image display apparatus 10.
  • the server 80 is a server installed by an administrator of the image display devices 10 and 50 (for example, a company that provides the image display devices 10 and 50).
  • the server 80 manages the image display devices 10 and 50 as devices included in a predetermined device group, and relays image data communication between the image display devices 10 and 50.
  • the server 80 includes a display unit 82, an operation unit 84, a Wi-Fi I / F 86, a control unit 88, and a memory 90.
  • the display unit 82 is a display capable of displaying various information.
  • the operation unit 84 includes a keyboard and a mouse.
  • the user of the server 80 can input various instructions to the server 80 by operating the operation unit 84.
  • the Wi-Fi I / F 86 is an I / F for executing Wi-Fi communication with an external device (for example, the image display devices 10 and 50) via the Internet 4.
  • the control unit 88 executes various processes according to programs stored in the memory 90.
  • the memory 90 stores the device IDs of the image display devices 10 and 50 in association with predetermined group IDs. That is, the memory 90 stores information indicating that the image display device 10 and the image display device 50 are included in the device group represented by the group ID.
  • FIG. 3 An outline of a technique (mechanism) for displaying the display image 110 on the display unit 14 in the image display apparatus 10 of the present embodiment will be described.
  • FIG. 3 an example is assumed in which the user U1 visually recognizes a display image 110 in which an object image 102 that is a virtual image is combined with an image 101 (hereinafter referred to as “real image”) 101 of a mug 100 that is an actual article.
  • the object image 102 includes a virtual image for making the mug 100 appear as if it has a face, and a virtual image for making it appear as if a balloon is coming out of the mug 100.
  • the user U1 is viewing the mug 100, which is an actual article, via the display unit 14 with the image display device 10 attached to his / her head.
  • the shooting range of the camera 16 includes the mug 100.
  • the right camera 16R and the left camera 16L each photograph the mug 100 from different angles.
  • the control unit 30 generates right image data for displaying the right display image 110R on the right display unit 14R based on the right camera image photographed by the right camera 16R. Similarly, the control unit 30 generates left image data for displaying the left display image 110L on the left display unit 14L based on the left camera image captured by the left camera 16L.
  • the control unit 30 displays the right display image 110R (that is, the right object image 102R) on the right display unit 14R and the left display image 110L (that is, the left object image) on the left display unit 14L. 102L) is displayed.
  • the right display image 110R includes the right real image 101R of the mug 100 that the user U1 is viewing with the right eye, and the right object image 102R synthesized with the right real image 101R.
  • the left display image 110L includes a left reality image 101L of the mug 100 that the user U1 is viewing with the left eye, and a left object image 102L synthesized with the left reality image 101L.
  • the display position of the right object image 102R in the right display image 110R and the display position of the left object image 102L in the left display image 110L are slightly different from each other. This takes into account the parallax between the right eye and the left eye.
  • the right eye of the user U1 visually recognizes the right display image 110R, and the left eye visually recognizes the left display image 110L. As a result, the user U1 can visually recognize the display image 110 with both eyes.
  • control unit 30 displays the right display image 110R on the right display unit 14R and displays the left display image 110L on the left display unit 14L, thereby allowing the user to visually recognize the display image 110 with both eyes. It may be simply described as “the control unit 30 causes the display unit 14 to display the display image 110”.
  • the image display apparatus 10 generates right image data based on the right camera image captured by the right camera 16R, and generates left image data based on the left camera image captured by the left camera 16L. Is generated. Then, based on each image data, the right display image 110R and the left display image 110L are displayed on the right display unit 14R and the left display unit 14L, respectively. Since the images 110R and 110L are displayed based on the camera images obtained by the two cameras 16R and 16L, the user U1 displays the display image 110 as compared with the case where the images are displayed based on the camera images obtained by the single camera. It is easy to recognize the object image 120 included in the image three-dimensionally.
  • the right display image 110R and the left display image 110L are respectively displayed on the right display unit 14R and the left display unit by the above-described method.
  • the user U1 can visually recognize one display image 110.
  • the images 110R and 110L are displayed at fixed positions in the right display unit 14R and the left display unit 14L each time, the user U1
  • the display image 110 may not be properly viewed by the user, for example, when the object image 120 is displayed twice, such as a shift occurs between the image viewed by the right eye and the image viewed by the left eye.
  • image position shift such a situation may be referred to as “image position shift”.
  • the position of the focus in the user's field of view (that is, the distance from the feature point on which the user is focused) also changes. Can occur.
  • the convergence angle of both eyes differs between when the user focuses on a relatively near position and when the user focuses on a relatively far position. Due to the difference in the convergence angle of both eyes, the above-described image misalignment may occur.
  • a startup process (see FIG. 4) described later is executed in order to suppress image positional deviation.
  • the display position of the right display image 110R on the right display unit 14R and the display position of the left display image 110L on the left display unit 14L are positions suitable for the user wearing the image display device 10.
  • Adjusted to And the display process (refer FIG. 9) for displaying an image on the display part 14 in the state by which adjustment was performed is performed.
  • the startup process (see FIG. 4) and the display process (see FIG. 9) will be described in detail.
  • the control unit 30 causes the display unit 14 to display a predetermined login screen (not shown).
  • the login screen is a screen for requesting the user to input a registered user ID and password. While the login screen is displayed, the user can input his / her registered user ID and password by performing a predetermined gesture within the shooting range (that is, the specific range) of the camera 16.
  • the user ID and password are not registered, the user can perform a predetermined new registration start operation for requesting new registration.
  • the user inputs various operations to the image display device 10
  • the user inputs the operations by performing a gesture within a specific range.
  • the user may perform various operations including an input operation such as a user ID and a new registration start operation by operating the operation unit 19.
  • the control unit 30 determines whether a user ID and a password have been input.
  • the control unit 30 detects that an operation for inputting the user ID and password is performed within the specific range while the login screen is displayed on the display unit 14, the control unit 30 determines YES in S12, and then proceeds to S14. move on.
  • the control unit 30 determines NO in S12 and proceeds to S20.
  • the control unit 30 determines whether or not the authentication of the user ID and password input by the user is successful.
  • the control unit 30 includes a user information storage area in the memory 32 that includes user information including a combination that matches a combination of input user ID and password (hereinafter referred to as “specific user ID etc.”). It is determined whether or not it is present in 34.
  • specific user ID etc. user information including a combination that matches a combination of input user ID and password
  • the control unit 30 determines YES in S14, and proceeds to S16.
  • the control unit 30 determines NO in S14 and returns to S10.
  • control unit 30 reads the adjustment information stored in the user information storage area 34 in association with a specific user ID or the like. Thereafter, the control unit 30 displays the right display image 110R on the right display unit 14R and the left display image 110L on the left display unit 14L according to the display position indicated by the read adjustment information.
  • S18 ends, the process proceeds to S32.
  • the control unit 30 displays a predetermined new information input screen (not shown) on the display unit 14, and the new user ID and new password are displayed. Monitor input.
  • the user can input a new user ID and a new password by performing a predetermined input gesture within a specific range while the new information input screen is displayed on the display unit 14.
  • the control unit 30 determines YES in S20, and proceeds to S22.
  • the control unit 30 displays the new user ID and the new password on the display unit 14. Display a message requesting a change.
  • the control unit 30 adjusts the vertical positions of the images 110R and 110L displayed on the right display unit 14R and the left display unit 14L. Specifically, in S22, the control unit 30 causes the display unit 14 to display the up / down adjustment screen 210 of FIG.
  • the vertical adjustment screen 210 includes marks 212 and 214 for adjusting the vertical position, an up button 216 for raising the image display position, a down button 218 for lowering the image display position, and confirmation.
  • the OK button 220 is included. Of the two marks 212 and 214, the mark 212 is displayed on the right display unit 14R, and the mark 214 is displayed on the left display unit 14L.
  • the up movement button 216, the down movement button 218, and the OK button 220 are displayed on one of the right display portion 14R and the left display portion 14L.
  • the user can visually recognize the vertical adjustment screen 210 as shown in FIG. 5 by viewing these elements with both eyes. While viewing the two marks 212 and 214 with both eyes, the user operates the buttons 216 and 218 to adjust the two marks 212 and 214 so that they overlap. By adjusting the two marks 212 and 214 to overlap, the vertical positions of the images 110R and 110L displayed on the right display unit 14R and the left display unit 14L are adjusted.
  • the user operates the OK button 220 when the two marks 212 and 214 are adjusted so as to overlap (that is, when the vertical position is adjusted), and determines the vertical position. In this case, the control unit 30 temporarily stores the determined vertical position in the memory 32, and proceeds to S24.
  • the control unit 30 adjusts the left and right positions of the images 110R and 110L. Specifically, in S24, the control unit 30 displays the left / right adjustment screen 230 of FIG.
  • the left and right adjustment screen 230 includes marks 232 and 234 for adjusting the left and right positions, a separation movement button 236 for increasing the display interval between the left and right images, and a proximity movement button 238 for reducing the display interval between the left and right images.
  • an OK button 240 for confirmation.
  • the mark 232 is displayed on the right display unit 14R
  • the mark 234 is displayed on the left display unit 14L.
  • the separation movement button 236, the proximity movement button 238, and the OK button 240 are displayed on one of the right display portion 14R and the left display portion 14L.
  • the user can visually recognize the left-right adjustment screen 230 as shown in FIG. 6 by viewing these elements with both eyes. While viewing the two marks 232 and 234 with both eyes, the user operates the buttons 236 and 238 to adjust the two marks 232 and 234 so that they overlap. By adjusting so that the two marks 232 and 234 overlap, the left and right positions (in other words, the intervals) of the images 110R and 110L displayed on the right display unit 14R and the left display unit 14L are adjusted.
  • the user operates the OK button 240 when the two marks 232 and 234 are adjusted so as to overlap each other (that is, when the left and right positions are adjusted) to determine the left and right positions.
  • the control unit 30 temporarily stores the determined left and right positions in the memory 32 and proceeds to S26.
  • the control unit 30 finely adjusts the display positions of the images 110R and 110L. Specifically, in S26, the control unit 30 displays the fine adjustment screen 250 of FIG.
  • the control unit 30 displays the fine adjustment screen 250 of FIG.
  • the fine adjustment screen 250 includes marks 252 and 254 for position adjustment, a right adjustment button 256 for finely adjusting the display position of the image displayed on the right display unit 14R vertically and horizontally, and a left display unit 14L.
  • a left adjustment button 258 for finely adjusting the display position of the displayed image vertically and horizontally, and an OK button 260 for confirmation are included.
  • the mark 252 is displayed on the right display unit 14R
  • the mark 254 is displayed on the left display unit 14L.
  • the right adjustment button 256, the left adjustment button 258, and the OK button 260 are displayed on one of the right display portion 14R and the left display portion 14L.
  • the user can view a fine adjustment screen 250 as shown in FIG. 7 by viewing these elements with both eyes. While viewing the two marks 252 and 254 with both eyes, the user operates the buttons 256 and 258 to adjust the two marks 252 and 254 so that they overlap. By adjusting so that the two marks 252 and 254 overlap, the display positions of the images 110R and 110L displayed on the right display unit 14R and the left display unit 14L are finely adjusted, respectively.
  • the user operates the OK button 260 when the two marks 252 and 254 are adjusted so as to overlap each other, and determines the position. In this case, the control unit 30 temporarily stores the determined display position in the memory 32, and proceeds to S28.
  • control unit 30 In S28, the control unit 30 generates adjustment information indicating the display position determined in S22 to S26. Next, in S30, the control unit 30 stores the adjustment information generated in S28 in the user information storage area 34 in association with the new user ID and the new password. Thereafter, the control unit 30 displays the right display image 110R on the right display unit 14R and the left display image 110L on the left display unit 14L according to the display position indicated by the adjustment information generated in S28. When S30 ends, the process proceeds to S32.
  • the control unit 30 causes the display unit 14 to display the menu screen 300 (see FIG. 8).
  • the menu screen 300 includes a menu object image 320.
  • the display unit 14 is a translucent display, the menu screen has a mode in which the menu object image 320 is combined with an actual article (that is, an indoor scene) that can be viewed through the display unit 14. 300 can be seen.
  • the menu object image 320 represents the main menu.
  • the menu object image 320 includes three selectable icons 322, 324, and 326, and an end button 328 for ending the display of the menu object image 320.
  • Each icon 322 to 326 corresponds to each function of the image display device 10.
  • the icon 322 indicates a game application
  • the icon 324 indicates an education application
  • the icon 326 indicates a setting function for changing various settings related to the image display device 10.
  • the user can activate a function corresponding to the desired icon by performing a gesture of touching the desired icon (that is, performing an operation of selecting the desired icon).
  • the activation process in FIG. 4 ends.
  • the control unit 30 displays the right display unit 14R.
  • the left and right positions of the images 110R and 110L displayed on the left display unit 14L can be readjusted.
  • the control unit 30 re-executes the processes of S22 to S30 in FIG. 4, and again performs the vertical position adjustment (S22), the horizontal position adjustment (S24), and the fine adjustment (S26) according to the user instruction.
  • control unit 30 displays the right display image 110R on the right display unit 14R and displays the left display image 110L on the left display unit 14L in accordance with the display position indicated by the re-executed adjustment information generated in S28.
  • the user can adjust the display position of the image as needed in accordance with his / her state even after the start-up process of FIG. 4 is completed.
  • control unit 30 acquires the right camera image currently captured by the right camera 16R from the right camera 16R, and acquires the left camera image currently captured by the left camera 16L from the left camera 16L. .
  • control unit 30 displays data of the display image displayed on the display unit of another image display device (for example, the image display device 50) in the device group including the own device from the server 80 (that is, the right side). Display image data and left display image data).
  • the control unit 30 displays the right display image on the right display unit 14R of its own device based on the right camera image acquired in S50 and the right image data received in S52 (see 110R in FIG. 3).
  • the right image data for displaying is generated.
  • the control unit 30 displays a left display image (see 110L in FIG. 3) on the left display unit 14L of the own device based on the left camera image acquired in S50 and the left image data received in S52.
  • Left image data for generating the image is generated.
  • the right image data and the left image data include object image data to be combined with a real image.
  • the control unit 30 causes the display unit 14 to display a display image (see 110 in FIG. 3) based on the right image data and the left image data generated in S54. That is, the control unit 30 displays the right display image on the right display unit 14R based on the right image data, and displays the left display image on the left display unit 14L based on the left image data.
  • the display image displayed in S56 is, for example, an image in a form in which an object image (see 102 in FIG. 3) is combined with a real image (see 101 in FIG. 3).
  • the control unit 30 includes right display image data indicating the right display image (that is, an image including the real image and the object image) displayed in S56, and a left display image (that is, the real image and the object image).
  • Left display image data indicating (image) is generated and transmitted to the server 80.
  • the server 80 receives the right display image data and the left display image data from the image display device 10
  • the server 80 transmits the image display device to another image display device (for example, the image display device 50) in the device group including the image display device 10.
  • the right display image data and the left display image data received from 10 are transmitted. Thereby, the display image of the aspect which visually recognized the same target object from the own viewpoint from the display units 14 and 54 of the image display apparatuses 10 and 50, respectively, is displayed.
  • the control unit 30 repeatedly executes the processes of S50 to S58 until a predetermined end operation is performed.
  • the control unit 30 can change the display content of the display image according to the operation, such as changing the display mode of the object image included in the display image according to the user's operation. Further, the control unit 30 can change the display content of the display image according to the contents of the right image data and the left image data received from the server 80.
  • FIGS. 10 and 11 Specific examples: FIGS. 10 and 11
  • display processing see FIG. 9
  • the display unit of each device 10, 50 A specific example of the display image displayed on 14 and 54 will be described.
  • a user U1 wearing the image display device 10 and a user U2 wearing the image display device 50 exist in the room R.
  • a game application that is, an application corresponding to the icon 322 in FIG. 8
  • display processing see FIG. 9) is executed.
  • Each of the users U1 and U2 sees the apple 400, which is an actual article existing in the room R, from a different position (that is, the apple 400 exists within the shooting range of the cameras 16 and 56).
  • FIG. 10 shows a state in which an object image 410 imitating an arrow is stabbed into the apple 400 by an operation of the user U1 in the game (specifically, an operation in which the user U1 shoots an arrow using a virtual bow). Yes.
  • a display image 450 is displayed on the display unit.
  • Display image 450 includes a real image 401 of apple 400 and an object image 410 of an arrow.
  • the object image 410 represents a state in which an arrow is stuck toward the back side of the page.
  • the display image 460 is displayed on the display unit 54 by the image display device 50 executing the processing of S50 to S58 of FIG.
  • the display image 460 also includes a real image 401 of the apple 400 and an arrow object image 410.
  • the object image 410 represents a state in which an arrow is stuck toward the left side of the page.
  • display images 450 and 460 in a form in which the same object is viewed from the viewpoint of the own device are displayed on the display units 14 and 54 of the image display apparatuses 10 and 50, respectively.
  • FIG. 11 further shows that the object image 420 imitating the flag of the apple 400 is further stabbed by an operation of the user U2 in the game (specifically, an operation in which the user U2 pushes up a virtual flagpole). ing.
  • a display image 470 is displayed on the display unit 14 of the image display device 10.
  • Display image 470 further includes a flag object image 420.
  • the object image 420 represents a state where a flag is stuck on the left side of the apple 400.
  • a display image 480 is displayed on the display unit 54 of the image display device 50.
  • Display image 480 also includes a flag object image 420.
  • the object image 420 represents a state where a flag is stuck in front of the apple 400.
  • display images 470 and 480 in a form in which the same object is visually recognized from the viewpoint of the own device are displayed on the display units 14 and 54 of the image display devices 10 and 50, respectively.
  • the image display apparatus 10 captures the target article with both the right camera 16R and the left camera 16L, displays the right display image on the right display unit 14R based on the right camera image, and also displays the left camera image.
  • the left display image is displayed on the left display unit 14L based on (see FIGS. 3 and 9).
  • the user wearing the image display device 10 can visually recognize the display image with both eyes based on the right display image and the left display image.
  • the image display device 10 recognizes three-dimensionally for the user as compared to the case where a configuration in which a display image is displayed based only on an image obtained by photographing an article with one camera is employed.
  • the display image can be displayed on the display unit 14 in a manner that is easy to do.
  • control unit 30 executes the processing of S22 to S30 in FIG. 4 to display the display portion by the right display image and the display by the left display image when displaying the display image on the display unit 14. Misalignment with the part can also be eliminated. Therefore, according to the image display apparatus 10 of the present embodiment, it is possible to display a display image in a manner that is less uncomfortable for the user. Therefore, according to the present embodiment, the user's uncomfortable feeling can be reduced.
  • control unit 30 stores the generated adjustment information in the user information storage area 34 in association with the new user ID and the new password (S28 and S30 in FIG. 4). Then, when the user ID and password stored in the user information storage area 34 are input in the subsequent startup processing (FIG. 4) (YES in S14), the control unit 30 associates with the user ID and the like. Then, the adjustment information stored in the user information storage area 34 is read (S16). After that, the control unit 30 displays the right display image on the right display unit 14R and displays the left display image on the left display unit 14L according to the display position indicated by the read adjustment information.
  • the image display apparatus 10 can store adjustment information indicating the display position for each user, and can reflect the display position indicated by the adjustment information when the user authentication is successful.
  • the image display device 10 can display a display image on the display unit 14 in a manner that is easy for the user to recognize for each user.
  • the control unit 30 also displays display image data (that is, right display image data and left display image data) displayed on the display unit of another image display device in the device group including the own device from the server 80. (S52 in FIG. 9), based on the display image data, right image data and left image data to be displayed on the display unit of the own device are generated (S54). Then, the control unit 30 causes the display unit 14 to display a display image (see 110 in FIG. 3) based on the right image data and the left image data generated in S54 (S56). Further, the control unit 30 generates right display image data and left display image data representing the display image displayed in S56, and transmits them to the server 80 (S58).
  • display image data that is, right display image data and left display image data
  • the server 80 When the server 80 receives the right display image data and the left display image data from the image display device 10, the server 80 transmits the image display device to another image display device (for example, the image display device 50) in the device group including the image display device 10.
  • the right display image data and the left display image data received from 10 are transmitted. Therefore, as shown in FIGS. 10 and 11, the image display devices 10 and 50 can display display images related to each other on the display units 14 and 54.
  • the image display device 10 is an example of a “first image display device”.
  • the image display device 50 is an example of an “external device” or “second image display device”.
  • the right display unit 14R and the left display unit 14L are examples of a “right eye display unit” and a “left eye display unit”, respectively.
  • the right image data and the left image data generated in S54 of FIG. 9 are examples of “first type right image data” and “first type left image data”, respectively.
  • the right display image and the left display image displayed in S56 of FIG. 9 are examples of the “first type right display image” and the “first type left display image”, respectively.
  • the left / right position adjustment instruction performed in S24, and the fine adjustment instruction performed in S26 are examples of the “change instruction”.
  • the vertical position adjustment performed in S22, the horizontal position adjustment performed in S24, and the fine adjustment performed in S26 are examples of the “first adjustment process”.
  • the display position for the right display unit and the display position for the left display unit indicated by the adjustment information generated in S28 are examples of “right adjustment position” and “left adjustment position”, respectively.
  • the user ID is an example of “identification information”.
  • the right display image data and the left display image data transmitted to the server 80 in S58 of FIG. 9 are examples of “second type right image data” and “second type left image data”, respectively.
  • the right display image data and the left display image data received from the server 80 in S52 of FIG. 9 (that is, received from the image display device 50 via the server 80) are “third type right image data”, It is an example of "3rd type left image data.”
  • a display process executed by the control unit 30 of the image display apparatus 10 of the present embodiment will be described. After the activation process (see FIG. 4) is finished and the menu screen 300 is displayed on the display unit 14, when the user performs an operation of selecting an icon (for example, the icon 324) corresponding to a desired application, The display process of FIG. 12 is started.
  • the control unit 30 determines whether or not the setting of the own device is the transmission side.
  • the memory 32 of the image display apparatus 10 stores information indicating whether the own device functions as a transmission side terminal or a reception side terminal.
  • the transmission-side terminal is a device that transmits data representing a display image displayed on its own device to another device via the server 80.
  • the receiving side terminal receives data representing the display image displayed on the transmitting side terminal from the transmitting side terminal via the server 80 and displays the same display image as the display image displayed on the transmitting side terminal. Terminal.
  • the user can set in advance whether the image display device 10 functions as a transmission-side terminal or a reception-side terminal.
  • the control unit 30 determines YES in S70 and proceeds to S72.
  • the control unit 30 determines NO in S70, and proceeds to S80.
  • control unit 30 acquires the right camera image currently captured by the right camera 16R from the right camera 16R, and acquires the left camera image currently captured by the left camera 16L from the left camera 16L. .
  • the control unit 30 In S74, the control unit 30 generates right image data for displaying a right display image (see 110R in FIG. 3) on the right display unit 14R of the own device based on the right camera image acquired in S70. At the same time, the control unit 30 generates left image data for displaying a left display image (see 110L in FIG. 3) on the left display unit 14L of the own device based on the left camera image acquired in S70.
  • the right image data and the left image data include object image data to be combined with a real image.
  • control unit 30 causes the display unit 14 to display a display image (see 110 in FIG. 3) based on the right image data and the left image data generated in S74. That is, the control unit 30 displays the right display image on the right display unit 14R based on the right image data, and displays the left display image on the left display unit 14L based on the left image data.
  • the control unit 30 In subsequent S78, the control unit 30 generates right display image data indicating the right display image displayed in S76 and left display image data indicating the left display image, and transmits them to the server 80.
  • the server 80 receives the right display image data and the left display image data from the image display device 10
  • the server 80 transmits the image display device 10 to another image display device (for example, the image display device 50) that is a receiving side terminal in the device group.
  • the received right display image data and left display image data are transmitted.
  • the same display image is displayed on the display units of the image display devices 10 and 50, respectively.
  • control unit 30 repeatedly executes the processes of S72 to S78 until a predetermined end operation is performed.
  • the control unit 30 can change the display content of the display image according to the operation, such as changing the display mode of the object image included in the display image according to the user's operation.
  • control unit 30 displays data of the display image displayed on the display unit of the image display device 10 that is the transmitting terminal (that is, the server 80). , Right display image data and left display image data).
  • the control unit 30 causes the display unit 14 to display a display image based on the right display image data and the left display image data received in S80. That is, the control unit 30 displays the right display image on the right display unit 14R based on the right display image data, and displays the left display image on the left display unit 14L based on the left display image data. Thereby, the same image as the display image data displayed on the display part of the other image display apparatus which is a transmission side terminal is displayed on the display part 14.
  • FIG. 13 For example: FIG. 13
  • a display process (see FIG. 12) is executed by the image display device 10 and the image display device 50 of the present embodiment, the display units 14 and 54 of the devices 10 and 50.
  • FIG. 13 A specific example of the display image displayed on the screen will be described.
  • a user U1 wearing the image display device 10 and a user U2 wearing the image display device 50 exist in the room R.
  • an education application that is, an application corresponding to the icon 324 in FIG. 8
  • display processing (see FIG. 12) is executed.
  • the image display device 10 is set as a display side terminal, and the image display device 50 is set as a reception side terminal.
  • the user U1 who is a lecturer teaches how to cook for the user U2 who is a student role using the image display system 2.
  • the user U1 is looking at the pan 500 which is an actual article existing in the room R (that is, the pan 500 exists within the shooting range of the camera 16).
  • a part of the bread 500 is sliced in advance.
  • the display image 550 is displayed on the display unit 14 as a result of executing the processing of S72 to S78 of FIG.
  • the display image 550 includes a real image 501 of the pan 500, an object image 510 for an explanatory message, and a knife object image 512.
  • the display image 600 is displayed on the display unit 54 as a result of executing the processing of S80 and S82 of FIG.
  • the display image 600 is the same image as the display image 550 displayed on the display unit 14 of the image display device 10.
  • the display images 550 and 600 having the same contents are displayed on the display units 14 and 54 of the image display devices 10 and 50, respectively.
  • the image display system 2 of the present embodiment can also exhibit the same operational effects as the image display system 2 of the first embodiment. Further, as described above, in the present embodiment, the display images 550 and 600 having the same contents are displayed on the display units 14 and 54 of the image display devices 10 and 50, respectively. That is, each user can visually recognize the same image.
  • the right image data and the left image data generated in S74 of FIG. 12 are examples of “first type right image data” and “first type left image data”, respectively.
  • the right display image data and the left display image data transmitted to the server 80 in S78 are examples of “second type right image data” and “second type left image data”, respectively.
  • the right display image data and the left display image data received from the server 80 in S80 (that is, received from the transmitting terminal via the server 80) are “third type right image data” and “third type”, respectively. Is an example of “left image data”.
  • the adjustment screen 710 of this embodiment includes marks 712 and 714 for position adjustment at positions close to the user, marks 722 and 724 for position adjustment at positions far from the user, and a right adjustment button 730. And a left adjustment button 732 and an OK button 750 for confirmation.
  • the marks 712 and 714 are mark images that are visually recognized so as to be displayed at positions close to the user.
  • the marks 722 and 724 are mark images that are visually recognized so as to be displayed at positions far from the user.
  • the marks 722 and 724 are displayed separately in four places. Among the marks, the mark 712 and the mark 722 are displayed on the right display unit 14R, and the mark 714 and the mark 724 are displayed on the left display unit 14L.
  • the right adjustment button 730 is an adjustment button for adjusting the display position of each image (in this case, the mark 712 and the mark 722) displayed on the right display unit 14R
  • the left adjustment button 732 is a left display. This is an adjustment button for adjusting the display position of each image (in this case, the mark 714 and the mark 724) displayed on the portion 14L in the vertical and horizontal directions.
  • the right adjustment button 730 is operated, the display positions of the mark 712 and the mark 722 are changed simultaneously with the operation.
  • the left adjustment button 732 is operated, the display positions of the mark 714 and the mark 724 change simultaneously with the operation.
  • the right adjustment button 730, the left adjustment button 732, and the OK button 750 are displayed on one of the right display portion 14R and the left display portion 14L.
  • the user can visually recognize the adjustment screen 710 as shown in FIG. 14 by viewing the above elements with both eyes.
  • the user operates the buttons 730 and 732 while viewing the marks 712, 714, 722, and 724 with both eyes, and adjusts the mark 712 and the mark 714 so that they overlap, and the mark 722 and the mark 724 overlap.
  • the marks 712 and 714 so as to overlap with each other, the display position of the image displayed at a position close to the user when viewed from the user is adjusted among the images 110R and 110L.
  • the marks 722 and 724 so as to overlap each other, the display position of an image displayed at a position far from the user among the images 110R and 110L is adjusted.
  • the user operates the OK button 750 to determine the position.
  • the control unit 30 temporarily stores the determined display position in the memory 32, and proceeds to S28 of FIG.
  • the subsequent processing is the same as in the first embodiment.
  • the way human objects are viewed differs depending on whether an object at a close position is viewed or an object at a distant position.
  • the human eye has a function of adjusting so as to compensate for a difference in appearance when an object at a close position and an object at a distant position are viewed at different timings. Therefore, when the image display device 10 performs an adjustment process for viewing an image displayed at a close position and an adjustment process for viewing an image displayed at a distant position at different timings, the display is performed at a close position.
  • an image to be displayed and an image displayed at a distant position are displayed at the same time, it may be difficult for the user to visually recognize them.
  • the user adjusts the marks 712 and 714 so that the marks 722 and 724 appear to overlap while looking at the adjustment screen 710. Therefore, according to the present embodiment, even when the images 110R and 110L include an image at a close position and an image at a distant position when viewed from the user, the image 110R is easily recognized three-dimensionally by the user. 110L can be displayed on the display unit 14.
  • the marks 712 and 714 of the present embodiment are examples of “first right mark image” and “first left mark image”, respectively.
  • the marks 722 and 724 are examples of a “second right mark image” and a “second left mark image”, respectively.
  • a button 832 and an OK button 840 for confirmation are included.
  • the mark 812 is a mark image displayed on the right display unit 14R
  • the mark 814 is a mark image displayed on the left display unit 14L.
  • the mark 812 has a shape that matches the outer shape of the actual monitor portion of the television monitor 850 that exists in the user's field of view, and is displayed at a position that overlaps the outer shape of the monitor portion. That is, in this embodiment, the control unit 30 recognizes the shape of the monitor portion of the television monitor 850 from the right camera image taken by the right camera 16R and the left camera image taken by the left camera 16L, and monitors the monitor. An image of the mark 812 having a shape that matches the portion is generated and displayed on the right display unit 14R.
  • the mark 814 has the same shape as the mark 812 (that is, a shape that matches the external shape of the monitor portion of the television monitor).
  • the right adjustment button 830, the left adjustment button 832, and the OK button 840 are displayed on one of the right display portion 14R and the left display portion 14L.
  • the user can visually recognize the adjustment screen 810 as shown in FIG. 15 by viewing each of the above elements with both eyes.
  • the user operates the buttons 830 and 832 while adjusting the marks 812 and 814 displayed at close positions with both eyes, and adjusts the marks 812 and 814 to overlap each other.
  • the user can move the mark 814 and adjust the mark 814 so as to overlap the mark 812 that is already displayed on the television monitor 850. That is, the user can make adjustments based on the TV monitor 850 that is an actual object.
  • the positions of the images 110R and 110L displayed on the right display unit 14R and the left display unit 14L are adjusted.
  • the user operates the OK button 840 to determine the position.
  • the control unit 30 temporarily stores the determined display position in the memory 32, and proceeds to S28 of FIG. The subsequent processing is the same as in the first embodiment.
  • the user can change the display position of the mark 814 so that the mark 812 displayed so as to overlap the television monitor 850 and the mark 814 appear to overlap each other.
  • the marks 812 and 814 can be aligned with reference to the television monitor 850 (that is, an actual object). Therefore, the marks 812 and 814 can be accurately aligned as compared with the case where the television monitor 850 is not used as a reference. Therefore, according to the above configuration, it is possible to display the images 110R and 110L on the display unit 14 in a more appropriate manner that allows the user to easily recognize three-dimensionally.
  • the marks 812 and 814 of the present embodiment are examples of a “right mark image” and a “left mark image”, respectively.
  • the marks 812 and 814 in the above example are examples of “first image” and “second image”, respectively.
  • the television monitor 850 is an example of “object”.
  • the configuration of the image display apparatus 1000 is different from that of the first embodiment.
  • the image display apparatus 1000 of the present embodiment is different from the first embodiment in that the display unit 1014 is a light-shielding display and blocks the user's field of view when the user wears the image display apparatus 1000. Is different. Other components are almost the same as those in the first embodiment.
  • the control unit 30 faces the right eye of the user in the display unit 1014 when displaying the display image on the display unit 1014.
  • the right display image is displayed in the area
  • the left display image is displayed in the area of the display unit 1014 facing the user's left eye.
  • the image display device 10 (50, 1000) includes one component in which all the components 14 to 32 (54 to 72, 1014) are mounted on the frame 12 (52). It is configured as a device.
  • the present invention is not limited to this, and some of the components of the image display device may be provided outside the frame.
  • the memory 32 may be provided outside the frame 12 so as to be able to communicate with the control unit 30.
  • the image display device 10 may be formed by combining the components mounted on the frame 12 and the components provided outside the frame 12.
  • the image display device 10 has the operation unit 19 provided outside the control box 18.
  • the image display device 10 is not limited to this, and the physical operation unit 19 may not be provided.
  • the input of various instructions may all be performed by the control unit 30 detecting a user gesture performed within the shooting range of the camera 16 (that is, the specific range).
  • a connection unit that can connect an external input device such as a keyboard may be provided. In that case, input of various instructions may be performed by an external input device connected to the connection unit.
  • the right camera 16R and the left camera 16L are disposed above the right display unit 14R and the left display unit 14L, respectively.
  • the arrangement positions of the right camera 16R and the left camera 16L are not limited to this, the right camera captures a range corresponding to the user's right eye view range, and the left camera captures a range corresponding to the user's left eye view range. If possible, they may be arranged at arbitrary positions.
  • the right image data for displaying the right display image on the right display unit 14R is generated based on the right camera image
  • the left display image is displayed on the left display unit 14L.
  • the left image data for this is generated based on the left camera image.
  • the right image data is generated based on the right image of the combined image of the right camera image and the left camera image
  • the left image data is generated based on the left image of the combined image. May be.
  • the image display devices 10 and 50 may perform wireless communication directly without using the server 80.
  • the image display device 10 and the image display device 50 may execute wireless communication other than Wi-Fi communication, such as Bluetooth (registered trademark) communication.
  • the image display device set in the receiving terminal may not be provided with a camera. That is, the device that is the receiving terminal may be a reception-only device.
  • each of the image display devices has a substantially glasses-like frame, and can be worn on the user's head like wearing glasses.
  • the image display device is not limited to this, and may have an arbitrary support frame such as a hat shape or a helmet shape as long as the image display device can be mounted on the user's head.
  • An image display device is formed by mounting a camera, a control box, etc. on eyewear (glasses, sunglasses, etc.) generally used for purposes such as vision correction and eye protection. May be. In that case, the lens portion of the eyewear may be used as the display unit.
  • Modification 9 In the third embodiment, instead of displaying the adjustment screen 710, the control unit 30 performs an up / down adjustment screen for adjusting the vertical position, a left / right adjustment screen for adjusting the left / right position, Adjustments may be made by displaying fine adjustment screens for adjustment in order. Similarly, in the fourth embodiment, instead of displaying the adjustment screen 810, the control unit displays a vertical adjustment screen for adjusting the vertical position, a horizontal adjustment screen for adjusting the horizontal position, and a fine adjustment for fine adjustment. Adjustments may be made by displaying the adjustment screens in order.
  • the control unit 30 may display the mark 814 on the television monitor 850 in place of the mark 812.
  • an object that exists in the user's field of view and is used as a reference for position adjustment is not limited to the television monitor 850 shown in FIG. 15, and may be any object. Therefore, for example, a dedicated article for position adjustment may be used.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

L'invention concerne un dispositif d'affichage d'image pour imager un article candidat à l'aide d'une caméra droite et d'une caméra gauche, et pour amener une image d'affichage droite à être affichée sur une unité d'affichage droite sur la base de l'image de caméra droite et amener une image d'affichage gauche à être affichée sur une unité d'affichage gauche sur la base de l'image de caméra gauche. Un utilisateur portant le dispositif d'affichage d'image peut reconnaître visuellement une image d'affichage avec les deux yeux, sur la base de l'image d'affichage droite et de l'image d'affichage gauche. Une unité de commande règle la position d'affichage de l'image d'affichage droite sur l'unité d'affichage droite et la position d'affichage de l'image d'affichage gauche sur l'unité d'affichage gauche conformément à une indication de l'utilisateur. Lors de l'affichage de l'image d'affichage sur une unité d'affichage (14), il est possible de résoudre une aberration de position entre une partie d'affichage par l'image d'affichage droite et une partie d'affichage par l'image d'affichage gauche.
PCT/JP2016/081358 2016-10-21 2016-10-21 Dispositif d'affichage d'image et système d'affichage d'image WO2018073969A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2018546133A JP6867566B2 (ja) 2016-10-21 2016-10-21 画像表示装置及び画像表示システム
PCT/JP2016/081358 WO2018073969A1 (fr) 2016-10-21 2016-10-21 Dispositif d'affichage d'image et système d'affichage d'image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/081358 WO2018073969A1 (fr) 2016-10-21 2016-10-21 Dispositif d'affichage d'image et système d'affichage d'image

Publications (1)

Publication Number Publication Date
WO2018073969A1 true WO2018073969A1 (fr) 2018-04-26

Family

ID=62019585

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/081358 WO2018073969A1 (fr) 2016-10-21 2016-10-21 Dispositif d'affichage d'image et système d'affichage d'image

Country Status (2)

Country Link
JP (1) JP6867566B2 (fr)
WO (1) WO2018073969A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022038637A (ja) * 2020-08-27 2022-03-10 グローブライド株式会社 釣具識別装置及び釣具管理システム

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005093687A1 (fr) * 2004-03-26 2005-10-06 Atsushi Takahashi Système de verre grossissant numérique d'entité 3d ayant une fonction d'instruction visuelle 3d
JP2006293605A (ja) * 2005-04-08 2006-10-26 Canon Inc 情報処理方法およびシステム
JP2012138654A (ja) * 2010-12-24 2012-07-19 Sony Corp ヘッド・マウント・ディスプレイ
JP2012165085A (ja) * 2011-02-04 2012-08-30 Seiko Epson Corp 頭部装着型表示装置および頭部装着型表示装置の制御方法
JP2013232744A (ja) * 2012-04-27 2013-11-14 Bi2−Vision株式会社 ディスプレイシステム、ディスプレイ調整システム、ディスプレイ調整方法、およびプログラム
JP2016004493A (ja) * 2014-06-18 2016-01-12 キヤノン株式会社 画像処理装置およびその制御方法
JP2016126365A (ja) * 2014-12-26 2016-07-11 セイコーエプソン株式会社 表示システム、表示装置、情報表示方法、及び、プログラム

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005093687A1 (fr) * 2004-03-26 2005-10-06 Atsushi Takahashi Système de verre grossissant numérique d'entité 3d ayant une fonction d'instruction visuelle 3d
JP2006293605A (ja) * 2005-04-08 2006-10-26 Canon Inc 情報処理方法およびシステム
JP2012138654A (ja) * 2010-12-24 2012-07-19 Sony Corp ヘッド・マウント・ディスプレイ
JP2012165085A (ja) * 2011-02-04 2012-08-30 Seiko Epson Corp 頭部装着型表示装置および頭部装着型表示装置の制御方法
JP2013232744A (ja) * 2012-04-27 2013-11-14 Bi2−Vision株式会社 ディスプレイシステム、ディスプレイ調整システム、ディスプレイ調整方法、およびプログラム
JP2016004493A (ja) * 2014-06-18 2016-01-12 キヤノン株式会社 画像処理装置およびその制御方法
JP2016126365A (ja) * 2014-12-26 2016-07-11 セイコーエプソン株式会社 表示システム、表示装置、情報表示方法、及び、プログラム

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022038637A (ja) * 2020-08-27 2022-03-10 グローブライド株式会社 釣具識別装置及び釣具管理システム
JP7464479B2 (ja) 2020-08-27 2024-04-09 グローブライド株式会社 釣具識別装置及び釣具管理システム

Also Published As

Publication number Publication date
JP6867566B2 (ja) 2021-04-28
JPWO2018073969A1 (ja) 2019-08-08

Similar Documents

Publication Publication Date Title
JP6339239B2 (ja) 頭部装着型表示装置、及び映像表示システム
JP6378781B2 (ja) 頭部装着型表示装置、及び映像表示システム
KR20180103066A (ko) 화상 표시 시스템, 화상 표시 시스템의 제어방법, 화상 전송 시스템 및 헤드 마운트 디스플레이
KR102117376B1 (ko) 정보 처리 장치
WO2016017245A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et système d'affichage d'images
JP2016031439A (ja) 情報処理装置及び情報処理方法、コンピューター・プログラム、並びに画像表示システム
WO2017085974A1 (fr) Appareil de traitement d'informations
GB2495159A (en) A head-mounted somatosensory control and display system based on a user's body action
JP2014192550A (ja) 頭部装着型表示装置および頭部装着型表示装置の制御方法
JP6459380B2 (ja) 頭部装着型表示装置、頭部装着型表示装置の制御方法、および、コンピュータープログラム
JPWO2016013272A1 (ja) 情報処理装置及び情報処理方法、並びに画像表示システム
WO2017022291A1 (fr) Dispositif de traitement d'informations
JP6631014B2 (ja) 表示システム、及び表示制御方法
KR20190069480A (ko) 융합기능 개선을 위한 시력훈련장치
EP3402410B1 (fr) Système de détection
JP6867566B2 (ja) 画像表示装置及び画像表示システム
WO2017191702A1 (fr) Dispositif de traitement d'image
US20120120051A1 (en) Method and system for displaying stereoscopic images
US20150237338A1 (en) Flip-up stereo viewing glasses
WO2012147482A1 (fr) Dispositif d'affichage d'image stéréoscopique et procédé d'affichage d'image stéréoscopique
WO2017163649A1 (fr) Dispositif de traitement d'image
JP2016090853A (ja) 表示装置、表示装置の制御方法、及び、プログラム
JP5037713B1 (ja) 立体画像表示装置および立体画像表示方法
JP6683218B2 (ja) 頭部装着型表示装置および頭部装着型表示装置の制御方法
KR20110136326A (ko) 삼차원 입체안경의 수평각 정보를 반영한 삼차원 스테레오스코픽 렌더링 시스템

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16919414

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2018546133

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16919414

Country of ref document: EP

Kind code of ref document: A1