WO2021220407A1 - Head-mounted display device and display control method - Google Patents

Head-mounted display device and display control method Download PDF

Info

Publication number
WO2021220407A1
WO2021220407A1 PCT/JP2020/018133 JP2020018133W WO2021220407A1 WO 2021220407 A1 WO2021220407 A1 WO 2021220407A1 JP 2020018133 W JP2020018133 W JP 2020018133W WO 2021220407 A1 WO2021220407 A1 WO 2021220407A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
head
display device
mounted display
optotype
Prior art date
Application number
PCT/JP2020/018133
Other languages
French (fr)
Japanese (ja)
Inventor
宣隆 奥山
仁 秋山
Original Assignee
マクセル株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by マクセル株式会社 filed Critical マクセル株式会社
Priority to PCT/JP2020/018133 priority Critical patent/WO2021220407A1/en
Publication of WO2021220407A1 publication Critical patent/WO2021220407A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/361Reproducing mixed stereoscopic images; Reproducing mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background

Definitions

  • the present invention relates to VR (Virtual Reality) display technology.
  • HMD head-mounted display
  • VR sickness When a user wears an HMD device and views a moving image, so-called VR sickness may occur.
  • Patent Document 1 states, "The processing executed by the processor of the computer that provides the virtual reality detects the step of defining the virtual space in the memory and the operation of the user wearing the HMD device. Steps to determine the flight direction of an object flying in the virtual space based on the user's actions, and to display a view image on the HMD device so that the user's view in the virtual space moves in the flight direction. The step of generating the view image data of the above and displaying the view image on the monitor based on the generated view image data, and when the object reaches the object in the virtual space, the position of the virtual user is moved to the object at high speed. Steps to make, including (summary excerpt) ”discloses how a computer provides virtual space to an HMD device.
  • viewpoint movement when viewing an image in which the visual field is moved by the virtual user's own movement operation (hereinafter referred to as viewpoint movement), VR sickness caused by the illusion that the user is moving cannot be reduced.
  • the present invention has been made in view of the above points, and an object of the present invention is to facilitate keeping the binocular parallax substantially constant by consciously looking at the optotype while keeping the image accompanied by the viewpoint movement. , To provide an information display technique for reducing VR sickness caused by using an HMD device.
  • the present invention is a head-mounted display device that is worn on the user's head and stereoscopically views the image for the right eye and the image for the left eye with both eyes, and is an operation detection unit that outputs a detection signal when a viewpoint movement operation is detected.
  • the target is provided with a display control unit that superimposes and displays the target on the image for the right eye and the image for the left eye, and the target is a binocular stereoscopic view of the target. It is characterized in that it is displayed on a virtual surface at a predetermined distance from the own device in the depth direction and has a shape capable of securing a field of view.
  • FIG. 1 is a schematic diagram for explaining an outline of the present embodiment.
  • the virtual image surface 310 and the binocular stereoscopic image 410 appear to be in focus when the user 101 wears the head-mounted display device (hereinafter, HMD device) 100 and looks at it with only one of the left and right eyes. Is shown.
  • HMD device head-mounted display device
  • the binocular stereoscopic image 410 is a stereoscopic image in which the position in the depth direction appears as an illusion based on the parallax of both eyes.
  • the position where the image displayed on the display of the HMD device 100 is seen as a virtual image in which the right-eye image and the left-eye image viewed through the optical element are in focus is the virtual image plane 310.
  • the optotype 500 is displayed on a predetermined virtual surface (virtual surface 300) so that binocular stereoscopic viewing can be performed.
  • the optotype 500 is a mark for facilitating the stabilization of the binocular parallax of the user's line of sight 612.
  • the virtual surface 300 that displays the optotype 500 in binocular stereoscopic vision is a surface that is always at the same distance from the HMD device 100 (own device) in the depth direction when viewed from the user. In this embodiment, for example, it is displayed on the virtual image plane 310.
  • the virtual image surface 310 is a virtual surface in which the left-eye and right-eye display images displayed by the HMD device 100 appear to be formed as virtual images. That is, it is a virtual surface at a distance for optically forming an image of the HMD device 100. This distance is optically determined by the distance between the lens and the display included in the HMD device 100.
  • the virtual image surface 310 may not be a flat surface, or may be a surface that does not match between the right-eye display and the left-eye display.
  • the optotype 500 has a shape that can secure a field of view.
  • the annular arrangement mark 510 which is a mark arranged in an annular shape, is displayed as the optotype 500.
  • the annular arrangement mark 510 of the present embodiment includes a first mark 511 and a second mark 512.
  • the first mark 511 is a white circle and the second mark 512 is a black circle. show.
  • the first mark 511 and the second mark 512 are both arranged on the same circumference, and both are arranged alternately.
  • the binocular stereoscopic image 410 (display object 410) can be viewed by moving the viewpoint together with the background image by operating the operation unit.
  • This viewpoint movement means that the appearance of the display object 410 changes as if the user 101 has moved. For example, if the viewpoint is moved forward, the user 101 can stereoscopically view the display object 410 as if it were close to the user 101. By operating the viewpoint to move backward, the display object 410 can be stereoscopically viewed with both eyes so as to move away from the user 101.
  • the HMD device 100 displays the optotype 500 (annular arrangement mark 510) only while detecting an operation involving the viewpoint movement by the user 101 (hereinafter, referred to as a viewpoint movement operation).
  • a sensation called VR sickness may occur only by grasping the relative positions of the user 101 and the binocular stereoscopic image 410. This is because the user 101 actually falls into the illusion that he / she is moving even though he / she is not moving.
  • the optotype 500 whose binocular stereoscopic positions match is displayed on the virtual image plane 310. By putting the optotype 500 in the field of view, the user 101 can easily grasp that he / she is not moving, which has an effect of reducing VR sickness.
  • the optotype 500 of the present embodiment has a configuration that does not block the center of the field of view. Therefore, even if the optotype 500 is displayed, it does not interfere with viewing the displayed image. Further, in the present embodiment, since the optotype 500 is not displayed when the viewpoint movement operation is not performed, it does not interfere with viewing the entire displayed image.
  • 110 is the HMD main unit
  • 120 is the HMD control device
  • 130 is the HMD operation device
  • 611 and 612 are the line of sight
  • 119R and 119L are the cameras. Details of each will be described later.
  • FIG. 2 is a hardware configuration diagram of the HMD device 100 of the present embodiment.
  • the HMD device 100 includes an HMD main body device 110, an HMD control device 120, and an HMD operation device 130.
  • the HMD main unit 110 includes a bus 111, a right-eye display 114R, a left-eye display 114L, a right-eye camera 119R, a left-eye camera 119L, and an inter-device interface (I / F) 117.
  • the right-eye display 114R and the left-eye display 114L are display devices (displays) such as a liquid crystal panel, respectively, and present the image data processed by the image processing unit 126 described later of the HMD control device 120 to the user 101.
  • the right-eye display 114R and the left-eye display 114L may be transmissive displays.
  • the right-eye camera 119R and the left-eye camera 119L are external cameras that acquire an image of the surroundings of the HMD device 100.
  • the right-eye camera photographs the line-of-sight area of the user 101's right eye
  • the left-eye camera 119L photographs the line-of-sight area of the left eye of the user 101. Both are arranged at a predetermined distance.
  • the HMD control device 120 includes a bus 121, a CPU (Central Processor Unit) 122, a RAM 123, a flash ROM 124, an OSD generation unit 125, an image processing unit 126, an inter-device I / F 127, a sensor device 128, and the like. It includes a communication device 129 and.
  • CPU Central Processor Unit
  • the bus 121 is a data communication path for transmitting and receiving data between the CPU 122 and each part in the HMD device 100.
  • the CPU 122 is a main control unit that controls the entire HMD device 100 according to a predetermined program.
  • the CPU 122 may be realized by a microprocessor unit (MPU).
  • the CPU 122 performs processing according to a clock signal measured and output by the timer.
  • RAM 123 is a program area when executing a basic operation program or other application program. Further, the RAM 123 is a temporary storage area for temporarily holding data as needed when executing various application programs. The RAM 123 may be integrated with the CPU 122.
  • the flash ROM 124 stores information necessary for processing, such as each operation setting value of the HMD device 100.
  • the flash ROM 124 may store still image data, moving image data, and the like taken by the HMD device 100. Further, it is assumed that the function of the HMD device 100 can be expanded by downloading a new application program from the application server via the Internet. At this time, the downloaded new application program is stored in the flash ROM 124.
  • the HMD device 100 realizes various functions when the CPU 122 develops and executes a new application program stored in the flash ROM 124 in the RAM 123.
  • the flash ROM 124 is provided with, for example, an operation program holding unit 124a and an image data holding unit 124b.
  • the operation program is stored in the operation program holding unit 124a, and the image data is stored in the image data holding unit 124b.
  • the flash ROM 124 can hold the stored information even when the power is not supplied to the HMD device 100, and is not limited to the flash ROM 124, for example, SSD (Solid State Drive), HDD (Hard Disk Drive). ) Etc. may be a device.
  • SSD Solid State Drive
  • HDD Hard Disk Drive
  • the OSD generation unit 125 generates information (OSD information) to be displayed overlaid on the video.
  • the OSD information to be generated is, for example, a setting menu or operation information of the HMD device 100 itself.
  • the optotype 500 is also retained as OSD information.
  • the OSD information is stored in the flash ROM 124 or the like in advance.
  • the image processing unit 126 is an image (video) processor, processes images acquired by the right-eye camera 119R and the left-eye camera 119L, and generates images to be displayed on the right-eye display 114R and the left-eye display 114L, respectively. .. Further, the image processing unit 126 superimposes an object created by the CPU 122 or the like on an input image to generate image data to be displayed on each display.
  • the device-to-device I / F 127 is a data transmission / reception interface with the HMD main device 110.
  • the HMD main body device 110 and the HMD control device 120 are connected by wire. Therefore, the device-to-device I / F 127 is, for example, an interface for a wired connection.
  • the sensor device 128 is a group of sensors for detecting the state of the HMD device 100.
  • an acceleration sensor 128a, a geomagnetic sensor 128b, a GPS (Global Positioning System) receiving device 128c, and a distance sensor 128d are provided.
  • other sensors such as a gyro sensor may be provided.
  • These sensor groups detect the position, movement, tilt, direction, etc. of the HMD device 100.
  • the distance sensor 128d is a depth sensor and acquires distance information from the HMD device 100 to the object.
  • the communication device 129 is a communication processor that performs communication processing.
  • a network (N / W) communication device 129a and a short-range wireless communication device 129b are provided.
  • the network communication device 129a is an interface for transmitting and receiving contents via a network such as a LAN (Local Area Network). Data is sent and received by connecting to an access point for wireless communication on the Internet by wireless communication.
  • the short-range wireless communication device 129b transmits / receives data to / from the HMD operation device 130 by short-range wireless communication.
  • a standard such as Bluetooth® may be adopted.
  • the network communication device 129a and the short-range wireless communication device 129b each include a coding circuit, a decoding circuit, an antenna, and the like.
  • the communication device 129 may further include an infrared communication device or the like.
  • the HMD operation device 130 includes a bus 131, a CPU 132, a RAM 133, a flash ROM 134, a touch sensor 137, a gyro sensor 138, and a short-range wireless communication device 139.
  • the bus 131, the CPU 132, the RAM 133, the flash ROM 134, and the short-range wireless communication device 139 have the same configuration as the HMD control device 120 having the same name.
  • the touch sensor 137 is a touch pad that receives input of operation instructions from the user 101.
  • the HMD operation device 130 may include operation keys such as a power key, a volume key, and a home key.
  • the gyro sensor 138 is a posture detection device that detects the rotational angular velocity of the HMD operation device 130.
  • the posture detection device included in the HMD operation device 130 does not have to be the gyro sensor 138 as long as the posture of the HMD operation device 130 can be detected.
  • FIG. 3 is a functional block diagram of the HMD device 100 of the present embodiment.
  • the HMD apparatus 100 of the present embodiment includes an overall control unit 151, a reception unit 152, an operation detection unit 153, a display control unit 154, and processing data 155.
  • the overall control unit 151 controls the overall operation by controlling each function of the HMD device 100.
  • the reception unit 152 accepts operations from the user 101. In this embodiment, it is received via the touch sensor 137. The information about the operation received from the user 101 is output to the operation detection unit 153 and the display control unit 154.
  • the operation detection unit 153 detects the viewpoint movement operation.
  • the operation received by the reception unit 152 is analyzed, and the viewpoint movement operation is detected. For example, when a predetermined operation is performed, it is determined that the viewpoint movement operation has been performed. The determination result is notified to the display control unit 154.
  • the viewpoint movement operation detection signal is output while it is determined that the viewpoint movement operation has been performed (during the detection of the line-of-sight movement operation).
  • the display control unit 154 controls the display on the right-eye display 114R and the left-eye display 114L.
  • the present embodiment includes display control of the optotype 500. While the operation detection unit 153 is detecting the viewpoint movement operation, that is, while the viewpoint movement operation detection signal is being received, the OSD generation unit 125 generates the display data of the optotype 500 and outputs it to the video processing unit 126.
  • the image processing unit 126 displays the optotype 500 by superimposing the received display data on the display image (image composition).
  • the processing data 155 is data necessary for executing the display control processing described later in the HMD device 100.
  • the processed data 155 is held in, for example, the RAM 123 and the flash ROM 124.
  • the overall control unit 151, the reception unit 152, the operation detection unit 153, and the display control unit 154 are realized by the CPU 132 expanding the program held in the flash ROM 134 to the RAM 133 and executing it.
  • the program is held in the operation program holding unit 124a.
  • FIG. 4 is an explanatory diagram of an operation when the viewpoint is moved according to the present embodiment.
  • the viewpoint movement operation is performed. Determine that it was done. Then, it is determined that the viewpoint movement operation is completed when the finger is separated from the touch pad.
  • the user 101 touches the point 211 at a predetermined position with a fingertip or the like on the touch pad on which the touch sensor 137 is arranged. After that, the fingertip is slid in the moving direction (arrow 221) and stopped in a touched state at the point 212.
  • the operation detection unit 153 detects it as a viewpoint movement operation and outputs a viewpoint movement operation detection signal. As a result, the display control unit 154 continues to move the viewpoint forward (in the depth direction) in the direction of the arrow 221 at a speed corresponding to the length of the arrow.
  • the reception unit 152 does not accept the operation.
  • the operation detection unit 153 does not detect the viewpoint movement operation, the output of the viewpoint movement operation detection signal is stopped.
  • the display control unit 154 stops the viewpoint movement.
  • the operation detection unit 153 sends a viewpoint movement operation detection signal so as to decelerate to a speed corresponding to the distance from the first touched point 211. Output.
  • the operation detection unit 153 corresponds to the arrow 223 from the first touched point 211.
  • the viewpoint movement operation detection signal is output so that the viewpoint movement is continued in the desired direction and speed.
  • the operation detection unit 153 indicates the arrow pointing to the right.
  • the viewpoint movement signal is output so that the viewpoint movement is continued at a speed corresponding to the length of 222.
  • the pan, tilt, and rotation may be reflected in the viewpoint movement by detecting the posture of the HMD operating device 130 with a posture detecting device such as a gyro sensor 138.
  • the operation detection unit 153 outputs a viewpoint movement operation detection signal based on the detection result of the gyro sensor 138.
  • pan means a horizontal turn and is called yawing in the aviation field.
  • Tilt also means tilting in the vertical direction and is called pitching in the aircraft field.
  • Rotation means the tilt of the angle of view and is called rolling in the aviation field.
  • FIG. 5A is a diagram for explaining the binocular stereoscopic image 410 and the annular arrangement mark 510 displayed as the optotype 500.
  • the fish of the display object 410 (binocular stereoscopic image 410) is the binocular parallax of the user 101 who visually recognizes the right eye image 410R and the left eye image 410L displayed on the virtual image surface 310. Therefore, it is recognized as one object existing at a predetermined position in the depth direction.
  • This predetermined position in the depth direction is a position where the line of sight 611R of the right eye and the line of sight 611L of the left eye intersect.
  • the annular arrangement mark 510 (511, 512) is displayed as a binocular stereoscopic image on the virtual image plane 310 by the line of sight 612.
  • FIG. 5B is a diagram showing how the display object 410 (binocular stereoscopic image 410) and the annular arrangement mark 510 are alternately viewed in binocular stereoscopic view.
  • the images when the display object 410 (binocular stereoscopic image 410) and the annular arrangement mark 510 are visually recognized are combined and shown.
  • the other is a double image shifted to the left and right.
  • the annular arrangement mark 510 includes a plurality of alternately arranged white circles (first mark 511) and black circles (second mark 512) on the circular auxiliary line (circumference 513). ..
  • the auxiliary line (circumference 513) is a virtual line that is not actually displayed.
  • FIG. 6B is a display state when the user 101 performs a viewpoint movement operation so as to bring the display object 410 closer to the display object 410 while being visible at the position shown in FIG. 6A.
  • FIG. 6C is a display state in the case where the positional relationship between the display position of the display object 410 and the virtual image surface 310 is the aspect shown in FIG. 6A.
  • FIG. 6D is a display state in the case where the positional relationship between the display position of the display object 410 and the virtual image surface 310 is the aspect shown in FIG. 6B.
  • FIG. 6D when either one of the display object 410 and the annular arrangement mark 510 is stereoscopically viewed with both eyes, the other can be seen without becoming a double image shifted to the left and right.
  • the annular arrangement mark 510 (511,512) in which the viewpoint position of the user 101 does not change relatively and putting it in the field of view of the user 101, the feeling that the user 101 is moving even though it is not actually moving. Can be prevented from falling into. This can reduce VR sickness.
  • the annular arrangement mark 510 is a discrete mark arranged in an annular shape, the central portion of the visual field is not blocked. Therefore, the display object 410 is not hindered from being viewed, and the surrounding background image can be viewed while maintaining the continuity.
  • FIG. 7 is a processing flow of the display control process including the optotype display by the HMD device 100 of the present embodiment. This process is started when the HMD device 100 is started and the operation program is started.
  • the overall control unit 151 makes initial settings (step S1101).
  • the initial setting is executed by reading a saved setting value for a predetermined setting item, or by inputting the setting value by the user 101 or the like.
  • the setting items are, for example, various specifications of the optotype 500.
  • the annular arrangement mark 510 is selected and set as the optotype 500. Further, the annular arrangement mark 510 is set to be displayed on the virtual image surface 310 only during the viewpoint movement operation.
  • the overall control unit 151 starts an application that has received an instruction from the user 101 (step S1102).
  • the reception unit 152 receives an instruction to specify an application to be started.
  • the reception unit 152 receives various operations by the user 101 for the application, such as a viewpoint movement operation, at predetermined time intervals (step S1103).
  • the operation detection unit 153 analyzes the operation received by the reception unit 152 and determines whether or not the viewpoint movement operation has been performed (step S1104). Then, when it is determined that the viewpoint movement operation is present (S1104; Yes), the viewpoint movement operation detection signal is output to the display control unit 154.
  • the display control unit 154 When the display control unit 154 receives the viewpoint movement operation detection signal, it determines whether or not the optotype 500 is currently displayed (step S1105).
  • the display control unit 154 displays the optotype 500 as it is, and the overall control unit 151 receives an instruction from the reception unit 152 to terminate the application. It is determined whether or not the application (step S1109). Then, when the instruction to terminate the application is not accepted (S1109; No), the process returns to step S1103 and the process is continued.
  • Step S1110 when the instruction to terminate the application is received (S1109; Yes), whether or not the overall control unit 151 has received the instruction to terminate the application and terminate the HMD device 100 itself (operation program termination instruction; power OFF instruction). (Step S1110).
  • step S1110 If the instruction to end is not received (S1110; No), the overall control unit 151 returns to step S1102. On the other hand, when the instruction to end is received (S1110; Yes), the process ends.
  • step S1105 when the optotype 500 is not displayed (S1105; No), the display control unit 154 displays the optotype 500 so as to form an image at a predetermined distance (step S1106), and proceeds to step S1109. Transition.
  • step S1104 when the viewpoint movement operation is not performed (S1104; No), the reception unit 152 does not output the viewpoint movement operation detection signal.
  • step S1107 determines whether or not the optotype 500 is currently displayed.
  • the display control unit 154 determines whether or not the optotype 500 is currently displayed (step S1107).
  • the optotype 500 is displayed (S1107; Yes)
  • the display of the optotype 500 is stopped (the optotype 500 is erased) (step S1108), and the process proceeds to step S1109.
  • step S1109 the process proceeds to step S1109 as it is.
  • the initial settings may be made after the application is started.
  • various specifications of the optotype 500 can be set according to the application.
  • the HMD device 100 of the present embodiment is attached to the head of the user 101, and displays the image 410R for the right eye and the image 410L for the left eye as a binocular stereoscopic image 410. Then, when the viewpoint movement operation is detected, the operation detection unit 153 that outputs the viewpoint movement operation detection signal and the display that superimposes and displays the target 500 on the binocular stereoscopic image while the viewpoint movement operation detection signal is output. It includes a control unit 154.
  • the optotype 500 has a shape that is displayed on a virtual surface at a predetermined distance from the own device in the depth direction when the optotype is stereoscopically viewed with both eyes, and a field of view can be secured.
  • the optotype 500 is displayed in which the viewpoint position of the user 101 does not change during the viewpoint movement operation.
  • the optotype 500 serves as a guide for fixing the viewpoint of the user 101.
  • the optotype 500 is displayed on the virtual image plane 310.
  • the optotype 500 can be stereoscopically viewed on the virtual image plane 310.
  • the user 101 is less likely to be caught in the illusion that the user 101 itself is moving. Therefore, it is possible to reduce VR sickness caused by using the HMD device 100 while keeping the image accompanied by the viewpoint movement.
  • the optotype 500 is displayed only while the viewpoint movement operation is detected. Coupled with the shape that can secure the field of view of the optotype 500, it does not interfere with the viewing of the display object of the user 101.
  • the optotype 500 can be displayed by the function of the HMD device 100 even if the application such as a game does not have the function of displaying the optotype 500.
  • the display position of the optotype 500 is preset and fixed during application execution.
  • the display position of the optotype 500 on the virtual surface is changed according to the displayed image. Further, in the present embodiment, it is possible to perform the initial setting of the optotype 500 and the setting change by interruption during the execution of the application.
  • the hardware configuration of the HMD device 100 of this embodiment is the same as that of the first embodiment. Further, the functional configuration of the HMD device 100 of the present embodiment is also the same as that of the first embodiment. However, the processing of the operation detection unit 153 and the display control unit 154 is different.
  • the display mode of the optotype 500 is switched.
  • a plurality of threshold values are set for the speed in the forward direction of the viewpoint movement, and the depth direction position (front-back direction position in the case of binocular stereoscopic viewing) of the optotype 500 is displayed by being gradually switched.
  • a plurality of threshold values are set for the left-right speed of the viewpoint movement, and the left-right position of the optotype 500 is gradually switched and displayed.
  • the operation detection unit 153 calculates the operation direction and the operation speed.
  • the operation direction and operation speed are calculated using the information between the immediately preceding viewpoint movement operation position and the latest viewpoint movement operation position, and the time interval between accepting the operations. Then, the operation detection unit 153 outputs a viewpoint movement operation detection signal including the operation direction and the operation speed.
  • the viewpoint movement operation uses the method described in the first embodiment. For example, in FIG. 4, when the operation of moving from the first touched point 211 to the point 213 without releasing the finger is accepted, the direction and speed are determined from the vector indicated by the arrow 223.
  • the display control unit 154 controls the display position of the optotype 500 based on the speed and direction information included in the viewpoint movement operation detection signal.
  • a table for designating a display position for each speed threshold value in the depth direction and the left-right direction is held as processing data 155.
  • the display position table is stored in, for example, the flash ROM 124 or the like.
  • the display control unit 154 refers to this display position table and controls the display position.
  • FIGS. 8 (a) and 8 (b) Examples of display position tables are shown in FIGS. 8 (a) and 8 (b).
  • FIG. 8A is an example of the depth direction display position table 161 showing the relationship between the speed threshold and the display position in the depth direction
  • FIG. 8B shows the relationship between the speed threshold and the display position in the left-right direction. This is an example of the left-right display position table 162 shown.
  • the display position table for each one or more speed thresholds, information that can specify the display positions in the depth direction and the left-right direction when the speed threshold is exceeded is provided in the depth direction display position and the display position. Hold as the left-right display position.
  • the depth direction display position and the left-right direction display position may be held for each type of the optotype 500.
  • the position D0 corresponding to the velocity threshold 0 and matching the virtual image plane 310 has the depth corresponding to the velocity threshold V1 and the position D2 corresponding to the velocity threshold V2.
  • V1 and V2 are arbitrary speed threshold values that satisfy V1 ⁇ V2.
  • the position information for example, the coordinate position on the virtual image plane 310 is held.
  • the set speed threshold value may be different in the depth direction and the left-right direction.
  • FIG. 8C is a conceptual diagram of a display state when a user 101 equipped with the HMD device 100 is operating a car chase game.
  • the same configuration as that of the first embodiment is assigned the same number, and the description thereof will be omitted.
  • windshield image 423 an image imitating the frame of the windshield of the own vehicle (hereinafter referred to as windshield image 423) is displayed in the display area of the display.
  • vehicle to be tracked vehicle to be tracked 421
  • road 422 are also displayed.
  • the HMD device 100 obtains the geographical shape of the road 422 via the communication device 129 according to the instruction of the user 101. Further, the traveling position of the tracking target vehicle 421 is also obtained via the communication device 129. On the other hand, the position of the virtual vehicle driven by the user 101 chasing the tracking target vehicle 421 is operated by the HMD operation device 130 of the HMD device 100.
  • FIG. 8C is an image when the user 101 is tracking the tracking target vehicle 421 traveling on the road 422 of the right curve.
  • the windshield image 423 is superimposed and displayed.
  • the display control unit 154 displays the tracking target vehicle 421 so that it is stereoscopically viewed with both eyes far away (forward) from the virtual image plane 310.
  • the windshield image 423 is stereoscopically displayed in front of the virtual image plane 310.
  • the operation detection unit 153 detects the corresponding viewpoint movement operation and calculates the operation direction and the operation speed. Then, a viewpoint movement signal including such information is generated and output.
  • the display control unit 154 that has received the viewpoint movement signal refers to the display position table and displays the optotype 500.
  • the case where the annular arrangement mark 510 is displayed as the optotype 500 is shown.
  • the center of the line of sight enters the area surrounded by the virtual circumference 513 of the annular arrangement mark 510, so that the user 101 can see the annular arrangement mark 510 rather than displaying it in the center of the screen.
  • Easy to put in It is possible to reduce the illusion that the user 101 itself is moving by being confused by the road 422 that appears to flow to the user 101. Therefore, VR sickness can be reduced.
  • FIG. 9 is a processing flow of the display control processing of the present embodiment. This process is also started when the HMD device 100 is started. Further, the same processing as in the first embodiment is designated by the same reference numerals, and the description thereof will be omitted again.
  • the overall control unit 151 starts the application instructed by the user 101 (step S2101).
  • the overall control unit 151 performs the initial setting according to the instruction from the user 101 (step S2102). Similar to the first embodiment, various specifications of the optotype 500 are set. The specifications may be predetermined according to the application. Here, as in the first embodiment, it is assumed that the annular arrangement mark 510 is set to be displayed.
  • the reception unit 152 receives various operations of the application such as a game at predetermined time intervals as in the first embodiment (step S1103). At this time, the reception unit 152 of the present embodiment determines whether or not the setting change interrupt has been received (step S2104).
  • step S2105 When the setting change interrupt is received (S2104; Yes), the overall control unit 151 changes the setting according to the received instruction (step S2105), and proceeds to step S1103.
  • the operation detection unit 153 determines whether the viewpoint movement operation has been performed as in the first embodiment (step S1104). Then, when it is determined that the viewpoint movement operation is present (S1104; Yes), the operation detection unit 153 calculates the operation direction and the operation speed, and outputs the viewpoint movement operation detection signal including the information to the display control unit 154. (Step S2106).
  • the display control unit 154 refers to the display position table based on the viewpoint movement operation detection signal including the operation direction and the operation speed, determines the display position, and displays the optotype 500 at the determined display position (step S2107). , Step S1109.
  • step S1104 the process when the viewpoint movement operation is not accepted (S1104; No) is the same as that of the first embodiment, and thus the description thereof will be omitted here.
  • steps S1107, S1108, S1109, and S1110 are the same as that of the first embodiment, the description thereof will be omitted here.
  • the operation detection unit 153 of the present embodiment further detects the operation direction and operation speed of the detected viewpoint movement operation, and includes the detected information on the operation direction and operation speed in the viewpoint movement operation detection signal.
  • the display control unit 154 displaces the optotype 500 in the direction in which the viewpoint of the user 101 has moved and displays the display based on the information.
  • the initial setting of the optotype 500 and the setting change by interrupt can be performed in the execution state of the application.
  • the depth direction position (front-back direction position in the case of binocular stereoscopic viewing) of the optotype 500 is gradually switched based on a plurality of threshold values with respect to the speed in the forward direction of the viewpoint movement.
  • the position of the optotype 500 in the left-right direction is gradually switched and displayed based on a plurality of threshold values with respect to the speed of the viewpoint movement in the left-right direction. This makes it easier to capture the optotype 500 in the field of view while looking at the display image of the application, so that it becomes easier to obtain the effect of reducing VR sickness.
  • the target is a captured moving image. Then, in the HMD apparatus 100 of the present embodiment, when the movement amount of the binocular stereoscopic image detected from the captured moving image is equal to or more than a predetermined value, it is determined that the viewpoint movement operation is performed.
  • the HMD device 100 of this embodiment is basically the same as that of the first embodiment.
  • the viewpoint movement operation detection process by the operation detection unit 153 is different. That is, in the present embodiment, the viewpoint movement operation is detected not from the operation by the user 101 but from the movement of the image.
  • the present embodiment will be described with a focus on a configuration different from that of the first embodiment.
  • the operation detection unit 153 of the present embodiment analyzes the displayed image.
  • the video is acquired by the right-eye camera 119R and the left-eye camera 119L.
  • An independent camera may be provided for taking an image of the surroundings.
  • the operation detection unit 153 detects the movement of an element in the image by using the existing motion detection process from the image (video) generated as an image by the image processing unit 126 processing at predetermined time intervals. For example, the movement of an element in an image is detected from the latest image and the immediately preceding image. Then, when the movement of the predetermined threshold value or more is analyzed, it is determined that there is a viewpoint movement, and the viewpoint movement operation detection signal is output to the display control unit 154.
  • the movement direction and the movement speed may be detected together as the operation direction and the operation speed, respectively, and these information may be included in the viewpoint movement operation detection signal.
  • the motion vector is calculated from the previous and next images and the intermediate image is interpolated.
  • the evaluation value is determined from the motion vector to detect the direction and speed.
  • the display control unit 154 displays the optotype 500 when it receives the viewpoint movement operation detection signal.
  • FIG. 10 is a diagram for explaining the viewpoint movement operation detection process of the present embodiment.
  • the HMD device 100 is viewing an image of a landscape that changes due to a change in the orientation of the cameras (right-eye camera 119R and left-eye camera 119L) is shown. That is, it is an example of the case where the image of the mountain moves from the broken line 431 to the solid line 432.
  • the operation detection unit 153 of the present embodiment detects motion from this image. Then, when the detected movement amount is equal to or greater than a predetermined threshold value, it is determined that the viewpoint movement operation has been detected, and the line-of-sight movement operation signal is output. Then, the display control unit 154 receives it and displays the optotype 500. Here, a case where the annular arrangement mark 510 is displayed as the optotype 500 will be illustrated.
  • the operation detection unit 153 detects the viewpoint movement operation when the movement amount of the binocular stereoscopic image detected from the moving image is equal to or more than a predetermined threshold value.
  • the optotype 500 when the movement is detected, the optotype 500 is automatically displayed.
  • the user 101 is less likely to be confused by the movement of the image and to have the illusion that he / she is moving. This makes it possible to reduce VR sickness.
  • the optotype 500 when motion is detected, the optotype 500 is automatically displayed, but the present invention is not limited to this.
  • the operation of the user 101 may be analyzed to detect the viewpoint movement operation.
  • the operation detection unit 153 analyzes the operation received by the reception unit 152 and determines whether or not the viewpoint is moved.
  • the display position of the optotype 500 is not limited to the virtual image plane 310.
  • the display control unit 154 may display the optotype 500 (annular arrangement mark 510) in front of the virtual image plane 310 (far away in the depth direction). That is, the distance in the depth direction of the virtual surface on which the optotype 500 is displayed is larger than the distance in the depth direction of the virtual image surface 310 and smaller than the distance in the depth direction to the binocular stereoscopic image 410.
  • an annular arrangement mark 510 in which black circles and white circles are alternately arranged on a virtual circle is used.
  • the optotype 500 is not limited to this.
  • the pattern may be as shown in FIGS. 12 (a) and 12 (b).
  • the optotype 500 shown in FIG. 12A is a first optotype 520 composed of a rectangular two-dot chain line 521 with rounded corners. Further, the optotype 500 shown in FIG. 12B is a second optotype 530 in which an arc 532 and a figure 533 are arranged on a virtual circumference.
  • the optotype 500 has a shape that allows the user 101 to fix the viewpoint so that he / she can grasp that he / she is not moving and does not block the central portion of the visual field and does not interfere with image viewing. The shape does not matter.
  • a plurality of types of optotypes 500 may be prepared in advance and can be selected according to the preference of the user 101, or the types of optotypes 500 to be displayed may be predetermined for each application. You may keep it. The same applies to the display position.
  • ⁇ Modification example 3> In each of the above embodiments, an example of connecting the HMD main unit 110 and the HMD control device 120 by wire is shown, but the present invention is not limited to this.
  • the connection may be wireless.
  • wirelessly connecting the HMD control device 120 and the HMD operation device 130 it may be wired.
  • the HMD device 100 is composed of three devices, an HMD main body device 110, an HMD control device 120, and an HMD operation device 130, has been shown, but the present invention is not limited thereto.
  • the function of the HMD control device 120 may be included in the HMD main body device 110.
  • the inter-device I / F 117 and 127 which are the connection interfaces between the two, can be reduced.
  • the HMD control device 120 may include the function of the HMD operation device 130.
  • the short-range wireless communication devices 129b and 139 which are the connection interfaces between the two, can be reduced.
  • the HMD device 100 may be integrated and the function of the HMD operating device 130 may not be provided.
  • the operation of the user 101 may be detected by gesture detection or the like.
  • Gesture detection is realized, for example, by taking an image of the surroundings with a camera (camera for the right eye 119R and a camera for the left eye 119L) and analyzing the image with the CPU 122.
  • an external device different from the HMD device 100 may be used.
  • the external device is, for example, a smartphone, a game controller, or the like.
  • an application for functioning as an operating device of the HMD device 100 is installed.
  • the viewpoint movement operation may be performed by operating the joystick.
  • HMD device 100 In each of the above embodiments, the case where the HMD device 100 is used has been described as an example, but the device is not limited to the HMD device 100.
  • the device may be VR goggles that realize the function of the HMD device by attaching a smartphone (Smart Phone).
  • a smartphone Smart Phone
  • a PC Personal Computer
  • a tablet or the like may be used.
  • the present invention is not limited to the above-described embodiments and modifications, and includes various modifications.
  • the above-described embodiments and modifications have been described in detail in order to explain the present invention in an easy-to-understand manner, and are not necessarily limited to those having all the described configurations.
  • each of the above configurations, functions, processing units, processing means, etc. may be realized by hardware by designing a part or all of them by, for example, an integrated circuit. Further, each of the above configurations, functions, and the like may be realized by software by the processor interpreting and executing a program that realizes each function. Information such as programs, tables, and files that realize each function can be placed in a memory unit, a hard disk, a recording device such as an SSD (Solid State Drive), or a recording medium such as an IC card, SD card, or DVD. ..
  • SSD Solid State Drive
  • control lines and information lines indicate what is considered necessary for explanation, and not all control lines and information lines are necessarily shown on the product. In practice, it can be considered that almost all configurations are interconnected.
  • HMD device 101: user, 110: HMD main unit, 111: bus, 114L: left eye display, 114R: right eye display, 117: inter-device I / F, 119L: left eye camera, 119R: right eye camera, 120: HMD control device, 121: Bus, 122: CPU, 123: RAM, 124: Flash ROM, 124a: Operation program holding unit, 124b: Image data holding unit, 125: OSD generation unit, 126: Video processing unit 127 : Inter-device I / F, 128: Sensor device, 128a: Acceleration sensor, 128b: Geomagnetic sensor, 128c: GPS receiver, 128d: Distance sensor, 129: Communication device, 129a: Network communication device, 129b: Short-range wireless communication Device, 130: HMD operating device, 131: bus, 132: CPU, 133: RAM, 134: flash ROM, 137: touch sensor, 138: gyro sensor, 139: short-

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The present invention minimizes VR sickness resulting from the use of an HMD device while keeping intact a moving picture that involves viewpoint shifting. A head-mounted display device that is worn on a user's head and that achieves binocular stereopsis with a right-eye image and a left-eye image, the head-mounted display device being characterized by being provided with: an operation detection unit which outputs a detection signal when a viewpoint shifting operation has been detected; and a display control unit which causes a visual target to be displayed in such a manner as to be superimposed on the right-eye image and the left-eye image while the detection signal is being outputted, wherein the visual target is displayed on a virtual plane at a predetermined distance from the device in terms of the depth direction when the visual target is viewed by means of binocular stereopsis, and also has a shape that enables ensuring the user's field of vision.

Description

ヘッドマウントディスプレイ装置および表示制御方法Head-mounted display device and display control method
 本発明は、VR(Virtual Reality)表示技術に関する。 The present invention relates to VR (Virtual Reality) display technology.
 ユーザの頭部に装着して、両眼のそれぞれの光学系を介して画像を表示するヘッドマウントディスプレイ(Head Mounted Display、以下、HMDという)装置がある。ユーザがHMD装置を装着して動きのある映像を見ると、いわゆるVR酔いと呼ばれる映像酔いが生じることがある。 There is a head-mounted display (Head Mounted Display, hereinafter referred to as HMD) device that is worn on the user's head and displays an image via the respective optical systems of both eyes. When a user wears an HMD device and views a moving image, so-called VR sickness may occur.
 これを低減させる技術として、例えば、特許文献1には、「仮想現実を提供するコンピュータのプロセッサが実行する処理は、メモリにおいて仮想空間を定義するステップと、HMD装置を装着したユーザの動作を検出するステップと、ユーザの動作に基づいて、仮想空間を飛翔する物体の飛翔方向を決定するステップと、仮想空間におけるユーザの視界が飛翔方向に移動するように、HMD装置に視界画像を表示させるための視界画像データを生成し、生成した視界画像データに基づいて視界画像をモニタに表示させるステップと、物体が仮想空間の対象物に到達した場合に、仮想ユーザの位置を対象物まで高速に移動させるステップと、を含む(要約抜粋)」コンピュータがHMD装置に仮想空間を提供する方法が開示されている。 As a technique for reducing this, for example, Patent Document 1 states, "The processing executed by the processor of the computer that provides the virtual reality detects the step of defining the virtual space in the memory and the operation of the user wearing the HMD device. Steps to determine the flight direction of an object flying in the virtual space based on the user's actions, and to display a view image on the HMD device so that the user's view in the virtual space moves in the flight direction. The step of generating the view image data of the above and displaying the view image on the monitor based on the generated view image data, and when the object reaches the object in the virtual space, the position of the virtual user is moved to the object at high speed. Steps to make, including (summary excerpt) ”discloses how a computer provides virtual space to an HMD device.
特許第6126273号公報Japanese Patent No. 6126273
 特許文献1に開示の技術では、現実空間におけるユーザの動作に基づいて、仮想空間における仮想ユーザの移動方向が表示されるので、ユーザの動作と仮想空間における移動方向とが連動する。その結果、視界画像を視認したユーザは、自らの動作に基づく移動方向が仮想空間においても予測できるので、VR酔いが抑制される。 In the technique disclosed in Patent Document 1, since the movement direction of the virtual user in the virtual space is displayed based on the movement of the user in the real space, the movement of the user and the movement direction in the virtual space are linked. As a result, the user who visually recognizes the visual field image can predict the moving direction based on his / her own movement even in the virtual space, so that VR sickness is suppressed.
 しかしながら、仮想ユーザ自身の移動操作による視界の移動(以下、視点移動という)のある映像を見ている場合、ユーザ自身が動いている錯覚に陥ることで発生するVR酔いは低減できない。 However, when viewing an image in which the visual field is moved by the virtual user's own movement operation (hereinafter referred to as viewpoint movement), VR sickness caused by the illusion that the user is moving cannot be reduced.
 本発明は上記の点を鑑みてなされたものであり、その目的は、視点移動を伴う映像はそのままに、視標を意識的に見ることにより、両眼視差をほぼ一定に保つことを容易にし、HMD装置を使用することによるVR酔いを低減する情報表示技術を提供することにある。 The present invention has been made in view of the above points, and an object of the present invention is to facilitate keeping the binocular parallax substantially constant by consciously looking at the optotype while keeping the image accompanied by the viewpoint movement. , To provide an information display technique for reducing VR sickness caused by using an HMD device.
 本発明は、ユーザの頭部に装着し、右目用画像と左目用画像とを両眼立体視するヘッドマウントディスプレイ装置であって、視点移動操作を検出した場合、検出信号を出力する操作検出部と、前記検出信号が出力されている間、視標を前記右目用画像と前記左目用画像に重畳表示させる表示制御部と、を備え、前記視標は、当該視標を両眼立体視したときに、奥行き方向に関し、自装置から予め定めた距離の仮想面上に表示され、かつ、視野を確保可能な形状を有することを特徴とする。 The present invention is a head-mounted display device that is worn on the user's head and stereoscopically views the image for the right eye and the image for the left eye with both eyes, and is an operation detection unit that outputs a detection signal when a viewpoint movement operation is detected. And, while the detection signal is being output, the target is provided with a display control unit that superimposes and displays the target on the image for the right eye and the image for the left eye, and the target is a binocular stereoscopic view of the target. It is characterized in that it is displayed on a virtual surface at a predetermined distance from the own device in the depth direction and has a shape capable of securing a field of view.
 本発明によれば、視点移動を伴う映像はそのままに、視標を意識的に見ることにより、両眼視差をほぼ一定に保つことが容易となり、HMD装置を使用することによるVR酔いを低減できる。上記した以外の課題、構成および効果は、以下の実施形態の説明により明らかにされる。 According to the present invention, it is easy to keep the binocular parallax substantially constant by consciously looking at the optotype while keeping the image accompanied by the viewpoint movement, and VR sickness due to the use of the HMD device can be reduced. .. Issues, configurations and effects other than those described above will be clarified by the description of the following embodiments.
第一実施形態の概要を説明するための説明図である。It is explanatory drawing for demonstrating the outline of 1st Embodiment. 第一実施形態のHMD装置のハードウェア構成図である。It is a hardware block diagram of the HMD apparatus of 1st Embodiment. 第一実施形態のHMD装置の機能ブロック図である。It is a functional block diagram of the HMD apparatus of 1st Embodiment. 第一実施形態の視点移動操作を説明するための説明図である。It is explanatory drawing for demonstrating the viewpoint movement operation of 1st Embodiment. (a)は、第一実施形態の虚像と両眼立体視像の位置関係を説明するための説明図であり、(b)は、第一実施形態の視標の見え方の一例を説明するための説明図である。(A) is an explanatory diagram for explaining the positional relationship between the virtual image and the binocular stereoscopic image of the first embodiment, and (b) explains an example of how the optotype of the first embodiment looks. It is explanatory drawing for this. (a)~(d)は、第一実施形態の視点移動操作がなされた場合の見え方の変化を説明するための説明図である。(A) to (d) are explanatory views for explaining the change in appearance when the viewpoint movement operation of the first embodiment is performed. 第一実施形態の表示制御処理のフローチャートである。It is a flowchart of the display control processing of 1st Embodiment. (a)および(b)は、第二実施形態の表示位置テーブルを説明するための説明図であり、(c)は、第二実施形態の表示状態の一例を説明するための説明図である。(A) and (b) are explanatory views for explaining the display position table of the second embodiment, and (c) are explanatory views for explaining an example of the display state of the second embodiment. .. 第二実施形態の表示制御処理のフローチャートである。It is a flowchart of the display control process of 2nd Embodiment. 第三実施形態の視点移動操作検出処理の一例を説明するための説明図である。It is explanatory drawing for demonstrating an example of the viewpoint movement operation detection processing of 3rd Embodiment. 本発明の実施形態の変形例の視標表示態様を説明するための説明図である。It is explanatory drawing for demonstrating the optotype display mode of the modification of embodiment of this invention. (a)および(b)は、本発明の実施形態の視標の変形例をそれぞれ説明するための説明図である。(A) and (b) are explanatory views for explaining each modification of the optotype according to the embodiment of the present invention.
 以下、図面を参照しながら、本発明の実施形態について説明する。なお、図中で用いる符号は同一符号のものは同一機能や処理を示すものである。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. The symbols used in the drawings having the same reference numerals indicate the same functions and processes.
 <<第一実施形態>>
 本発明の第一実施形態を説明する。
<< First Embodiment >>
The first embodiment of the present invention will be described.
 [概要]
 各装置、機能の詳細の説明に先立ち、本実施形態の概要を説明する。図1は、本実施形態の概要を説明するための模式図である。ここでは、ユーザ101がヘッドマウントディスプレイ装置(以下、HMD装置)100を装着した状態で左右のそれぞれ一方の目だけで見たときに焦点が合って見える虚像面310と両眼立体視像410とを示す。
[Overview]
Prior to the detailed explanation of each device and function, the outline of the present embodiment will be described. FIG. 1 is a schematic diagram for explaining an outline of the present embodiment. Here, the virtual image surface 310 and the binocular stereoscopic image 410 appear to be in focus when the user 101 wears the head-mounted display device (hereinafter, HMD device) 100 and looks at it with only one of the left and right eyes. Is shown.
 図1に示すように、ユーザ101は、HMD装置100を装着し、両眼立体視像410を見る。両眼立体視像410は、両眼の視差に基づいて奥行き方向位置が錯覚して見える立体視像である。HMD装置100のディスプレイに表示された画像を、光学素子を介して見る右目用画像および左目用画像各々の焦点が合い、虚像として見える位置が、虚像面310である。 As shown in FIG. 1, the user 101 wears the HMD device 100 and sees the binocular stereoscopic image 410. The binocular stereoscopic image 410 is a stereoscopic image in which the position in the depth direction appears as an illusion based on the parallax of both eyes. The position where the image displayed on the display of the HMD device 100 is seen as a virtual image in which the right-eye image and the left-eye image viewed through the optical element are in focus is the virtual image plane 310.
 このとき、本実施形態では、予め定めた仮想的な面(仮想面300)上に両眼立体視できるよう視標500を表示する。視標500は、ユーザの視線612の両眼視差の安定化を容易にするためのマークである。 At this time, in the present embodiment, the optotype 500 is displayed on a predetermined virtual surface (virtual surface 300) so that binocular stereoscopic viewing can be performed. The optotype 500 is a mark for facilitating the stabilization of the binocular parallax of the user's line of sight 612.
 視標500を両眼立体視で表示する仮想面300は、ユーザから見て、奥行き方向に関して常にHMD装置100(自装置)から同じ距離にある面とする。本実施形態では、例えば、虚像面310に表示する。 The virtual surface 300 that displays the optotype 500 in binocular stereoscopic vision is a surface that is always at the same distance from the HMD device 100 (own device) in the depth direction when viewed from the user. In this embodiment, for example, it is displayed on the virtual image plane 310.
 なお、虚像面310は、HMD装置100で表示される左目用および右目用のそれぞれの表示画像が虚像として結像して見える仮想的な面である。すなわち、HMD装置100の像を光学的に結像させる距離にある仮想面である。この距離は、HMD装置100が備えるレンズとディスプレイとの間隔とにより光学的に定まる。なお、虚像面310は、平面でなくても良いし、右目用ディスプレイと左目用ディスプレイで一致しない面であっても良い。 The virtual image surface 310 is a virtual surface in which the left-eye and right-eye display images displayed by the HMD device 100 appear to be formed as virtual images. That is, it is a virtual surface at a distance for optically forming an image of the HMD device 100. This distance is optically determined by the distance between the lens and the display included in the HMD device 100. The virtual image surface 310 may not be a flat surface, or may be a surface that does not match between the right-eye display and the left-eye display.
 また、視標500は、視野を確保可能な形状とする。本実施形態では、例えば、環状に配置されるマークである環状配置マーク510を、視標500として表示する。 In addition, the optotype 500 has a shape that can secure a field of view. In the present embodiment, for example, the annular arrangement mark 510, which is a mark arranged in an annular shape, is displayed as the optotype 500.
 本図に示すように、本実施形態の環状配置マーク510は、第一マーク511と、第二マーク512とを備える、本図では、第一マーク511は白丸で、第二マーク512は黒丸で示す。第一マーク511と第二マーク512とは、ともに同一の円周上に配置され、両者は、交互に配置される。 As shown in this figure, the annular arrangement mark 510 of the present embodiment includes a first mark 511 and a second mark 512. In this figure, the first mark 511 is a white circle and the second mark 512 is a black circle. show. The first mark 511 and the second mark 512 are both arranged on the same circumference, and both are arranged alternately.
 HMD装置100の映像を表示するアプリケーションでは、操作部を操作することにより、両眼立体視像410(表示物410)を、背景画像とともに、視点移動して見ることができる。この視点移動とは、ユーザ101が移動したように、表示物410の見え方が変わることである。例えば、前方に視点移動するよう操作すれば、ユーザ101は、表示物410がユーザ101に近づいたように両眼立体視できる。後方に視点移動するよう操作すれば、表示物410がユーザ101から遠ざかるように両眼立体視できる。 In the application for displaying the image of the HMD device 100, the binocular stereoscopic image 410 (display object 410) can be viewed by moving the viewpoint together with the background image by operating the operation unit. This viewpoint movement means that the appearance of the display object 410 changes as if the user 101 has moved. For example, if the viewpoint is moved forward, the user 101 can stereoscopically view the display object 410 as if it were close to the user 101. By operating the viewpoint to move backward, the display object 410 can be stereoscopically viewed with both eyes so as to move away from the user 101.
 本実施形態では、HMD装置100は、ユーザ101による視点移動を伴う操作(以下、視点移動操作と呼ぶ。)を検出している間のみ、視標500(環状配置マーク510)を表示する。 In the present embodiment, the HMD device 100 displays the optotype 500 (annular arrangement mark 510) only while detecting an operation involving the viewpoint movement by the user 101 (hereinafter, referred to as a viewpoint movement operation).
 視点移動中は、ユーザ101と両眼立体視像410との相対的な位置の把握だけでは、VR酔いと呼ばれる感覚を生じることがある。これは、実際にはユーザ101は、動いていないのに動いている錯覚に陥るためである。しかし、本実施形態では、虚像面310上に、両眼立体視の位置が一致する視標500を表示する。ユーザ101は、この視標500を視界に入れることで、自身が動いていないことを容易に把握できるため、VR酔いを軽減する効果がある。 While moving the viewpoint, a sensation called VR sickness may occur only by grasping the relative positions of the user 101 and the binocular stereoscopic image 410. This is because the user 101 actually falls into the illusion that he / she is moving even though he / she is not moving. However, in the present embodiment, the optotype 500 whose binocular stereoscopic positions match is displayed on the virtual image plane 310. By putting the optotype 500 in the field of view, the user 101 can easily grasp that he / she is not moving, which has an effect of reducing VR sickness.
 また、本実施形態の視標500は、視界の中央を遮らない構成とする。このため、視標500を表示しても、表示画像を観賞する妨げにならない。さらに、本実施形態では、視点移動操作をしていないときは、視標500は表示されないので、表示画像全体を観賞する妨げにならない。 Further, the optotype 500 of the present embodiment has a configuration that does not block the center of the field of view. Therefore, even if the optotype 500 is displayed, it does not interfere with viewing the displayed image. Further, in the present embodiment, since the optotype 500 is not displayed when the viewpoint movement operation is not performed, it does not interfere with viewing the entire displayed image.
 なお、本図において、110はHMD本体装置、120はHMD制御装置、130はHMD操作装置、611、612は視線、119R,119Lはカメラである。それぞれの詳細は後述する。 In this figure, 110 is the HMD main unit, 120 is the HMD control device, 130 is the HMD operation device, 611 and 612 are the line of sight, and 119R and 119L are the cameras. Details of each will be described later.
 次に、本実施形態のHMD装置100を、図2および図3を用いて説明する。 Next, the HMD apparatus 100 of the present embodiment will be described with reference to FIGS. 2 and 3.
 [ハードウェア構成]
 図2は、本実施形態のHMD装置100のハードウェア構成図である。本図に示すように、HMD装置100は、HMD本体装置110と、HMD制御装置120と、HMD操作装置130と、を備える。
[Hardware configuration]
FIG. 2 is a hardware configuration diagram of the HMD device 100 of the present embodiment. As shown in this figure, the HMD device 100 includes an HMD main body device 110, an HMD control device 120, and an HMD operation device 130.
 HMD本体装置110は、バス111と、右目用ディスプレイ114Rと、左目用ディスプレイ114Lと、右目用カメラ119Rと、左目用カメラ119Lと、装置間インタフェース(I/F)117と、を備える。 The HMD main unit 110 includes a bus 111, a right-eye display 114R, a left-eye display 114L, a right-eye camera 119R, a left-eye camera 119L, and an inter-device interface (I / F) 117.
 右目用ディスプレイ114Rおよび左目用ディスプレイ114Lは、それぞれ、例えば液晶パネル等の表示デバイス(ディスプレイ)であり、HMD制御装置120の後述する映像処理部126で処理された画像データをユーザ101に提示する。右目用ディスプレイ114Rおよび左目用ディスプレイ114Lは、透過型ディスプレイであってもよい。 The right-eye display 114R and the left-eye display 114L are display devices (displays) such as a liquid crystal panel, respectively, and present the image data processed by the image processing unit 126 described later of the HMD control device 120 to the user 101. The right-eye display 114R and the left-eye display 114L may be transmissive displays.
 右目用カメラ119Rと、左目用カメラ119Lとは、HMD装置100の周囲の画像を取得する外部カメラである。右目用カメラは、ユーザ101の右目の視線領域を撮影し、左目用カメラ119Lは、ユーザ101の左目の視線領域を撮影する。両者は、所定の距離だけ間隔をあけて配置される。 The right-eye camera 119R and the left-eye camera 119L are external cameras that acquire an image of the surroundings of the HMD device 100. The right-eye camera photographs the line-of-sight area of the user 101's right eye, and the left-eye camera 119L photographs the line-of-sight area of the left eye of the user 101. Both are arranged at a predetermined distance.
 HMD制御装置120は、バス121と、CPU(Centoral Processor Unit)122と、RAM123と、フラッシュROM124と、OSD生成部125と、映像処理部126と、装置間I/F127と、センサ装置128と、通信装置129と、を備える。 The HMD control device 120 includes a bus 121, a CPU (Central Processor Unit) 122, a RAM 123, a flash ROM 124, an OSD generation unit 125, an image processing unit 126, an inter-device I / F 127, a sensor device 128, and the like. It includes a communication device 129 and.
 バス121は、CPU122とHMD装置100内の各部との間でデータ送受信を行うためのデータ通信路である。 The bus 121 is a data communication path for transmitting and receiving data between the CPU 122 and each part in the HMD device 100.
 CPU122は、所定のプログラムに従ってHMD装置100全体を制御する主制御部である。CPU122は、マイクロプロセッサユニット(MPU)で実現されてもよい。CPU122は、タイマが計測し、出力するクロック信号に従って、処理を行う。 The CPU 122 is a main control unit that controls the entire HMD device 100 according to a predetermined program. The CPU 122 may be realized by a microprocessor unit (MPU). The CPU 122 performs processing according to a clock signal measured and output by the timer.
 RAM123は、基本動作プログラムやその他のアプリケーションプログラム実行時のプログラム領域である。また、RAM123は、各種アプリケーションプログラム実行時に、必要に応じてデータを一時的に保持する一時記憶領域である。RAM123は、CPU122と一体構成であっても良い。 RAM 123 is a program area when executing a basic operation program or other application program. Further, the RAM 123 is a temporary storage area for temporarily holding data as needed when executing various application programs. The RAM 123 may be integrated with the CPU 122.
 フラッシュROM124は、HMD装置100の各動作設定値等、処理に必要な情報等を記憶する。フラッシュROM124は、HMD装置100で撮影した静止画像データや動画像データ等を記憶してもよい。また、HMD装置100は、アプリケーションサーバから、インターネットを介して、新規アプリケーションプログラムをダウンロードすることにより、機能拡張が可能であるものとする。この際、ダウンロードした新規アプリケーションプログラムは、フラッシュROM124に記憶される。CPU122が、フラッシュROM124に記憶された新規アプリケーションプログラムをRAM123に展開し、実行することにより、HMD装置100は、多種の機能を実現する。 The flash ROM 124 stores information necessary for processing, such as each operation setting value of the HMD device 100. The flash ROM 124 may store still image data, moving image data, and the like taken by the HMD device 100. Further, it is assumed that the function of the HMD device 100 can be expanded by downloading a new application program from the application server via the Internet. At this time, the downloaded new application program is stored in the flash ROM 124. The HMD device 100 realizes various functions when the CPU 122 develops and executes a new application program stored in the flash ROM 124 in the RAM 123.
 フラッシュROM124には、例えば、動作プログラム保持部124aと、画像データ保持部124bとが設けられる。動作プログラム保持部124aには、動作プログラムが、画像データ保持部124bには、画像データがそれぞれ保存される。 The flash ROM 124 is provided with, for example, an operation program holding unit 124a and an image data holding unit 124b. The operation program is stored in the operation program holding unit 124a, and the image data is stored in the image data holding unit 124b.
 フラッシュROM124は、HMD装置100に電源が供給されていない状態であっても記憶している情報を保持できるものであり、フラッシュROM124に限らず、例えばSSD(Solid State Drive)、HDD(Hard Disc Drive)等のデバイスであってもよい。 The flash ROM 124 can hold the stored information even when the power is not supplied to the HMD device 100, and is not limited to the flash ROM 124, for example, SSD (Solid State Drive), HDD (Hard Disk Drive). ) Etc. may be a device.
 OSD生成部125は、映像に重ねて表示させる情報(OSD情報)を生成する。生成するOSD情報は、例えば、HMD装置100自体の設定メニューや操作情報等である。本実施形態では、視標500もOSD情報として保持される。OSD情報は、予めフラッシュROM124等に保持される。 The OSD generation unit 125 generates information (OSD information) to be displayed overlaid on the video. The OSD information to be generated is, for example, a setting menu or operation information of the HMD device 100 itself. In this embodiment, the optotype 500 is also retained as OSD information. The OSD information is stored in the flash ROM 124 or the like in advance.
 映像処理部126は、イメージ(ビデオ)プロセッサであり、右目用カメラ119Rと左目用カメラ119Lとで取得した画像を処理し、右目用ディスプレイ114Rおよび左目用ディスプレイ114Lに表示する画像を、それぞれ生成する。また、映像処理部126は、CPU122等により作成したオブジェクトを入力画像に重畳して各ディスプレイに表示する画像データを生成する。 The image processing unit 126 is an image (video) processor, processes images acquired by the right-eye camera 119R and the left-eye camera 119L, and generates images to be displayed on the right-eye display 114R and the left-eye display 114L, respectively. .. Further, the image processing unit 126 superimposes an object created by the CPU 122 or the like on an input image to generate image data to be displayed on each display.
 装置間I/F127は、HMD本体装置110とのデータ送受信のインタフェースである。本実施形態では、HMD本体装置110とHMD制御装置120とは、有線接続される。したがって、装置間I/F127は、例えば、有線接続用のインタフェースとする。 The device-to-device I / F 127 is a data transmission / reception interface with the HMD main device 110. In the present embodiment, the HMD main body device 110 and the HMD control device 120 are connected by wire. Therefore, the device-to-device I / F 127 is, for example, an interface for a wired connection.
 センサ装置128は、HMD装置100の状態を検出するためのセンサ群である。本実施形態では、加速度センサ128aと、地磁気センサ128bと、GPS(Global Positioning System)受信装置128cと、距離センサ128dと、を備える。その他、ジャイロセンサ等の他のセンサを備えてもよい。これらのセンサ群により、HMD装置100の位置、動き、傾き、方角等を検出する。なお、距離センサ128dは、深度センサであり、HMD装置100から対象物までの距離情報を取得する。 The sensor device 128 is a group of sensors for detecting the state of the HMD device 100. In this embodiment, an acceleration sensor 128a, a geomagnetic sensor 128b, a GPS (Global Positioning System) receiving device 128c, and a distance sensor 128d are provided. In addition, other sensors such as a gyro sensor may be provided. These sensor groups detect the position, movement, tilt, direction, etc. of the HMD device 100. The distance sensor 128d is a depth sensor and acquires distance information from the HMD device 100 to the object.
 通信装置129は、通信処理を行うコミュニケーションプロセッサである。ネットワーク(N/W)通信装置129aと、近距離無線通信装置129bと、を備える。ネットワーク通信装置129aは、本実施形態では、例えば、LAN(Local Area Network)等のネットワークを介して、コンテンツを送受信するインタフェースである。インターネットの無線通信用アクセスポイントと無線通信により接続してデータの送受信を行う。 The communication device 129 is a communication processor that performs communication processing. A network (N / W) communication device 129a and a short-range wireless communication device 129b are provided. In the present embodiment, the network communication device 129a is an interface for transmitting and receiving contents via a network such as a LAN (Local Area Network). Data is sent and received by connecting to an access point for wireless communication on the Internet by wireless communication.
 近距離無線通信装置129bは、近距離無線通信により、HMD操作装置130とデータの送受信を行う。例えば、Bluetooth(登録商標)等の規格を採用してもよい。 The short-range wireless communication device 129b transmits / receives data to / from the HMD operation device 130 by short-range wireless communication. For example, a standard such as Bluetooth® may be adopted.
 ネットワーク通信装置129aおよび近距離無線通信装置129bは、それぞれ符号化回路や復号回路、アンテナ等を備える。通信装置129は、さらに、赤外線通信装置等を備えていても良い。 The network communication device 129a and the short-range wireless communication device 129b each include a coding circuit, a decoding circuit, an antenna, and the like. The communication device 129 may further include an infrared communication device or the like.
 HMD操作装置130は、バス131と、CPU132と、RAM133と、フラッシュROM134と、タッチセンサ137と、ジャイロセンサ138と、近距離無線通信装置139と、を備える。 The HMD operation device 130 includes a bus 131, a CPU 132, a RAM 133, a flash ROM 134, a touch sensor 137, a gyro sensor 138, and a short-range wireless communication device 139.
 バス131と、CPU132と、RAM133と、フラッシュROM134と、近距離無線通信装置139とは、HMD制御装置120の同名の構成と同様である。 The bus 131, the CPU 132, the RAM 133, the flash ROM 134, and the short-range wireless communication device 139 have the same configuration as the HMD control device 120 having the same name.
 タッチセンサ137は、ユーザ101からの操作指示の入力を受け付けるタッチパッドである。なお、HMD操作装置130は、電源キー、音量キー、ホームキー等の操作キーを備えてもよい。 The touch sensor 137 is a touch pad that receives input of operation instructions from the user 101. The HMD operation device 130 may include operation keys such as a power key, a volume key, and a home key.
 ジャイロセンサ138は、HMD操作装置130の回転角速度を検出する姿勢検知装置である。なお、HMD操作装置130が備える姿勢検知装置は、HMD操作装置130の姿勢が検知できれば、ジャイロセンサ138でなくてもよい。 The gyro sensor 138 is a posture detection device that detects the rotational angular velocity of the HMD operation device 130. The posture detection device included in the HMD operation device 130 does not have to be the gyro sensor 138 as long as the posture of the HMD operation device 130 can be detected.
 [機能構成]
 次に、本実施形態のHMD装置100の機能を説明する。図3は、本実施形態のHMD装置100の機能ブロック図である。本図に示すように、本実施形態のHMD装置100は、全体制御部151と、受付部152と、操作検出部153と、表示制御部154と、処理データ155と、を備える。
[Functional configuration]
Next, the function of the HMD apparatus 100 of this embodiment will be described. FIG. 3 is a functional block diagram of the HMD device 100 of the present embodiment. As shown in this figure, the HMD apparatus 100 of the present embodiment includes an overall control unit 151, a reception unit 152, an operation detection unit 153, a display control unit 154, and processing data 155.
 全体制御部151は、HMD装置100の各機能を制御することにより、全体の動作を制御する。 The overall control unit 151 controls the overall operation by controlling each function of the HMD device 100.
 受付部152は、ユーザ101からの操作を受け付ける。本実施形態では、タッチセンサ137を介して受け付ける。ユーザ101から受け付けた操作に関する情報は、操作検出部153および表示制御部154に出力する。 The reception unit 152 accepts operations from the user 101. In this embodiment, it is received via the touch sensor 137. The information about the operation received from the user 101 is output to the operation detection unit 153 and the display control unit 154.
 操作検出部153は、視点移動操作を検出する。本実施形態では、受付部152で受け付けた操作を解析し、視点移動操作を検出する。例えば、予め定めた操作がなされた場合、視点移動操作がなされたと判別する。判別結果は、表示制御部154に通知する。本実施形態では、例えば、視点移動操作がなされたと判別している間(視線移動操作を検出中)は、視点移動操作検出信号を出力する。 The operation detection unit 153 detects the viewpoint movement operation. In the present embodiment, the operation received by the reception unit 152 is analyzed, and the viewpoint movement operation is detected. For example, when a predetermined operation is performed, it is determined that the viewpoint movement operation has been performed. The determination result is notified to the display control unit 154. In the present embodiment, for example, the viewpoint movement operation detection signal is output while it is determined that the viewpoint movement operation has been performed (during the detection of the line-of-sight movement operation).
 表示制御部154は、右目用ディスプレイ114Rおよび左目用ディスプレイ114Lへの表示を制御する。本実施形態では、視標500の表示制御を含む。操作検出部153で視点移動操作を検出している間、すなわち、視点移動操作検出信号を受信中は、OSD生成部125で視標500の表示データを生成し、映像処理部126に出力する。映像処理部126では受信した表示データを表示画像に重畳(画像合成)することにより、視標500を表示させる。 The display control unit 154 controls the display on the right-eye display 114R and the left-eye display 114L. The present embodiment includes display control of the optotype 500. While the operation detection unit 153 is detecting the viewpoint movement operation, that is, while the viewpoint movement operation detection signal is being received, the OSD generation unit 125 generates the display data of the optotype 500 and outputs it to the video processing unit 126. The image processing unit 126 displays the optotype 500 by superimposing the received display data on the display image (image composition).
 処理データ155は、HMD装置100の、後述する表示制御処理の実行に必要なデータである。処理データ155は、例えば、RAM123およびフラッシュROM124に保持される。 The processing data 155 is data necessary for executing the display control processing described later in the HMD device 100. The processed data 155 is held in, for example, the RAM 123 and the flash ROM 124.
 全体制御部151、受付部152、操作検出部153および表示制御部154は、CPU132が、フラッシュROM134に保持されたプログラムを、RAM133に展開して実行することにより実現される。プログラムは、動作プログラム保持部124aに保持される。 The overall control unit 151, the reception unit 152, the operation detection unit 153, and the display control unit 154 are realized by the CPU 132 expanding the program held in the flash ROM 134 to the RAM 133 and executing it. The program is held in the operation program holding unit 124a.
 [視点移動操作]
 ここで、本実施形態の視点移動時のユーザ101による操作を説明する。図4は、本実施形態の視点移動時の操作の説明図である。
[Viewpoint movement operation]
Here, the operation by the user 101 at the time of moving the viewpoint of the present embodiment will be described. FIG. 4 is an explanatory diagram of an operation when the viewpoint is moved according to the present embodiment.
 本実施形態では、操作検出部153は、例えば、タッチセンサ137が配置されたタッチパッド上で、タッチパッドをタッチした状態で所定方向に所定量スライドさせる操作が検出された場合、視点移動操作がなされたと判別する。そして、タッチパッドから指が離れたことにより、視点移動操作が終了したと判別する。 In the present embodiment, when the operation detection unit 153 detects, for example, an operation of sliding a predetermined amount in a predetermined direction while touching the touch pad on the touch pad on which the touch sensor 137 is arranged, the viewpoint movement operation is performed. Determine that it was done. Then, it is determined that the viewpoint movement operation is completed when the finger is separated from the touch pad.
 具体的には、ユーザ101は、タッチセンサ137が配置されたタッチパッド上で、所定位置のポイント211に指先等でタッチする。その後、移動方向(矢印221)に指先をスライドさせてポイント212においてタッチした状態で停止する。受付部152がこのような操作を受け付けた場合、操作検出部153は、視点移動操作として検出し、視点移動操作検出信号を出力する。これによって、表示制御部154は、矢印221の方向である前方(奥行き方向)に、矢印長に応じた速さで視点移動を継続する。 Specifically, the user 101 touches the point 211 at a predetermined position with a fingertip or the like on the touch pad on which the touch sensor 137 is arranged. After that, the fingertip is slid in the moving direction (arrow 221) and stopped in a touched state at the point 212. When the reception unit 152 receives such an operation, the operation detection unit 153 detects it as a viewpoint movement operation and outputs a viewpoint movement operation detection signal. As a result, the display control unit 154 continues to move the viewpoint forward (in the depth direction) in the direction of the arrow 221 at a speed corresponding to the length of the arrow.
 その後、ポイント212において、タッチした状態から指を離した場合、受付部152は、操作を受け付けなくなる。この場合、操作検出部153は、視点移動操作を検出しないため、視点移動操作検出信号の出力を停止する。これにより、表示制御部154は、視点移動を停止する。 After that, when the finger is released from the touched state at the point 212, the reception unit 152 does not accept the operation. In this case, since the operation detection unit 153 does not detect the viewpoint movement operation, the output of the viewpoint movement operation detection signal is stopped. As a result, the display control unit 154 stops the viewpoint movement.
 矢印221が短くなるように逆に戻す操作を、受付部152が受け付けると、操作検出部153は、最初にタッチしたポイント211からの距離に対応した速さに減速するよう視点移動操作検出信号を出力する。 When the reception unit 152 accepts the operation of returning the arrow 221 so as to shorten it, the operation detection unit 153 sends a viewpoint movement operation detection signal so as to decelerate to a speed corresponding to the distance from the first touched point 211. Output.
 また、タッチした指を離さずにポイント212からポイント213に矢印222の長さだけ移動する操作を受付部152が受け付けると、操作検出部153は、最初にタッチしたポイント211からの矢印223に対応した向きと速さで視点移動が継続されるよう視点移動操作検出信号を出力する。 Further, when the reception unit 152 receives an operation of moving from the point 212 to the point 213 by the length of the arrow 222 without releasing the touched finger, the operation detection unit 153 corresponds to the arrow 223 from the first touched point 211. The viewpoint movement operation detection signal is output so that the viewpoint movement is continued in the desired direction and speed.
 一旦、指を離して視点移動操作を停止してから、再度、指をポイント212からポイント213に矢印222だけ移動する操作を受付部152が受け付けた場合は、操作検出部153は、右向きで矢印222の長さに応じた速さで視点移動が継続されるよう視点移動信号を出力する。 If the reception unit 152 accepts the operation of moving the finger from the point 212 to the point 213 by the arrow 222 again after releasing the finger to stop the viewpoint movement operation, the operation detection unit 153 indicates the arrow pointing to the right. The viewpoint movement signal is output so that the viewpoint movement is continued at a speed corresponding to the length of 222.
 なお、パンやチルトおよび回転は、HMD操作装置130の姿勢をジャイロセンサ138などの姿勢検知装置で検出して、視点移動に反映してもよい。この場合、操作検出部153は、ジャイロセンサ138の検出結果に基づいて、視点移動操作検出信号を出力する。 Note that the pan, tilt, and rotation may be reflected in the viewpoint movement by detecting the posture of the HMD operating device 130 with a posture detecting device such as a gyro sensor 138. In this case, the operation detection unit 153 outputs a viewpoint movement operation detection signal based on the detection result of the gyro sensor 138.
 ここで、パンは、水平方向の旋回を意味し、航空機分野ではヨーイングと呼ばれる。また、チルトは、垂直方向の傾斜を意味し、航空機分野ではピッチングと呼ばれる。回転は、画角の傾きを意味し、航空機分野ではローリングと呼ばれる。 Here, pan means a horizontal turn and is called yawing in the aviation field. Tilt also means tilting in the vertical direction and is called pitching in the aircraft field. Rotation means the tilt of the angle of view and is called rolling in the aviation field.
 図5(a)は、両眼立体視像410と視標500として表示される環状配置マーク510とを説明する図である。 FIG. 5A is a diagram for explaining the binocular stereoscopic image 410 and the annular arrangement mark 510 displayed as the optotype 500.
 本図に示すように、表示物410(両眼立体視像410)の魚は、虚像面310上に表示される右目用画像410Rと左目用画像410Lとを、視認するユーザ101の両眼視差により、奥行き方向の所定の位置に存在する1つの物体として認識される。この、奥行き方向の所定の位置は、右目の視線611Rおよび左目の視線611Lが交差する位置である。 As shown in this figure, the fish of the display object 410 (binocular stereoscopic image 410) is the binocular parallax of the user 101 who visually recognizes the right eye image 410R and the left eye image 410L displayed on the virtual image surface 310. Therefore, it is recognized as one object existing at a predetermined position in the depth direction. This predetermined position in the depth direction is a position where the line of sight 611R of the right eye and the line of sight 611L of the left eye intersect.
 一方、環状配置マーク510(511、512)は、視線612により、虚像面310上の両眼立体視像として表示される。 On the other hand, the annular arrangement mark 510 (511, 512) is displayed as a binocular stereoscopic image on the virtual image plane 310 by the line of sight 612.
 図5(b)は、表示物410(両眼立体視像410)と環状配置マーク510とを交互に両眼立体視したときの見え方を示す図である。ここでは、表示物410(両眼立体視像410)と環状配置マーク510とをそれぞれ視認した際の画像を、合成して示す。実際には、表示物410(両眼立体視像410)と環状配置マーク510とのどちらか一方を両眼立体視しているときは、他方は左右にずれた2重の像となる。 FIG. 5B is a diagram showing how the display object 410 (binocular stereoscopic image 410) and the annular arrangement mark 510 are alternately viewed in binocular stereoscopic view. Here, the images when the display object 410 (binocular stereoscopic image 410) and the annular arrangement mark 510 are visually recognized are combined and shown. Actually, when either one of the display object 410 (binocular stereoscopic image 410) and the annular arrangement mark 510 is binocular stereoscopically viewed, the other is a double image shifted to the left and right.
 なお、環状配置マーク510は、上述のように、円形の補助線(円周513)上に、交互に配置された複数の白丸(第一マーク511)と黒丸(第二マーク512)とを備える。なお、補助線(円周513)は、実際には表示されない仮想的な線である。 As described above, the annular arrangement mark 510 includes a plurality of alternately arranged white circles (first mark 511) and black circles (second mark 512) on the circular auxiliary line (circumference 513). .. The auxiliary line (circumference 513) is a virtual line that is not actually displayed.
 図6(a)~図6(d)は、表示物410を近づける視点移動操作がなされた場合の、表示物410の見え方を示す。図6(b)は、図6(a)の位置に見えている状態で、ユーザ101が、表示物410を近づけるよう視点移動操作がなされた場合の表示状態である。図6(c)は、表示物410の表示位置と虚像面310との位置関係が図6(a)に示す態様の場合の、表示状態である。また、図6(d)は、表示物410の表示位置と虚像面310との位置関係が図6(b)に示す態様の場合の、表示状態である。図6(c)および図6(d)は、図5(a)同様、表示物410と環状配置マーク510とを合成して示す。図6(d)では、表示物410と環状配置マーク510とのどちらか一方を両眼立体視しているときに、他方も左右にずれた2重の像となることなく見える。 6 (a) to 6 (d) show the appearance of the display object 410 when the viewpoint movement operation for bringing the display object 410 closer is performed. FIG. 6B is a display state when the user 101 performs a viewpoint movement operation so as to bring the display object 410 closer to the display object 410 while being visible at the position shown in FIG. 6A. FIG. 6C is a display state in the case where the positional relationship between the display position of the display object 410 and the virtual image surface 310 is the aspect shown in FIG. 6A. Further, FIG. 6D is a display state in the case where the positional relationship between the display position of the display object 410 and the virtual image surface 310 is the aspect shown in FIG. 6B. 6 (c) and 6 (d) show a composite of the display object 410 and the annular arrangement mark 510, as in FIG. 5 (a). In FIG. 6D, when either one of the display object 410 and the annular arrangement mark 510 is stereoscopically viewed with both eyes, the other can be seen without becoming a double image shifted to the left and right.
 図6(a)に示す状態から、図6(b)に示す状態に変化するような視点移動により、表示物410の魚だけを注視している場合は、ユーザ101は、本人が実際は動いていないにも関わらず両眼立体視の視線が611から613に変わる。これにより、ユーザ101は、自身が動いている感覚に陥ることにより、VR酔いになることがある。 When only the fish of the display object 410 is being watched by the viewpoint movement that changes from the state shown in FIG. 6 (a) to the state shown in FIG. 6 (b), the user 101 is actually moving. Despite the absence, the line of sight of binocular stereoscopic vision changes from 611 to 613. As a result, the user 101 may become VR sick by falling into the sensation of moving himself / herself.
 一方、相対的にユーザ101の視点位置が変わらない環状配置マーク510(511,512)を表示し、ユーザ101の視野に入れることで、ユーザ101が実際は動いていないにも関わらず動いている感覚に陥ることを防止できる。これにより、VR酔いを軽減できる。しかも、環状配置マーク510は、環状に配置した離散的なマークであるため、視野の中央部を塞がない。このため、表示物410を観賞する妨げにならないとともに、周囲の背景画像の連続性を保った観賞が可能となる。 On the other hand, by displaying the annular arrangement mark 510 (511,512) in which the viewpoint position of the user 101 does not change relatively and putting it in the field of view of the user 101, the feeling that the user 101 is moving even though it is not actually moving. Can be prevented from falling into. This can reduce VR sickness. Moreover, since the annular arrangement mark 510 is a discrete mark arranged in an annular shape, the central portion of the visual field is not blocked. Therefore, the display object 410 is not hindered from being viewed, and the surrounding background image can be viewed while maintaining the continuity.
 次に、本実施形態のHMD装置100の動作を説明する。図7は、本実施形態のHMD装置100による視標表示を含む、表示制御処理の処理フローである。本処理は、HMD装置100が起動され、動作プログラムが開始されることを契機に開始される。 Next, the operation of the HMD device 100 of this embodiment will be described. FIG. 7 is a processing flow of the display control process including the optotype display by the HMD device 100 of the present embodiment. This process is started when the HMD device 100 is started and the operation program is started.
 HMD装置100が起動すると、全体制御部151は、初期設定を行う(ステップS1101)。初期設定は、予め定めた設定項目について、保存してある設定値を読み込む、あるいは、ユーザ101が入力等することにより、実行される。設定項目は、例えば、視標500の諸仕様である。 When the HMD device 100 is activated, the overall control unit 151 makes initial settings (step S1101). The initial setting is executed by reading a saved setting value for a predetermined setting item, or by inputting the setting value by the user 101 or the like. The setting items are, for example, various specifications of the optotype 500.
 具体的には、視標500の種類、視標500の表示位置(虚像面310上/前後に所定距離だけ移動)、視標500の表示モード(視点移動操作時のみ/映像の動き検出時/手動切替)、視標500の色(一定色/短時間で繰り返して変わる色の組み合わせ/重畳表示箇所の表示画像の明暗反転)などである。本実施形態では、例えば、視標500として環状配置マーク510が選択され、設定される。また、環状配置マーク510は、視点移動操作時のみ虚像面310に表示されるよう設定する。 Specifically, the type of the optotype 500, the display position of the optotype 500 (moving on the virtual image plane 310 / back and forth by a predetermined distance), and the display mode of the optotype 500 (only when the viewpoint is moved / when the motion of the image is detected / (Manual switching), the color of the optotype 500 (constant color / color combination that changes repeatedly in a short time / light / dark inversion of the display image of the superimposed display location) and the like. In the present embodiment, for example, the annular arrangement mark 510 is selected and set as the optotype 500. Further, the annular arrangement mark 510 is set to be displayed on the virtual image surface 310 only during the viewpoint movement operation.
 全体制御部151は、ユーザ101から指示を受け付けたアプリケーションを起動する(ステップS1102)。なお、起動するアプリケーションを指定する指示は、受付部152が受け付ける。 The overall control unit 151 starts an application that has received an instruction from the user 101 (step S1102). The reception unit 152 receives an instruction to specify an application to be started.
 そして、受付部152は、所定の時間間隔で、視点移動操作など、アプリケーションに対するユーザ101による各種操作を受け付ける(ステップS1103)。 Then, the reception unit 152 receives various operations by the user 101 for the application, such as a viewpoint movement operation, at predetermined time intervals (step S1103).
 操作検出部153は、受付部152が受け付けた操作を解析し、視点移動操作がなされたかを判別する(ステップS1104)。そして、視点移動操作有りと判別した場合(S1104;Yes)、視点移動操作検出信号を表示制御部154に出力する。 The operation detection unit 153 analyzes the operation received by the reception unit 152 and determines whether or not the viewpoint movement operation has been performed (step S1104). Then, when it is determined that the viewpoint movement operation is present (S1104; Yes), the viewpoint movement operation detection signal is output to the display control unit 154.
 表示制御部154は、視点移動操作検出信号を受信すると、現在、視標500が表示されているかを判別する(ステップS1105)。 When the display control unit 154 receives the viewpoint movement operation detection signal, it determines whether or not the optotype 500 is currently displayed (step S1105).
 視標500が表示されている場合(S1105;Yes)は、表示制御部154は、そのまま視標500を表示させ、また、全体制御部151は、受付部152が、アプリケーションを終了する指示を受け付けたか否かを判別する(ステップS1109)。そして、アプリケーションを終了する指示を受け付けていない場合(S1109;No)、ステップS1103へもどり、処理を継続する。 When the optotype 500 is displayed (S1105; Yes), the display control unit 154 displays the optotype 500 as it is, and the overall control unit 151 receives an instruction from the reception unit 152 to terminate the application. It is determined whether or not the application (step S1109). Then, when the instruction to terminate the application is not accepted (S1109; No), the process returns to step S1103 and the process is continued.
 一方、アプリケーションを終了する指示を受け付けた場合(S1109;Yes)、全体制御部151は、アプリケーションを終了させ、HMD装置100自体を終了させる指示(動作プログラム終了指示;電源OFF指示)を受け付けたか否かを判別する(ステップS1110)。 On the other hand, when the instruction to terminate the application is received (S1109; Yes), whether or not the overall control unit 151 has received the instruction to terminate the application and terminate the HMD device 100 itself (operation program termination instruction; power OFF instruction). (Step S1110).
 終了させる指示を受け付けていない場合(S1110;No)、全体制御部151は、ステップS1102へ戻る。一方、終了させる指示を受け付けた場合(S1110;Yes)、処理を終了する。 If the instruction to end is not received (S1110; No), the overall control unit 151 returns to step S1102. On the other hand, when the instruction to end is received (S1110; Yes), the process ends.
 また、ステップS1105において、視標500が表示されていない場合(S1105;No)、表示制御部154は、視標500を予め定めた距離に結像するよう表示させ(ステップS1106)、ステップS1109へ移行する。 Further, in step S1105, when the optotype 500 is not displayed (S1105; No), the display control unit 154 displays the optotype 500 so as to form an image at a predetermined distance (step S1106), and proceeds to step S1109. Transition.
 また、ステップS1104において、視点移動操作がなされていない場合(S1104;No)、受付部152は、視点移動操作検出信号を出力しない。 Further, in step S1104, when the viewpoint movement operation is not performed (S1104; No), the reception unit 152 does not output the viewpoint movement operation detection signal.
 表示制御部154は、所定期間視点移動操作検出信号を受信しないと、現在、視標500が表示されているかを判別する(ステップS1107)。視標500が表示されている場合(S1107;Yes)、視標500の表示を停止(視標500消去)し(ステップS1108)、ステップS1109へ移行する。 If the display control unit 154 does not receive the viewpoint movement operation detection signal for a predetermined period, it determines whether or not the optotype 500 is currently displayed (step S1107). When the optotype 500 is displayed (S1107; Yes), the display of the optotype 500 is stopped (the optotype 500 is erased) (step S1108), and the process proceeds to step S1109.
 一方、視標500が表示されていない場合(S1107;No)、そのままステップS1109へ移行する。 On the other hand, if the optotype 500 is not displayed (S1107; No), the process proceeds to step S1109 as it is.
 なお、アプリケーションの起動後、初期設定を行うように構成してもよい。この場合、アプリケーションに応じて、視標500の諸仕様を設定できる。 It should be noted that the initial settings may be made after the application is started. In this case, various specifications of the optotype 500 can be set according to the application.
 以上説明したように、本実施形態のHMD装置100は、ユーザ101の頭部に装着し、右目用画像410Rと左目用画像410Lとを両眼立体視像410として表示する。そして、視点移動操作を検出した場合、視点移動操作検出信号を出力する操作検出部153と、視点移動操作検出信号が出力されている間、視標500を両眼立体視像に重畳表示させる表示制御部154と、を備える。この視標500は、当該視標を両眼立体視したときに、奥行き方向に関し、自装置から予め定めた距離の仮想面上に表示され、かつ、視野を確保可能な形状を有する。 As described above, the HMD device 100 of the present embodiment is attached to the head of the user 101, and displays the image 410R for the right eye and the image 410L for the left eye as a binocular stereoscopic image 410. Then, when the viewpoint movement operation is detected, the operation detection unit 153 that outputs the viewpoint movement operation detection signal and the display that superimposes and displays the target 500 on the binocular stereoscopic image while the viewpoint movement operation detection signal is output. It includes a control unit 154. The optotype 500 has a shape that is displayed on a virtual surface at a predetermined distance from the own device in the depth direction when the optotype is stereoscopically viewed with both eyes, and a field of view can be secured.
 このように、本実施形態によれば、視点移動操作中、ユーザ101の視点位置が変わらない、視標500を表示させる。この視標500は、ユーザ101の視点を固定する目安となる。本実施形態では、例えば、この視標500を、虚像面310上に表示する。 As described above, according to the present embodiment, the optotype 500 is displayed in which the viewpoint position of the user 101 does not change during the viewpoint movement operation. The optotype 500 serves as a guide for fixing the viewpoint of the user 101. In the present embodiment, for example, the optotype 500 is displayed on the virtual image plane 310.
 したがって、本実施形態によれば、虚像面310に視標500を両眼立体視することができる。ユーザ101は、該視標500に視線をやりつつ視点移動のある映像を見ることで、ユーザ101自身が動いている錯覚にとらわれにくくなる。このため、視点移動を伴う映像はそのままにHMD装置100を使用することによるVR酔いを低減できる。 Therefore, according to the present embodiment, the optotype 500 can be stereoscopically viewed on the virtual image plane 310. By viewing the image with the viewpoint moving while looking at the optotype 500, the user 101 is less likely to be caught in the illusion that the user 101 itself is moving. Therefore, it is possible to reduce VR sickness caused by using the HMD device 100 while keeping the image accompanied by the viewpoint movement.
 また、本実施形態では、視標500を、視点移動操作を検出している間のみ、表示させる。視標500の視野を確保可能な形状と相まって、ユーザ101の表示物の鑑賞を妨げない。 Further, in the present embodiment, the optotype 500 is displayed only while the viewpoint movement operation is detected. Coupled with the shape that can secure the field of view of the optotype 500, it does not interfere with the viewing of the display object of the user 101.
 また、本実施形態では、ゲーム等のアプリケーションに視標500を表示する機能がなくても、HMD装置100の機能により視標500を表示することができる。 Further, in the present embodiment, the optotype 500 can be displayed by the function of the HMD device 100 even if the application such as a game does not have the function of displaying the optotype 500.
 <<第二実施形態>>
 次に、本発明の第二実施形態を説明する。第一実施形態では、視標500の表示位置は、予め設定され、アプリケーション実行中は固定されている。本実施形態では、表示される画像に応じて、視標500の仮想面上の表示位置を変更する。さらに、本実施形態では、アプリケーション実行中に視標500の初期設定と割り込みによる設定変更とを可能とする。
<< Second Embodiment >>
Next, the second embodiment of the present invention will be described. In the first embodiment, the display position of the optotype 500 is preset and fixed during application execution. In the present embodiment, the display position of the optotype 500 on the virtual surface is changed according to the displayed image. Further, in the present embodiment, it is possible to perform the initial setting of the optotype 500 and the setting change by interruption during the execution of the application.
 以下、本実施形態について、第一実施形態と異なる構成に主眼をおいて説明する。 Hereinafter, this embodiment will be described with a focus on a configuration different from that of the first embodiment.
 本実施形態のHMD装置100のハードウェア構成は、第一実施形態と同じである。また、本実施形態のHMD装置100の機能構成も第一実施形態と同じである。ただし、操作検出部153および表示制御部154の処理が異なる。 The hardware configuration of the HMD device 100 of this embodiment is the same as that of the first embodiment. Further, the functional configuration of the HMD device 100 of the present embodiment is also the same as that of the first embodiment. However, the processing of the operation detection unit 153 and the display control unit 154 is different.
 本実施形態では、視標500の表示態様を切り替える。例えば、視点移動の前進方向の速度に対して、複数の閾値を設け、視標500の奥行き方向位置(両眼立体視した場合の前後方向位置)を段階的に切り替えて表示する。また、視点移動の左右方向の速度に対して、複数の閾値を設け、視標500の左右方向位置を段階的に切り替えて表示する。 In this embodiment, the display mode of the optotype 500 is switched. For example, a plurality of threshold values are set for the speed in the forward direction of the viewpoint movement, and the depth direction position (front-back direction position in the case of binocular stereoscopic viewing) of the optotype 500 is displayed by being gradually switched. Further, a plurality of threshold values are set for the left-right speed of the viewpoint movement, and the left-right position of the optotype 500 is gradually switched and displayed.
 操作検出部153は、視点移動操作を検出した場合、その操作方向および操作速度を算出する。操作方向および操作速度は、直前の視点移動操作位置と最新の視点移動操作位置との情報、および、操作を受け付ける間の時間間隔、を用いて算出する。そして、操作検出部153は、操作方向および操作速度を含めた視点移動操作検出信号を出力する。なお、視点移動操作は、第一実施形態で説明する手法を用いる。例えば、図4において、最初にタッチしたポイント211から、指を離さないでポイント213まで移動する操作を受け付けた場合、矢印223で示されるベクトルより、方向と速さが定まる。 When the operation detection unit 153 detects the viewpoint movement operation, the operation detection unit 153 calculates the operation direction and the operation speed. The operation direction and operation speed are calculated using the information between the immediately preceding viewpoint movement operation position and the latest viewpoint movement operation position, and the time interval between accepting the operations. Then, the operation detection unit 153 outputs a viewpoint movement operation detection signal including the operation direction and the operation speed. The viewpoint movement operation uses the method described in the first embodiment. For example, in FIG. 4, when the operation of moving from the first touched point 211 to the point 213 without releasing the finger is accepted, the direction and speed are determined from the vector indicated by the arrow 223.
 表示制御部154は、視点移動操作検出信号に含まれる速度および方向の情報に基づいて、視標500の表示位置を制御する。本実施形態では、例えば、奥行き方向および左右方向について、速度閾値毎の表示位置を指定するテーブル(表示位置テーブル)を、処理データ155として保持する。表示位置テーブルは、例えば、フラッシュROM124等に記憶される。表示制御部154は、この表示位置テーブルを参照し、表示位置を制御する。 The display control unit 154 controls the display position of the optotype 500 based on the speed and direction information included in the viewpoint movement operation detection signal. In the present embodiment, for example, a table (display position table) for designating a display position for each speed threshold value in the depth direction and the left-right direction is held as processing data 155. The display position table is stored in, for example, the flash ROM 124 or the like. The display control unit 154 refers to this display position table and controls the display position.
 表示位置のテーブル例を、図8(a)および図8(b)に示す。図8(a)は、速度閾値と奥行き方向の表示位置との関係を示す奥行き方向表示位置テーブル161の例であり、図8(b)は、速度閾値と左右方向の表示位置との関係を示す左右方向表示位置テーブル162の例である。 Examples of display position tables are shown in FIGS. 8 (a) and 8 (b). FIG. 8A is an example of the depth direction display position table 161 showing the relationship between the speed threshold and the display position in the depth direction, and FIG. 8B shows the relationship between the speed threshold and the display position in the left-right direction. This is an example of the left-right display position table 162 shown.
 これらの図に示すように、表示位置テーブルでは、1以上の速度閾値毎に、当該速度閾値を超えたときの奥行き方向および左右方向それぞれの表示位置を指定可能な情報を、奥行き方向表示位置および左右方向表示位置として保持する。奥行き方向表示位置および左右方向表示位置は、視標500の種類ごとに保持してもよい。例えば、図8(a)に、速度閾値0に対応付けて虚像面310と一致する位置D0が、速度閾値V1に対応づけて位置D1が、速度閾値V2に対応付けて位置D2が、それぞれ奥行き方向表示位置として記憶されている場合、速度0からV1の場合、D0に、速度V1を超えてV2までは、D1に、速度V2を超えた場合、D2に、それぞれ視標500が表示される。なお、V1、V2は、V1<V2を満たす任意の速度閾値である。 As shown in these figures, in the display position table, for each one or more speed thresholds, information that can specify the display positions in the depth direction and the left-right direction when the speed threshold is exceeded is provided in the depth direction display position and the display position. Hold as the left-right display position. The depth direction display position and the left-right direction display position may be held for each type of the optotype 500. For example, in FIG. 8A, the position D0 corresponding to the velocity threshold 0 and matching the virtual image plane 310 has the depth corresponding to the velocity threshold V1 and the position D2 corresponding to the velocity threshold V2. When stored as the direction display position, when the speed is 0 to V1, the optotype 500 is displayed on D0, when the speed exceeds V1 and up to V2, on D1, and when the speed exceeds V2, the optotype 500 is displayed on D2. .. Note that V1 and V2 are arbitrary speed threshold values that satisfy V1 <V2.
 例えば、環状配置マーク510の場合は、白丸(第一マーク511)および黒丸(第二マーク512)が配置される仮想円周の中心位置および白丸または黒丸の個数を保持する。位置情報として、例えば、虚像面310上の座標位置を保持する。また、設定される速度閾値は、奥行き方向、左右方向で異なっていてもよい。 For example, in the case of the annular arrangement mark 510, the center position of the virtual circumference where the white circle (first mark 511) and the black circle (second mark 512) are arranged and the number of white circles or black circles are held. As the position information, for example, the coordinate position on the virtual image plane 310 is held. Further, the set speed threshold value may be different in the depth direction and the left-right direction.
 図8(c)は、HMD装置100を装着したユーザ101が、カーチェイスのゲームを操作しているときの表示状態の概念図である。第一実施形態と同じ構成については同じ番号を付し、説明を省略する。 FIG. 8C is a conceptual diagram of a display state when a user 101 equipped with the HMD device 100 is operating a car chase game. The same configuration as that of the first embodiment is assigned the same number, and the description thereof will be omitted.
 カーチェイスゲームでは、ディスプレイの表示領域に、自車のフロントガラスの枠を模した画像(以下、フロントガラス画像423と呼ぶ)が表示される。また、追跡対象の車(追跡対象車421)と、道路422と、も表示される。 In the car chase game, an image imitating the frame of the windshield of the own vehicle (hereinafter referred to as windshield image 423) is displayed in the display area of the display. In addition, the vehicle to be tracked (vehicle to be tracked 421) and the road 422 are also displayed.
 HMD装置100は、ユーザ101の指示により、道路422の地理的な形状を、通信装置129を介して入手する。また、追跡対象車421の走行位置も通信装置129を介して入手する。一方、追跡対象車421を追いかけるユーザ101の運転する仮想の車の位置は、HMD装置100のHMD操作装置130で操作する。 The HMD device 100 obtains the geographical shape of the road 422 via the communication device 129 according to the instruction of the user 101. Further, the traveling position of the tracking target vehicle 421 is also obtained via the communication device 129. On the other hand, the position of the virtual vehicle driven by the user 101 chasing the tracking target vehicle 421 is operated by the HMD operation device 130 of the HMD device 100.
 図8(c)は、右カーブの道路422を走行する追跡対象車421をユーザ101が追跡しているときの画像である。ここでは、フロントガラス画像423が重畳表示される。 FIG. 8C is an image when the user 101 is tracking the tracking target vehicle 421 traveling on the road 422 of the right curve. Here, the windshield image 423 is superimposed and displayed.
 本実施形態では、表示制御部154は、追跡対象車421が虚像面310よりもはるかに遠方(前方)に両眼立体視されるよう、表示する。一方、フロントガラス画像423は、虚像面310より手前に立体表示する。 In the present embodiment, the display control unit 154 displays the tracking target vehicle 421 so that it is stereoscopically viewed with both eyes far away (forward) from the virtual image plane 310. On the other hand, the windshield image 423 is stereoscopically displayed in front of the virtual image plane 310.
 右カーブの道路422を走行する追跡対象車421を追跡する操作をユーザ101が行うと、ユーザ101の視線は右に行く。この場合、視標500は、右寄りに表示される。ここでは、操作検出部153は、対応する視点移動操作を検出し、操作方向および操作速度を算出する。そして、これらの情報を含む視点移動信号を生成し、出力する。 When the user 101 performs an operation of tracking the tracking target vehicle 421 traveling on the road 422 of the right curve, the line of sight of the user 101 goes to the right. In this case, the optotype 500 is displayed on the right side. Here, the operation detection unit 153 detects the corresponding viewpoint movement operation and calculates the operation direction and the operation speed. Then, a viewpoint movement signal including such information is generated and output.
 視点移動信号を受信した表示制御部154は、表示位置テーブルを参照し、視標500を表示する。本図の例では、視標500として環状配置マーク510が表示される場合を示す。 The display control unit 154 that has received the viewpoint movement signal refers to the display position table and displays the optotype 500. In the example of this figure, the case where the annular arrangement mark 510 is displayed as the optotype 500 is shown.
 このように処理することにより、視線中心が環状配置マーク510の仮想的な円周513で囲まれた領域内に入るので、ユーザ101は、環状配置マーク510を画面中央に表示するよりも視界に入れやすい。ユーザ101には流れて見える道路422に惑わされて、ユーザ101自身が動いているように錯覚することを軽減できる。よって、VR酔いを軽減することができる。 By processing in this way, the center of the line of sight enters the area surrounded by the virtual circumference 513 of the annular arrangement mark 510, so that the user 101 can see the annular arrangement mark 510 rather than displaying it in the center of the screen. Easy to put in. It is possible to reduce the illusion that the user 101 itself is moving by being confused by the road 422 that appears to flow to the user 101. Therefore, VR sickness can be reduced.
 [表示制御処理]
 次に、本実施形態の表示制御処理の流れを説明する。図9は、本実施形態の表示制御処理の処理フローである。本処理も、HMD装置100が起動されることを契機に開始される。また、第一実施形態と同じ処理には同じ符号を付し、再度の説明は省略する。
[Display control processing]
Next, the flow of the display control process of this embodiment will be described. FIG. 9 is a processing flow of the display control processing of the present embodiment. This process is also started when the HMD device 100 is started. Further, the same processing as in the first embodiment is designated by the same reference numerals, and the description thereof will be omitted again.
 本実施形態では、まず、全体制御部151は、ユーザ101から指示されたアプリケーションを起動する(ステップS2101)。 In the present embodiment, first, the overall control unit 151 starts the application instructed by the user 101 (step S2101).
 次に、全体制御部151は、ユーザ101からの指示に従って、初期設定を行う(ステップS2102)。第一実施形態同様、視標500の諸仕様を設定する。なお、諸仕様は、アプリケーションに応じて、予め定められていてもよい。ここでも、第一実施形態同様、環状配置マーク510を表示させるよう設定されたものとして説明する。 Next, the overall control unit 151 performs the initial setting according to the instruction from the user 101 (step S2102). Similar to the first embodiment, various specifications of the optotype 500 are set. The specifications may be predetermined according to the application. Here, as in the first embodiment, it is assumed that the annular arrangement mark 510 is set to be displayed.
 その後、受付部152は、第一実施形態同様、所定の時間間隔で、ゲーム等のアプリケーションの各種操作を受け付ける(ステップS1103)。このとき、本実施形態の受付部152は、設定変更の割り込みを受け付けたかを判別する(ステップS2104)。 After that, the reception unit 152 receives various operations of the application such as a game at predetermined time intervals as in the first embodiment (step S1103). At this time, the reception unit 152 of the present embodiment determines whether or not the setting change interrupt has been received (step S2104).
 設定変更の割り込みを受け付けた場合(S2104;Yes)、全体制御部151は、受け付けた指示に従って設定を変更し(ステップS2105)、ステップS1103へ移行する。 When the setting change interrupt is received (S2104; Yes), the overall control unit 151 changes the setting according to the received instruction (step S2105), and proceeds to step S1103.
 一方、設定変更の割り込みを受け付けていない場合(S2104;No)、操作検出部153は、第一実施形態同様、視点移動操作がなされたかを判別する(ステップS1104)。そして、視点移動操作有と判別した場合(S1104;Yes)、操作検出部153は、操作方向と操作速度とを算出し、それらの情報を含む視点移動操作検出信号を表示制御部154に出力する(ステップS2106)。 On the other hand, when the setting change interrupt is not accepted (S2104; No), the operation detection unit 153 determines whether the viewpoint movement operation has been performed as in the first embodiment (step S1104). Then, when it is determined that the viewpoint movement operation is present (S1104; Yes), the operation detection unit 153 calculates the operation direction and the operation speed, and outputs the viewpoint movement operation detection signal including the information to the display control unit 154. (Step S2106).
 表示制御部154は、操作方向と操作速度とを含む視点移動操作検出信号に基づき、表示位置テーブルを参照し、表示位置を決定し、決定した表示位置に視標500を表示させ(ステップS2107)、ステップS1109へ移行する。 The display control unit 154 refers to the display position table based on the viewpoint movement operation detection signal including the operation direction and the operation speed, determines the display position, and displays the optotype 500 at the determined display position (step S2107). , Step S1109.
 また、ステップS1104において、視点移動操作を受け付けていない場合(S1104;No)の処理は、第一実施形態と同様であるため、ここでは説明を省略する。 Further, in step S1104, the process when the viewpoint movement operation is not accepted (S1104; No) is the same as that of the first embodiment, and thus the description thereof will be omitted here.
 また、ステップS1107、S1108、S1109、S1110の処理も第一実施形態と同様であるため、ここでは説明を省略する。 Further, since the processing of steps S1107, S1108, S1109, and S1110 is the same as that of the first embodiment, the description thereof will be omitted here.
 このように、本実施形態の操作検出部153は、さらに、検出した視点移動操作の操作方向および操作速度をさらに検出し、検出した当該操作方向および操作速度の情報を視点移動操作検出信号に含めて出力し、表示制御部154は、それらの情報に基づき、ユーザ101の視点が移動した方向に視標500を変位させて表示させる。 As described above, the operation detection unit 153 of the present embodiment further detects the operation direction and operation speed of the detected viewpoint movement operation, and includes the detected information on the operation direction and operation speed in the viewpoint movement operation detection signal. The display control unit 154 displaces the optotype 500 in the direction in which the viewpoint of the user 101 has moved and displays the display based on the information.
 また、本実施形態では、第一の実施形態と異なり、アプリケーションの実行状態で視標500の初期設定と割り込みによる設定変更をできる。さらに、本実施形態では、視点移動の前進方向の速度に対して複数の閾値に基づいて、視標500の奥行き方向位置(両眼立体視した場合の前後方向位置)を段階的に切り替える。また、視点移動の左右方向の速度に対して複数の閾値に基づいて、視標500の左右方向位置を段階的に切り替えて表示する。これにより、アプリケーションの表示画像を見つつ視標500を視界に捉えやすくなるため、VR酔いの低減効果を得やすくなる。 Further, in the present embodiment, unlike the first embodiment, the initial setting of the optotype 500 and the setting change by interrupt can be performed in the execution state of the application. Further, in the present embodiment, the depth direction position (front-back direction position in the case of binocular stereoscopic viewing) of the optotype 500 is gradually switched based on a plurality of threshold values with respect to the speed in the forward direction of the viewpoint movement. Further, the position of the optotype 500 in the left-right direction is gradually switched and displayed based on a plurality of threshold values with respect to the speed of the viewpoint movement in the left-right direction. This makes it easier to capture the optotype 500 in the field of view while looking at the display image of the application, so that it becomes easier to obtain the effect of reducing VR sickness.
 <<第三実施形態>>
 次に、本発明の第三実施形態を説明する。本実施形態では、対象を撮影動画映像とする。そして、本実施形態のHMD装置100では、その撮影動画映像から検出した両眼立体視像の移動量が所定以上である場合、視点移動操作有りと判別する。
<< Third Embodiment >>
Next, a third embodiment of the present invention will be described. In the present embodiment, the target is a captured moving image. Then, in the HMD apparatus 100 of the present embodiment, when the movement amount of the binocular stereoscopic image detected from the captured moving image is equal to or more than a predetermined value, it is determined that the viewpoint movement operation is performed.
 本実施形態のHMD装置100は、基本的に第一実施形態と同様である。ただし、操作検出部153による視点移動操作検出処理が異なる。すなわち、本実施形態では、視点移動操作を、ユーザ101による操作から検出するのではなく、映像の動きから検出する。以下、本実施形態について、第一実施形態と異なる構成に主眼をおいて説明する。 The HMD device 100 of this embodiment is basically the same as that of the first embodiment. However, the viewpoint movement operation detection process by the operation detection unit 153 is different. That is, in the present embodiment, the viewpoint movement operation is detected not from the operation by the user 101 but from the movement of the image. Hereinafter, the present embodiment will be described with a focus on a configuration different from that of the first embodiment.
 本実施形態の操作検出部153は、表示される映像を解析する。映像は、右目用カメラ119Rおよび左目用カメラ119Lで取得される。なお、周囲の画像を撮影するための、独立したカメラを備えていてもよい。 The operation detection unit 153 of the present embodiment analyzes the displayed image. The video is acquired by the right-eye camera 119R and the left-eye camera 119L. An independent camera may be provided for taking an image of the surroundings.
 操作検出部153は、映像処理部126が、所定の時間間隔で処理を行い画像として生成した画像(映像)から、既存の動き検出処理を用いて画像内の要素の動きを検出する。例えば、最新の画像と、直前の画像とから、画像内の要素の動きを検出する。そして、所定の閾値以上の動きを解析した場合、視点移動ありと判別し、視点移動操作検出信号を表示制御部154に出力する。 The operation detection unit 153 detects the movement of an element in the image by using the existing motion detection process from the image (video) generated as an image by the image processing unit 126 processing at predetermined time intervals. For example, the movement of an element in an image is detected from the latest image and the immediately preceding image. Then, when the movement of the predetermined threshold value or more is analyzed, it is determined that there is a viewpoint movement, and the viewpoint movement operation detection signal is output to the display control unit 154.
 なお、このとき、第二実施形態のように、移動方向と移動速度とを、それぞれ、操作方向および操作速度として併せて検出し、これらの情報を視点移動操作検出信号に含めてもよい。例えば、映像がMPEG符号化方式で符号化されている場合、前後の画像から動きベクトルを算出して中間画像の補間を行う。この場合、その動きベクトルから評価値を決定して方向や速度を検出する。 At this time, as in the second embodiment, the movement direction and the movement speed may be detected together as the operation direction and the operation speed, respectively, and these information may be included in the viewpoint movement operation detection signal. For example, when the video is encoded by the MPEG coding method, the motion vector is calculated from the previous and next images and the intermediate image is interpolated. In this case, the evaluation value is determined from the motion vector to detect the direction and speed.
 表示制御部154は、第一実施形態と同様に、視点移動操作検出信号を受信した場合、視標500を表示させる。 Similar to the first embodiment, the display control unit 154 displays the optotype 500 when it receives the viewpoint movement operation detection signal.
 次に、本実施形態の操作検出部153による、具体的な視点移動操作検出処理の概要を説明する。図10は、本実施形態の視点移動操作検出処理を説明するための図である。 Next, an outline of a specific viewpoint movement operation detection process by the operation detection unit 153 of the present embodiment will be described. FIG. 10 is a diagram for explaining the viewpoint movement operation detection process of the present embodiment.
 ここでは、カメラ(右目用カメラ119Rおよび左目用カメラ119L)の向きが変化することにより変化する風景の映像をHMD装置100で観賞している場合を示す。すなわち、山の映像が、破線431から実線432へ移動する場合の例である。 Here, the case where the HMD device 100 is viewing an image of a landscape that changes due to a change in the orientation of the cameras (right-eye camera 119R and left-eye camera 119L) is shown. That is, it is an example of the case where the image of the mountain moves from the broken line 431 to the solid line 432.
 本実施形態の操作検出部153は、この映像から動き検出する。そして、検出した動き量が所定の閾値以上の場合、視点移動操作を検出したと判断し、視線移動操作信号を出力する。そして、表示制御部154は、それを受け、視標500を表示する。ここでは、視標500として、環状配置マーク510が表示される場合を例示する。 The operation detection unit 153 of the present embodiment detects motion from this image. Then, when the detected movement amount is equal to or greater than a predetermined threshold value, it is determined that the viewpoint movement operation has been detected, and the line-of-sight movement operation signal is output. Then, the display control unit 154 receives it and displays the optotype 500. Here, a case where the annular arrangement mark 510 is displayed as the optotype 500 will be illustrated.
 以上説明したように、本実施形態では、操作検出部153は、動画映像から検出した両眼立体視像の移動量が、予め定めた閾値以上である場合、視点移動操作を検出したとする。 As described above, in the present embodiment, it is assumed that the operation detection unit 153 detects the viewpoint movement operation when the movement amount of the binocular stereoscopic image detected from the moving image is equal to or more than a predetermined threshold value.
 このため、本実施形態によれば、動きを検出したとき、視標500が自動的に表示される。ユーザ101は、この視標500を視界に入れることにより、映像の動きに惑わされて自身が動いているように錯覚しにくくなる。これにより、VR酔いを軽減することができる。 Therefore, according to the present embodiment, when the movement is detected, the optotype 500 is automatically displayed. By putting the optotype 500 in the field of view, the user 101 is less likely to be confused by the movement of the image and to have the illusion that he / she is moving. This makes it possible to reduce VR sickness.
 <第三実施形態の変形例>
 なお、本実施形態では、動きを検出した場合、自動的に視標500が表示されているが、これに限定されない。例えば、第一実施形態同様、ユーザ101の動作を解析し、視点移動操作を検出してもよい。
<Modified example of the third embodiment>
In the present embodiment, when motion is detected, the optotype 500 is automatically displayed, but the present invention is not limited to this. For example, as in the first embodiment, the operation of the user 101 may be analyzed to detect the viewpoint movement operation.
 具体的には、ユーザ101は、動きのある映像を視認したときに、映像の動きに追随して、第一実施形態と同様の視点移動操作を行う。操作検出部153は、受付部152が受け付けた操作を解析し、視点移動の有無を判別する。 Specifically, when the user 101 visually recognizes a moving image, the user 101 follows the movement of the image and performs the same viewpoint movement operation as in the first embodiment. The operation detection unit 153 analyzes the operation received by the reception unit 152 and determines whether or not the viewpoint is moved.
 <変形例1>
 なお、視標500の表示位置は、虚像面310に限定されない。例えば、図11に示すように、表示制御部154は、視標500(環状配置マーク510)を虚像面310よりも前方(奥行き方向遠方)に表示してもよい。すなわち、視標500が表示される仮想面の奥行き方向の距離は、虚像面310の奥行き方向の距離よりも大きく、かつ、両眼立体視像410までの奥行き方向の距離よりも小さい。
<Modification example 1>
The display position of the optotype 500 is not limited to the virtual image plane 310. For example, as shown in FIG. 11, the display control unit 154 may display the optotype 500 (annular arrangement mark 510) in front of the virtual image plane 310 (far away in the depth direction). That is, the distance in the depth direction of the virtual surface on which the optotype 500 is displayed is larger than the distance in the depth direction of the virtual image surface 310 and smaller than the distance in the depth direction to the binocular stereoscopic image 410.
 これにより、視標500と道路422とを交互に視認する際に、両眼立体視の視線611と視線612との両眼視差の調整が少なくて済むため、眼精疲労を軽減できる。 As a result, when the optotype 500 and the road 422 are alternately visually recognized, the adjustment of the binocular parallax between the binocular parallax 611 and the binocular parallax 612 in binocular stereoscopic vision can be reduced, so that eye strain can be reduced.
 <変形例2>
 なお、上記実施形態では、視標500として、仮想円上に黒丸および白丸が交互に配置される環状配置マーク510を用いる。しかし、視標500はこれに限定されない。例えば、図12(a)および図12(b)に示すようなパターンであってもよい。
<Modification 2>
In the above embodiment, as the optotype 500, an annular arrangement mark 510 in which black circles and white circles are alternately arranged on a virtual circle is used. However, the optotype 500 is not limited to this. For example, the pattern may be as shown in FIGS. 12 (a) and 12 (b).
 図12(a)に示す視標500は、角が丸い矩形の二点鎖線521で構成される第一視標520である。また、図12(b)に示す視標500は、仮想の円周上に円弧532と、図形533とが配置される第二視標530である。なお、視標500は、ユーザ101が、自身が移動していないことを把握できるよう、視点を固定でき、かつ、視野の中央部を塞がずに画像鑑賞を妨げない形状であれば、その形状は問わない。 The optotype 500 shown in FIG. 12A is a first optotype 520 composed of a rectangular two-dot chain line 521 with rounded corners. Further, the optotype 500 shown in FIG. 12B is a second optotype 530 in which an arc 532 and a figure 533 are arranged on a virtual circumference. The optotype 500 has a shape that allows the user 101 to fix the viewpoint so that he / she can grasp that he / she is not moving and does not block the central portion of the visual field and does not interfere with image viewing. The shape does not matter.
 上記いずれの実施形態においても、視標500は、予め複数の種類を用意し、ユーザ101の好みに応じて選択可能にしてもよいし、アプリケーション毎に、表示する視標500の種類を予め定めておいてもよい。表示位置についても同様である。 In any of the above embodiments, a plurality of types of optotypes 500 may be prepared in advance and can be selected according to the preference of the user 101, or the types of optotypes 500 to be displayed may be predetermined for each application. You may keep it. The same applies to the display position.
 <変形例3>
 なお、上記各実施形態では、HMD本体装置110とHMD制御装置120との間を有線で接続する例を示したが、これに限定されない。例えば、HMD本体装置110が電源部を備えている場合、無線による接続であってもよい。また、HMD制御装置120とHMD操作装置130との間を無線接続する例を示したが、有線であってもよい。
<Modification example 3>
In each of the above embodiments, an example of connecting the HMD main unit 110 and the HMD control device 120 by wire is shown, but the present invention is not limited to this. For example, when the HMD main unit 110 includes a power supply unit, the connection may be wireless. Further, although an example of wirelessly connecting the HMD control device 120 and the HMD operation device 130 is shown, it may be wired.
 また、上記各実施形態では、HMD装置100を、HMD本体装置110と、HMD制御装置120と、HMD操作装置130と、の3つの装置で構成する例を示したが、これに限定されない。 Further, in each of the above embodiments, an example in which the HMD device 100 is composed of three devices, an HMD main body device 110, an HMD control device 120, and an HMD operation device 130, has been shown, but the present invention is not limited thereto.
 例えば、HMD本体装置110にHMD制御装置120の機能を含めてもよい。この場合、両者の接続インタフェースである装置間I/F117および127を削減できる。また、HMD制御装置120にHMD操作装置130の機能を含めてもよい。この場合、両者の接続インタフェースである近距離無線通信装置129bおよび139を削減できる。 For example, the function of the HMD control device 120 may be included in the HMD main body device 110. In this case, the inter-device I / F 117 and 127, which are the connection interfaces between the two, can be reduced. Further, the HMD control device 120 may include the function of the HMD operation device 130. In this case, the short-range wireless communication devices 129b and 139, which are the connection interfaces between the two, can be reduced.
 また、HMD装置100を一体型とし、HMD操作装置130の機能は備えなくてもよい。この場合、ユーザ101の操作は、ジェスチャ検出等により検出してもよい。ジェスチャ検出は、例えば、カメラ(右目用カメラ119Rおよび左目用カメラ119L)で周囲の画像を撮影し、CPU122により当該画像を解析することにより実現する。 Further, the HMD device 100 may be integrated and the function of the HMD operating device 130 may not be provided. In this case, the operation of the user 101 may be detected by gesture detection or the like. Gesture detection is realized, for example, by taking an image of the surroundings with a camera (camera for the right eye 119R and a camera for the left eye 119L) and analyzing the image with the CPU 122.
 また、HMD操作装置130として、HMD装置100とは異なる外部装置を用いてもよい。外部装置は、例えば、スマートフォンやゲームコントローラ等である。スマートフォンを用いる場合、HMD装置100の操作装置として機能させるためのアプリケーションがインストールされる。また、ゲームコントローラの場合、ジョイスティックの操作で視点移動操作を行うようにしてもよい。 Further, as the HMD operation device 130, an external device different from the HMD device 100 may be used. The external device is, for example, a smartphone, a game controller, or the like. When using a smartphone, an application for functioning as an operating device of the HMD device 100 is installed. Further, in the case of a game controller, the viewpoint movement operation may be performed by operating the joystick.
 <変形例4>
 上記各実施形態では、HMD装置100を用いる場合を例にあげて説明したが、装置は、HMD装置100に限定されない。例えば、スマートフォン(Smart Phone)を装着することにより、HMD装置の機能を実現するVRゴーグルであってもよい。
<Modification example 4>
In each of the above embodiments, the case where the HMD device 100 is used has been described as an example, but the device is not limited to the HMD device 100. For example, it may be VR goggles that realize the function of the HMD device by attaching a smartphone (Smart Phone).
 また、ユーザ101に両眼立体視像を提供可能であれば、例えば、スマートフォン(Smart Phone)、PC(Personal Computer)、タブレット等であってもよい。 Further, as long as it is possible to provide the binocular stereoscopic image to the user 101, for example, a smartphone (Smart Phone), a PC (Personal Computer), a tablet, or the like may be used.
 本発明は上記した実施形態および変形例に限定されるものではなく、様々な変形例が含まれる。例えば、上記した実施形態および変形例は本発明を分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。また、ある実施形態または変形例の構成の一部を他の実施形態や変形例の構成に置き換えることが可能である。また、ある実施形態または変形例の構成に他の実施形態または変形例の構成を加えることも可能である。さらに、各実施形態または変形例の構成の一部について、他の構成の追加・削除・置換をすることが可能である。 The present invention is not limited to the above-described embodiments and modifications, and includes various modifications. For example, the above-described embodiments and modifications have been described in detail in order to explain the present invention in an easy-to-understand manner, and are not necessarily limited to those having all the described configurations. Further, it is possible to replace a part of the configuration of one embodiment or modification with the configuration of another embodiment or modification. It is also possible to add the configuration of another embodiment or modification to the configuration of one embodiment or modification. Further, it is possible to add / delete / replace other configurations with respect to a part of the configurations of each embodiment or modification.
 また、上記の各構成、機能、処理部、処理手段等は、それらの一部または全部を、例えば集積回路で設計する等によりハードウェアで実現してもよい。また、上記の各構成、機能等は、プロセッサがそれぞれの機能を実現するプログラムを解釈し、実行することによりソフトウェアで実現してもよい。各機能を実現するプログラム、テーブル、ファイル等の情報は、メモリ部や、ハードディスク、SSD(Solid State Drive)等の記録装置、または、ICカード、SDカード、DVD等の記録媒体に置くことができる。 Further, each of the above configurations, functions, processing units, processing means, etc. may be realized by hardware by designing a part or all of them by, for example, an integrated circuit. Further, each of the above configurations, functions, and the like may be realized by software by the processor interpreting and executing a program that realizes each function. Information such as programs, tables, and files that realize each function can be placed in a memory unit, a hard disk, a recording device such as an SSD (Solid State Drive), or a recording medium such as an IC card, SD card, or DVD. ..
 また、制御線や情報線は説明上必要と考えられるものを示しており、製品上必ずしも全ての制御線や情報線を示しているとは限らない。実際には殆ど全ての構成が相互に接続されていると考えてもよい。 In addition, the control lines and information lines indicate what is considered necessary for explanation, and not all control lines and information lines are necessarily shown on the product. In practice, it can be considered that almost all configurations are interconnected.
 100:HMD装置、101:ユーザ、
 110:HMD本体装置、111:バス、114L:左目用ディスプレイ、114R:右目用ディスプレイ、117:装置間I/F、119L:左目用カメラ、119R:右目用カメラ、
 120:HMD制御装置、121:バス、122:CPU、123:RAM、124:フラッシュROM、124a:動作プログラム保持部、124b:画像データ保持部、125:OSD生成部、126:映像処理部、127:装置間I/F、128:センサ装置、128a:加速度センサ、128b:地磁気センサ、128c:GPS受信装置、128d:距離センサ、129:通信装置、129a:ネットワーク通信装置、129b:近距離無線通信装置、
 130:HMD操作装置、131:バス、132:CPU、133:RAM、134:フラッシュROM、137:タッチセンサ、138:ジャイロセンサ、139:近距離無線通信装置、
 151:全体制御部、152:受付部、153:操作検出部、154:表示制御部、155:処理データ、
 161:奥行き方向表示位置テーブル、162:左右方向表示位置テーブル、
 211:ポイント、212:ポイント、213:ポイント、221:矢印、222:矢印、223:矢印、
 300:仮想面、310:虚像面、
 410:両眼立体視像(表示物)、410L:左目用画像(虚像)、410R:右目用画像(虚像)、421:追跡対象車、422:道路、423:フロントガラス画像、431:破線、432:実線、
 500:視標、510:環状配置マーク、511:第一マーク、512:第二マーク、513:円周、520:第一視標、521:二点鎖線、530:第二視標、532:円弧、533:図形、
 611:視線、611L:視線、611R:視線、612:視線
100: HMD device, 101: user,
110: HMD main unit, 111: bus, 114L: left eye display, 114R: right eye display, 117: inter-device I / F, 119L: left eye camera, 119R: right eye camera,
120: HMD control device, 121: Bus, 122: CPU, 123: RAM, 124: Flash ROM, 124a: Operation program holding unit, 124b: Image data holding unit, 125: OSD generation unit, 126: Video processing unit 127 : Inter-device I / F, 128: Sensor device, 128a: Acceleration sensor, 128b: Geomagnetic sensor, 128c: GPS receiver, 128d: Distance sensor, 129: Communication device, 129a: Network communication device, 129b: Short-range wireless communication Device,
130: HMD operating device, 131: bus, 132: CPU, 133: RAM, 134: flash ROM, 137: touch sensor, 138: gyro sensor, 139: short-range wireless communication device,
151: Overall control unit, 152: Reception unit, 153: Operation detection unit, 154: Display control unit, 155: Processing data,
161: Depth direction display position table, 162: Left and right direction display position table,
211: Point, 212: Point, 213: Point, 221: Arrow, 222: Arrow, 223: Arrow,
300: virtual surface, 310: virtual image surface,
410: Binocular stereoscopic image (display object), 410L: Left eye image (virtual image), 410R: Right eye image (virtual image), 421: Tracked vehicle, 422: Road, 423: Windshield image, 431: Broken line, 432: Solid line,
500: optotype, 510: annular arrangement mark, 511: first mark, 512: second mark, 513: circumference, 520: first optotype, 521: alternate long and short dash line, 530: second optotype, 532: Arc, 533: figure,
611: Line of sight, 611L: Line of sight, 611R: Line of sight, 612: Line of sight

Claims (8)

  1.  ユーザの頭部に装着し、右目用画像と左目用画像とを両眼立体視するヘッドマウントディスプレイ装置であって、
     視点移動操作を検出した場合、検出信号を出力する操作検出部と、
     前記検出信号が出力されている間、視標を前記右目用画像と前記左目用画像に重畳表示させる表示制御部と、を備え、
     前記視標は、当該視標を両眼立体視したときに、奥行き方向に関し、自装置から予め定めた距離の仮想面上に表示され、かつ、視野を確保可能な形状を有すること
     を特徴とするヘッドマウントディスプレイ装置。
    A head-mounted display device that is worn on the user's head and stereoscopically views the image for the right eye and the image for the left eye.
    When a viewpoint movement operation is detected, an operation detection unit that outputs a detection signal and
    A display control unit that superimposes and displays the optotype on the right-eye image and the left-eye image while the detection signal is output is provided.
    The optotype is characterized in that when the optotype is viewed stereoscopically with both eyes, it is displayed on a virtual surface at a predetermined distance from the own device in the depth direction and has a shape capable of securing a field of view. Head-mounted display device.
  2.  請求項1記載のヘッドマウントディスプレイ装置であって、
     ユーザの指示を受け付ける受付部をさらに備え、
     前記操作検出部は、前記受付部が前記指示として前記視点移動操作を受け付けている間、前記視点移動操作を検出したとすること
     を特徴とするヘッドマウントディスプレイ装置。
    The head-mounted display device according to claim 1.
    It also has a reception area that accepts user instructions.
    The operation detection unit is a head-mounted display device, characterized in that the reception unit detects the viewpoint movement operation while the reception unit receives the viewpoint movement operation as the instruction.
  3.  請求項1記載のヘッドマウントディスプレイ装置であって、
     前記両眼立体視する右目用画像と左目用画像は動画像であり、
     前記操作検出部は、前記動画像から検出した前記両眼立体視による画像の位置の移動量が、予め定めた閾値以上である場合、前記視点移動操作を検出したとすること
     を特徴とするヘッドマウントディスプレイ装置。
    The head-mounted display device according to claim 1.
    The right eye image and the left eye image stereoscopically viewed with both eyes are moving images.
    The head is characterized in that the operation detection unit detects the viewpoint movement operation when the amount of movement of the position of the image by the binocular stereoscopic vision detected from the moving image is equal to or more than a predetermined threshold value. Mounted display device.
  4.  請求項1記載のヘッドマウントディスプレイ装置であって、
     前記操作検出部は、検出した前記視点移動操作の方向および速度をさらに検出し、検出した当該方向および速度の情報を前記検出信号に含めて出力し、
     前記表示制御部は、前記方向および速度の情報に基づき、ユーザが視点移動操作した方向に前記視標を変位させて表示させること
     を特徴とするヘッドマウントディスプレイ装置。
    The head-mounted display device according to claim 1.
    The operation detection unit further detects the detected direction and speed of the viewpoint movement operation, includes the detected direction and speed information in the detection signal, and outputs the information.
    The display control unit is a head-mounted display device characterized in that the target is displaced and displayed in the direction in which the user operates the viewpoint movement based on the direction and speed information.
  5.  請求項1記載のヘッドマウントディスプレイ装置であって、
     前記仮想面は、当該ヘッドマウントディスプレイ装置の前記右目用画像と前記左目用画像の虚像面であること
     を特徴とするヘッドマウントディスプレイ装置。
    The head-mounted display device according to claim 1.
    The head-mounted display device is characterized in that the virtual surface is a virtual image surface of the image for the right eye and the image for the left eye of the head-mounted display device.
  6.  請求項1記載のヘッドマウントディスプレイ装置であって、
     前記仮想面の前記奥行き方向の距離は、当該ヘッドマウントディスプレイ装置の前記右目用画像と前記左目用画像の虚像面の前記奥行き方向の距離よりも大きく、かつ、前記両眼立体視による画像までの前記奥行き方向の距離よりも小さいこと
     を特徴とするヘッドマウントディスプレイ装置。
    The head-mounted display device according to claim 1.
    The distance in the depth direction of the virtual surface is larger than the distance in the depth direction of the virtual image plane of the right eye image and the left eye image of the head-mounted display device, and the distance to the binocular stereoscopic image is reached. A head-mounted display device characterized in that it is smaller than the distance in the depth direction.
  7.  請求項1記載のヘッドマウントディスプレイ装置であって、
     前記視標は、予め定めた直径の仮想円の円周上に白丸と黒丸とが交互に配置された形状を有すること
     を特徴とするヘッドマウントディスプレイ装置。
    The head-mounted display device according to claim 1.
    The target is a head-mounted display device having a shape in which white circles and black circles are alternately arranged on the circumference of a virtual circle having a predetermined diameter.
  8.  ユーザの頭部に装着し、右目用画像と左目用画像とを表示するヘッドマウントディスプレイ装置における表示制御方法であって、
     視点移動操作を検出すると、検出信号を出力するステップと、
     前記検出信号が出力されている間、視標を表示させるステップと、を備え、
     前記視標は、当該視標を両眼立体視したときに、奥行き方向に関し、自装置から予め定めた距離の仮想面上に表示され、かつ、視野を確保可能な形状を有すること
     を特徴とする表示制御方法。
    It is a display control method in a head-mounted display device that is worn on the user's head and displays an image for the right eye and an image for the left eye.
    When the viewpoint movement operation is detected, the step to output the detection signal and
    A step of displaying an optotype while the detection signal is being output is provided.
    The optotype is characterized in that when the optotype is viewed stereoscopically with both eyes, it is displayed on a virtual surface at a predetermined distance from the own device in the depth direction and has a shape capable of securing a field of view. Display control method to be performed.
PCT/JP2020/018133 2020-04-28 2020-04-28 Head-mounted display device and display control method WO2021220407A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/018133 WO2021220407A1 (en) 2020-04-28 2020-04-28 Head-mounted display device and display control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/018133 WO2021220407A1 (en) 2020-04-28 2020-04-28 Head-mounted display device and display control method

Publications (1)

Publication Number Publication Date
WO2021220407A1 true WO2021220407A1 (en) 2021-11-04

Family

ID=78332313

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/018133 WO2021220407A1 (en) 2020-04-28 2020-04-28 Head-mounted display device and display control method

Country Status (1)

Country Link
WO (1) WO2021220407A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116781884A (en) * 2023-06-29 2023-09-19 广州视景医疗软件有限公司 Data acquisition method and device for monocular stereoscopic vision

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07181424A (en) * 1993-12-22 1995-07-21 Canon Inc Compound lens picture display device
JPH08111834A (en) * 1994-10-12 1996-04-30 Olympus Optical Co Ltd Head mounted video image display device
JP2005084569A (en) * 2003-09-11 2005-03-31 Brother Ind Ltd Picture display device
JP2018157331A (en) * 2017-03-16 2018-10-04 株式会社スクウェア・エニックス Program, recording medium, image generating apparatus, image generation method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07181424A (en) * 1993-12-22 1995-07-21 Canon Inc Compound lens picture display device
JPH08111834A (en) * 1994-10-12 1996-04-30 Olympus Optical Co Ltd Head mounted video image display device
JP2005084569A (en) * 2003-09-11 2005-03-31 Brother Ind Ltd Picture display device
JP2018157331A (en) * 2017-03-16 2018-10-04 株式会社スクウェア・エニックス Program, recording medium, image generating apparatus, image generation method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116781884A (en) * 2023-06-29 2023-09-19 广州视景医疗软件有限公司 Data acquisition method and device for monocular stereoscopic vision
CN116781884B (en) * 2023-06-29 2024-03-12 广州视景医疗软件有限公司 Data acquisition method and device for monocular stereoscopic vision

Similar Documents

Publication Publication Date Title
US11235871B2 (en) Control method, control system, and smart glasses for first person view unmanned aerial vehicle flight
US10198870B2 (en) Information processing apparatus, information processing system, and information processing method
US11030771B2 (en) Information processing apparatus and image generating method
JP6642432B2 (en) Information processing apparatus, information processing method, and image display system
JP4900741B2 (en) Image recognition apparatus, operation determination method, and program
US10416835B2 (en) Three-dimensional user interface for head-mountable display
EP3349107B1 (en) Information processing device and image generation method
JP6479199B2 (en) Information processing device
JP5114795B2 (en) Image recognition apparatus, operation determination method, and program
JP6899875B2 (en) Information processing device, video display system, information processing device control method, and program
CN106817913A (en) Head mounted display, personal digital assistant device, image processing apparatus, display control program, display control method and display system
JP6507827B2 (en) Display system
KR20170062439A (en) Control device, control method, and program
JP2019125215A (en) Information processing apparatus, information processing method, and recording medium
WO2021220407A1 (en) Head-mounted display device and display control method
EP3109734A1 (en) Three-dimensional user interface for head-mountable display
JP2016126687A (en) Head-mounted display, operation reception method, and operation reception program
US11972037B2 (en) Head mounted information processing apparatus and head mounted display system
KR20180055637A (en) Electronic apparatus and method for controlling thereof
EP3958095A1 (en) A mobile computer-tethered virtual reality/augmented reality system using the mobile computer as a man machine interface
WO2024057783A1 (en) Information processing device provided with 360-degree image viewpoint position identification unit
JP2021177277A (en) Program, information processing method, information processing device and information processing system
JP2022118501A (en) Display system, display device, control method and program thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20933898

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20933898

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP