WO2021220407A1 - Dispositif d'affichage monté sur la tête et procédé de commande d'affichage - Google Patents

Dispositif d'affichage monté sur la tête et procédé de commande d'affichage Download PDF

Info

Publication number
WO2021220407A1
WO2021220407A1 PCT/JP2020/018133 JP2020018133W WO2021220407A1 WO 2021220407 A1 WO2021220407 A1 WO 2021220407A1 JP 2020018133 W JP2020018133 W JP 2020018133W WO 2021220407 A1 WO2021220407 A1 WO 2021220407A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
head
display device
mounted display
optotype
Prior art date
Application number
PCT/JP2020/018133
Other languages
English (en)
Japanese (ja)
Inventor
宣隆 奥山
仁 秋山
Original Assignee
マクセル株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by マクセル株式会社 filed Critical マクセル株式会社
Priority to PCT/JP2020/018133 priority Critical patent/WO2021220407A1/fr
Publication of WO2021220407A1 publication Critical patent/WO2021220407A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/361Reproducing mixed stereoscopic images; Reproducing mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background

Definitions

  • the present invention relates to VR (Virtual Reality) display technology.
  • HMD head-mounted display
  • VR sickness When a user wears an HMD device and views a moving image, so-called VR sickness may occur.
  • Patent Document 1 states, "The processing executed by the processor of the computer that provides the virtual reality detects the step of defining the virtual space in the memory and the operation of the user wearing the HMD device. Steps to determine the flight direction of an object flying in the virtual space based on the user's actions, and to display a view image on the HMD device so that the user's view in the virtual space moves in the flight direction. The step of generating the view image data of the above and displaying the view image on the monitor based on the generated view image data, and when the object reaches the object in the virtual space, the position of the virtual user is moved to the object at high speed. Steps to make, including (summary excerpt) ”discloses how a computer provides virtual space to an HMD device.
  • viewpoint movement when viewing an image in which the visual field is moved by the virtual user's own movement operation (hereinafter referred to as viewpoint movement), VR sickness caused by the illusion that the user is moving cannot be reduced.
  • the present invention has been made in view of the above points, and an object of the present invention is to facilitate keeping the binocular parallax substantially constant by consciously looking at the optotype while keeping the image accompanied by the viewpoint movement. , To provide an information display technique for reducing VR sickness caused by using an HMD device.
  • the present invention is a head-mounted display device that is worn on the user's head and stereoscopically views the image for the right eye and the image for the left eye with both eyes, and is an operation detection unit that outputs a detection signal when a viewpoint movement operation is detected.
  • the target is provided with a display control unit that superimposes and displays the target on the image for the right eye and the image for the left eye, and the target is a binocular stereoscopic view of the target. It is characterized in that it is displayed on a virtual surface at a predetermined distance from the own device in the depth direction and has a shape capable of securing a field of view.
  • FIG. 1 is a schematic diagram for explaining an outline of the present embodiment.
  • the virtual image surface 310 and the binocular stereoscopic image 410 appear to be in focus when the user 101 wears the head-mounted display device (hereinafter, HMD device) 100 and looks at it with only one of the left and right eyes. Is shown.
  • HMD device head-mounted display device
  • the binocular stereoscopic image 410 is a stereoscopic image in which the position in the depth direction appears as an illusion based on the parallax of both eyes.
  • the position where the image displayed on the display of the HMD device 100 is seen as a virtual image in which the right-eye image and the left-eye image viewed through the optical element are in focus is the virtual image plane 310.
  • the optotype 500 is displayed on a predetermined virtual surface (virtual surface 300) so that binocular stereoscopic viewing can be performed.
  • the optotype 500 is a mark for facilitating the stabilization of the binocular parallax of the user's line of sight 612.
  • the virtual surface 300 that displays the optotype 500 in binocular stereoscopic vision is a surface that is always at the same distance from the HMD device 100 (own device) in the depth direction when viewed from the user. In this embodiment, for example, it is displayed on the virtual image plane 310.
  • the virtual image surface 310 is a virtual surface in which the left-eye and right-eye display images displayed by the HMD device 100 appear to be formed as virtual images. That is, it is a virtual surface at a distance for optically forming an image of the HMD device 100. This distance is optically determined by the distance between the lens and the display included in the HMD device 100.
  • the virtual image surface 310 may not be a flat surface, or may be a surface that does not match between the right-eye display and the left-eye display.
  • the optotype 500 has a shape that can secure a field of view.
  • the annular arrangement mark 510 which is a mark arranged in an annular shape, is displayed as the optotype 500.
  • the annular arrangement mark 510 of the present embodiment includes a first mark 511 and a second mark 512.
  • the first mark 511 is a white circle and the second mark 512 is a black circle. show.
  • the first mark 511 and the second mark 512 are both arranged on the same circumference, and both are arranged alternately.
  • the binocular stereoscopic image 410 (display object 410) can be viewed by moving the viewpoint together with the background image by operating the operation unit.
  • This viewpoint movement means that the appearance of the display object 410 changes as if the user 101 has moved. For example, if the viewpoint is moved forward, the user 101 can stereoscopically view the display object 410 as if it were close to the user 101. By operating the viewpoint to move backward, the display object 410 can be stereoscopically viewed with both eyes so as to move away from the user 101.
  • the HMD device 100 displays the optotype 500 (annular arrangement mark 510) only while detecting an operation involving the viewpoint movement by the user 101 (hereinafter, referred to as a viewpoint movement operation).
  • a sensation called VR sickness may occur only by grasping the relative positions of the user 101 and the binocular stereoscopic image 410. This is because the user 101 actually falls into the illusion that he / she is moving even though he / she is not moving.
  • the optotype 500 whose binocular stereoscopic positions match is displayed on the virtual image plane 310. By putting the optotype 500 in the field of view, the user 101 can easily grasp that he / she is not moving, which has an effect of reducing VR sickness.
  • the optotype 500 of the present embodiment has a configuration that does not block the center of the field of view. Therefore, even if the optotype 500 is displayed, it does not interfere with viewing the displayed image. Further, in the present embodiment, since the optotype 500 is not displayed when the viewpoint movement operation is not performed, it does not interfere with viewing the entire displayed image.
  • 110 is the HMD main unit
  • 120 is the HMD control device
  • 130 is the HMD operation device
  • 611 and 612 are the line of sight
  • 119R and 119L are the cameras. Details of each will be described later.
  • FIG. 2 is a hardware configuration diagram of the HMD device 100 of the present embodiment.
  • the HMD device 100 includes an HMD main body device 110, an HMD control device 120, and an HMD operation device 130.
  • the HMD main unit 110 includes a bus 111, a right-eye display 114R, a left-eye display 114L, a right-eye camera 119R, a left-eye camera 119L, and an inter-device interface (I / F) 117.
  • the right-eye display 114R and the left-eye display 114L are display devices (displays) such as a liquid crystal panel, respectively, and present the image data processed by the image processing unit 126 described later of the HMD control device 120 to the user 101.
  • the right-eye display 114R and the left-eye display 114L may be transmissive displays.
  • the right-eye camera 119R and the left-eye camera 119L are external cameras that acquire an image of the surroundings of the HMD device 100.
  • the right-eye camera photographs the line-of-sight area of the user 101's right eye
  • the left-eye camera 119L photographs the line-of-sight area of the left eye of the user 101. Both are arranged at a predetermined distance.
  • the HMD control device 120 includes a bus 121, a CPU (Central Processor Unit) 122, a RAM 123, a flash ROM 124, an OSD generation unit 125, an image processing unit 126, an inter-device I / F 127, a sensor device 128, and the like. It includes a communication device 129 and.
  • CPU Central Processor Unit
  • the bus 121 is a data communication path for transmitting and receiving data between the CPU 122 and each part in the HMD device 100.
  • the CPU 122 is a main control unit that controls the entire HMD device 100 according to a predetermined program.
  • the CPU 122 may be realized by a microprocessor unit (MPU).
  • the CPU 122 performs processing according to a clock signal measured and output by the timer.
  • RAM 123 is a program area when executing a basic operation program or other application program. Further, the RAM 123 is a temporary storage area for temporarily holding data as needed when executing various application programs. The RAM 123 may be integrated with the CPU 122.
  • the flash ROM 124 stores information necessary for processing, such as each operation setting value of the HMD device 100.
  • the flash ROM 124 may store still image data, moving image data, and the like taken by the HMD device 100. Further, it is assumed that the function of the HMD device 100 can be expanded by downloading a new application program from the application server via the Internet. At this time, the downloaded new application program is stored in the flash ROM 124.
  • the HMD device 100 realizes various functions when the CPU 122 develops and executes a new application program stored in the flash ROM 124 in the RAM 123.
  • the flash ROM 124 is provided with, for example, an operation program holding unit 124a and an image data holding unit 124b.
  • the operation program is stored in the operation program holding unit 124a, and the image data is stored in the image data holding unit 124b.
  • the flash ROM 124 can hold the stored information even when the power is not supplied to the HMD device 100, and is not limited to the flash ROM 124, for example, SSD (Solid State Drive), HDD (Hard Disk Drive). ) Etc. may be a device.
  • SSD Solid State Drive
  • HDD Hard Disk Drive
  • the OSD generation unit 125 generates information (OSD information) to be displayed overlaid on the video.
  • the OSD information to be generated is, for example, a setting menu or operation information of the HMD device 100 itself.
  • the optotype 500 is also retained as OSD information.
  • the OSD information is stored in the flash ROM 124 or the like in advance.
  • the image processing unit 126 is an image (video) processor, processes images acquired by the right-eye camera 119R and the left-eye camera 119L, and generates images to be displayed on the right-eye display 114R and the left-eye display 114L, respectively. .. Further, the image processing unit 126 superimposes an object created by the CPU 122 or the like on an input image to generate image data to be displayed on each display.
  • the device-to-device I / F 127 is a data transmission / reception interface with the HMD main device 110.
  • the HMD main body device 110 and the HMD control device 120 are connected by wire. Therefore, the device-to-device I / F 127 is, for example, an interface for a wired connection.
  • the sensor device 128 is a group of sensors for detecting the state of the HMD device 100.
  • an acceleration sensor 128a, a geomagnetic sensor 128b, a GPS (Global Positioning System) receiving device 128c, and a distance sensor 128d are provided.
  • other sensors such as a gyro sensor may be provided.
  • These sensor groups detect the position, movement, tilt, direction, etc. of the HMD device 100.
  • the distance sensor 128d is a depth sensor and acquires distance information from the HMD device 100 to the object.
  • the communication device 129 is a communication processor that performs communication processing.
  • a network (N / W) communication device 129a and a short-range wireless communication device 129b are provided.
  • the network communication device 129a is an interface for transmitting and receiving contents via a network such as a LAN (Local Area Network). Data is sent and received by connecting to an access point for wireless communication on the Internet by wireless communication.
  • the short-range wireless communication device 129b transmits / receives data to / from the HMD operation device 130 by short-range wireless communication.
  • a standard such as Bluetooth® may be adopted.
  • the network communication device 129a and the short-range wireless communication device 129b each include a coding circuit, a decoding circuit, an antenna, and the like.
  • the communication device 129 may further include an infrared communication device or the like.
  • the HMD operation device 130 includes a bus 131, a CPU 132, a RAM 133, a flash ROM 134, a touch sensor 137, a gyro sensor 138, and a short-range wireless communication device 139.
  • the bus 131, the CPU 132, the RAM 133, the flash ROM 134, and the short-range wireless communication device 139 have the same configuration as the HMD control device 120 having the same name.
  • the touch sensor 137 is a touch pad that receives input of operation instructions from the user 101.
  • the HMD operation device 130 may include operation keys such as a power key, a volume key, and a home key.
  • the gyro sensor 138 is a posture detection device that detects the rotational angular velocity of the HMD operation device 130.
  • the posture detection device included in the HMD operation device 130 does not have to be the gyro sensor 138 as long as the posture of the HMD operation device 130 can be detected.
  • FIG. 3 is a functional block diagram of the HMD device 100 of the present embodiment.
  • the HMD apparatus 100 of the present embodiment includes an overall control unit 151, a reception unit 152, an operation detection unit 153, a display control unit 154, and processing data 155.
  • the overall control unit 151 controls the overall operation by controlling each function of the HMD device 100.
  • the reception unit 152 accepts operations from the user 101. In this embodiment, it is received via the touch sensor 137. The information about the operation received from the user 101 is output to the operation detection unit 153 and the display control unit 154.
  • the operation detection unit 153 detects the viewpoint movement operation.
  • the operation received by the reception unit 152 is analyzed, and the viewpoint movement operation is detected. For example, when a predetermined operation is performed, it is determined that the viewpoint movement operation has been performed. The determination result is notified to the display control unit 154.
  • the viewpoint movement operation detection signal is output while it is determined that the viewpoint movement operation has been performed (during the detection of the line-of-sight movement operation).
  • the display control unit 154 controls the display on the right-eye display 114R and the left-eye display 114L.
  • the present embodiment includes display control of the optotype 500. While the operation detection unit 153 is detecting the viewpoint movement operation, that is, while the viewpoint movement operation detection signal is being received, the OSD generation unit 125 generates the display data of the optotype 500 and outputs it to the video processing unit 126.
  • the image processing unit 126 displays the optotype 500 by superimposing the received display data on the display image (image composition).
  • the processing data 155 is data necessary for executing the display control processing described later in the HMD device 100.
  • the processed data 155 is held in, for example, the RAM 123 and the flash ROM 124.
  • the overall control unit 151, the reception unit 152, the operation detection unit 153, and the display control unit 154 are realized by the CPU 132 expanding the program held in the flash ROM 134 to the RAM 133 and executing it.
  • the program is held in the operation program holding unit 124a.
  • FIG. 4 is an explanatory diagram of an operation when the viewpoint is moved according to the present embodiment.
  • the viewpoint movement operation is performed. Determine that it was done. Then, it is determined that the viewpoint movement operation is completed when the finger is separated from the touch pad.
  • the user 101 touches the point 211 at a predetermined position with a fingertip or the like on the touch pad on which the touch sensor 137 is arranged. After that, the fingertip is slid in the moving direction (arrow 221) and stopped in a touched state at the point 212.
  • the operation detection unit 153 detects it as a viewpoint movement operation and outputs a viewpoint movement operation detection signal. As a result, the display control unit 154 continues to move the viewpoint forward (in the depth direction) in the direction of the arrow 221 at a speed corresponding to the length of the arrow.
  • the reception unit 152 does not accept the operation.
  • the operation detection unit 153 does not detect the viewpoint movement operation, the output of the viewpoint movement operation detection signal is stopped.
  • the display control unit 154 stops the viewpoint movement.
  • the operation detection unit 153 sends a viewpoint movement operation detection signal so as to decelerate to a speed corresponding to the distance from the first touched point 211. Output.
  • the operation detection unit 153 corresponds to the arrow 223 from the first touched point 211.
  • the viewpoint movement operation detection signal is output so that the viewpoint movement is continued in the desired direction and speed.
  • the operation detection unit 153 indicates the arrow pointing to the right.
  • the viewpoint movement signal is output so that the viewpoint movement is continued at a speed corresponding to the length of 222.
  • the pan, tilt, and rotation may be reflected in the viewpoint movement by detecting the posture of the HMD operating device 130 with a posture detecting device such as a gyro sensor 138.
  • the operation detection unit 153 outputs a viewpoint movement operation detection signal based on the detection result of the gyro sensor 138.
  • pan means a horizontal turn and is called yawing in the aviation field.
  • Tilt also means tilting in the vertical direction and is called pitching in the aircraft field.
  • Rotation means the tilt of the angle of view and is called rolling in the aviation field.
  • FIG. 5A is a diagram for explaining the binocular stereoscopic image 410 and the annular arrangement mark 510 displayed as the optotype 500.
  • the fish of the display object 410 (binocular stereoscopic image 410) is the binocular parallax of the user 101 who visually recognizes the right eye image 410R and the left eye image 410L displayed on the virtual image surface 310. Therefore, it is recognized as one object existing at a predetermined position in the depth direction.
  • This predetermined position in the depth direction is a position where the line of sight 611R of the right eye and the line of sight 611L of the left eye intersect.
  • the annular arrangement mark 510 (511, 512) is displayed as a binocular stereoscopic image on the virtual image plane 310 by the line of sight 612.
  • FIG. 5B is a diagram showing how the display object 410 (binocular stereoscopic image 410) and the annular arrangement mark 510 are alternately viewed in binocular stereoscopic view.
  • the images when the display object 410 (binocular stereoscopic image 410) and the annular arrangement mark 510 are visually recognized are combined and shown.
  • the other is a double image shifted to the left and right.
  • the annular arrangement mark 510 includes a plurality of alternately arranged white circles (first mark 511) and black circles (second mark 512) on the circular auxiliary line (circumference 513). ..
  • the auxiliary line (circumference 513) is a virtual line that is not actually displayed.
  • FIG. 6B is a display state when the user 101 performs a viewpoint movement operation so as to bring the display object 410 closer to the display object 410 while being visible at the position shown in FIG. 6A.
  • FIG. 6C is a display state in the case where the positional relationship between the display position of the display object 410 and the virtual image surface 310 is the aspect shown in FIG. 6A.
  • FIG. 6D is a display state in the case where the positional relationship between the display position of the display object 410 and the virtual image surface 310 is the aspect shown in FIG. 6B.
  • FIG. 6D when either one of the display object 410 and the annular arrangement mark 510 is stereoscopically viewed with both eyes, the other can be seen without becoming a double image shifted to the left and right.
  • the annular arrangement mark 510 (511,512) in which the viewpoint position of the user 101 does not change relatively and putting it in the field of view of the user 101, the feeling that the user 101 is moving even though it is not actually moving. Can be prevented from falling into. This can reduce VR sickness.
  • the annular arrangement mark 510 is a discrete mark arranged in an annular shape, the central portion of the visual field is not blocked. Therefore, the display object 410 is not hindered from being viewed, and the surrounding background image can be viewed while maintaining the continuity.
  • FIG. 7 is a processing flow of the display control process including the optotype display by the HMD device 100 of the present embodiment. This process is started when the HMD device 100 is started and the operation program is started.
  • the overall control unit 151 makes initial settings (step S1101).
  • the initial setting is executed by reading a saved setting value for a predetermined setting item, or by inputting the setting value by the user 101 or the like.
  • the setting items are, for example, various specifications of the optotype 500.
  • the annular arrangement mark 510 is selected and set as the optotype 500. Further, the annular arrangement mark 510 is set to be displayed on the virtual image surface 310 only during the viewpoint movement operation.
  • the overall control unit 151 starts an application that has received an instruction from the user 101 (step S1102).
  • the reception unit 152 receives an instruction to specify an application to be started.
  • the reception unit 152 receives various operations by the user 101 for the application, such as a viewpoint movement operation, at predetermined time intervals (step S1103).
  • the operation detection unit 153 analyzes the operation received by the reception unit 152 and determines whether or not the viewpoint movement operation has been performed (step S1104). Then, when it is determined that the viewpoint movement operation is present (S1104; Yes), the viewpoint movement operation detection signal is output to the display control unit 154.
  • the display control unit 154 When the display control unit 154 receives the viewpoint movement operation detection signal, it determines whether or not the optotype 500 is currently displayed (step S1105).
  • the display control unit 154 displays the optotype 500 as it is, and the overall control unit 151 receives an instruction from the reception unit 152 to terminate the application. It is determined whether or not the application (step S1109). Then, when the instruction to terminate the application is not accepted (S1109; No), the process returns to step S1103 and the process is continued.
  • Step S1110 when the instruction to terminate the application is received (S1109; Yes), whether or not the overall control unit 151 has received the instruction to terminate the application and terminate the HMD device 100 itself (operation program termination instruction; power OFF instruction). (Step S1110).
  • step S1110 If the instruction to end is not received (S1110; No), the overall control unit 151 returns to step S1102. On the other hand, when the instruction to end is received (S1110; Yes), the process ends.
  • step S1105 when the optotype 500 is not displayed (S1105; No), the display control unit 154 displays the optotype 500 so as to form an image at a predetermined distance (step S1106), and proceeds to step S1109. Transition.
  • step S1104 when the viewpoint movement operation is not performed (S1104; No), the reception unit 152 does not output the viewpoint movement operation detection signal.
  • step S1107 determines whether or not the optotype 500 is currently displayed.
  • the display control unit 154 determines whether or not the optotype 500 is currently displayed (step S1107).
  • the optotype 500 is displayed (S1107; Yes)
  • the display of the optotype 500 is stopped (the optotype 500 is erased) (step S1108), and the process proceeds to step S1109.
  • step S1109 the process proceeds to step S1109 as it is.
  • the initial settings may be made after the application is started.
  • various specifications of the optotype 500 can be set according to the application.
  • the HMD device 100 of the present embodiment is attached to the head of the user 101, and displays the image 410R for the right eye and the image 410L for the left eye as a binocular stereoscopic image 410. Then, when the viewpoint movement operation is detected, the operation detection unit 153 that outputs the viewpoint movement operation detection signal and the display that superimposes and displays the target 500 on the binocular stereoscopic image while the viewpoint movement operation detection signal is output. It includes a control unit 154.
  • the optotype 500 has a shape that is displayed on a virtual surface at a predetermined distance from the own device in the depth direction when the optotype is stereoscopically viewed with both eyes, and a field of view can be secured.
  • the optotype 500 is displayed in which the viewpoint position of the user 101 does not change during the viewpoint movement operation.
  • the optotype 500 serves as a guide for fixing the viewpoint of the user 101.
  • the optotype 500 is displayed on the virtual image plane 310.
  • the optotype 500 can be stereoscopically viewed on the virtual image plane 310.
  • the user 101 is less likely to be caught in the illusion that the user 101 itself is moving. Therefore, it is possible to reduce VR sickness caused by using the HMD device 100 while keeping the image accompanied by the viewpoint movement.
  • the optotype 500 is displayed only while the viewpoint movement operation is detected. Coupled with the shape that can secure the field of view of the optotype 500, it does not interfere with the viewing of the display object of the user 101.
  • the optotype 500 can be displayed by the function of the HMD device 100 even if the application such as a game does not have the function of displaying the optotype 500.
  • the display position of the optotype 500 is preset and fixed during application execution.
  • the display position of the optotype 500 on the virtual surface is changed according to the displayed image. Further, in the present embodiment, it is possible to perform the initial setting of the optotype 500 and the setting change by interruption during the execution of the application.
  • the hardware configuration of the HMD device 100 of this embodiment is the same as that of the first embodiment. Further, the functional configuration of the HMD device 100 of the present embodiment is also the same as that of the first embodiment. However, the processing of the operation detection unit 153 and the display control unit 154 is different.
  • the display mode of the optotype 500 is switched.
  • a plurality of threshold values are set for the speed in the forward direction of the viewpoint movement, and the depth direction position (front-back direction position in the case of binocular stereoscopic viewing) of the optotype 500 is displayed by being gradually switched.
  • a plurality of threshold values are set for the left-right speed of the viewpoint movement, and the left-right position of the optotype 500 is gradually switched and displayed.
  • the operation detection unit 153 calculates the operation direction and the operation speed.
  • the operation direction and operation speed are calculated using the information between the immediately preceding viewpoint movement operation position and the latest viewpoint movement operation position, and the time interval between accepting the operations. Then, the operation detection unit 153 outputs a viewpoint movement operation detection signal including the operation direction and the operation speed.
  • the viewpoint movement operation uses the method described in the first embodiment. For example, in FIG. 4, when the operation of moving from the first touched point 211 to the point 213 without releasing the finger is accepted, the direction and speed are determined from the vector indicated by the arrow 223.
  • the display control unit 154 controls the display position of the optotype 500 based on the speed and direction information included in the viewpoint movement operation detection signal.
  • a table for designating a display position for each speed threshold value in the depth direction and the left-right direction is held as processing data 155.
  • the display position table is stored in, for example, the flash ROM 124 or the like.
  • the display control unit 154 refers to this display position table and controls the display position.
  • FIGS. 8 (a) and 8 (b) Examples of display position tables are shown in FIGS. 8 (a) and 8 (b).
  • FIG. 8A is an example of the depth direction display position table 161 showing the relationship between the speed threshold and the display position in the depth direction
  • FIG. 8B shows the relationship between the speed threshold and the display position in the left-right direction. This is an example of the left-right display position table 162 shown.
  • the display position table for each one or more speed thresholds, information that can specify the display positions in the depth direction and the left-right direction when the speed threshold is exceeded is provided in the depth direction display position and the display position. Hold as the left-right display position.
  • the depth direction display position and the left-right direction display position may be held for each type of the optotype 500.
  • the position D0 corresponding to the velocity threshold 0 and matching the virtual image plane 310 has the depth corresponding to the velocity threshold V1 and the position D2 corresponding to the velocity threshold V2.
  • V1 and V2 are arbitrary speed threshold values that satisfy V1 ⁇ V2.
  • the position information for example, the coordinate position on the virtual image plane 310 is held.
  • the set speed threshold value may be different in the depth direction and the left-right direction.
  • FIG. 8C is a conceptual diagram of a display state when a user 101 equipped with the HMD device 100 is operating a car chase game.
  • the same configuration as that of the first embodiment is assigned the same number, and the description thereof will be omitted.
  • windshield image 423 an image imitating the frame of the windshield of the own vehicle (hereinafter referred to as windshield image 423) is displayed in the display area of the display.
  • vehicle to be tracked vehicle to be tracked 421
  • road 422 are also displayed.
  • the HMD device 100 obtains the geographical shape of the road 422 via the communication device 129 according to the instruction of the user 101. Further, the traveling position of the tracking target vehicle 421 is also obtained via the communication device 129. On the other hand, the position of the virtual vehicle driven by the user 101 chasing the tracking target vehicle 421 is operated by the HMD operation device 130 of the HMD device 100.
  • FIG. 8C is an image when the user 101 is tracking the tracking target vehicle 421 traveling on the road 422 of the right curve.
  • the windshield image 423 is superimposed and displayed.
  • the display control unit 154 displays the tracking target vehicle 421 so that it is stereoscopically viewed with both eyes far away (forward) from the virtual image plane 310.
  • the windshield image 423 is stereoscopically displayed in front of the virtual image plane 310.
  • the operation detection unit 153 detects the corresponding viewpoint movement operation and calculates the operation direction and the operation speed. Then, a viewpoint movement signal including such information is generated and output.
  • the display control unit 154 that has received the viewpoint movement signal refers to the display position table and displays the optotype 500.
  • the case where the annular arrangement mark 510 is displayed as the optotype 500 is shown.
  • the center of the line of sight enters the area surrounded by the virtual circumference 513 of the annular arrangement mark 510, so that the user 101 can see the annular arrangement mark 510 rather than displaying it in the center of the screen.
  • Easy to put in It is possible to reduce the illusion that the user 101 itself is moving by being confused by the road 422 that appears to flow to the user 101. Therefore, VR sickness can be reduced.
  • FIG. 9 is a processing flow of the display control processing of the present embodiment. This process is also started when the HMD device 100 is started. Further, the same processing as in the first embodiment is designated by the same reference numerals, and the description thereof will be omitted again.
  • the overall control unit 151 starts the application instructed by the user 101 (step S2101).
  • the overall control unit 151 performs the initial setting according to the instruction from the user 101 (step S2102). Similar to the first embodiment, various specifications of the optotype 500 are set. The specifications may be predetermined according to the application. Here, as in the first embodiment, it is assumed that the annular arrangement mark 510 is set to be displayed.
  • the reception unit 152 receives various operations of the application such as a game at predetermined time intervals as in the first embodiment (step S1103). At this time, the reception unit 152 of the present embodiment determines whether or not the setting change interrupt has been received (step S2104).
  • step S2105 When the setting change interrupt is received (S2104; Yes), the overall control unit 151 changes the setting according to the received instruction (step S2105), and proceeds to step S1103.
  • the operation detection unit 153 determines whether the viewpoint movement operation has been performed as in the first embodiment (step S1104). Then, when it is determined that the viewpoint movement operation is present (S1104; Yes), the operation detection unit 153 calculates the operation direction and the operation speed, and outputs the viewpoint movement operation detection signal including the information to the display control unit 154. (Step S2106).
  • the display control unit 154 refers to the display position table based on the viewpoint movement operation detection signal including the operation direction and the operation speed, determines the display position, and displays the optotype 500 at the determined display position (step S2107). , Step S1109.
  • step S1104 the process when the viewpoint movement operation is not accepted (S1104; No) is the same as that of the first embodiment, and thus the description thereof will be omitted here.
  • steps S1107, S1108, S1109, and S1110 are the same as that of the first embodiment, the description thereof will be omitted here.
  • the operation detection unit 153 of the present embodiment further detects the operation direction and operation speed of the detected viewpoint movement operation, and includes the detected information on the operation direction and operation speed in the viewpoint movement operation detection signal.
  • the display control unit 154 displaces the optotype 500 in the direction in which the viewpoint of the user 101 has moved and displays the display based on the information.
  • the initial setting of the optotype 500 and the setting change by interrupt can be performed in the execution state of the application.
  • the depth direction position (front-back direction position in the case of binocular stereoscopic viewing) of the optotype 500 is gradually switched based on a plurality of threshold values with respect to the speed in the forward direction of the viewpoint movement.
  • the position of the optotype 500 in the left-right direction is gradually switched and displayed based on a plurality of threshold values with respect to the speed of the viewpoint movement in the left-right direction. This makes it easier to capture the optotype 500 in the field of view while looking at the display image of the application, so that it becomes easier to obtain the effect of reducing VR sickness.
  • the target is a captured moving image. Then, in the HMD apparatus 100 of the present embodiment, when the movement amount of the binocular stereoscopic image detected from the captured moving image is equal to or more than a predetermined value, it is determined that the viewpoint movement operation is performed.
  • the HMD device 100 of this embodiment is basically the same as that of the first embodiment.
  • the viewpoint movement operation detection process by the operation detection unit 153 is different. That is, in the present embodiment, the viewpoint movement operation is detected not from the operation by the user 101 but from the movement of the image.
  • the present embodiment will be described with a focus on a configuration different from that of the first embodiment.
  • the operation detection unit 153 of the present embodiment analyzes the displayed image.
  • the video is acquired by the right-eye camera 119R and the left-eye camera 119L.
  • An independent camera may be provided for taking an image of the surroundings.
  • the operation detection unit 153 detects the movement of an element in the image by using the existing motion detection process from the image (video) generated as an image by the image processing unit 126 processing at predetermined time intervals. For example, the movement of an element in an image is detected from the latest image and the immediately preceding image. Then, when the movement of the predetermined threshold value or more is analyzed, it is determined that there is a viewpoint movement, and the viewpoint movement operation detection signal is output to the display control unit 154.
  • the movement direction and the movement speed may be detected together as the operation direction and the operation speed, respectively, and these information may be included in the viewpoint movement operation detection signal.
  • the motion vector is calculated from the previous and next images and the intermediate image is interpolated.
  • the evaluation value is determined from the motion vector to detect the direction and speed.
  • the display control unit 154 displays the optotype 500 when it receives the viewpoint movement operation detection signal.
  • FIG. 10 is a diagram for explaining the viewpoint movement operation detection process of the present embodiment.
  • the HMD device 100 is viewing an image of a landscape that changes due to a change in the orientation of the cameras (right-eye camera 119R and left-eye camera 119L) is shown. That is, it is an example of the case where the image of the mountain moves from the broken line 431 to the solid line 432.
  • the operation detection unit 153 of the present embodiment detects motion from this image. Then, when the detected movement amount is equal to or greater than a predetermined threshold value, it is determined that the viewpoint movement operation has been detected, and the line-of-sight movement operation signal is output. Then, the display control unit 154 receives it and displays the optotype 500. Here, a case where the annular arrangement mark 510 is displayed as the optotype 500 will be illustrated.
  • the operation detection unit 153 detects the viewpoint movement operation when the movement amount of the binocular stereoscopic image detected from the moving image is equal to or more than a predetermined threshold value.
  • the optotype 500 when the movement is detected, the optotype 500 is automatically displayed.
  • the user 101 is less likely to be confused by the movement of the image and to have the illusion that he / she is moving. This makes it possible to reduce VR sickness.
  • the optotype 500 when motion is detected, the optotype 500 is automatically displayed, but the present invention is not limited to this.
  • the operation of the user 101 may be analyzed to detect the viewpoint movement operation.
  • the operation detection unit 153 analyzes the operation received by the reception unit 152 and determines whether or not the viewpoint is moved.
  • the display position of the optotype 500 is not limited to the virtual image plane 310.
  • the display control unit 154 may display the optotype 500 (annular arrangement mark 510) in front of the virtual image plane 310 (far away in the depth direction). That is, the distance in the depth direction of the virtual surface on which the optotype 500 is displayed is larger than the distance in the depth direction of the virtual image surface 310 and smaller than the distance in the depth direction to the binocular stereoscopic image 410.
  • an annular arrangement mark 510 in which black circles and white circles are alternately arranged on a virtual circle is used.
  • the optotype 500 is not limited to this.
  • the pattern may be as shown in FIGS. 12 (a) and 12 (b).
  • the optotype 500 shown in FIG. 12A is a first optotype 520 composed of a rectangular two-dot chain line 521 with rounded corners. Further, the optotype 500 shown in FIG. 12B is a second optotype 530 in which an arc 532 and a figure 533 are arranged on a virtual circumference.
  • the optotype 500 has a shape that allows the user 101 to fix the viewpoint so that he / she can grasp that he / she is not moving and does not block the central portion of the visual field and does not interfere with image viewing. The shape does not matter.
  • a plurality of types of optotypes 500 may be prepared in advance and can be selected according to the preference of the user 101, or the types of optotypes 500 to be displayed may be predetermined for each application. You may keep it. The same applies to the display position.
  • ⁇ Modification example 3> In each of the above embodiments, an example of connecting the HMD main unit 110 and the HMD control device 120 by wire is shown, but the present invention is not limited to this.
  • the connection may be wireless.
  • wirelessly connecting the HMD control device 120 and the HMD operation device 130 it may be wired.
  • the HMD device 100 is composed of three devices, an HMD main body device 110, an HMD control device 120, and an HMD operation device 130, has been shown, but the present invention is not limited thereto.
  • the function of the HMD control device 120 may be included in the HMD main body device 110.
  • the inter-device I / F 117 and 127 which are the connection interfaces between the two, can be reduced.
  • the HMD control device 120 may include the function of the HMD operation device 130.
  • the short-range wireless communication devices 129b and 139 which are the connection interfaces between the two, can be reduced.
  • the HMD device 100 may be integrated and the function of the HMD operating device 130 may not be provided.
  • the operation of the user 101 may be detected by gesture detection or the like.
  • Gesture detection is realized, for example, by taking an image of the surroundings with a camera (camera for the right eye 119R and a camera for the left eye 119L) and analyzing the image with the CPU 122.
  • an external device different from the HMD device 100 may be used.
  • the external device is, for example, a smartphone, a game controller, or the like.
  • an application for functioning as an operating device of the HMD device 100 is installed.
  • the viewpoint movement operation may be performed by operating the joystick.
  • HMD device 100 In each of the above embodiments, the case where the HMD device 100 is used has been described as an example, but the device is not limited to the HMD device 100.
  • the device may be VR goggles that realize the function of the HMD device by attaching a smartphone (Smart Phone).
  • a smartphone Smart Phone
  • a PC Personal Computer
  • a tablet or the like may be used.
  • the present invention is not limited to the above-described embodiments and modifications, and includes various modifications.
  • the above-described embodiments and modifications have been described in detail in order to explain the present invention in an easy-to-understand manner, and are not necessarily limited to those having all the described configurations.
  • each of the above configurations, functions, processing units, processing means, etc. may be realized by hardware by designing a part or all of them by, for example, an integrated circuit. Further, each of the above configurations, functions, and the like may be realized by software by the processor interpreting and executing a program that realizes each function. Information such as programs, tables, and files that realize each function can be placed in a memory unit, a hard disk, a recording device such as an SSD (Solid State Drive), or a recording medium such as an IC card, SD card, or DVD. ..
  • SSD Solid State Drive
  • control lines and information lines indicate what is considered necessary for explanation, and not all control lines and information lines are necessarily shown on the product. In practice, it can be considered that almost all configurations are interconnected.
  • HMD device 101: user, 110: HMD main unit, 111: bus, 114L: left eye display, 114R: right eye display, 117: inter-device I / F, 119L: left eye camera, 119R: right eye camera, 120: HMD control device, 121: Bus, 122: CPU, 123: RAM, 124: Flash ROM, 124a: Operation program holding unit, 124b: Image data holding unit, 125: OSD generation unit, 126: Video processing unit 127 : Inter-device I / F, 128: Sensor device, 128a: Acceleration sensor, 128b: Geomagnetic sensor, 128c: GPS receiver, 128d: Distance sensor, 129: Communication device, 129a: Network communication device, 129b: Short-range wireless communication Device, 130: HMD operating device, 131: bus, 132: CPU, 133: RAM, 134: flash ROM, 137: touch sensor, 138: gyro sensor, 139: short-

Abstract

La présente invention minimise le mal de la VR résultant de l'utilisation d'un dispositif HMD tout en conservant intacte une image en mouvement qui implique un déplacement du point de vue. Un dispositif d'affichage monté sur la tête qui est porté sur la tête d'un utilisateur et qui réalise une stéréopsie binoculaire avec une image de l'œil droit et une image de l'œil gauche, le dispositif d'affichage monté sur la tête étant caractérisé en ce qu'il est muni d'une unité de détection d'opération qui émet un signal de détection lorsqu'une opération de décalage de point de vue a été détectée; et d'une unité de commande d'affichage qui provoque l'affichage d'une cible visuelle de manière à ce qu'elle soit superposée à l'image de l'œil droit et à l'image de l'œil gauche pendant que le signal de détection est émis, dans lequel la cible visuelle est affichée sur un plan virtuel à une distance prédéterminée du dispositif en termes de direction de profondeur lorsque la cible visuelle est vue au moyen de la stéréopsie binoculaire, et a également une forme qui permet d'assurer le champ de vision de l'utilisateur.
PCT/JP2020/018133 2020-04-28 2020-04-28 Dispositif d'affichage monté sur la tête et procédé de commande d'affichage WO2021220407A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/018133 WO2021220407A1 (fr) 2020-04-28 2020-04-28 Dispositif d'affichage monté sur la tête et procédé de commande d'affichage

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/018133 WO2021220407A1 (fr) 2020-04-28 2020-04-28 Dispositif d'affichage monté sur la tête et procédé de commande d'affichage

Publications (1)

Publication Number Publication Date
WO2021220407A1 true WO2021220407A1 (fr) 2021-11-04

Family

ID=78332313

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/018133 WO2021220407A1 (fr) 2020-04-28 2020-04-28 Dispositif d'affichage monté sur la tête et procédé de commande d'affichage

Country Status (1)

Country Link
WO (1) WO2021220407A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116781884A (zh) * 2023-06-29 2023-09-19 广州视景医疗软件有限公司 一种单眼立体视的数据采集方法及装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07181424A (ja) * 1993-12-22 1995-07-21 Canon Inc 複眼式画像表示装置
JPH08111834A (ja) * 1994-10-12 1996-04-30 Olympus Optical Co Ltd 頭部装着型映像表示装置
JP2005084569A (ja) * 2003-09-11 2005-03-31 Brother Ind Ltd 画像表示装置
JP2018157331A (ja) * 2017-03-16 2018-10-04 株式会社スクウェア・エニックス プログラム、記録媒体、画像生成装置、画像生成方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07181424A (ja) * 1993-12-22 1995-07-21 Canon Inc 複眼式画像表示装置
JPH08111834A (ja) * 1994-10-12 1996-04-30 Olympus Optical Co Ltd 頭部装着型映像表示装置
JP2005084569A (ja) * 2003-09-11 2005-03-31 Brother Ind Ltd 画像表示装置
JP2018157331A (ja) * 2017-03-16 2018-10-04 株式会社スクウェア・エニックス プログラム、記録媒体、画像生成装置、画像生成方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116781884A (zh) * 2023-06-29 2023-09-19 广州视景医疗软件有限公司 一种单眼立体视的数据采集方法及装置
CN116781884B (zh) * 2023-06-29 2024-03-12 广州视景医疗软件有限公司 一种单眼立体视的数据采集方法及装置

Similar Documents

Publication Publication Date Title
US10198870B2 (en) Information processing apparatus, information processing system, and information processing method
US11235871B2 (en) Control method, control system, and smart glasses for first person view unmanned aerial vehicle flight
US11030771B2 (en) Information processing apparatus and image generating method
JP6642432B2 (ja) 情報処理装置及び情報処理方法、並びに画像表示システム
JP4900741B2 (ja) 画像認識装置および操作判定方法並びにプログラム
US10416835B2 (en) Three-dimensional user interface for head-mountable display
EP3349107B1 (fr) Dispositif de traitement d'informations et procédé de génération d'image
JP6479199B2 (ja) 情報処理装置
JP5114795B2 (ja) 画像認識装置および操作判定方法並びにプログラム
JP6899875B2 (ja) 情報処理装置、映像表示システム、情報処理装置の制御方法、及びプログラム
CN106817913A (zh) 头戴式显示器、移动信息终端、图像处理装置、显示控制程序、显示控制方法和显示系统
JP6507827B2 (ja) 表示システム
KR20170062439A (ko) 제어 장치, 제어 방법 및 프로그램
WO2019142560A1 (fr) Dispositif de traitement d'informations destiné à guider le regard
WO2021220407A1 (fr) Dispositif d'affichage monté sur la tête et procédé de commande d'affichage
JP2016126687A (ja) ヘッドマウントディスプレイ、操作受付方法および操作受付プログラム
US11972037B2 (en) Head mounted information processing apparatus and head mounted display system
KR20180055637A (ko) 전자 장치 및 그의 제어 방법
EP3958095A1 (fr) Système de réalité virtuelle/augmentée relié à un ordinateur mobile utilisant l'ordinateur mobile comme une interface homme-machine
WO2024057783A1 (fr) Dispositif de traitement d'informations pourvu d'une unité d'identification de position de point de vue d'image à 360 degrés
JP2021177277A (ja) プログラム、情報処理方法、情報処理装置、及び情報処理システム
JP2024012898A (ja) 電子機器
JP2022118501A (ja) 表示システム、表示装置とその制御方法及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20933898

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20933898

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP