WO2017169230A1 - 画像表示装置および画像表示方法 - Google Patents
画像表示装置および画像表示方法 Download PDFInfo
- Publication number
- WO2017169230A1 WO2017169230A1 PCT/JP2017/005508 JP2017005508W WO2017169230A1 WO 2017169230 A1 WO2017169230 A1 WO 2017169230A1 JP 2017005508 W JP2017005508 W JP 2017005508W WO 2017169230 A1 WO2017169230 A1 WO 2017169230A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- information
- unit
- display
- control unit
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 29
- 238000001514 detection method Methods 0.000 claims abstract description 67
- 210000003128 head Anatomy 0.000 description 35
- 230000001133 acceleration Effects 0.000 description 11
- 230000005540 biological transmission Effects 0.000 description 9
- 238000010586 diagram Methods 0.000 description 9
- 230000000007 visual effect Effects 0.000 description 8
- 238000004891 communication Methods 0.000 description 6
- 230000000630 rising effect Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 2
- 239000000446 fuel Substances 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 230000005856 abnormality Effects 0.000 description 1
- 239000012141 concentrate Substances 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000001771 impaired effect Effects 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000004904 shortening Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62J—CYCLE SADDLES OR SEATS; AUXILIARY DEVICES OR ACCESSORIES SPECIALLY ADAPTED TO CYCLES AND NOT OTHERWISE PROVIDED FOR, e.g. ARTICLE CARRIERS OR CYCLE PROTECTORS
- B62J50/00—Arrangements specially adapted for use on cycles not provided for in main groups B62J1/00 - B62J45/00
- B62J50/20—Information-providing devices
- B62J50/21—Information-providing devices intended to provide information to rider or passenger
- B62J50/22—Information-providing devices intended to provide information to rider or passenger electronic, e.g. displays
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62J—CYCLE SADDLES OR SEATS; AUXILIARY DEVICES OR ACCESSORIES SPECIALLY ADAPTED TO CYCLES AND NOT OTHERWISE PROVIDED FOR, e.g. ARTICLE CARRIERS OR CYCLE PROTECTORS
- B62J45/00—Electrical equipment arrangements specially adapted for use as accessories on cycles, not otherwise provided for
- B62J45/40—Sensor arrangements; Mounting thereof
- B62J45/41—Sensor arrangements; Mounting thereof characterised by the type of sensor
- B62J45/412—Speed sensors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62J—CYCLE SADDLES OR SEATS; AUXILIARY DEVICES OR ACCESSORIES SPECIALLY ADAPTED TO CYCLES AND NOT OTHERWISE PROVIDED FOR, e.g. ARTICLE CARRIERS OR CYCLE PROTECTORS
- B62J45/00—Electrical equipment arrangements specially adapted for use as accessories on cycles, not otherwise provided for
- B62J45/40—Sensor arrangements; Mounting thereof
- B62J45/41—Sensor arrangements; Mounting thereof characterised by the type of sensor
- B62J45/414—Acceleration sensors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62J—CYCLE SADDLES OR SEATS; AUXILIARY DEVICES OR ACCESSORIES SPECIALLY ADAPTED TO CYCLES AND NOT OTHERWISE PROVIDED FOR, e.g. ARTICLE CARRIERS OR CYCLE PROTECTORS
- B62J45/00—Electrical equipment arrangements specially adapted for use as accessories on cycles, not otherwise provided for
- B62J45/40—Sensor arrangements; Mounting thereof
- B62J45/41—Sensor arrangements; Mounting thereof characterised by the type of sensor
- B62J45/415—Inclination sensors
- B62J45/4151—Inclination sensors for sensing lateral inclination of the cycle
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/365—Guidance using head up displays or projectors, e.g. virtual vehicles or arrows projected on the windscreen or on the road itself
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/02—Viewing or reading apparatus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/147—Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0968—Systems involving transmission of navigation instructions to the vehicle
- G08G1/096855—Systems involving transmission of navigation instructions to the vehicle where the output is provided in a suitable form to the driver
- G08G1/096861—Systems involving transmission of navigation instructions to the vehicle where the output is provided in a suitable form to the driver where the immediate route instructions are output to the driver, e.g. arrow signs for next turn
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0968—Systems involving transmission of navigation instructions to the vehicle
- G08G1/0969—Systems involving transmission of navigation instructions to the vehicle having a display in the form of a map
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/34—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators for rolling or scrolling
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0141—Head-up displays characterised by optical features characterised by the informative content of the display
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
- G02B2027/0174—Head mounted characterised by optical features holographic
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/02—Improving the quality of display appearance
- G09G2320/0261—Improving the quality of display appearance in the context of movement of objects on the screen or movement of the observer relative to the screen
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/12—Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2380/00—Specific applications
- G09G2380/10—Automotive applications
Definitions
- the present invention relates to an image display device and an image display method.
- This application claims priority based on Japanese Patent Application No. 2016-073634 filed on Mar. 31, 2016, the contents of which are incorporated herein by reference.
- image display apparatuses include a non-transmissive type that covers eyes and a transmissive type that does not cover eyes.
- non-transmissive type only the image is displayed on the display unit. For this reason, the user can visually recognize only the image.
- transmissive type the display unit is, for example, a half mirror. For this reason, the user can visually recognize an image and an image of the outside world.
- Patent Document 1 there has been proposed a technique for acquiring information from a vehicle using such an image display device and displaying the acquired information as an image.
- the information from the vehicle is, for example, navigation information or vehicle speed information.
- Patent Document 1 In a motorcycle helmet equipped with a head-mounted display, a display is provided so that the information image does not interfere with the user's field of view when the driver rotates his head while driving on a curve. A technique for stopping is disclosed. However, in the case of Patent Document 1, since the information is not displayed while the driver rotates his head, the driver cannot acquire the vehicle information during that time, and the information is displayed on the image display device. The convenience of doing so may be impaired. Moreover, in patent document 1, since a driver
- a non-display area having a rectangular shape is set at the center of the field of view, and the non-display area is moved according to the posture state of the vehicle detected by the gyro sensor.
- the non-display area has a rectangular shape. If the non-display area is provided in a rectangular shape, the displayable area may be excessively limited, particularly when the field of view is limited by a helmet.
- the non-display area is rectangular, information may be displayed on the left and right. When information is displayed on the side of the driver's line of sight, there is a problem that it is difficult for the driver to instantly check the surroundings.
- An aspect of the present invention aims to provide an image display device and an image display method capable of displaying a necessary information image without the information image affecting the field of view of the driver of the vehicle.
- An image display apparatus is an image display apparatus having a display unit disposed on a mounting body mounted on a user's head, wherein the user's head is inclined. And a control unit that sets a virtual horizon based on the inclination detected by the detection unit and sets a non-display area having a strip shape parallel to the virtual horizon.
- the non-display area includes a first boundary line provided above the virtual horizon by a first distance, and a second distance below the virtual horizon in parallel. It may be an area between the second boundary line provided in the.
- the first distance may be shorter than the second distance.
- the image display device includes a speed detection unit that detects a movement speed, and the control unit detects the first distance and the second distance according to the movement speed detected by the speed detection unit. You may change the relationship.
- control unit shortens the second distance when the moving speed detected by the speed detecting unit is fast, and the first distance when the moving speed is slow. May be shortened.
- control unit may erase the information image when the area set as the non-display area and the information image are in an overlapping area.
- control unit reduces the information image to a size that does not overlap the non-display area when the area set as the non-display area and the information image are in the overlap area. May be displayed.
- the display unit simulates a three-dimensional space on a virtual sphere centered on a user, projects the information image on a spherical surface of the virtual sphere, and the non-display unit is You may comprise an annular belt-like shape on a spherical surface.
- An image display method is an image control method for controlling an image display device having a display unit disposed on a mounting body to be mounted on a user's head.
- the unit detects the inclination of the user's head, and the control unit sets a virtual horizon based on the detected inclination, and sets a non-display area having a strip shape parallel to the virtual horizon Including.
- the non-display area is set in a strip shape based on the virtual horizon, so that the driver's visual field can be secured in the left-right direction where the driver's line of sight concentrates.
- the configuration ⁇ 1> or ⁇ 9> by setting the virtual horizon based on the inclination, it is possible to ensure the driver's view even when the vehicle is turned and the vehicle is curving.
- the non-display area is provided below the first boundary line parallel to the virtual horizon by a first distance and below the second distance parallel to the virtual horizon line.
- the first distance and the second distance are appropriately set as a region between the second boundary line.
- the non-display area can be optimally set according to the operating conditions.
- the first distance from the virtual horizon to the upper boundary line is shorter than the second distance from the virtual horizon to the lower boundary line, and thus confirmation during traveling is necessary.
- the information display part can be enlarged while ensuring the range.
- an optimal non-display area can be secured according to the moving speed of the vehicle.
- the frequency of confirming road conditions etc. for the user in the area below the virtual horizon near the user decreases.
- the traveling speed is high, the second distance from the virtual horizon to the lower boundary line is shortened to secure a range that needs to be confirmed during traveling, and the information display unit Can be greatly increased.
- the traveling speed is low, the first distance from the virtual horizon to the upper boundary line is shortened to secure a range that needs to be confirmed during traveling.
- the display portion can be made large.
- the driver's field of view can be secured by deleting the information image.
- the configuration of ⁇ 7> when the area set as the non-display area and the information image are in an overlapping area, the information image can be reduced and displayed to ensure the driver's field of view. And necessary information can be acquired.
- the non-display portion is configured in an annular band shape on the spherical surface, thereby simulating a three-dimensional space on a virtual sphere centered on the user and displaying an information image on the spherical surface of the sphere.
- a non-display area can be set in the display unit to be projected on.
- FIG. 1 is a block diagram showing a schematic configuration of a vehicle detection system according to a first embodiment. It is a figure which shows an example of the external appearance of HMD (Head Mount Display: Head mounted display) which concerns on 1st Embodiment. It is a figure which shows an example of the external appearance of the vehicle which concerns on 1st Embodiment. It is a figure which shows an example of the information displayed on the display part which concerns on 1st Embodiment. It is a figure which shows the image seen from the driver
- HMD Head Mount Display: Head mounted display
- an HMD eyeglass-type head-mounted display
- FIG. 1 is a block diagram showing a schematic configuration of a vehicle detection system 1 according to the present embodiment.
- the vehicle detection system 1 includes a vehicle 2 and a head-mounted display 3 (hereinafter also referred to as an HMD 3) (image display device).
- HMD 3 head-mounted display 3
- the vehicle 2 and the HMD 3 communicate using, for example, a short-range wireless communication standard.
- the short-range wireless communication standard is, for example, Bluetooth (registered trademark) LE (Low Energy) (hereinafter referred to as BLE) standard communication.
- the vehicle 2 is a straddle-type vehicle such as a motorcycle.
- the vehicle 2 includes a control unit 21, a detection unit 22, a reception unit 23, and a transmission unit 24.
- the detection unit 22 includes an inclination detection unit 221, a rotation angle detection unit 222, a position detection unit 223, and a vehicle speed detection unit 224.
- the detection unit 22 may further include a fuel remaining amount detection unit, a vibration detection unit, a key detection unit, an air pressure detection unit, and the like.
- the control unit 21 performs various processes based on information from the detection unit 22. In addition, the control unit 21 generates a detection signal based on information from the detection unit 22 and outputs the generated signal to the transmission unit 24. In addition, the control unit 21 inputs information from the HMD 3 received by the receiving unit 23 and performs various processes.
- the inclination detection unit 221 is, for example, a triaxial acceleration sensor, detects the inclination of the vehicle 2, and outputs information indicating the detected inclination to the control unit 21.
- the rotation angle detection unit 222 is, for example, a gyro sensor, detects the rotation angle of the vehicle 2, and outputs information indicating the detected rotation angle to the control unit 21.
- the position detection unit 223 is, for example, a GPS (Global Positioning System), detects the current position of the vehicle 2, and outputs information indicating the detected current position to the control unit 21 as vehicle position information.
- the vehicle speed detection unit 224 detects the speed of the vehicle 2 and outputs information indicating the detected speed to the control unit 21.
- the receiving unit 23 and the transmitting unit 24 transmit and receive information wirelessly between the transmitting unit 35 and the receiving unit 31 of the HMD 3 in accordance with the short-range wireless communication standard.
- the receiving unit 23 receives a signal from the HMD 3 and outputs information based on the received signal to the control unit 21.
- the transmission unit 24 generates a transmission signal based on the information output from the control unit 21, and transmits the generated transmission signal to the HMD 3.
- the HMD 3 includes an operation unit 30, a reception unit 31, a detection unit 32, an information generation unit 33, a display unit 34, and a transmission unit 35.
- the information generation unit 33 includes a storage unit 331, a control unit 332, and an image generation unit 333.
- the HMD 3 receives the information transmitted by the vehicle 2 and displays various information on the display unit 34 according to the received information.
- Various information includes vehicle speed information, time information, vehicle position information, navigation information, and the like. Further, the remaining energy information of the vehicle 2, fuel consumption information, information indicating that an abnormality has occurred in the vehicle 2, mail reception information, and the like may be displayed on the display unit 34.
- the operation unit 30 includes, for example, a mechanical switch, a touch panel switch, and the like.
- the operation unit 30 detects the result of operation by the user (in this case, the driver of the vehicle), and outputs the detected operation instruction to the control unit 332.
- the receiving unit 31 receives information from the vehicle 2 and outputs the received information to the image generating unit 333.
- the receiving unit 31 may receive an operation instruction from the vehicle 2 and output the received operation instruction to the control unit 332.
- the operation instructions include, for example, an instruction to turn on or off the power of the HMD 3, an instruction to display information on the display unit 34, an instruction to display what information when the HMD 3 is in any state, etc. It is.
- the detection unit 32 includes a magnetic sensor 321, an acceleration sensor 322, an angular velocity sensor 323, and a speed sensor 324 (speed detection unit).
- the magnetic sensor 321 is, for example, a geomagnetic sensor, detects the orientation of the HMD 3, and outputs the detected detection value to the control unit 332.
- the acceleration sensor 322 is, for example, a triaxial acceleration sensor, detects the triaxial acceleration of the HMD 3, and outputs the detected detection value to the control unit 332.
- the angular velocity sensor 323 is, for example, a triaxial gyro sensor, detects the rotational angular acceleration of the HMD 3, and outputs the detected detection value to the control unit 332.
- the speed sensor 324 detects the moving speed.
- the inclination of the head of the driver is detected by three axes of the roll direction, the pitch direction, and the yaw direction.
- the roll direction is an inclination in a direction in which the vehicle 2 rotates about the front and rear of the vehicle 2 as a rotation axis.
- the pitch direction is an inclination in a direction that rotates about the left and right of the vehicle 2 as a rotation axis.
- the yaw direction is an inclination in a direction in which the vehicle 2 rotates with the top and bottom of the vehicle 2 as a rotation axis.
- the inclination of the driver's head is detected by the three axes of the roll direction, the pitch direction, and the yaw direction.
- the information generation unit 33 selects information from the information received by the reception unit 31 according to an operation instruction output by the operation unit 30, and generates display data based on the selected information.
- the storage unit 331 stores information according to various processes of the control unit 332.
- the storage unit 331 stores a control program used by the control unit 332.
- the control unit 332 performs various processes. In the present embodiment, the control unit 332 performs a process of setting a non-display area in accordance with the detection value from the detection unit 32 when displaying the information image on the display unit 34.
- the image generation unit 333 acquires the information output from the reception unit 31. In addition, the image generation unit 333 acquires the determination result output by the control unit 332. The image generation unit 333 selects acquired information according to the acquired determination result, and generates an image to be displayed on the display unit 34 using the selected information.
- the display unit 34 includes a projection unit that projects an image and a transmissive display using, for example, a hologram.
- the display unit 34 transmits the external light and displays the image output from the image generation unit 333 using a hologram.
- the display unit 34 may be provided on both the left and right sides or one of the left and right sides.
- the display unit 34 simulates a three-dimensional space on a virtual sphere centered on the driver, and displays an information image projected on the spherical surface of the sphere.
- FIG. 2 is a diagram illustrating an example of the appearance of the HMD 3 according to the present embodiment.
- the coordinates when the driver stands upright with respect to the ground surface and wears the HMD 3 on the head are as follows. To do.
- the HMD 3 of this embodiment is a glasses type.
- the HMD 3 includes display units 34R and 34L on the left and right, nose pads 302R and 302L, a bridge 303, and temples 301R and 301L.
- the detection unit 32 is mounted in the left and right temples 301R and 301L, and the operation unit 30, the reception unit 31, the detection unit 32, the information generation unit 33, and the transmission unit 35 are mounted in the left temple 301L.
- the configuration shown in FIG. 2 is an example, and the position where each part is attached is not limited to this.
- FIG. 3 is a diagram illustrating an example of the appearance of the vehicle 2 according to the present embodiment.
- an arrow FR indicates the front of the vehicle 2
- an arrow UP indicates the upper side of the vehicle 2.
- the vehicle 2 of the present embodiment is a scooter type straddle-type vehicle.
- the vehicle 2 includes a vehicle body cover 201, a steering handle 202, a key cylinder 203, a front wheel Wf, a rear wheel Wr, a seat 213, and the like.
- the vehicle body cover 201 includes an inclination detection unit 221, a rotation angle detection unit 222, a position detection unit 223, and a vehicle speed detection unit 224. Note that the configuration shown in FIG. 3 is an example, and the position where each part is attached is not limited to this.
- FIG. 4 is an explanatory diagram of a display image of the display unit 34 according to the first embodiment of the present invention.
- the driver R drives the vehicle 2 while wearing the HMD 3 on the head.
- the display unit 34 of the HMD 3 transmits external light and displays various information images using a hologram.
- the information image includes vehicle speed information, time information, vehicle position information, navigation information, and the like.
- the driver R visually recognizes the image of the outside world and the visual information projected by the projection unit as a virtual image through the combiner, so that the information image is displayed on the spherical surface S of the sphere over the entire field of view of the driver R. Can see.
- operator R can see MAP (map) information by seeing his / her leg direction at the time of a stop, for example.
- the driver R can confirm information from various traffic devices by looking overhead.
- the area where the driver R is gazing during driving needs to be a non-display area on the display unit 34.
- the vertical width of the non-display area may be based on, for example, the characteristics of the human visual field, and may be, for example, about 60 degrees on the upper side and about 70 degrees on the lower side.
- the non-display area may be provided in the left-right direction.
- the non-display area may be about 60 degrees on the inside and about 90 degrees on the outside based on the characteristics of the human visual field.
- FIG. 5 is a diagram showing an image viewed from the viewpoint of the driver R when driving the two-wheeled vehicle 2.
- FIG. 5A is an image when traveling on a straight road.
- FIG. 5B is an image when traveling on the right curve.
- a region A indicates a range that the driver R can visually recognize from the display unit 34.
- the driver R who is driving the vehicle 2 keeps driving while watching the information on the road.
- the road is curved left and right, and there are various obstacles on the road, such as a sudden pedestrian jumping out, a vehicle in front of the vehicle suddenly stopping, or a falling object. All of these obstacles are events on the ground. Therefore, the driver R is traveling while gazing at the position at the position relative to the horizon.
- FIG. 5B is an image viewed from the viewpoint of the driver R when traveling on the right curve.
- the driver R travels while tilting the vehicle body to the right side. Therefore, when viewed from the driver R's line of sight, the horizon is inclined upward as shown in FIG. 5B.
- the information on the road of the braking range of the vehicle 2 from the viewpoint of the driver R is a range indicated by a region A2 in FIG. 5B.
- the driver R is driving the vehicle 2 while paying attention to the range indicated by the area A2. Therefore, in the case of the right curve, the HMD 3 needs to make the area A2 a non-display area on the display unit 34.
- the inclination of the driver R's head in the roll direction is detected, and a virtual horizon is set based on the inclination of the driver R's head in the roll direction.
- a range separated by a predetermined distance in the direction and in the downward direction is set as a non-display area on the display unit 34 in a band shape.
- the reason why the non-display area is set in a band shape is to secure the left and right fields of view. That is, the driver R of the vehicle 2 often swings his / her head to the left and right to check the left and right while traveling. Especially when wearing a full-face helmet, the head is often shaken greatly.
- the non-display area is set in a strip shape on the display unit 34 so that the driving is not hindered, and the left and right areas are all non-display areas. .
- the display unit 34 simulates a three-dimensional space on a virtual sphere centered on the driver R and projects an information image on the spherical surface of the sphere. Used.
- the non-display area NA having a strip shape is connected to the end on a spherical surface, and is configured in an annular strip shape. Become.
- FIG. 6 is a diagram illustrating an example of information displayed on the display unit 34 according to the present embodiment.
- FIG. 6A is a display example when the head of the driver R is hardly tilted with respect to the ground.
- FIG. 6B is a display example when the head of the driver R is tilted to the right with respect to the ground.
- a region A indicates a range that the driver R can visually recognize from the display unit 34 as in FIG. 5
- a region NA is a non-display region.
- the displayed image is an image obtained by superimposing information on a region surrounded by chain lines g101, g102, g103, and g104 on an image of the outside world.
- the information of the area surrounded by the chain line g101 is vehicle speed information, for example.
- the information of the area surrounded by the chain line g102 is time information, for example.
- the information on the area surrounded by the chain line g103 is, for example, current location information.
- the information of the area surrounded by the chain line g104 is navigation information, for example.
- a non-display area NA is set on the display unit 34.
- the head of the driver R is hardly tilted with respect to the ground.
- the virtual horizon P is set based on the inclination of the head of the driving vehicle in the roll direction
- the virtual horizon P is substantially parallel to the horizontal direction of the display unit 34 as shown in FIG. 6A.
- the non-display area NA is separated from the virtual horizon P by a predetermined distance L1 (first distance) in the upward direction and separated from the virtual horizon P by a predetermined distance L2 (second distance).
- the band shape is set within the range between the two lines.
- a range between the boundary line Q1 (first boundary line) and the boundary line Q2 (second boundary line) is the non-display area NA. As shown in FIG.
- the positions of the upper left chain line g101, the upper right chain line g102, the lower left chain line g103, and the lower right chain line g104 are all outside the non-display area NA. For this reason, even if images are displayed on the chain line g101, the chain line g102, the chain line g103, and the chain line g104, the driver R can ensure visibility in driving.
- the driver R tilts the vehicle body to the right, and the head of the driver R tilts to the right with respect to the ground.
- the virtual horizon P is set based on the inclination of the head of the driving vehicle in the roll direction
- the virtual horizon P is inclined upward to the horizontal direction of the display unit 34 as shown in FIG. 6B.
- the non-display area NA is between a line separated by a predetermined distance L1 upward from the virtual horizon P rising to the right and a line separated by a predetermined distance L2 downward from the virtual horizon P rising to the right.
- a band shape is set.
- the information indicated by the chain line g101 is displayed because it is inside the display area A and outside the non-display area NA. Since the information indicated by the chain line g102 comes within the non-display area NA, it is erased.
- the information indicated by the chain line g103 and the chain line g104 is displayed as it is because it is inside the display area A and outside the non-display area NA. In this way, the information indicated by the chain line g102 that enters the non-display area NA is erased from the display unit 34, so that the visibility of the driver R can be secured.
- the non-display area NA is tilted so as to be horizontal with respect to the virtual horizon P even when the vehicle body is tilted, but is displayed on the display unit 34. Don't tilt information. This is because the driver R may not tilt the body, that is, the HMD 3 may not tilt even when the vehicle body is tilted.
- the distance L1 from the virtual horizon P to the upper boundary line Q1 of the non-display area NA is shorter than the distance L2 from the virtual horizon P to the lower boundary line Q2 of the non-display area NA.
- This is for ensuring a wide area of the information image while assuming urgency in driving operation. That is, the upper side on the screen of the display unit 34 is a landscape at a long distance when viewed from the driver R, and the lower side on the screen is a landscape at a short distance when viewed from the driver R. What is highly urgent is an event that occurs particularly at a short distance when viewed from the driver R.
- FIG. 7 is a flowchart showing processing when setting a non-display area in the HMD 3 according to the present embodiment.
- the control unit 332 acquires the detection value of the detection unit 32, and advances the process to step S102.
- the control unit 332 obtains the inclination of the driver R in the roll direction from the detection value of the detection unit 32, and advances the process to step S103.
- the roll direction is inclined so as to rotate about the front and rear of the vehicle 2 as a rotation axis.
- the control unit 332 detects the acceleration of gravity by the acceleration sensor 322, and detects the angular velocity in the rotation direction of the three axes from the angular velocity sensor 323, whereby the inclination of the head of the driver R is changed in the roll direction, pitch direction, and yaw direction. It can be detected with 3 axes.
- the control unit 332 sets the virtual horizon P based on the inclination of the driver R's head in the roll direction.
- the control unit 332 is configured such that the upper end of the non-display area NA is parallel to the virtual horizon P and is separated by a predetermined distance L1 upward, and the lower end of the non-display area NA is parallel to the virtual horizon P.
- the non-display area NA is set so as to be separated downward by a predetermined distance L2.
- Step S105 The control unit 332 determines whether there is an information image in the non-display area NA. If there is no information image in the non-display area NA (step S105: NO), the control unit 332 returns the process to step S101. If there is an information image in the non-display area NA (step S105: YES), the control unit 332 advances the process to step S106. (Step S106) The control unit 332 deletes the information image in the non-display area NA, and returns the process to step S101.
- a line that is separated by a predetermined distance in the upward direction and the virtual horizon A non-display area is set in a band shape within a range between a line separated by a predetermined distance in the direction. For this reason, when the driver R is driving the vehicle 2 while tilting the vehicle body, the visibility of the driver R can be secured.
- the non-display area is set in a band shape, and the left and right sides of the screen are non-display areas. For this reason, when the driver
- FIG. 8 is a diagram illustrating an example of information displayed on the display unit 34 in the second embodiment of the present invention.
- a region A indicates a range that can be visually recognized by the driver R from the display unit 34 as in FIGS. 5A, 5B, 6A, and 6B
- a region NA is a non-display region.
- the basic configuration of the second embodiment is the same as that of the first embodiment.
- a virtual horizon is set based on the inclination in the roll direction of the head of the driver R, and the virtual horizon is separated from the virtual horizon by a predetermined distance upward.
- a range between a line and a line that is a predetermined distance downward from the virtual horizon is set as a non-display area in a band shape.
- the image of information that comes within the non-display area NA is erased.
- an image of information that comes within the non-display area NA is reduced and displayed.
- the head of the driver R is hardly inclined with respect to the ground.
- the virtual horizon P is set based on the inclination of the head of the driving vehicle in the roll direction
- the virtual horizon P is substantially parallel to the horizontal direction of the display unit 34 as shown in FIG. 8A.
- the non-display area NA is a band-like shape within a range between a line that is separated from the virtual horizon P by a predetermined distance L1 and a line that is separated from the virtual horizon P by a predetermined distance L2.
- Set to The range between the boundary line Q1 and the boundary line Q2 is the non-display area NA. As shown in FIG.
- the positions of the upper left chain line g101, the upper right chain line g102, the lower left chain line g103, and the lower right chain line g104 are all outside the non-display area NA. For this reason, even if images are displayed on the chain line g101, the chain line g102, the chain line g103, and the chain line g104, the visibility of the driver R can be secured.
- the driver R tilts the vehicle body to the right, and the head of the driver R tilts to the right with respect to the ground.
- the virtual horizon P is set based on the inclination of the head of the driving vehicle in the roll direction
- the virtual horizon P is inclined upward to the horizontal direction of the display unit 34 as shown in FIG. 8B.
- the non-display area NA is between a line separated by a predetermined distance L1 upward from the virtual horizon P rising to the right and a line separated by a predetermined distance L2 downward from the virtual horizon P rising to the right.
- a band shape is set.
- the range between the boundary line Q1 and the boundary line Q2 is the non-display area NA.
- the information indicated by the chain line g101, the chain line g103, and the chain line g104 is displayed as it is because it is inside the display area A and outside the non-display area NA. Since the information indicated by the chain line g102 comes within the non-display area NA, the information is reduced and displayed so as to be out of the non-display area NA.
- the information indicated by the chain line g102 entering the non-display area NA is reduced and displayed so as to be out of the non-display area NA. For this reason, according to the present embodiment, necessary information can be obtained while ensuring the visibility of the driver R.
- FIG. 9 is a flowchart showing processing when setting a non-display area in the HMD according to the third embodiment of the present invention.
- the basic configuration of the third embodiment is the same as that of the first embodiment described above.
- a distance L2 to the line Q2 is fixed to a predetermined distance.
- the distance L1 and the distance L2 are appropriately set according to the speed of the vehicle 2.
- the upper side on the screen is a landscape at a long distance when viewed from the driver R
- the lower side on the screen is a landscape at a short distance when viewed from the driver R.
- FIG. 9 is a flowchart showing a process when setting a non-display area in the HMD 3 according to the third embodiment of the present invention.
- the control unit 332 acquires the detection value of the detection unit 32, and advances the process to step S202.
- the control unit 332 obtains the inclination in the roll direction of the driver R from the detection value of the detection unit 32, and advances the process to step S203.
- the control unit 332 sets the virtual horizon P based on the inclination of the driver R's head in the roll direction, and proceeds to step S204.
- the control unit 332 acquires vehicle speed information from the speed sensor 324 of the detection unit 32, and advances the processing to step S205.
- Step S205 The control unit 332 determines the distance L1 from the virtual horizon P to the upper boundary line Q1 of the non-display area NA and the lower boundary line Q2 from the virtual horizon P to the non-display area NA according to the vehicle speed information. Distance L2 is calculated, and the process proceeds to step S206.
- the control unit 332 includes a line parallel to the virtual horizon P and separated by a predetermined distance L1 upward, and a line parallel to the virtual horizon P and separated by a predetermined distance L2 downward. Is set as a non-display area NA. At this time, the distances L1 and L2 are set according to the vehicle speed as shown in step S205.
- Step S207 The control unit 332 determines whether there is an information image in the non-display area NA. If there is no information image in the non-display area NA (step S207: NO), the control unit 332 returns the process to step S201. If there is an information image in the non-display area NA (step S207: YES), the control unit 332 advances the process to step S208. (Step S208) The control unit 332 deletes the information image in the non-display area NA, and returns the process to step S201.
- the non-display area NA can be optimally set according to the speed of the vehicle 2.
- FIG. 10 is a diagram illustrating an example of the appearance of the HMD 3 according to the fourth embodiment of the present invention.
- a helmet type is used as the HMD 3.
- the HMD 3 is attached to the helmet 300.
- the helmet 300 includes a cap body (shell) 311, an impact absorbing liner 312, and a shield 313.
- the display unit 34 of the HMD 3 includes a projection unit 341, a lens 342, and a combiner 343.
- Projection unit 341 outputs visual information to combiner 343 via lens 342.
- the lens 342 collects the visual information output from the projection unit 341 on the combiner 343.
- the combiner 343 is a half mirror that superimposes the scenery in front of the driver R and the video.
- the driver R can visually recognize the external image and the visual information projected by the projection unit 341 as a virtual image I through the combiner 343.
- the control unit 332 determines that the vehicle 2 is traveling on the uphill based on the detection value detected by the detection unit 22, and displays a virtual horizontal line. You may correct
- the control unit 332 determines that the vehicle 2 is traveling on the downhill based on the detection value detected by the detection unit 22, and displays the virtual horizontal line. You may correct
- the control unit 332 may correct the position of the virtual horizon according to the road surface condition such as the road surface inclination.
- a program for realizing all or part of the functions of the vehicle detection system 1 is recorded on a computer-readable recording medium, and the program recorded on the recording medium is read into the computer system and executed. You may perform the process of each part.
- the “computer system” includes an OS and hardware such as peripheral devices. Further, the “computer system” includes a homepage providing environment (or display environment) if a WWW system is used.
- the “computer-readable recording medium” refers to a storage device such as a flexible medium, a magneto-optical disk, a portable medium such as a ROM or a CD-ROM, and a hard disk incorporated in a computer system.
- the “computer-readable recording medium” dynamically holds a program for a short time like a communication line when transmitting a program via a network such as the Internet or a communication line such as a telephone line.
- a volatile memory in a computer system serving as a server or a client in that case and a program that holds a program for a certain period of time are also included.
- the program may be a program for realizing a part of the functions described above, and may be a program capable of realizing the functions described above in combination with a program already recorded in a computer system.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Radar, Positioning & Navigation (AREA)
- Mechanical Engineering (AREA)
- Remote Sensing (AREA)
- Optics & Photonics (AREA)
- Computer Hardware Design (AREA)
- Automation & Control Theory (AREA)
- Controls And Circuits For Display Device (AREA)
- Instrument Panels (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
本願は、2016年3月31日に出願された日本国特許出願2016-073634号に基づき優先権を主張し、その内容をここに援用する。
<4>の構成によれば、車両の移動速度に応じて、最適な非表示領域を確保できる。
<7>の構成によれば、非表示領域として設定した領域と情報画像とが重複した領域にある場合には、情報画像を縮小して表示することで、運転者の視界を確保することができ、必要な情報を取得することができる。
<第1の実施形態>
本実施形態では、画像表示装置として、眼鏡型のヘッドマウントディスプレイ(以下、HMDという)を例にして説明を行う。
車両2は、例えば自動二輪車等の鞍乗型車両である。車両2は、制御部21、検出部22、受信部23、および送信部24を備える。また、検出部22は、傾き検出部221、回転角検出部222、位置検出部223、および車速検出部224を備える。なお、検出部22は、更に、燃料残量検出部、振動検出部、キー検出部、空気圧検出部等を備えていてもよい。
HMD3は、操作部30、受信部31、検出部32、情報生成部33、表示部34、および送信部35を備えている。情報生成部33は、記憶部331、制御部332、および画像生成部333を備えている。
図2は、本実施形態に係るHMD3の外観の一例を示す図である。
以下、運転者が地表に対して直立してHMD3を頭部に装着した場合の座標は、運転者から見て上下方向をz軸方向、左右方向をx軸方向、前後方向をy軸方向とする。
図3に示すように、本実施形態の車両2は、スクータ型の鞍乗型車両である。車両2は、車体カバー201、操向ハンドル202、キーシリンダ-203、前輪Wf、後輪Wr、シート213等を含んでいる。例えば、車体カバー201の内側に、傾き検出部221、回転角検出部222、位置検出部223、および車速検出部224を備える。なお、図3に示した構成は一例であり、各部が取り付けられている位置は、これに限られない。
図4に示すように、運転者Rは、HMD3を頭部に装着しながら、車両2を運転する。
HMD3の表示部34は、外界光を透過し、ホログラムを用いて、各種の情報画像を表示する。情報画像としては、車速情報、時刻情報、車両位置情報、ナビゲーション情報等である。運転者Rは、コンバイナを介して、外界の映像と、投影部が投影した視覚情報を虚像として視認することで、運転者Rの視界の全域に渡って、球体の球面S上に、情報画像を見ることができる。これにより、本実施形態によれば、例えば、運転者Rは、停車時に自身の脚方向を見ることで、MAP(地図)情報を見ることができる。運転者Rは、頭上を見ることで各種通行機器からの情報を確認することができる。
このため、運転者Rが運転中に注視している領域は、表示部34上において非表示領域とする必要がある。なお、非表示領域の上下の幅は、例えば、人間の視野の特性に基づくものであってもよく、例えば上側に60度程度、下側に70度程度であってもよい。なお、図4に示した例では、非表示領域を視野の上下方向に設ける例を説明したが、これに限られない。非表示領域は、左右方向に設けられていてもよく、例えば、人間の視野の特性に基づいて、内側に60度程度、外側に90度程度であってもよい。
車両2を運転している運転者Rは、道路上の情報を注視しながら、運転を続けている。
道路は左右にカーブしており、また、道路上では、歩行者の急な飛び出しがあったり、前方の車両が急停車したり、落下物があったり等、様々な障害が存在する。これら障害となるものは、何れも、地面の上の事象である。したがって、運転者Rは、地平線を基準とした位置にあるものを注視しながら走行している。
(ステップS101)制御部332は、検出部32の検出値を取得し、ステップS102に処理を進める。
(ステップS102)制御部332は、検出部32の検出値から、運転者Rのロール方向の傾斜を求め、ステップS103に処理を進める。前述したように、ロール方向は、車両2の前後を回転軸にして回転するような傾きである。制御部332は、加速度センサー322により重力加速度を検出し、角速度センサー323から3軸の回転方向の角速度を検出することで、運転者Rの頭部の傾斜をロール方向、ピッチ方向、ヨー方向の3軸で検出できる。
(ステップS103)制御部332は、運転者Rの頭部のロール方向の傾斜に基づいて、仮想地平線Pを設定する。
(ステップS104)制御部332は、非表示領域NAの上端が仮想地平線Pに対して平行に、上方向に所定距離L1だけ離れ、非表示領域NAの下端が仮想地平線Pに対して平行に、下方向に所定距離L2だけ離れるように非表示領域NAを設定する。
(ステップS106)制御部332は、非表示領域NAにある情報画像を消去し、処理をステップS101にリターンする。
次に、本発明の第2の実施形態について説明する。図8は、本発明の第2の実施形態において、表示部34に表示される情報の一例を示す図である。図8Aと図8Bにおいて、領域Aは、図5A、図5B、図6A、および図6Bと同様に運転者Rが表示部34から視認できる範囲を示し、領域NAは、非表示領域である。なお、この第2の実施形態の基本構成は、前述の第1の実施形態と同様である。
次に、本発明の第3の実施形態について説明する。図9は、本発明の第3の実施形態に係るHMDにおいて非表示領域を設定する際の処理を示すフローチャートである。なお、この第3の実施形態の基本構成は、前述の第1の実施形態と同様である。
(ステップS201)制御部332は、検出部32の検出値を取得し、ステップS202に処理を進める。
(ステップS202)制御部332は、検出部32の検出値から、運転者Rのロール方向の傾斜を求め、ステップS203に処理を進める。
(ステップS203)制御部332は、運転者Rの頭部のロール方向の傾斜に基づいて、仮想地平線Pを設定して、ステップS204に処理を進める。
(ステップS204)制御部332は、検出部32の速度センサー324から車速情報を取得して、処理をステップS205に進める。
(ステップS205)制御部332は、車速情報に応じて、仮想地平線Pから非表示領域NAの上側の境界線Q1までの距離L1と、仮想地平線Pから非表示領域NAの下側の境界線Q2までの距離L2を算出して、処理をステップS206に進める。
(ステップS206)制御部332は、仮想地平線Pに対して平行に、上方向に所定距離L1だけ離れた線と、仮想地平線Pに対して平行に、下方向に所定距離L2だけ離れた線との間の領域を、非表示領域NAとして設定する。この際、距離L1およびL2は、ステップS205で示したように、車速に応じて設定される。
(ステップS208)制御部332は、非表示領域NAにある情報画像を消去し、処理をステップS201にリターンする。
次に、本発明の第4の実施形態について説明する。
図10は、本発明の第4の実施形態に係るHMD3の外観の一例を示す図である。図10に示すように、この実施形態では、HMD3として、ヘルメット型のものを用いている。
図10において、HMD3はヘルメット300に取り付けられている。図10に示すようにヘルメット300は、帽体(シェル)311、衝撃吸収ライナー312、シールド313を備える。また、HMD3の表示部34は、投影部341、レンズ342、およびコンバイナ343を備える。
これにより、実施形態によれば、路面の状況に応じた視界を確保することが容易になる。
また、「コンピュータシステム」は、WWWシステムを利用している場合であれば、ホームページ提供環境(あるいは表示環境)も含むものとする。
また、「コンピュータ読み取り可能な記録媒体」とは、フレキシブルディスク、光磁気ディスク、ROM、CD-ROM等の可搬媒体、コンピュータシステムに内蔵されるハードディスク等の記憶装置のことをいう。さらに「コンピュータ読み取り可能な記録媒体」とは、インターネット等のネットワークや電話回線等の通信回線を介してプログラムを送信する場合の通信線のように、短時間の間、動的にプログラムを保持するもの、その場合のサーバやクライアントとなるコンピュータシステム内部の揮発性メモリのように、一定時間プログラムを保持しているものも含むものとする。また上記プログラムは、前述した機能の一部を実現するためのものであってもよく、さらに前述した機能をコンピュータシステムにすでに記録されているプログラムとの組み合わせで実現できるものであってもよい。
Claims (9)
- 使用者の頭部に装着される装着体に配置された表示部を有する画像表示装置であって、
前記使用者の頭部の傾斜を検出する検出部と、
前記検出部により検出された傾斜に基づいて仮想地平線を設定し、前記仮想地平線と平行な帯状形状を有する非表示領域を設定する制御部と、
を備える画像表示装置。 - 前記非表示領域は、前記仮想地平線と平行に第1の距離だけ上方に設けられた第1の境界線と、前記仮想地平線と平行に第2の距離だけ下方に設けられた第2の境界線との間の領域である、請求項1に記載の画像表示装置。
- 前記第1の距離は、前記第2の距離より短い、請求項2に記載の画像表示装置。
- 移動速度を検出する速度検出部、を備え、
前記制御部は、前記速度検出部により検出された移動速度に応じて、前記第1の距離と前記第2の距離との関係を変更する、請求項2または請求項3に記載の画像表示装置。 - 前記制御部は、前記速度検出部により検出された移動速度が速いときに、前記第2の距離を短くし、前記移動速度が遅いときには、前記第1の距離を短くする、請求項4に記載の画像表示装置。
- 前記制御部は、前記非表示領域として設定した領域と情報画像とが重複した領域にある場合に、前記情報画像を消去する、請求項1から請求項5のいずれか1項に記載の画像表示装置。
- 前記制御部は、前記非表示領域として設定した領域と情報画像とが重複した領域にある場合に、前記情報画像を前記非表示領域と重複しない大きさに縮小して表示する、請求項1から請求項5のいずれか1項に記載の画像表示装置。
- 前記表示部は、使用者を中心とする仮想球体に三次元空間を模擬し、前記情報画像を前記仮想球体の球面上に投影し、前記球面上に非表示部を円環帯状形状に構成する請求項6または請求項7に記載の画像表示装置。
- 使用者の頭部に装着される装着体に配置された表示部を有する画像表示装置を制御する画像制御方法であって、
検出部が、前記使用者の頭部の傾斜を検出することと、
制御部が、前記検出された傾斜に基づいて仮想地平線を設定し、前記仮想地平線と平行な帯状形状を有する非表示領域を設定することと、
を含む画像表示方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201780018201.6A CN109074205B (zh) | 2016-03-31 | 2017-02-15 | 图像显示装置及图像显示方法 |
EP17773778.0A EP3438805B1 (en) | 2016-03-31 | 2017-02-15 | Image display apparatus and image display method |
US16/084,593 US10935789B2 (en) | 2016-03-31 | 2017-02-15 | Image display apparatus and image display method |
JP2018508545A JP6624758B2 (ja) | 2016-03-31 | 2017-02-15 | 画像表示装置および画像表示方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016-073634 | 2016-03-31 | ||
JP2016073634 | 2016-03-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017169230A1 true WO2017169230A1 (ja) | 2017-10-05 |
Family
ID=59962897
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/005508 WO2017169230A1 (ja) | 2016-03-31 | 2017-02-15 | 画像表示装置および画像表示方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US10935789B2 (ja) |
EP (1) | EP3438805B1 (ja) |
JP (1) | JP6624758B2 (ja) |
CN (1) | CN109074205B (ja) |
WO (1) | WO2017169230A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019061559A (ja) * | 2017-09-27 | 2019-04-18 | 本田技研工業株式会社 | 表示装置、表示制御装置及び車両 |
WO2022209258A1 (ja) * | 2021-03-29 | 2022-10-06 | ソニーグループ株式会社 | 情報処理装置、情報処理方法及び記録媒体 |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021098411A (ja) * | 2019-12-20 | 2021-07-01 | Nsウエスト株式会社 | 表示光出射装置 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH03255419A (ja) * | 1990-03-06 | 1991-11-14 | Yazaki Corp | 表示装置 |
JP2000284214A (ja) * | 1999-03-30 | 2000-10-13 | Suzuki Motor Corp | ヘルメット搭載用表示手段制御装置 |
JP2002302822A (ja) * | 2001-04-03 | 2002-10-18 | Suzuki Motor Corp | 二輪車用ヘルメットマウントディスプレイシステム及びその表示制御方法 |
WO2006035755A1 (ja) * | 2004-09-28 | 2006-04-06 | National University Corporation Kumamoto University | 移動体ナビゲート情報表示方法および移動体ナビゲート情報表示装置 |
JP2008026075A (ja) * | 2006-07-19 | 2008-02-07 | Sky Kk | 自動二輪車用ナビゲーションシステム |
WO2015114807A1 (ja) * | 2014-01-31 | 2015-08-06 | パイオニア株式会社 | 虚像表示装置、制御方法、プログラム、及び記憶媒体 |
JP2015161930A (ja) * | 2014-02-28 | 2015-09-07 | 三菱電機株式会社 | 表示制御装置、表示制御方法、および表示制御システム |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4470930B2 (ja) | 2006-09-21 | 2010-06-02 | ソニー株式会社 | 画像処理装置、画像処理方法、及び、プログラム |
JP2009092809A (ja) * | 2007-10-05 | 2009-04-30 | Nikon Corp | ヘッドマウントディスプレイ装置 |
US20090128938A1 (en) | 2007-11-16 | 2009-05-21 | Carnes Stephen A | Visors and rearview mirrors for helmets |
US20130141520A1 (en) | 2011-12-02 | 2013-06-06 | GM Global Technology Operations LLC | Lane tracking system |
JP2014098564A (ja) * | 2012-11-13 | 2014-05-29 | Panasonic Corp | 情報表示装置 |
US9041741B2 (en) * | 2013-03-14 | 2015-05-26 | Qualcomm Incorporated | User interface for a head mounted display |
FR3018119B1 (fr) * | 2014-02-28 | 2017-07-07 | Thales Sa | Systeme de visualisation de casque comportant des moyens d'affichage et de gestion de documentation |
JP6369106B2 (ja) | 2014-04-16 | 2018-08-08 | 株式会社デンソー | ヘッドアップディスプレイ装置 |
FR3052553B1 (fr) * | 2016-06-13 | 2020-11-27 | Airbus Operations Sas | Systeme et procede d'affichage d'un aeronef |
-
2017
- 2017-02-15 EP EP17773778.0A patent/EP3438805B1/en active Active
- 2017-02-15 JP JP2018508545A patent/JP6624758B2/ja active Active
- 2017-02-15 US US16/084,593 patent/US10935789B2/en active Active
- 2017-02-15 WO PCT/JP2017/005508 patent/WO2017169230A1/ja active Application Filing
- 2017-02-15 CN CN201780018201.6A patent/CN109074205B/zh active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH03255419A (ja) * | 1990-03-06 | 1991-11-14 | Yazaki Corp | 表示装置 |
JP2000284214A (ja) * | 1999-03-30 | 2000-10-13 | Suzuki Motor Corp | ヘルメット搭載用表示手段制御装置 |
JP2002302822A (ja) * | 2001-04-03 | 2002-10-18 | Suzuki Motor Corp | 二輪車用ヘルメットマウントディスプレイシステム及びその表示制御方法 |
WO2006035755A1 (ja) * | 2004-09-28 | 2006-04-06 | National University Corporation Kumamoto University | 移動体ナビゲート情報表示方法および移動体ナビゲート情報表示装置 |
JP2008026075A (ja) * | 2006-07-19 | 2008-02-07 | Sky Kk | 自動二輪車用ナビゲーションシステム |
WO2015114807A1 (ja) * | 2014-01-31 | 2015-08-06 | パイオニア株式会社 | 虚像表示装置、制御方法、プログラム、及び記憶媒体 |
JP2015161930A (ja) * | 2014-02-28 | 2015-09-07 | 三菱電機株式会社 | 表示制御装置、表示制御方法、および表示制御システム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3438805A4 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019061559A (ja) * | 2017-09-27 | 2019-04-18 | 本田技研工業株式会社 | 表示装置、表示制御装置及び車両 |
WO2022209258A1 (ja) * | 2021-03-29 | 2022-10-06 | ソニーグループ株式会社 | 情報処理装置、情報処理方法及び記録媒体 |
Also Published As
Publication number | Publication date |
---|---|
CN109074205A (zh) | 2018-12-21 |
CN109074205B (zh) | 2021-10-26 |
EP3438805A4 (en) | 2019-03-20 |
US20190079290A1 (en) | 2019-03-14 |
US10935789B2 (en) | 2021-03-02 |
JP6624758B2 (ja) | 2019-12-25 |
EP3438805B1 (en) | 2020-09-09 |
JPWO2017169230A1 (ja) | 2018-11-08 |
EP3438805A1 (en) | 2019-02-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210223547A1 (en) | Method and apparatus for adjusting motion-based data space manipulation | |
US9703100B2 (en) | Change nature of display according to overall motion | |
JP6280134B2 (ja) | ヘルメットベースのナビゲーション通知方法、装置およびコンピュータプログラム | |
JP5685499B2 (ja) | 表示装置、画像データ生成装置、画像データ生成プログラム及び表示方法 | |
JP4767136B2 (ja) | 情報開示装置 | |
JP2018191322A (ja) | リアルタイム画像分析に基づき機能の有効化および無効化が可能なヘッドセットコンピュータ | |
US10839623B2 (en) | Vehicle, image display device, vehicle control method, and image display method | |
JP6624758B2 (ja) | 画像表示装置および画像表示方法 | |
JP6806914B2 (ja) | 表示システム及び表示方法 | |
JP2000284214A (ja) | ヘルメット搭載用表示手段制御装置 | |
JP2020536331A (ja) | 車両内において乗物酔いを伴わずにデジタルコンテンツを見ること | |
US11110933B2 (en) | Driving support device, wearable device, driving support system, driving support method, and computer-readable recording medium | |
JP2014098564A (ja) | 情報表示装置 | |
US10642033B2 (en) | Image display device and image display method | |
JP2008014783A (ja) | 車両用ナビゲーション装置 | |
CN110297325A (zh) | 增强现实眼镜和系统及增强现实眼镜显示车上信息的方法 | |
JP2020100912A (ja) | 画像表示システム | |
JP2015136953A (ja) | 二輪車両情報提供装置 | |
TW201823803A (zh) | 具擴增實境與虛擬實境的頭盔 | |
WO2023145852A1 (ja) | 表示制御装置、表示システム、及び表示制御方法 | |
JP6503407B2 (ja) | コンテンツ表示プログラム、コンピュータ装置、コンテンツ表示方法、及びコンテンツ表示システム | |
KR20170082757A (ko) | 차량, 웨어러블 장치, 및 신호등 정보 제공 방법 | |
JP2023073663A (ja) | 交通施設表示装置、交通施設表示システム、交通施設表示方法及びプログラム | |
CN115145386A (zh) | 智能摩托头盔 | |
WO2023122459A1 (en) | An odometry using sensor data from a personal vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2018508545 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17773778 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2017773778 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2017773778 Country of ref document: EP Effective date: 20181031 |