US20200124849A1 - Display system, information presentation system, method for controlling display system, recording medium, and mobile body - Google Patents
Display system, information presentation system, method for controlling display system, recording medium, and mobile body Download PDFInfo
- Publication number
- US20200124849A1 US20200124849A1 US16/724,766 US201916724766A US2020124849A1 US 20200124849 A1 US20200124849 A1 US 20200124849A1 US 201916724766 A US201916724766 A US 201916724766A US 2020124849 A1 US2020124849 A1 US 2020124849A1
- Authority
- US
- United States
- Prior art keywords
- virtual image
- display
- vibration
- display system
- threshold value
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims description 23
- 238000001514 detection method Methods 0.000 claims description 47
- 230000008859 change Effects 0.000 claims description 34
- 238000002834 transmittance Methods 0.000 claims description 9
- 230000003287 optical effect Effects 0.000 description 30
- 238000010586 diagram Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 7
- 238000012545 processing Methods 0.000 description 6
- 230000000149 penetrating effect Effects 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000007480 spreading Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/23—Head-up displays [HUD]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/28—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1454—Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/147—Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/74—Projection arrangements for image reproduction, e.g. using eidophor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3129—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM] scanning a light beam on the display screen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/168—Target or limit values
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/18—Information management
- B60K2360/186—Displaying information according to relevancy
- B60K2360/1868—Displaying information according to relevancy according to driving situations
-
- B60K2370/1868—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/29—Instruments characterised by the way in which information is handled, e.g. showing information on plural displays or prioritising information according to driving conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/20—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used
- B60R2300/205—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used using a head-up display
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0141—Head-up displays characterised by optical features characterised by the informative content of the display
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2380/00—Specific applications
- G09G2380/10—Automotive applications
Definitions
- the present disclosure relates to a display system, an information presentation system, a method for controlling the display system, a program, and a mobile body, and more particularly, to a display system which displays a virtual image in a target space, an information presentation system, a method for controlling the display system, a program, and a mobile body.
- a dangerous situation warning device which includes a vehicle neighborhood object detection device which detects an object on a front travel area, a danger latent area determination device which determines a danger latent area in which danger is latent, a danger degree processing device, and a warning output device (for example, see Patent Literature (PTL) 1).
- the danger degree processing device checks object information obtained by the vehicle neighborhood object detection device against the danger latent area obtained by the danger latent area determination device to determine a degree of danger with respect to an object which exists in the danger latent area.
- the danger degree processing device sets the degree of danger for each of the obstacles and the warning output device displays the degree of danger for each of the obstacles.
- the warning output device is a head-up display, and displays a red frame surrounding the obstacle determined as having a high degree of danger at a position seen to be overlapping with the obstacle.
- An object of the present disclosure is to provide a display system, an information presentation system, a display method, a program, and a mobile body which are capable of making the deviation of the display position of a virtual image less noticeable.
- a display system projects an image to be viewed by a target person as if a virtual image is being projected onto a target space.
- the display system includes a display unit, a controller, and a vibration information obtaining unit which obtains vibration information.
- the display unit displays the image.
- the controller controls the display of the display unit.
- the vibration information obtaining unit obtains the vibration information about vibration applied to the display unit.
- the controller changes the display mode of the portion of the virtual image based on the vibration information.
- An information presentation system includes the display system, and a detection system which detects the vibration applied to the display unit.
- the vibration information obtaining unit obtains the vibration information from the detection system.
- a control method is a method for controlling a display system which projects an image to be viewed by a target person as if a virtual image is being projected onto the target space.
- the display system includes a display unit, a controller, and a vibration information obtaining unit which obtains vibration information.
- the display unit displays the image.
- the controller controls the display of the display unit.
- the vibration information obtaining unit obtains the vibration information about vibration applied to the display unit.
- a program according to another aspect of the present disclosure is a program for causing a computer system to execute the method for controlling the display system.
- a mobile body includes: the display system; and a reflection member which has light transmitting properties and reflects the light emitted from the display unit so that the virtual image is viewed by the target person.
- a display system an information presentation system, a method for controlling the display system, a program, and a mobile body which are capable of making the deviation of the display position of a virtual image less noticeable.
- FIG. 1 is a conceptual diagram of a configuration of an information presentation system according to an embodiment of the present disclosure.
- FIG. 2 is a conceptual diagram of a vehicle which includes a display system according to an embodiment of the present disclosure.
- FIG. 3 is a conceptual diagram of a display example in the information presentation system.
- FIG. 4 is a flowchart of an operation of the information presentation system.
- FIG. 5A is a conceptual diagram of a display example in the information presentation system.
- FIG. 5B is a conceptual diagram of a display example in the information presentation system.
- FIG. 6A is a conceptual diagram of another display example in the information presentation system.
- FIG. 6B is a conceptual diagram of another display example in the information presentation system.
- FIG. 7A is a conceptual diagram of another display example in the information presentation system.
- FIG. 7B is a conceptual diagram of another display example in the information presentation system.
- display system 10 includes display unit 40 which displays (projects) a virtual image in a target space, vibration information obtaining unit 6 which obtains vibration information about vibration applied to display unit 40 , and controller 5 which controls display unit 40 .
- Obtaining unit 6 , display unit 40 , and controller 5 will be described in detail in “(2) Configuration”.
- display system 10 is a head-up display (HUD) for use in vehicle 100 as a mobile body, for example.
- Display system 10 is disposed in a cabin of vehicle 100 to project an image onto windshield 101 (reflection member) of vehicle 100 from below.
- windshield 101 reflection member
- display system 10 is disposed in dashboard 102 installed below windshield 101 .
- virtual image 300 looks like being projected onto target space 400 set in front of vehicle 100 (outside of the vehicle).
- front refers to a direction in which vehicle 100 moves forward
- a direction in which vehicle 100 moves forward or backward is referred to as a longitudinal direction.
- virtual image means an image formed by a radiated ray as if an object were actually presented when light emitted from display system 10 is radiated by a reflecting object such as windshield 101 . Since windshield 101 has light transmitting properties, user 200 , as a target person, is capable of viewing target space 400 in front of vehicle 100 through windshield 101 .
- user 200 is capable of viewing virtual image 300 which is projected by display system 10 , while superimposing virtual image 300 on a real space spreading in front of vehicle 100 .
- various pieces of driving assistance information such as vehicle speed information, navigation information, pedestrian information, front vehicle information, lane departure information and vehicle condition information, can be displayed as virtual image 300 to be viewed by user 200 .
- user 200 is capable of visually obtaining the driving assistance information by only a slight movement of a line of sight from a state of directing the line of sight forward of windshield 101 .
- virtual image 300 formed in target space 400 includes at least two types of virtual images, i.e., first virtual image 301 and second virtual image 302 .
- first virtual image mentioned herein is virtual image 300 ( 301 ) formed on first virtual plane 501 .
- first virtual plane is a virtual plane in which inclination angle ⁇ with respect to optical axis 500 of display system 10 is smaller than predetermined value ⁇ ( ⁇ ).
- second virtual image mentioned herein is virtual image 300 ( 302 ) formed on second virtual plane 502 .
- second virtual plane is a virtual plane in which inclination angle ß with respect to optical axis 500 of display system 10 is larger than predetermined value ⁇ (ß> ⁇ ).
- the “optical axis” mentioned herein is an optical axis of an optical system of projection optical system 4 (see FIG. 1 ) to be described later, that is, an axis that passes through a center of target space 400 and goes along an optical path of virtual image 300 .
- An example of predetermined value ⁇ is 45 degrees
- an example of inclination angle ß is 90 degrees.
- virtual image 300 formed in target space 400 includes third virtual image 303 (see FIG. 3 ) in addition to first virtual image 301 and second virtual image 302 .
- the term “third virtual image” is, similarly to second virtual image 302 , virtual image 300 ( 303 ) formed on second virtual plane 502 in which inclination angle ß with respect to optical axis 500 is larger than predetermined value ⁇ .
- a virtual image formed by light penetrating movable screen 1 a is second virtual image 302
- a virtual image formed by light penetrating fixed screen 1 b is third virtual image 303 , as details will be described later.
- optical axis 500 is along road surface 600 in front of vehicle 100 in target space 400 in front of vehicle 100 .
- first virtual image 301 is formed on first virtual plane 501 substantially parallel to road surface 600
- second virtual image 302 and third virtual image 303 are formed on second virtual plane 502 substantially perpendicular to road surface 600 .
- road surface 600 is a horizontal plane
- first virtual image 301 is displayed along the horizontal plane
- second virtual image 302 and third virtual image 303 are displayed along a vertical plane.
- FIG. 3 is a conceptual diagram of a visual field of user 200 .
- display system 10 according to the present embodiment is capable of displaying first virtual images 301 viewed with depth along road surface 600 and second virtual images 302 and third virtual image 303 viewed vertically on road surface 600 at a fixed distance from user 200 .
- each first virtual image 301 looks like being presented on a plane substantially parallel to road surface 600
- each second virtual image 302 and third virtual image 303 look like being presented on a plane substantially perpendicular to road surface 600 .
- An example of first virtual image 301 is navigation information indicating a traveling direction of vehicle 100 , which can present an arrow that indicates to turn right or turn left on road surface 600 .
- An example of second virtual image 302 is information indicating a distance to a front vehicle or a pedestrian, which can present a distance to the front vehicle (inter-vehicle distance) on the front vehicle.
- An example of third virtual image 303 is a current time, vehicle speed information, and vehicle condition information, which can present these pieces of information, for example, by letters, numbers, and symbols, or a meter such as a fuel gauge.
- display system 10 includes a plurality of screens 1 a and 1 b , drive unit 2 , irradiator 3 , projection optical system 4 , controller 5 , and obtaining unit 6 .
- projection optical system 4 with irradiator 3 form display unit 40 that projects (displays) virtual image 300 (see FIG. 2 ) onto target space 400 (see FIG. 2 ).
- information presentation system 20 includes display system 10 and detection system 7 .
- a plurality of screens 1 a and 1 b include fixed screen 1 b and movable screen 1 a .
- Fixed screen 1 b is fixed to a fixed position of a housing or the like of display system 10 .
- Movable screen 1 a is inclined at angle ⁇ with respect to reference plane 503 .
- movable screen 1 a is configured to be movable in movement directions X orthogonal to reference plane 503 .
- the term “reference plane” mentioned herein is not a real plane but a virtual flat plane that defines the movement direction of movable screen 1 a .
- Movable screen 1 a is configured to be movable rectilinearly in movement directions X (directions shown by an arrow X 1 -X 2 in FIG.
- each of the plurality of screens 1 a and 1 b may be referred to as “screen 1 ” hereinafter.
- Screen 1 (each of movable screen 1 a and fixed screen 1 b ) has translucency and forms an image to form virtual image 300 (see FIG. 2 ) in target space 400 (see FIG. 2 ).
- an image is drawn on screen 1 by light from irradiator 3
- virtual image 300 is formed in target space 400 by the light penetrating screen 1 .
- Screen 1 is made of, for example, a plate-shaped member that has light diffusing properties and is formed into a rectangular shape. Screen 1 is disposed between irradiator 3 and projection optical system 4 .
- Drive unit 2 moves movable screen 1 a in movement directions X.
- drive unit 2 is capable of moving movable screen 1 a both in a direction toward and away from projection optical system 4 along movement directions X.
- drive unit 2 is made of an electric driven actuator, such as a voice coil motor, and operates according to a first control signal output from controller 5 .
- Irradiator 3 is a scanning photoirradiation unit, and irradiates movable screen 1 a or fixed screen 1 b with light.
- Irradiator 3 includes light source 31 and scanner 32 .
- each of light source 31 and scanner 32 operates according to a second control signal output from controller 5 .
- Light source 31 is formed of a laser module that outputs laser light.
- Light source 31 includes a red laser diode that emits a laser light beam of a red color (R), a green laser diode that emits a laser light beam of a green color (G), and a blue laser diode that emits a laser light beam of a blue color (B).
- R red laser diode
- G green laser diode
- B blue laser diode
- Scanner 32 irradiates movable screen 1 a or fixed screen 1 b with light that scans on one side of movable screen 1 a or fixed screen 1 b by scanning the light from light source 31 .
- scanner 32 executes raster scan in which light is scanned two-dimensionally on one side of movable screen 1 a or fixed screen 1 b.
- projection optical system 4 projects virtual image 300 (see FIG. 2 ) onto target space 400 (see FIG. 2 ) with the incident light.
- Projection optical system 4 is arranged in line in movement directions X of movable screen 1 a with respect to screen 1 .
- projection optical system 4 includes magnifying lens 41 , first mirror 42 , and second mirror 43 .
- Magnifying lens 41 , first mirror 42 , and second mirror 43 are arranged in this order on a route of the light penetrating screen 1 .
- Magnifying lens 41 is disposed on an opposite side to irradiator 3 (a side indicated by first direction X 1 ) in movement directions X as seen from screen 1 so that the light output from screen 1 in movement directions X is incident on magnifying lens 41 .
- Magnifying lens 41 magnifies an image formed on screen 1 by the light emitted from irradiator 3 to output the image to first mirror 42 .
- First mirror 42 reflects the light from magnifying lens 41 toward second mirror 43 .
- Second mirror 43 reflects the light, which is emitted from first mirror 42 , toward windshield 101 (see FIG. 2 ).
- projection optical system 4 magnifies the image formed on screen 1 by the light emitted from irradiator 3 with magnifying lens 41 and projects the image onto windshield 101 , thereby projecting virtual image 300 onto target space 400 .
- An optical axis of magnifying lens 41 corresponds to optical axis 500 of projection optical system 4 .
- Obtaining unit 6 obtains, from detection system 7 , vibration information about vibration applied to main body 110 of vehicle 100 , that is, vibration information about vibration applied to display unit 40 mounted on main body 110 .
- Obtaining unit 6 also obtains the detection information of a detection object that exists in target space 400 , information concerning a position of vehicle 100 (also referred to as “position information”), and information concerning a state of vehicle 100 (also referred to as “vehicle information”).
- position information information concerning a position of vehicle 100
- vehicle information also referred to as “vehicle information”.
- An example of the detection object mentioned herein is an object with which vehicle 100 may collide, in objects that exist in target space 400 .
- Examples of this kind of object include a movable object, such as a person, an animal, a bicycle, a vehicle, a motorcycle, a wheelchair, and a stroller, and a fixed object, such as a traffic signal, a street light, and a utility pole, or a mobile object, such as an obstacle which exist in target space 400 .
- a movable object such as a person, an animal, a bicycle, a vehicle, a motorcycle, a wheelchair, and a stroller
- a fixed object such as a traffic signal, a street light, and a utility pole
- a mobile object such as an obstacle which exist in target space 400 .
- Detection system 7 includes vibration detector 71 which detects vibration applied to the main body of vehicle 100 .
- Vibration detector 71 includes, for example, a gyroscope sensor or an inclination sensor which detects an orientation (inclination) of main body 110 of vehicle 100 .
- Vibration detector 71 detects, based on an output from the gyroscope sensor or the inclination sensor, vibration applied to main body 110 (that is, display unit 40 ) from a change over time in the orientation of main body 110 during a predetermined period (for example, one to a few seconds).
- vibration detector 71 detects the amount of change in orientation (for example, inclination angle) of main body 110 during a predetermined period, as a magnitude of vibration applied to display unit 40 (for example, amplitude). Vibration detector 71 then outputs, to display system 10 , the detection value of the magnitude of the vibration as vibration information about vibration applied to display unit 40 . Controller 5 may obtain the amplitude of vibration from the amount of change in orientation (inclination angle) of main body 110 during a predetermined period, and output, to display system 10 , an average value, a maximum value, a minimum value, a medium value or the like of the amplitude during a predetermined period as vibration information.
- vibration detector 71 is not limited to the one which includes a gyroscope sensor or an inclination sensor which detects the orientation (inclination) of main body 110 .
- the configuration of vibration detector 71 may be appropriately changed as long as the vibration applied to display unit 40 can be detected.
- vibration detector 71 may include an acceleration sensor which detects acceleration applied to main body 110 , and may detect vibration applied to display unit 40 based on an output value of the acceleration sensor.
- vibration detector 71 may include a piezoelectric vibration sensor to detect vibration applied to display unit 40 from an output of the vibration sensor.
- detection system 7 includes at least one sensor among, for example, a camera, a light detection and ranging (LiDAR), a sonar sensor, and a radar, and detects a detection object which exists around vehicle 100 (own vehicle). Detection system 7 obtains information, such as a distance from vehicle 100 to a detection object, a relative coordinate of the detection object to vehicle 100 , or a relative velocity between the detection object and vehicle 100 , as detection information about the detection object.
- LiDAR light detection and ranging
- Detection system 7 also obtains a current position of vehicle 100 using, for example, a global positioning system (GPS) to detect the position information concerning the position of vehicle 100 based on the current position of vehicle 100 .
- GPS global positioning system
- Detection system 7 obtains map information of a neighborhood of the current position based on the current position of vehicle 100 .
- Detection system 7 may obtain the map information of the neighborhood of the current position from a memory storing map information, or obtain the map information from an external server by a mobile communicator included in detection system 7 or vehicle 100 communicating with the external server.
- the position information mentioned herein is, for example, information of a road (traffic route) on which vehicle 100 currently travels.
- the position information is, for example, information such as the number of lanes on the road, width of a roadway, presence or absence of a sidewalk, a gradient, a curvature of a curve, presence or absence of a sidewalk, information as to whether a current position is an intersection (intersection such as a crossroad or a T-junction) or not, or information as to whether the road is one way or not.
- Detection system 7 may also obtain the vehicle information concerning a state of vehicle 100 from an advanced driver assistance system (ADAS) or the like.
- the vehicle information is information indicating the local state of vehicle 100 itself, and the information detectable by a sensor installed on vehicle 100 .
- Specific examples of the vehicle information include travelling speed (running speed) of vehicle 100 , acceleration applied to vehicle 100 , depression amount of an accelerator pedal (degree of accelerator opening), depression amount of a brake pedal, a steering angle, or a driver's pulse, an expression and a line of sight detected by a driver monitor.
- Specific data of vehicle 100 such as vehicle width, vehicle height, overall vehicle length, and eye point, is also included in the vehicle information.
- Vibration detector 71 included in detection system 7 may be shared with the advanced driver assistance system.
- Controller 5 is composed of a microcomputer mainly including, for example, a central processing unit (CPU) and a memory.
- controller 5 is realized by a computer including the CPU and the memory.
- the CPU executes a program stored in the memory, allowing the computer to function as controller 5 .
- the program is recorded in the memory of controller 5 in advance.
- the program may be provided via a telecommunication line such as the Internet or by being recorded in a (non-transitory) recording medium such as a memory card.
- Controller 5 controls display of display unit 40 by controlling drive unit 2 and irradiator 3 .
- Controller 5 controls drive unit 2 with a first control signal, and controls irradiator 3 with a second control signal.
- Controller 5 synchronizes operation of drive unit 2 with operation of irradiator 3 .
- Controller 5 further functions as drive controller 51 and display controller 52 as illustrated in FIG. 1 .
- Drive controller 51 relatively moves movable screen 1 a with respect to a reference position by controlling drive unit 2 .
- the “reference position” mentioned herein is a position provided at a prescribed position in a movement area of movable screen 1 a .
- Drive controller 51 moves movable screen 1 a in order to project second virtual image 302 onto target space 400 by the light penetrating movable screen 1 a .
- Drive controller 51 controls drive unit 2 by synchronizing with drawing on movable screen 1 a by irradiator 3 .
- Display controller 52 determines content of virtual image 300 projected onto target space 400 by display unit 40 and a viewing distance in which virtual image 300 is projected, based on the detection information, position information and vehicle information obtained by obtaining unit 6 .
- the viewing distance in which virtual image 300 is projected refers to a distance from an eye (eye point) of user 200 to virtual image 300 .
- the viewing distance in first virtual image 301 is different between the part farthest from the eye of user 200 and the part nearest to the eye of user 200 .
- Obtaining unit 6 may obtain at least one of the detection information, the position information, and the vehicle information, and display controller 52 may determine the content and viewing distance of virtual image 300 based on the information obtained by obtaining unit 6 among the detection information, the position information, and the vehicle information.
- Display controller 52 changes, based on the vibration information obtained by obtaining unit 6 , the display mode of virtual image 300 displayed by display unit 40 , when the vibration applied to display unit 40 exceeds a predetermined first threshold value and the viewing distance of a portion of virtual image 300 (including first virtual image 301 or second virtual image 302 ) exceeds a second threshold value. For example, when main body 110 of vehicle 100 vibrates due to the unevenness or the like of road surface 600 , display unit 40 vibrates according to the vibration of main body 110 . This causes blurring in virtual image 300 projected onto target space 400 by display unit 40 . Such blurring of virtual image 300 becomes more noticeable as the viewing distance from user 200 to virtual image 300 increases.
- display controller 52 changes the display mode of virtual image 300 so as to make blurring of the display position of virtual image 300 caused by the vibration less noticeable.
- changing of the display mode of virtual image 300 means visually changing of virtual image 300 projected onto target space 400 , and means changing of the state of virtual image 300 viewed by user 200 .
- display controller 52 changes the display mode of virtual image 300 by changing one or more of the transmittance, the size, the shape, the color, the outline, and the like of virtual image 300 .
- the vibration applied to display unit 40 is, for example, a vibration generated when main body 110 of vehicle 100 swings (pitching) around the left and right axes, and a vibration generated due to, for example, a road undulation or unevenness of the road surface. In the case of vehicle 100 , such a vibration is, for example, low frequency vibration constantly generated in a cycle of around 1 Hz.
- the first threshold value is a threshold value set in advance for the vibration applied to display unit 40 .
- the first threshold value is, for example, a threshold value for the change over time in the magnitude (for example, amplitude) of the vibration, and is set to a value corresponding to the change over time when the pitch angle of main body 110 changes in the range of ⁇ 0.5 degrees.
- the second threshold value is a threshold value set in advance for the viewing distance of virtual image 300 , and may be set to a distance at which the blurring of virtual image 300 is easily noticeable due to the vibration.
- the second threshold value may be set to a distance of, for example, about 50 meters to 100 meters, and is set to 50 meters in the present embodiment.
- Controller 5 causes irradiator 3 to irradiate movable screen 1 a with light.
- irradiator 3 emits light scanning on one side of movable screen 1 a .
- an image is formed on (projected onto) movable screen 1 a .
- the light from irradiator 3 penetrates movable screen 1 a , and is emitted to windshield 101 from projection optical system 4 .
- the image formed on movable screen 1 a is projected onto windshield 101 from below windshield 101 in the cabin of vehicle 100 .
- windshield 101 When the image is projected from projection optical system 4 onto windshield 101 , windshield 101 reflects the light from projection optical system 4 toward user 200 (driver) in the cabin. Accordingly, the image reflected by windshield 101 is viewed by user 200 . It looks like virtual image 300 (first virtual image 301 or second virtual image 302 ) is being projected in front of vehicle 100 (outside of the vehicle) for user 200 . As a result, virtual image 300 (first virtual image 301 or second virtual image 302 ) projected in front of vehicle 100 (outside of the vehicle) is viewed by user 200 , as if virtual image 300 is being viewed through windshield 101 .
- controller 5 scans the light on one side of movable screen 1 a in a state where movable screen 1 a is fixed in movement directions X, so that first virtual image 301 viewed with depth along road surface 600 is formed. Moreover, controller 5 scans the light on one side of movable screen 1 a , while moving movable screen 1 a so that a distance in directions X between a luminescent spot on one side of movable screen 1 a and projection optical system 4 is kept constant. Consequently, second virtual image 302 , which is viewed vertically on road surface 600 that is positioned at a fixed distance from user 200 , is formed.
- drive controller 51 of controller 5 causes drive unit 2 to move movable screen 1 a in movement directions X.
- an irradiation position on one side of movable screen 1 a on which the light is emitted from irradiator 3 that is, a position of the luminescent spot is constant
- the distance from an eye (eye point) of user 200 to virtual image 300 (also referred to as “viewing distance”) becomes shorter.
- the viewing distance to virtual image 300 becomes longer (more distant). In other words, the viewing distance to virtual image 300 changes according to the position of movable screen 1 a in movement directions X.
- controller 5 moves movable screen 1 a in directions X according to the viewing distance.
- the light is scanned on one side of movable screen 1 a .
- controller 5 moves movable screen 1 a in directions X according to the viewing distance. Controller 5 scans the light on one side of movable screen 1 a , while moving movable screen 1 a so that the distance in directions X between the luminescent spot and projection optical system 4 is kept constant based on the position after the movement.
- Controller 5 also causes irradiator 3 to irradiate fixed screen 1 b with light. At this moment, irradiator 3 emits, to fixed screen 1 b , light scanning on one side of fixed screen 1 b . Accordingly, similarly to the case where movable screen 1 a is irradiated with light, an image is formed on (projected onto) fixed screen 1 b and the image is projected onto windshield 101 . As a result, user 200 is capable of viewing virtual image 300 (third virtual image 303 ), which is projected in front of vehicle 100 (outside of the vehicle), through windshield 101 . Since third virtual image 303 is formed by the light projected onto fixed screen 1 b whose position is fixed, third virtual image 303 is viewed vertically at a predetermined distance (for example, two to three meters) from user 200 on road surface 600 .
- a predetermined distance for example, two to three meters
- Display system 10 is capable of projecting all of first virtual image 301 , second virtual image 302 , and third virtual image 303 during one cycle in which scanner 32 makes one round trip in a longitudinal direction of movable screen 1 a (in an inclined direction with respect to reference plane 503 of movable screen 1 a ).
- display unit 40 emits light to movable screen 1 a to project first virtual image 301 , and then emits light to fixed screen 1 b to display third virtual image 303 .
- display unit 40 On “return way” in which the light is scanned on fixed screen 1 b and movable screen 1 a in this order, at first, display unit 40 emits light to fixed screen 1 b to display third virtual image 303 , and then emits light to movable screen 1 a to project second virtual image 302 .
- first virtual image 301 , third virtual image 303 , and second virtual image 302 are projected onto target space 400 during one cycle in which scanner 32 scans in the longitudinal direction. Scanning in the longitudinal direction is performed in irradiator 3 relatively fast, so that user 200 recognizes visually that first virtual image 301 , third virtual image 303 , and second virtual image 302 look like being displayed simultaneously. Frequency of scanning in the longitudinal direction in irradiator 3 is, for example, greater than or equal to 60 Hz.
- Obtaining unit 6 of display system 10 obtains, on a regular or irregular basis, from detection system 7 , detection information of a detection objection present in target space 400 , position information of vehicle 100 , and vehicle information about the state of vehicle 100 .
- Display controller 52 generates content of virtual image 300 projected onto target space 400 by display unit 40 , based on the detection information, position information and vehicle information obtained by obtaining unit 6 (S 1 ), and calculates the viewing distance in which generated virtual image 300 is projected.
- the viewing distance of each pixel in virtual image 300 can be calculated from the angle of view and positions of screens 1 a and 1 b , the irradiation position of light from irradiator 3 , and the like (S 2 ).
- Obtaining unit 6 also obtains, from vibration detector 71 of detection system 7 , vibration information about vibration applied to main body 110 of vehicle 100 that is vibration information about vibration applied to display unit 40 (S 3 ).
- Controller 5 compares the magnitude of the vibration with a first threshold value based on the vibration information obtained by obtaining unit 6 (S 4 ).
- drive controller 51 of controller 5 controls drive unit 2 and irradiator 3 so that virtual image 300 generated in Step S 1 is projected onto target space 400 by display unit 40 (S 7 ).
- controller 5 compares the magnitude of the viewing distance calculated in Step S 2 with a second threshold value. For example, relative to the viewing distance, the viewing distance of each pixel forming the virtual image is compared with the second threshold value. In addition, relative to virtual image 301 or the like viewed vertically, the viewing distance of at least one pixel may be compared with the second threshold value. Moreover, relative to virtual image 301 viewed with depth, only a plurality of representative pixels may be compared with the second threshold value (S 5 ).
- Step S 2 If the viewing distance calculated in Step S 2 is less than or equal to the second threshold value (S 5 : No), drive controller 51 of controller 5 controls drive unit 2 and irradiator 3 so that virtual image 300 generated in Step S 1 is projected onto target space 400 by display unit 40 (S 7 ).
- Step S 2 if the viewing distance calculated in Step S 2 exceeds the second threshold value (S 5 : Yes), display controller 52 of controller 5 changes the content of virtual image 300 generated in Step S 1 (S 6 ).
- FIG. 5A is an example of virtual images 304 and 306 generated in Step S 1 .
- Virtual image 306 is third virtual image 303 for displaying vehicle information of vehicle 100 (own vehicle), and the viewing distance is several meters.
- Virtual image 304 is first virtual image 301 indicating the path of vehicle 100 (own vehicle).
- Virtual image 304 is virtual image 300 for instructing to turn left at the second intersection ahead, and the viewing distance of the top portion of virtual image 304 is, for example, 80 meters. Since the viewing distance of virtual image 306 is less than or equal to the second threshold value, controller 5 causes display unit 40 to project virtual image 306 onto target space 400 without changing the content. In contrast, since the viewing distance of the pixels of the top portion (X 1 portion in FIG.
- display controller 52 changes the display mode of the pixels of the top portion of virtual image 304 having a viewing distance exceeding the second threshold value.
- display controller 52 changes the content of virtual image 304 so as to reduce the transmittance of the top portion of virtual image 304 . Accordingly, the visibility of the top portion (X 2 portion in FIG. 5B ) of virtual image 304 A after the change is reduced compared with the top portion of virtual image 304 before the change, and is reduced also compared with portions of virtual image 304 A after the change other than the top portion.
- drive controller 51 of controller 5 controls drive unit 2 and irradiator 3 so that virtual image 304 A having a display mode which has been changed in Step S 6 is projected by display unit 40 onto target space 400 (S 7 ).
- Display system 10 projects virtual image 300 onto target space 400 by repeatedly executing the processing of Steps S 1 to S 7 described above at a predetermined time interval (for example, 1/60 seconds).
- a predetermined time interval for example, 1/60 seconds.
- controller 5 may be configured to determine whether or not the change over time in the magnitude of the vibration is greater than or equal to the first threshold value. In other words, controller 5 may be configured to determine whether or not the change over time in the magnitude of the vibration is less than the first threshold value.
- controller 5 may be configured to determine whether or not the viewing distance is greater than or equal to the second threshold value. In other words, controller 5 may be configured to determine whether or not the viewing distance is less than the second threshold value.
- display controller 52 when the vibration applied to display unit 40 exceeds the first threshold value, reduces the transmittance of the portion of virtual image 304 having a viewing distance which exceeds the second threshold value.
- the change of the display mode is not limited to reduction of the transmittance.
- Display controller 52 may change the display mode of virtual image 300 by changing one or more of the transmittance, the size, the shape, the color, the outline, and the like of virtual image 304 ( 300 ).
- display controller 52 may change the shape of virtual image 304 so that blurring of virtual image 304 caused by the vibration becomes less noticeable.
- display controller 52 displays, in target space 400 , virtual image 304 B (see FIG. 6B ) having a shape changed so that the viewing distance becomes less than or equal to the second threshold value. Since virtual image 304 before the change is a virtual image instructing to turn left at the second intersection ahead, the viewing distance of the top portion of virtual image 304 is about 80 meters and exceeds the second threshold value.
- virtual image 304 B after the change is a virtual image instructing to go straight at the first intersection ahead, and the viewing distance of the top portion of virtual image 304 B is less than the second threshold value.
- the vibration applied to display unit 40 exceeds the first threshold value, blurring of virtual image 304 B becomes less noticeable and the deviation of the display position of virtual image 304 B becomes less noticeable.
- controller 5 may change the size of virtual image 300 so that blurring of virtual image 300 caused by the vibration becomes less noticeable.
- controller 5 changes the size of virtual image 305 .
- display controller 52 makes the size of virtual image 305 A after the change larger than the size of virtual image 305 before the change.
- the range surrounded by virtual image 305 A increases, so that even if the position of virtual image 305 A is blurred due to the vibration applied to display unit 40 , the positions of virtual image 305 A and obstacle 700 are unlikely to deviate, and the deviation of the display position of virtual image 305 A becomes less noticeable.
- display controller 52 may make the size of virtual image 305 smaller than that before the change. By reducing the size of virtual image 305 , the virtual image after the change becomes less noticeable, and the deviation of the display position of the virtual image after the change becomes less noticeable.
- display controller 52 may change the color of the portion of virtual image 300 having a viewing distance exceeding the second threshold value into a color that is less noticeable than a portion of virtual image 300 having a viewing distance less than or equal to the second threshold value.
- display controller 52 may change the depth of the color of the portion of virtual image 300 having a viewing distance exceeding the second threshold value into a less depth color compared with the portion having a viewing distance less than or equal to the second threshold value.
- display controller 52 may blur the outline of the portion of virtual image 300 having a viewing distance exceeding the second threshold value or reduce the contrast of virtual image 300 .
- Display controller 52 may make virtual image 300 after the change less noticeable by combining and changing two or more of the transmittance, the size, the shape, the color, the outline, and the like of virtual image 300 .
- a method for controlling display system 10 is a method for controlling display system 10 that includes display unit 40 which displays virtual image 300 in target space 400 , controller 5 which controls the display of display unit 40 , and obtaining unit 6 which obtains vibration information about vibration applied to display unit 40 .
- controller 5 when the vibration exceeds the first threshold value and the viewing distance of virtual image 300 exceeds the second threshold value, controller 5 changes, based on the vibration information, the display mode of virtual image 300 displayed by display unit 40 .
- a (computer) program according to an aspect is a program for causing a computer system to execute the method for controlling display system 10 .
- An executing subject of display system 10 , information presentation system 20 , or the method for controlling display system 10 according to the present disclosure includes a computer system.
- the computer system mainly includes a processor as hardware and a memory.
- the processor executes the program stored in the memory of the computer system, so that function, as the executing subject of display system 10 , information presentation system 20 , or the method for controlling display system 10 according to the present disclosure, is realized.
- the program may be stored in the memory of the computer system in advance, may be supplied through a telecommunication line, or may be supplied in the state where the program is stored in the non-transitory recording medium which can be read by the computer system. Examples of this type of the non-transitory recording medium includes a memory card, an optical disk, and a hard disk drive.
- the processor of the computer system is composed of one or a plurality of electronic circuits including semiconductor integrated circuit (IC) or large-scale integration (LSI).
- the plurality of electronic circuits may be integrated in one chip, or may be provided on a plurality of chips so as to be dispersed.
- the plurality of chips may be integrated in one device, or may be provided on a plurality of devices so as to be dispersed.
- Functions of obtaining unit 6 , display unit 40 , and controller 5 of display system 10 may be provided on two or more systems so as to be dispersed.
- a function of controller 5 of display system 10 may be realized by, for example, cloud (cloud computing).
- Information presentation system 20 is realized with display system 10 and detection system 7 .
- Information presentation system 20 is not limited to this configuration.
- Information presentation system 20 may be realized with, for example, one of display system 10 and detection system 7 .
- a function of detection system 7 may be integrated in display system 10 .
- Controller 5 in display system 10 may determine the content of virtual image 300 based on information obtained by communication between a communicator included in display system 10 or vehicle 100 and an outside. Controller 5 may determine the content of virtual image 300 based on information obtained by an inter vehicle communication (V2V: Vehicle-to-Vehicle) between vehicle 100 and a peripheral vehicle, an intervehicle and road-vehicle communication (V2X: Vehicle-to-Everything) between vehicle 100 and the peripheral vehicle or infrastructure, or the like.
- V2V Vehicle-to-Vehicle
- V2X Vehicle-to-Everything
- the content of virtual image 300 projected onto target space 400 may be determined by the infrastructure. In this case, at least a part of controller 5 may not be installed on vehicle 100 .
- display system 10 is not limited to the configuration of projecting virtual image 300 onto target space 400 set in front of vehicle 100 in the traveling direction.
- display system 10 may project virtual image 300 in a side direction, a rear direction, or an upper direction and the like in the traveling direction of vehicle 100 .
- display system 10 is not limited to the head-up display for use in vehicle 100 .
- display system 10 is also applicable for a mobile body other than vehicle 100 .
- Examples of the mobile body other than vehicle 100 includes a motorcycle, a train, an aircraft, a construction machine, and a vessel.
- the place of use of display system 10 is not limited to the mobile body.
- display system 10 may be used in an amusement facility.
- Display system 10 may also be used as a wearable terminal such as Head Mounted Display (HMD).
- display system 10 may be used at a medical facility, and may be used as a stationary device.
- HMD Head Mounted Display
- display unit 40 of display system 10 includes movable screen 1 a and fixed screen 1 b , it is sufficient that display unit 40 includes at least movable screen 1 a.
- Display unit 40 is not limited to the configuration of projecting the virtual image by a laser beam.
- display unit 40 may also be configured so that a projector projects an image (virtual image 300 ) onto a diffuse transmission type screen 1 from behind screen 1 .
- Display unit 40 may project, through projection optical system 4 , virtual image 300 corresponding to an image displayed by a liquid crystal display.
- the reflection member which reflects the light emitted from display unit 40 is composed of windshield 101 of vehicle 100 .
- the reflection member is not limited to windshield 101 .
- the reflection member may be a transparent plate provided separately from windshield 101 .
- the display system ( 10 ) includes, in the first aspect, the display unit ( 40 ), the controller ( 5 ), and the obtaining unit ( 6 ).
- the display unit ( 40 ) displays a virtual image ( 300 to 305 , 304 A, 304 B, 305 A) in the target space ( 400 ).
- the controller ( 5 ) controls the display of the display unit ( 40 ).
- the obtaining unit ( 6 ) obtains vibration information about vibration applied to the display unit ( 40 ).
- the controller ( 5 ) changes the display mode of the virtual image ( 300 to 305 , 304 A, 304 B, 305 A) based on the vibration information.
- the vibration exceeds the first threshold value, and the viewing distance exceeds the second threshold value
- blurring of the virtual image can be made less noticeable compared with the case where the display mode of the virtual image ( 300 to 305 , 304 A, 304 B, 305 A) does not change. Accordingly, the deviation of the display position of the virtual image ( 300 to 305 , 304 A, 304 B, 305 A) can be made less noticeable.
- the display mode is transmittance of the virtual image ( 300 to 305 , 304 A, 304 B, 305 A).
- visibility of the virtual image ( 300 to 305 , 304 A, 304 B, 305 A) can be changed by changing the transmittance. Accordingly, the deviation of the display position of the virtual image ( 300 to 305 , 304 A, 304 B, 305 A) caused by the vibration can be made less noticeable.
- the display mode is the size of the virtual image ( 300 to 305 , 304 A, 304 B, 305 A).
- the positional deviation between an object which exists in the real space and the virtual image ( 300 to 305 , 304 A, 304 B, 305 A) becomes less noticeable.
- the virtual image ( 300 to 305 , 304 A, 304 B, 305 A) itself becomes less noticeable.
- the positional deviation can be made less noticeable.
- the display mode is the shape of the virtual image ( 300 to 305 , 304 A, 304 B, 305 A).
- the shape of the virtual image ( 300 to 305 , 304 A, 304 B, 305 A) can be changed into such a shape that the positional deviation of the virtual image ( 300 to 305 , 304 A, 304 B, 305 A) caused by the vibration is less noticeable.
- the controller ( 5 ) changes the shape of the virtual image ( 300 to 305 , 304 A, 304 B, 305 A) so that the viewing distance becomes less than or equal to the second threshold value.
- the positional deviation caused by the vibration becomes less noticeable, compared with the case where the virtual image ( 300 to 305 , 304 A, 304 B, 305 A) having a viewing distance exceeding the second threshold value is displayed.
- the vibration is a change over time in the orientation of the display unit ( 40 ).
- the vibration applied to the display unit ( 40 ) exceeds the first threshold value.
- the information presentation system ( 20 ) according to a seventh aspect includes the display system ( 10 ) according to any one of the first to sixth aspects, and the detection system ( 7 ) which detects vibration applied to the display unit ( 40 ).
- the obtaining unit ( 6 ) obtains detection information from the detection system ( 7 ).
- the seventh aspect it is possible to realize the information presentation system ( 20 ) capable of making the deviation of the display position of the virtual image less noticeable.
- the method for controlling the display system ( 10 ) is a method for controlling the display system ( 10 ) including the display unit ( 40 ), the controller ( 5 ), and the obtaining unit ( 6 ).
- the display unit ( 40 ) displays the virtual image ( 300 to 305 , 304 A, 304 B, 305 A) in the target space ( 400 ).
- the controller ( 5 ) controls the display of the display unit ( 40 ).
- the obtaining unit ( 6 ) obtains vibration information about vibration applied to the display unit ( 40 ).
- the controller ( 5 ) when the vibration exceeds the first threshold value and the viewing distance of the virtual image ( 300 to 305 , 304 A, 304 B, 305 A) exceeds the second threshold value, the controller ( 5 ) changes the display mode of the virtual image ( 300 to 305 , 304 A, 304 B, 305 A) based on the vibration information.
- the positional deviation of the virtual image ( 300 to 305 , 304 A, 304 B, 305 A) can be made less noticeable, compared with the case where the display mode of the virtual image ( 300 to 305 , 304 A, 304 B, 305 A) does not change.
- the program according to a ninth aspect is a program for causing a computer system to execute the method for controlling the display system ( 10 ) according to the eighth aspect.
- the mobile body ( 100 ) includes: the display system ( 10 ) according to any one of the first to sixth aspects, and the reflection member ( 101 ).
- the reflection member ( 101 ) has light transmitting properties, and reflects the light emitted from the display unit ( 40 ) so that the virtual image ( 300 to 305 , 304 A, 304 B, 305 A) is viewed by the target person ( 200 ).
- the mobile body ( 100 ) capable of making the deviation of the display position of the virtual image less noticeable.
- the configurations according to the second to sixth aspects are not essential configurations for the display system ( 10 ), and the configurations may be appropriately omitted.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Optics & Photonics (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Instrument Panels (AREA)
- Transforming Electric Information Into Light Information (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
- This application is a continuation of the PCT International Application No. PCT/JP2018/024285 filed on Jun. 27, 2018, which claims the benefit of foreign priority of Japanese patent application No. 2017-129898 filed on Jun. 30, 2017, the contents all of which are incorporated herein by reference.”
- The present disclosure relates to a display system, an information presentation system, a method for controlling the display system, a program, and a mobile body, and more particularly, to a display system which displays a virtual image in a target space, an information presentation system, a method for controlling the display system, a program, and a mobile body.
- A dangerous situation warning device is conventionally known which includes a vehicle neighborhood object detection device which detects an object on a front travel area, a danger latent area determination device which determines a danger latent area in which danger is latent, a danger degree processing device, and a warning output device (for example, see Patent Literature (PTL) 1). The danger degree processing device checks object information obtained by the vehicle neighborhood object detection device against the danger latent area obtained by the danger latent area determination device to determine a degree of danger with respect to an object which exists in the danger latent area. When the vehicle neighborhood object detection device detects a plurality of obstacles on the front travel area, the danger degree processing device sets the degree of danger for each of the obstacles and the warning output device displays the degree of danger for each of the obstacles. The warning output device is a head-up display, and displays a red frame surrounding the obstacle determined as having a high degree of danger at a position seen to be overlapping with the obstacle.
- PTL 1: Japanese Unexamined Patent Application Publication No. 2015-221651
- An object of the present disclosure is to provide a display system, an information presentation system, a display method, a program, and a mobile body which are capable of making the deviation of the display position of a virtual image less noticeable.
- A display system according to an aspect of the present disclosure projects an image to be viewed by a target person as if a virtual image is being projected onto a target space. The display system includes a display unit, a controller, and a vibration information obtaining unit which obtains vibration information. The display unit displays the image. The controller controls the display of the display unit. The vibration information obtaining unit obtains the vibration information about vibration applied to the display unit. When the vibration exceeds a first threshold value and the viewing distance of a portion of the virtual image exceeds a second threshold value, the controller changes the display mode of the portion of the virtual image based on the vibration information.
- An information presentation system according to another aspect of the present disclosure includes the display system, and a detection system which detects the vibration applied to the display unit. The vibration information obtaining unit obtains the vibration information from the detection system.
- A control method according to another aspect of the present disclosure is a method for controlling a display system which projects an image to be viewed by a target person as if a virtual image is being projected onto the target space. The display system includes a display unit, a controller, and a vibration information obtaining unit which obtains vibration information. The display unit displays the image. The controller controls the display of the display unit. The vibration information obtaining unit obtains the vibration information about vibration applied to the display unit. When the vibration exceeds a first threshold value and the viewing distance of a portion of the virtual image exceeds a second threshold value, the method for controlling the display system causes the controller to change the display mode of the virtual image based on the vibration information.
- A program according to another aspect of the present disclosure is a program for causing a computer system to execute the method for controlling the display system.
- A mobile body according to another aspect of the present disclosure includes: the display system; and a reflection member which has light transmitting properties and reflects the light emitted from the display unit so that the virtual image is viewed by the target person.
- According to the present disclosure, it is possible to provide a display system, an information presentation system, a method for controlling the display system, a program, and a mobile body which are capable of making the deviation of the display position of a virtual image less noticeable.
-
FIG. 1 is a conceptual diagram of a configuration of an information presentation system according to an embodiment of the present disclosure. -
FIG. 2 is a conceptual diagram of a vehicle which includes a display system according to an embodiment of the present disclosure. -
FIG. 3 is a conceptual diagram of a display example in the information presentation system. -
FIG. 4 is a flowchart of an operation of the information presentation system. -
FIG. 5A is a conceptual diagram of a display example in the information presentation system. -
FIG. 5B is a conceptual diagram of a display example in the information presentation system. -
FIG. 6A is a conceptual diagram of another display example in the information presentation system. -
FIG. 6B is a conceptual diagram of another display example in the information presentation system. -
FIG. 7A is a conceptual diagram of another display example in the information presentation system. -
FIG. 7B is a conceptual diagram of another display example in the information presentation system. - Prior to the description of an embodiment of the present invention, problems in a conventional system will be briefly described. In the dangerous situation warning device described in
PTL 1, when vibration occurs in a vehicle equipped with the dangerous situation warning device, the position of the red frame (virtual image) displayed by the warning output device (display system) varies due to the vibration of the vehicle. This may result in deviation of the display positions of the obstacle and the red frame. - (1) Overview
- As illustrated in
FIG. 1 ,display system 10 according to the present embodiment includesdisplay unit 40 which displays (projects) a virtual image in a target space, vibrationinformation obtaining unit 6 which obtains vibration information about vibration applied todisplay unit 40, andcontroller 5 which controlsdisplay unit 40. Obtainingunit 6,display unit 40, andcontroller 5 will be described in detail in “(2) Configuration”. - As illustrated in
FIG. 2 ,display system 10 according to the present embodiment is a head-up display (HUD) for use invehicle 100 as a mobile body, for example.Display system 10 is disposed in a cabin ofvehicle 100 to project an image onto windshield 101 (reflection member) ofvehicle 100 from below. In the example illustrated inFIG. 2 ,display system 10 is disposed indashboard 102 installed belowwindshield 101. When an image is projected fromdisplay system 10 ontowindshield 101, the image reflected onwindshield 101 as the reflection member is viewed by user 200 (driver). - According to
display system 10 as described,user 200 recognizes visually thatvirtual image 300 looks like being projected ontotarget space 400 set in front of vehicle 100 (outside of the vehicle). Here, the term “front” refers to a direction in whichvehicle 100 moves forward, and a direction in whichvehicle 100 moves forward or backward is referred to as a longitudinal direction. The term “virtual image” means an image formed by a radiated ray as if an object were actually presented when light emitted fromdisplay system 10 is radiated by a reflecting object such aswindshield 101. Sincewindshield 101 has light transmitting properties,user 200, as a target person, is capable of viewingtarget space 400 in front ofvehicle 100 throughwindshield 101. Therefore,user 200 is capable of viewingvirtual image 300 which is projected bydisplay system 10, while superimposingvirtual image 300 on a real space spreading in front ofvehicle 100. Hence, according todisplay system 10, various pieces of driving assistance information, such as vehicle speed information, navigation information, pedestrian information, front vehicle information, lane departure information and vehicle condition information, can be displayed asvirtual image 300 to be viewed byuser 200. In this manner,user 200 is capable of visually obtaining the driving assistance information by only a slight movement of a line of sight from a state of directing the line of sight forward ofwindshield 101. - In
display system 10 according to the present embodiment,virtual image 300 formed intarget space 400 includes at least two types of virtual images, i.e., firstvirtual image 301 and secondvirtual image 302. The term “first virtual image” mentioned herein is virtual image 300 (301) formed on firstvirtual plane 501. The term “first virtual plane” is a virtual plane in which inclination angle α with respect tooptical axis 500 ofdisplay system 10 is smaller than predetermined value γ (α<γ). Moreover, the term “second virtual image” mentioned herein is virtual image 300 (302) formed on secondvirtual plane 502. The term “second virtual plane” is a virtual plane in which inclination angle ß with respect tooptical axis 500 ofdisplay system 10 is larger than predetermined value γ (ß>γ). The “optical axis” mentioned herein is an optical axis of an optical system of projection optical system 4 (seeFIG. 1 ) to be described later, that is, an axis that passes through a center oftarget space 400 and goes along an optical path ofvirtual image 300. An example of predetermined value γ is 45 degrees, and an example of inclination angle ß is 90 degrees. - In
display system 10 according to the present embodiment,virtual image 300 formed intarget space 400 includes third virtual image 303 (seeFIG. 3 ) in addition to firstvirtual image 301 and secondvirtual image 302. The term “third virtual image” is, similarly to secondvirtual image 302, virtual image 300 (303) formed on secondvirtual plane 502 in which inclination angle ß with respect tooptical axis 500 is larger than predetermined value γ. Invirtual image 300 formed on secondvirtual plane 502, a virtual image formed by light penetratingmovable screen 1 a is secondvirtual image 302, and a virtual image formed by light penetrating fixedscreen 1 b is third virtual image 303, as details will be described later. - In the present embodiment,
optical axis 500 is alongroad surface 600 in front ofvehicle 100 intarget space 400 in front ofvehicle 100. Then, firstvirtual image 301 is formed on firstvirtual plane 501 substantially parallel toroad surface 600, and secondvirtual image 302 and third virtual image 303 are formed on secondvirtual plane 502 substantially perpendicular toroad surface 600. For example, whenroad surface 600 is a horizontal plane, firstvirtual image 301 is displayed along the horizontal plane, and secondvirtual image 302 and third virtual image 303 are displayed along a vertical plane. -
FIG. 3 is a conceptual diagram of a visual field ofuser 200. As illustrated inFIG. 3 ,display system 10 according to the present embodiment is capable of displaying firstvirtual images 301 viewed with depth alongroad surface 600 and secondvirtual images 302 and third virtual image 303 viewed vertically onroad surface 600 at a fixed distance fromuser 200. Hence, foruser 200, each firstvirtual image 301 looks like being presented on a plane substantially parallel toroad surface 600, and each secondvirtual image 302 and third virtual image 303 look like being presented on a plane substantially perpendicular toroad surface 600. An example of firstvirtual image 301 is navigation information indicating a traveling direction ofvehicle 100, which can present an arrow that indicates to turn right or turn left onroad surface 600. An example of secondvirtual image 302 is information indicating a distance to a front vehicle or a pedestrian, which can present a distance to the front vehicle (inter-vehicle distance) on the front vehicle. An example of third virtual image 303 is a current time, vehicle speed information, and vehicle condition information, which can present these pieces of information, for example, by letters, numbers, and symbols, or a meter such as a fuel gauge. - (2) Configuration
- As illustrated in
FIG. 1 ,display system 10 according to the present embodiment includes a plurality ofscreens drive unit 2,irradiator 3, projectionoptical system 4,controller 5, and obtainingunit 6. According to the present embodiment, projectionoptical system 4 withirradiator 3form display unit 40 that projects (displays) virtual image 300 (seeFIG. 2 ) onto target space 400 (seeFIG. 2 ). - Further,
information presentation system 20 according to the present embodiment includesdisplay system 10 anddetection system 7. - A plurality of
screens screen 1 b andmovable screen 1 a.Fixed screen 1 b is fixed to a fixed position of a housing or the like ofdisplay system 10.Movable screen 1 a is inclined at angle θ with respect toreference plane 503. Moreover,movable screen 1 a is configured to be movable in movement directions X orthogonal toreference plane 503. The term “reference plane” mentioned herein is not a real plane but a virtual flat plane that defines the movement direction ofmovable screen 1 a.Movable screen 1 a is configured to be movable rectilinearly in movement directions X (directions shown by an arrow X1-X2 inFIG. 1 ) while maintaining the orientation inclined at angle θ with respect toreference plane 503. In the case wheremovable screen 1 a and fixedscreen 1 b are not particularly distinguished from one another, each of the plurality ofscreens screen 1” hereinafter. - Screen 1 (each of
movable screen 1 a and fixedscreen 1 b) has translucency and forms an image to form virtual image 300 (seeFIG. 2 ) in target space 400 (seeFIG. 2 ). In other words, an image is drawn onscreen 1 by light fromirradiator 3, andvirtual image 300 is formed intarget space 400 by thelight penetrating screen 1.Screen 1 is made of, for example, a plate-shaped member that has light diffusing properties and is formed into a rectangular shape.Screen 1 is disposed betweenirradiator 3 and projectionoptical system 4. -
Drive unit 2 movesmovable screen 1 a in movement directions X. Here,drive unit 2 is capable of movingmovable screen 1 a both in a direction toward and away from projectionoptical system 4 along movement directions X. For example, driveunit 2 is made of an electric driven actuator, such as a voice coil motor, and operates according to a first control signal output fromcontroller 5. -
Irradiator 3 is a scanning photoirradiation unit, and irradiatesmovable screen 1 a or fixedscreen 1 b with light.Irradiator 3 includeslight source 31 andscanner 32. Inirradiator 3, each oflight source 31 andscanner 32 operates according to a second control signal output fromcontroller 5. -
Light source 31 is formed of a laser module that outputs laser light.Light source 31 includes a red laser diode that emits a laser light beam of a red color (R), a green laser diode that emits a laser light beam of a green color (G), and a blue laser diode that emits a laser light beam of a blue color (B). Three color laser light beams output from these three kinds of laser diodes are synthesized by, for example, a dichroic mirror, and incident onscanner 32. -
Scanner 32 irradiatesmovable screen 1 a or fixedscreen 1 b with light that scans on one side ofmovable screen 1 a or fixedscreen 1 b by scanning the light fromlight source 31. Here,scanner 32 executes raster scan in which light is scanned two-dimensionally on one side ofmovable screen 1 a or fixedscreen 1 b. - The light, as incident light, that is output from
irradiator 3 and that penetratesscreen 1 is incident on projectionoptical system 4. Projectionoptical system 4 projects virtual image 300 (seeFIG. 2 ) onto target space 400 (seeFIG. 2 ) with the incident light. Projectionoptical system 4 is arranged in line in movement directions X ofmovable screen 1 a with respect toscreen 1. As illustrated inFIG. 1 , projectionoptical system 4 includes magnifyinglens 41,first mirror 42, andsecond mirror 43. - Magnifying
lens 41,first mirror 42, andsecond mirror 43 are arranged in this order on a route of thelight penetrating screen 1. Magnifyinglens 41 is disposed on an opposite side to irradiator 3 (a side indicated by first direction X1) in movement directions X as seen fromscreen 1 so that the light output fromscreen 1 in movement directions X is incident on magnifyinglens 41. Magnifyinglens 41 magnifies an image formed onscreen 1 by the light emitted fromirradiator 3 to output the image tofirst mirror 42.First mirror 42 reflects the light from magnifyinglens 41 towardsecond mirror 43.Second mirror 43 reflects the light, which is emitted fromfirst mirror 42, toward windshield 101 (seeFIG. 2 ). That is, projectionoptical system 4 magnifies the image formed onscreen 1 by the light emitted fromirradiator 3 with magnifyinglens 41 and projects the image ontowindshield 101, thereby projectingvirtual image 300 ontotarget space 400. An optical axis of magnifyinglens 41 corresponds tooptical axis 500 of projectionoptical system 4. - Obtaining
unit 6 obtains, fromdetection system 7, vibration information about vibration applied tomain body 110 ofvehicle 100, that is, vibration information about vibration applied to displayunit 40 mounted onmain body 110. Obtainingunit 6 also obtains the detection information of a detection object that exists intarget space 400, information concerning a position of vehicle 100 (also referred to as “position information”), and information concerning a state of vehicle 100 (also referred to as “vehicle information”). An example of the detection object mentioned herein is an object with whichvehicle 100 may collide, in objects that exist intarget space 400. Examples of this kind of object include a movable object, such as a person, an animal, a bicycle, a vehicle, a motorcycle, a wheelchair, and a stroller, and a fixed object, such as a traffic signal, a street light, and a utility pole, or a mobile object, such as an obstacle which exist intarget space 400. -
Detection system 7 includesvibration detector 71 which detects vibration applied to the main body ofvehicle 100.Vibration detector 71 includes, for example, a gyroscope sensor or an inclination sensor which detects an orientation (inclination) ofmain body 110 ofvehicle 100.Vibration detector 71 detects, based on an output from the gyroscope sensor or the inclination sensor, vibration applied to main body 110 (that is, display unit 40) from a change over time in the orientation ofmain body 110 during a predetermined period (for example, one to a few seconds). In other words,vibration detector 71 detects the amount of change in orientation (for example, inclination angle) ofmain body 110 during a predetermined period, as a magnitude of vibration applied to display unit 40 (for example, amplitude).Vibration detector 71 then outputs, to displaysystem 10, the detection value of the magnitude of the vibration as vibration information about vibration applied to displayunit 40.Controller 5 may obtain the amplitude of vibration from the amount of change in orientation (inclination angle) ofmain body 110 during a predetermined period, and output, to displaysystem 10, an average value, a maximum value, a minimum value, a medium value or the like of the amplitude during a predetermined period as vibration information. - Note that
vibration detector 71 is not limited to the one which includes a gyroscope sensor or an inclination sensor which detects the orientation (inclination) ofmain body 110. The configuration ofvibration detector 71 may be appropriately changed as long as the vibration applied to displayunit 40 can be detected. For example,vibration detector 71 may include an acceleration sensor which detects acceleration applied tomain body 110, and may detect vibration applied to displayunit 40 based on an output value of the acceleration sensor. Moreover,vibration detector 71 may include a piezoelectric vibration sensor to detect vibration applied to displayunit 40 from an output of the vibration sensor. - Moreover,
detection system 7 includes at least one sensor among, for example, a camera, a light detection and ranging (LiDAR), a sonar sensor, and a radar, and detects a detection object which exists around vehicle 100 (own vehicle).Detection system 7 obtains information, such as a distance fromvehicle 100 to a detection object, a relative coordinate of the detection object tovehicle 100, or a relative velocity between the detection object andvehicle 100, as detection information about the detection object. -
Detection system 7 also obtains a current position ofvehicle 100 using, for example, a global positioning system (GPS) to detect the position information concerning the position ofvehicle 100 based on the current position ofvehicle 100.Detection system 7 obtains map information of a neighborhood of the current position based on the current position ofvehicle 100.Detection system 7 may obtain the map information of the neighborhood of the current position from a memory storing map information, or obtain the map information from an external server by a mobile communicator included indetection system 7 orvehicle 100 communicating with the external server. The position information mentioned herein is, for example, information of a road (traffic route) on whichvehicle 100 currently travels. The position information is, for example, information such as the number of lanes on the road, width of a roadway, presence or absence of a sidewalk, a gradient, a curvature of a curve, presence or absence of a sidewalk, information as to whether a current position is an intersection (intersection such as a crossroad or a T-junction) or not, or information as to whether the road is one way or not. -
Detection system 7 may also obtain the vehicle information concerning a state ofvehicle 100 from an advanced driver assistance system (ADAS) or the like. The vehicle information is information indicating the local state ofvehicle 100 itself, and the information detectable by a sensor installed onvehicle 100. Specific examples of the vehicle information include travelling speed (running speed) ofvehicle 100, acceleration applied tovehicle 100, depression amount of an accelerator pedal (degree of accelerator opening), depression amount of a brake pedal, a steering angle, or a driver's pulse, an expression and a line of sight detected by a driver monitor. Specific data ofvehicle 100, such as vehicle width, vehicle height, overall vehicle length, and eye point, is also included in the vehicle information. -
Vibration detector 71 included indetection system 7 may be shared with the advanced driver assistance system. -
Controller 5 is composed of a microcomputer mainly including, for example, a central processing unit (CPU) and a memory. In other words,controller 5 is realized by a computer including the CPU and the memory. The CPU executes a program stored in the memory, allowing the computer to function ascontroller 5. Herein, the program is recorded in the memory ofcontroller 5 in advance. However, the program may be provided via a telecommunication line such as the Internet or by being recorded in a (non-transitory) recording medium such as a memory card. -
Controller 5 controls display ofdisplay unit 40 by controllingdrive unit 2 andirradiator 3.Controller 5 controls driveunit 2 with a first control signal, and controls irradiator 3 with a second control signal.Controller 5 synchronizes operation ofdrive unit 2 with operation ofirradiator 3.Controller 5 further functions asdrive controller 51 anddisplay controller 52 as illustrated inFIG. 1 . -
Drive controller 51 relatively movesmovable screen 1 a with respect to a reference position by controllingdrive unit 2. The “reference position” mentioned herein is a position provided at a prescribed position in a movement area ofmovable screen 1 a.Drive controller 51 movesmovable screen 1 a in order to project secondvirtual image 302 ontotarget space 400 by the light penetratingmovable screen 1 a.Drive controller 51 controls driveunit 2 by synchronizing with drawing onmovable screen 1 a byirradiator 3. -
Display controller 52 determines content ofvirtual image 300 projected ontotarget space 400 bydisplay unit 40 and a viewing distance in whichvirtual image 300 is projected, based on the detection information, position information and vehicle information obtained by obtainingunit 6. Here, the viewing distance in whichvirtual image 300 is projected refers to a distance from an eye (eye point) ofuser 200 tovirtual image 300. In the case of firstvirtual image 301 displayed along firstvirtual plane 501 substantially parallel toroad surface 600, the viewing distance in firstvirtual image 301 is different between the part farthest from the eye ofuser 200 and the part nearest to the eye ofuser 200. Obtainingunit 6 may obtain at least one of the detection information, the position information, and the vehicle information, anddisplay controller 52 may determine the content and viewing distance ofvirtual image 300 based on the information obtained by obtainingunit 6 among the detection information, the position information, and the vehicle information. -
Display controller 52 changes, based on the vibration information obtained by obtainingunit 6, the display mode ofvirtual image 300 displayed bydisplay unit 40, when the vibration applied to displayunit 40 exceeds a predetermined first threshold value and the viewing distance of a portion of virtual image 300 (including firstvirtual image 301 or second virtual image 302) exceeds a second threshold value. For example, whenmain body 110 ofvehicle 100 vibrates due to the unevenness or the like ofroad surface 600,display unit 40 vibrates according to the vibration ofmain body 110. This causes blurring invirtual image 300 projected ontotarget space 400 bydisplay unit 40. Such blurring ofvirtual image 300 becomes more noticeable as the viewing distance fromuser 200 tovirtual image 300 increases. Whenvirtual image 300 is displayed at a position overlapping or corresponding to an object which exists in the real space, if the blurring ofvirtual image 300 increases, the positions ofvirtual image 300 and the object existing in the real space deviate, anddisplay system 10 may fail to perform intended display. Therefore, when the vibration applied to displayunit 40 exceeds the predetermined first threshold value and the viewing distance of the portion ofvirtual image 300 exceeds the second threshold value,display controller 52 changes the display mode ofvirtual image 300 so as to make blurring of the display position ofvirtual image 300 caused by the vibration less noticeable. - Here, changing of the display mode of
virtual image 300 means visually changing ofvirtual image 300 projected ontotarget space 400, and means changing of the state ofvirtual image 300 viewed byuser 200. In the present embodiment,display controller 52 changes the display mode ofvirtual image 300 by changing one or more of the transmittance, the size, the shape, the color, the outline, and the like ofvirtual image 300. Further, the vibration applied to displayunit 40 is, for example, a vibration generated whenmain body 110 ofvehicle 100 swings (pitching) around the left and right axes, and a vibration generated due to, for example, a road undulation or unevenness of the road surface. In the case ofvehicle 100, such a vibration is, for example, low frequency vibration constantly generated in a cycle of around 1 Hz. The first threshold value is a threshold value set in advance for the vibration applied to displayunit 40. The first threshold value is, for example, a threshold value for the change over time in the magnitude (for example, amplitude) of the vibration, and is set to a value corresponding to the change over time when the pitch angle ofmain body 110 changes in the range of ±0.5 degrees. The second threshold value is a threshold value set in advance for the viewing distance ofvirtual image 300, and may be set to a distance at which the blurring ofvirtual image 300 is easily noticeable due to the vibration. The second threshold value may be set to a distance of, for example, about 50 meters to 100 meters, and is set to 50 meters in the present embodiment. - (3) Operation
- Hereinafter, a basic operation of information presentation system 20 (display system 10) according to the present embodiment will be described.
Controller 5 causesirradiator 3 to irradiatemovable screen 1 a with light. At this time,irradiator 3 emits light scanning on one side ofmovable screen 1 a. Accordingly, an image is formed on (projected onto)movable screen 1 a. Moreover, the light fromirradiator 3 penetratesmovable screen 1 a, and is emitted towindshield 101 from projectionoptical system 4. In this manner, the image formed onmovable screen 1 a is projected ontowindshield 101 from belowwindshield 101 in the cabin ofvehicle 100. - When the image is projected from projection
optical system 4 ontowindshield 101,windshield 101 reflects the light from projectionoptical system 4 toward user 200 (driver) in the cabin. Accordingly, the image reflected bywindshield 101 is viewed byuser 200. It looks like virtual image 300 (firstvirtual image 301 or second virtual image 302) is being projected in front of vehicle 100 (outside of the vehicle) foruser 200. As a result, virtual image 300 (firstvirtual image 301 or second virtual image 302) projected in front of vehicle 100 (outside of the vehicle) is viewed byuser 200, as ifvirtual image 300 is being viewed throughwindshield 101. - Specifically,
controller 5 scans the light on one side ofmovable screen 1 a in a state wheremovable screen 1 a is fixed in movement directions X, so that firstvirtual image 301 viewed with depth alongroad surface 600 is formed. Moreover,controller 5 scans the light on one side ofmovable screen 1 a, while movingmovable screen 1 a so that a distance in directions X between a luminescent spot on one side ofmovable screen 1 a and projectionoptical system 4 is kept constant. Consequently, secondvirtual image 302, which is viewed vertically onroad surface 600 that is positioned at a fixed distance fromuser 200, is formed. - Here, while
irradiator 3 irradiatesmovable screen 1 a with light,drive controller 51 ofcontroller 5 causes driveunit 2 to movemovable screen 1 a in movement directions X. In the case where an irradiation position on one side ofmovable screen 1 a on which the light is emitted fromirradiator 3, that is, a position of the luminescent spot is constant, whenmovable screen 1 a moves in first direction X1, the distance from an eye (eye point) ofuser 200 to virtual image 300 (also referred to as “viewing distance”) becomes shorter. In contrast, in the case where the position of the luminescent spot on one side ofmovable screen 1 a is constant, whenmovable screen 1 a moves in second direction X2, the viewing distance tovirtual image 300 becomes longer (more distant). In other words, the viewing distance tovirtual image 300 changes according to the position ofmovable screen 1 a in movement directions X. - For example, in the case where the viewing distance of first
virtual image 301 is to be changed,controller 5 movesmovable screen 1 a in directions X according to the viewing distance. In the state wheremovable screen 1 a is fixed at a position after the movement, the light is scanned on one side ofmovable screen 1 a. In the case where the viewing distance of secondvirtual image 302 is to be changed,controller 5 movesmovable screen 1 a in directions X according to the viewing distance.Controller 5 scans the light on one side ofmovable screen 1 a, while movingmovable screen 1 a so that the distance in directions X between the luminescent spot and projectionoptical system 4 is kept constant based on the position after the movement. -
Controller 5 also causesirradiator 3 to irradiate fixedscreen 1 b with light. At this moment,irradiator 3 emits, to fixedscreen 1 b, light scanning on one side of fixedscreen 1 b. Accordingly, similarly to the case wheremovable screen 1 a is irradiated with light, an image is formed on (projected onto) fixedscreen 1 b and the image is projected ontowindshield 101. As a result,user 200 is capable of viewing virtual image 300 (third virtual image 303), which is projected in front of vehicle 100 (outside of the vehicle), throughwindshield 101. Since third virtual image 303 is formed by the light projected onto fixedscreen 1 b whose position is fixed, third virtual image 303 is viewed vertically at a predetermined distance (for example, two to three meters) fromuser 200 onroad surface 600. -
Display system 10 according to the present embodiment is capable of projecting all of firstvirtual image 301, secondvirtual image 302, and third virtual image 303 during one cycle in whichscanner 32 makes one round trip in a longitudinal direction ofmovable screen 1 a (in an inclined direction with respect toreference plane 503 ofmovable screen 1 a). Specifically, on “outward way” in which the light is scanned onmovable screen 1 a and fixedscreen 1 b in this order, at first,display unit 40 emits light tomovable screen 1 a to project firstvirtual image 301, and then emits light to fixedscreen 1 b to display third virtual image 303. On “return way” in which the light is scanned on fixedscreen 1 b andmovable screen 1 a in this order, at first,display unit 40 emits light to fixedscreen 1 b to display third virtual image 303, and then emits light tomovable screen 1 a to project secondvirtual image 302. - Therefore, first
virtual image 301, third virtual image 303, and secondvirtual image 302 are projected ontotarget space 400 during one cycle in whichscanner 32 scans in the longitudinal direction. Scanning in the longitudinal direction is performed inirradiator 3 relatively fast, so thatuser 200 recognizes visually that firstvirtual image 301, third virtual image 303, and secondvirtual image 302 look like being displayed simultaneously. Frequency of scanning in the longitudinal direction inirradiator 3 is, for example, greater than or equal to 60 Hz. - With reference to
FIG. 4 , an operation ofinformation presentation system 20 according to the present embodiment, which is executed for changing the display mode of a virtual image based on the vibration information detected bydetection system 7, will be described hereinafter. - When
user 200, as a driver ofvehicle 100, operates an ignition switch or the like, electricity is supplied to information presentation system 20 (display system 10 and detection system 7), so thatinformation presentation system 20 starts to be operated. - Obtaining
unit 6 ofdisplay system 10 obtains, on a regular or irregular basis, fromdetection system 7, detection information of a detection objection present intarget space 400, position information ofvehicle 100, and vehicle information about the state ofvehicle 100. -
Display controller 52 generates content ofvirtual image 300 projected ontotarget space 400 bydisplay unit 40, based on the detection information, position information and vehicle information obtained by obtaining unit 6 (S1), and calculates the viewing distance in which generatedvirtual image 300 is projected. Here, the viewing distance of each pixel invirtual image 300 can be calculated from the angle of view and positions ofscreens irradiator 3, and the like (S2). - Obtaining
unit 6 also obtains, fromvibration detector 71 ofdetection system 7, vibration information about vibration applied tomain body 110 ofvehicle 100 that is vibration information about vibration applied to display unit 40 (S3). -
Controller 5 compares the magnitude of the vibration with a first threshold value based on the vibration information obtained by obtaining unit 6 (S4). - When the change over time in the magnitude of the vibration is less than or equal to the first threshold value (S4: No),
drive controller 51 ofcontroller 5 controls driveunit 2 andirradiator 3 so thatvirtual image 300 generated in Step S1 is projected ontotarget space 400 by display unit 40 (S7). - When the change over time in the magnitude of the vibration exceeds the first threshold value (S4: Yes),
controller 5 compares the magnitude of the viewing distance calculated in Step S2 with a second threshold value. For example, relative to the viewing distance, the viewing distance of each pixel forming the virtual image is compared with the second threshold value. In addition, relative tovirtual image 301 or the like viewed vertically, the viewing distance of at least one pixel may be compared with the second threshold value. Moreover, relative tovirtual image 301 viewed with depth, only a plurality of representative pixels may be compared with the second threshold value (S5). - If the viewing distance calculated in Step S2 is less than or equal to the second threshold value (S5: No),
drive controller 51 ofcontroller 5 controls driveunit 2 andirradiator 3 so thatvirtual image 300 generated in Step S1 is projected ontotarget space 400 by display unit 40 (S7). - In contrast, if the viewing distance calculated in Step S2 exceeds the second threshold value (S5: Yes),
display controller 52 ofcontroller 5 changes the content ofvirtual image 300 generated in Step S1 (S6). - For example,
FIG. 5A is an example ofvirtual images Virtual image 306 is third virtual image 303 for displaying vehicle information of vehicle 100 (own vehicle), and the viewing distance is several meters.Virtual image 304 is firstvirtual image 301 indicating the path of vehicle 100 (own vehicle).Virtual image 304 isvirtual image 300 for instructing to turn left at the second intersection ahead, and the viewing distance of the top portion ofvirtual image 304 is, for example, 80 meters. Since the viewing distance ofvirtual image 306 is less than or equal to the second threshold value,controller 5causes display unit 40 to projectvirtual image 306 ontotarget space 400 without changing the content. In contrast, since the viewing distance of the pixels of the top portion (X1 portion inFIG. 5A ) ofvirtual image 304 exceeds the second threshold value,display controller 52 changes the display mode of the pixels of the top portion ofvirtual image 304 having a viewing distance exceeding the second threshold value. In the example ofFIG. 5B ,display controller 52 changes the content ofvirtual image 304 so as to reduce the transmittance of the top portion ofvirtual image 304. Accordingly, the visibility of the top portion (X2 portion inFIG. 5B ) ofvirtual image 304A after the change is reduced compared with the top portion ofvirtual image 304 before the change, and is reduced also compared with portions ofvirtual image 304A after the change other than the top portion. Therefore, even if the display position ofvirtual image 304A is blurred due to the vibration applied to displayunit 40,user 200 is unlikely to notice the blurring ofvirtual image 304A. As a result, it is possible to make the deviation of the display position ofvirtual image 304A less noticeable. - In this manner, when the display mode of
virtual image 304 is changed,drive controller 51 ofcontroller 5 controls driveunit 2 andirradiator 3 so thatvirtual image 304A having a display mode which has been changed in Step S6 is projected bydisplay unit 40 onto target space 400 (S7). -
Display system 10 projectsvirtual image 300 ontotarget space 400 by repeatedly executing the processing of Steps S1 to S7 described above at a predetermined time interval (for example, 1/60 seconds). Note that once the content ofvirtual image 304 is changed in Step S7,controller 5 only has to change the content ofvirtual image 304 back to the state before the change after the lapse of a predetermined period from when the vibration becomes less than or equal to the first threshold value or when the viewing distance becomes less than or equal to the second threshold value. This makes it difficult for the content ofvirtual image 304 to be frequently changed. - Note that the flowchart in
FIG. 4 is an example, and the processing flow may be changed as appropriate. After the determinations in Step S4 and Step S5 are made, the content ofvirtual image 300 may be generated. Further, in the comparison of the two values in Step S4, whether or not the case where the two values are equal to each other is included can be arbitrarily changed depending on the setting of the first threshold value. Hence,controller 5 may be configured to determine whether or not the change over time in the magnitude of the vibration is greater than or equal to the first threshold value. In other words,controller 5 may be configured to determine whether or not the change over time in the magnitude of the vibration is less than the first threshold value. Similarly, in the comparison of two values in Step S4,controller 5 may be configured to determine whether or not the viewing distance is greater than or equal to the second threshold value. In other words,controller 5 may be configured to determine whether or not the viewing distance is less than the second threshold value. - Moreover, in the example of
FIG. 5B , when the vibration applied to displayunit 40 exceeds the first threshold value,display controller 52 reduces the transmittance of the portion ofvirtual image 304 having a viewing distance which exceeds the second threshold value. However, the change of the display mode is not limited to reduction of the transmittance.Display controller 52 may change the display mode ofvirtual image 300 by changing one or more of the transmittance, the size, the shape, the color, the outline, and the like of virtual image 304 (300). - For example, when the vibration exceeds the first threshold value and the viewing distance exceeds the second threshold value in the state where
display unit 40 projectsvirtual image 304 ontotarget space 400 as illustrated inFIG. 6A ,display controller 52 may change the shape ofvirtual image 304 so that blurring ofvirtual image 304 caused by the vibration becomes less noticeable. Specifically,display controller 52 displays, intarget space 400,virtual image 304B (seeFIG. 6B ) having a shape changed so that the viewing distance becomes less than or equal to the second threshold value. Sincevirtual image 304 before the change is a virtual image instructing to turn left at the second intersection ahead, the viewing distance of the top portion ofvirtual image 304 is about 80 meters and exceeds the second threshold value. In contrast,virtual image 304B after the change is a virtual image instructing to go straight at the first intersection ahead, and the viewing distance of the top portion ofvirtual image 304B is less than the second threshold value. As a result, even when the vibration applied to displayunit 40 exceeds the first threshold value, blurring ofvirtual image 304B becomes less noticeable and the deviation of the display position ofvirtual image 304B becomes less noticeable. - In addition, when the vibration exceeds the first threshold value and the viewing distance of
virtual image 300 exceeds the second threshold value,controller 5 may change the size ofvirtual image 300 so that blurring ofvirtual image 300 caused by the vibration becomes less noticeable. For example, as illustrated inFIG. 7A , when the vibration exceeds the first threshold value and the viewing distance ofvirtual image 305 exceeds the second threshold value in the state wheredisplay unit 40 projects circularvirtual image 305 surroundingobstacle 700 onroad surface 600,controller 5 changes the size ofvirtual image 305. Specifically, as illustrated inFIG. 7B ,display controller 52 makes the size ofvirtual image 305A after the change larger than the size ofvirtual image 305 before the change. As a result, the range surrounded byvirtual image 305A increases, so that even if the position ofvirtual image 305A is blurred due to the vibration applied to displayunit 40, the positions ofvirtual image 305A andobstacle 700 are unlikely to deviate, and the deviation of the display position ofvirtual image 305A becomes less noticeable. If the vibration exceeds the first threshold value and the viewing distance ofvirtual image 305 exceeds the second threshold value,display controller 52 may make the size ofvirtual image 305 smaller than that before the change. By reducing the size ofvirtual image 305, the virtual image after the change becomes less noticeable, and the deviation of the display position of the virtual image after the change becomes less noticeable. - Moreover, when the vibration applied to display
unit 40 exceeds the first threshold value and the viewing distance ofvirtual image 300 exceeds the second threshold value,display controller 52 may change the color of the portion ofvirtual image 300 having a viewing distance exceeding the second threshold value into a color that is less noticeable than a portion ofvirtual image 300 having a viewing distance less than or equal to the second threshold value. Moreover, when the vibration applied to displayunit 40 exceeds the first threshold value and the viewing distance ofvirtual image 300 exceeds the second threshold value,display controller 52 may change the depth of the color of the portion ofvirtual image 300 having a viewing distance exceeding the second threshold value into a less depth color compared with the portion having a viewing distance less than or equal to the second threshold value. Moreover, when the vibration applied to displayunit 40 exceeds the first threshold value and the viewing distance ofvirtual image 300 exceeds the second threshold value,display controller 52 may blur the outline of the portion ofvirtual image 300 having a viewing distance exceeding the second threshold value or reduce the contrast ofvirtual image 300. -
Display controller 52 may makevirtual image 300 after the change less noticeable by combining and changing two or more of the transmittance, the size, the shape, the color, the outline, and the like ofvirtual image 300. - (4) Modifications
- The embodiment described above is merely one of various embodiments of the present disclosure. The embodiment described above can be variously modified in accordance with a design, for example, as long as the object of the present disclosure can be achieved. The same function as
display system 10 may be embodied by, for example, a method for controllingdisplay system 10, a program, or a non-transitory recording medium storing a program. A method for controllingdisplay system 10 according to an aspect is a method for controllingdisplay system 10 that includesdisplay unit 40 which displaysvirtual image 300 intarget space 400,controller 5 which controls the display ofdisplay unit 40, and obtainingunit 6 which obtains vibration information about vibration applied to displayunit 40. In the method for controllingdisplay system 10 according to an aspect, when the vibration exceeds the first threshold value and the viewing distance ofvirtual image 300 exceeds the second threshold value,controller 5 changes, based on the vibration information, the display mode ofvirtual image 300 displayed bydisplay unit 40. A (computer) program according to an aspect is a program for causing a computer system to execute the method for controllingdisplay system 10. - Modifications of the above embodiment will be described below. Modifications described below can be applied in appropriate combination.
- An executing subject of
display system 10,information presentation system 20, or the method for controllingdisplay system 10 according to the present disclosure includes a computer system. The computer system mainly includes a processor as hardware and a memory. The processor executes the program stored in the memory of the computer system, so that function, as the executing subject ofdisplay system 10,information presentation system 20, or the method for controllingdisplay system 10 according to the present disclosure, is realized. The program may be stored in the memory of the computer system in advance, may be supplied through a telecommunication line, or may be supplied in the state where the program is stored in the non-transitory recording medium which can be read by the computer system. Examples of this type of the non-transitory recording medium includes a memory card, an optical disk, and a hard disk drive. The processor of the computer system is composed of one or a plurality of electronic circuits including semiconductor integrated circuit (IC) or large-scale integration (LSI). The plurality of electronic circuits may be integrated in one chip, or may be provided on a plurality of chips so as to be dispersed. The plurality of chips may be integrated in one device, or may be provided on a plurality of devices so as to be dispersed. - Functions of obtaining
unit 6,display unit 40, andcontroller 5 ofdisplay system 10 may be provided on two or more systems so as to be dispersed. A function ofcontroller 5 ofdisplay system 10 may be realized by, for example, cloud (cloud computing). -
Information presentation system 20 is realized withdisplay system 10 anddetection system 7. However,information presentation system 20 is not limited to this configuration.Information presentation system 20 may be realized with, for example, one ofdisplay system 10 anddetection system 7. For example, a function ofdetection system 7 may be integrated indisplay system 10. -
Controller 5 indisplay system 10 may determine the content ofvirtual image 300 based on information obtained by communication between a communicator included indisplay system 10 orvehicle 100 and an outside.Controller 5 may determine the content ofvirtual image 300 based on information obtained by an inter vehicle communication (V2V: Vehicle-to-Vehicle) betweenvehicle 100 and a peripheral vehicle, an intervehicle and road-vehicle communication (V2X: Vehicle-to-Everything) betweenvehicle 100 and the peripheral vehicle or infrastructure, or the like. The content ofvirtual image 300 projected ontotarget space 400 may be determined by the infrastructure. In this case, at least a part ofcontroller 5 may not be installed onvehicle 100. - Moreover,
display system 10 is not limited to the configuration of projectingvirtual image 300 ontotarget space 400 set in front ofvehicle 100 in the traveling direction. For example,display system 10 may projectvirtual image 300 in a side direction, a rear direction, or an upper direction and the like in the traveling direction ofvehicle 100. - In addition,
display system 10 is not limited to the head-up display for use invehicle 100. For example,display system 10 is also applicable for a mobile body other thanvehicle 100. Examples of the mobile body other thanvehicle 100 includes a motorcycle, a train, an aircraft, a construction machine, and a vessel. Moreover, the place of use ofdisplay system 10 is not limited to the mobile body. For example,display system 10 may be used in an amusement facility.Display system 10 may also be used as a wearable terminal such as Head Mounted Display (HMD). Furthermore,display system 10 may be used at a medical facility, and may be used as a stationary device. - Although
display unit 40 ofdisplay system 10 includesmovable screen 1 a and fixedscreen 1 b, it is sufficient thatdisplay unit 40 includes at leastmovable screen 1 a. -
Display unit 40 is not limited to the configuration of projecting the virtual image by a laser beam. For example,display unit 40 may also be configured so that a projector projects an image (virtual image 300) onto a diffusetransmission type screen 1 from behindscreen 1.Display unit 40 may project, through projectionoptical system 4,virtual image 300 corresponding to an image displayed by a liquid crystal display. - The reflection member which reflects the light emitted from
display unit 40 is composed ofwindshield 101 ofvehicle 100. The reflection member is not limited towindshield 101. The reflection member may be a transparent plate provided separately fromwindshield 101. - (Summary)
- As described above, the display system (10) according to a first aspect includes, in the first aspect, the display unit (40), the controller (5), and the obtaining unit (6). The display unit (40) displays a virtual image (300 to 305, 304A, 304B, 305A) in the target space (400). The controller (5) controls the display of the display unit (40). The obtaining unit (6) obtains vibration information about vibration applied to the display unit (40). When the vibration exceeds the first threshold value and the viewing distance of the virtual image (300 to 305, 304A, 304B, 305A) exceeds the second threshold value, the controller (5) changes the display mode of the virtual image (300 to 305, 304A, 304B, 305A) based on the vibration information.
- According to the first aspect, when the vibration exceeds the first threshold value, and the viewing distance exceeds the second threshold value, blurring of the virtual image (300 to 305, 304A, 304B, 305A) can be made less noticeable compared with the case where the display mode of the virtual image (300 to 305, 304A, 304B, 305A) does not change. Accordingly, the deviation of the display position of the virtual image (300 to 305, 304A, 304B, 305A) can be made less noticeable.
- In the display system (10) according to a second aspect, in the first aspect, the display mode is transmittance of the virtual image (300 to 305, 304A, 304B, 305A).
- According to the second aspect, visibility of the virtual image (300 to 305, 304A, 304B, 305A) can be changed by changing the transmittance. Accordingly, the deviation of the display position of the virtual image (300 to 305, 304A, 304B, 305A) caused by the vibration can be made less noticeable.
- In the display system (10) according to a third aspect, in the first or second aspect, the display mode is the size of the virtual image (300 to 305, 304A, 304B, 305A).
- According to the third aspect, by increasing the size of the virtual image (300 to 305, 304A, 304B, 305A), even when the position deviates by the vibration, the positional deviation between an object which exists in the real space and the virtual image (300 to 305, 304A, 304B, 305A) becomes less noticeable. In contrast, by reducing the size of the virtual image (300 to 305, 304A, 304B, 305A), the virtual image (300 to 305, 304A, 304B, 305A) itself becomes less noticeable. Hence, the positional deviation can be made less noticeable.
- The display system (10) according to a fourth aspect, in any one of the first to third aspects, the display mode is the shape of the virtual image (300 to 305, 304A, 304B, 305A).
- According to the fourth aspect, the shape of the virtual image (300 to 305, 304A, 304B, 305A) can be changed into such a shape that the positional deviation of the virtual image (300 to 305, 304A, 304B, 305A) caused by the vibration is less noticeable.
- In display system (10) according to a fifth aspect, in the fourth aspect, the controller (5) changes the shape of the virtual image (300 to 305, 304A, 304B, 305A) so that the viewing distance becomes less than or equal to the second threshold value.
- According to the fifth aspect, by changing the shape of the virtual image (300 to 305, 304A, 304B, 305A) into such a shape having a viewing distance less than or equal to the second threshold value, the positional deviation caused by the vibration becomes less noticeable, compared with the case where the virtual image (300 to 305, 304A, 304B, 305A) having a viewing distance exceeding the second threshold value is displayed.
- In the display system (10) according to a sixth aspect, in any one of the first to fifth aspects, the vibration is a change over time in the orientation of the display unit (40).
- According to the sixth aspect, by using the detection result of the sensor (71) for detecting the orientation of the display unit (40), it can be determined whether or not the vibration applied to the display unit (40) exceeds the first threshold value.
- The information presentation system (20) according to a seventh aspect includes the display system (10) according to any one of the first to sixth aspects, and the detection system (7) which detects vibration applied to the display unit (40). The obtaining unit (6) obtains detection information from the detection system (7).
- According to the seventh aspect, it is possible to realize the information presentation system (20) capable of making the deviation of the display position of the virtual image less noticeable.
- The method for controlling the display system (10) according to an eighth aspect is a method for controlling the display system (10) including the display unit (40), the controller (5), and the obtaining unit (6). The display unit (40) displays the virtual image (300 to 305, 304A, 304B, 305A) in the target space (400). The controller (5) controls the display of the display unit (40). The obtaining unit (6) obtains vibration information about vibration applied to the display unit (40). In the control method according to the eighth aspect, when the vibration exceeds the first threshold value and the viewing distance of the virtual image (300 to 305, 304A, 304B, 305A) exceeds the second threshold value, the controller (5) changes the display mode of the virtual image (300 to 305, 304A, 304B, 305A) based on the vibration information.
- According to the eighth aspect, when the vibration exceeds the first threshold value, and the viewing distance exceeds the second threshold value, the positional deviation of the virtual image (300 to 305, 304A, 304B, 305A) can be made less noticeable, compared with the case where the display mode of the virtual image (300 to 305, 304A, 304B, 305A) does not change.
- The program according to a ninth aspect is a program for causing a computer system to execute the method for controlling the display system (10) according to the eighth aspect.
- The mobile body (100) according to a tenth aspect includes: the display system (10) according to any one of the first to sixth aspects, and the reflection member (101). The reflection member (101) has light transmitting properties, and reflects the light emitted from the display unit (40) so that the virtual image (300 to 305, 304A, 304B, 305A) is viewed by the target person (200).
- According to the tenth aspect, it is possible to realize the mobile body (100) capable of making the deviation of the display position of the virtual image less noticeable.
- The configurations according to the second to sixth aspects are not essential configurations for the display system (10), and the configurations may be appropriately omitted.
-
-
- 1 screen
- 1 a movable screen
- 1 b fixed screen
- 2 drive unit
- 3 irradiator
- 4 projection optical system
- 5 controller
- 6 obtaining unit
- 7 detection system
- 10 display system
- 20 information presentation system
- 40 display unit
- 41 magnifying lens
- 42, 43 mirror
- 52 display controller
- 71 vibration detector
- 100 vehicle (mobile body)
- 101 windshield (reflection member)
- 102 dashboard
- 110 main body
- 200 user (target person)
- 300 to 306, 304A, 304B, 305A virtual image
- 400 target space
- 500 optical axis
- 501, 502 virtual plane
- 503 reference plane
- 600 road surface
- 700 obstacle
Claims (10)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017129898A JP6796806B2 (en) | 2017-06-30 | 2017-06-30 | Display system, information presentation system, display system control method, program, and mobile |
JP2017-129898 | 2017-06-30 | ||
PCT/JP2018/024285 WO2019004244A1 (en) | 2017-06-30 | 2018-06-27 | Display system, information presentation system, method for controlling display system, program, and mobile body |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/024285 Continuation WO2019004244A1 (en) | 2017-06-30 | 2018-06-27 | Display system, information presentation system, method for controlling display system, program, and mobile body |
Publications (2)
Publication Number | Publication Date |
---|---|
US20200124849A1 true US20200124849A1 (en) | 2020-04-23 |
US10649207B1 US10649207B1 (en) | 2020-05-12 |
Family
ID=64742989
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/724,766 Active US10649207B1 (en) | 2017-06-30 | 2019-12-23 | Display system, information presentation system, method for controlling display system, recording medium, and mobile body |
Country Status (5)
Country | Link |
---|---|
US (1) | US10649207B1 (en) |
JP (1) | JP6796806B2 (en) |
CN (1) | CN111033607A (en) |
DE (1) | DE112018003346B4 (en) |
WO (1) | WO2019004244A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11170537B2 (en) * | 2017-08-10 | 2021-11-09 | Nippon Seiki Co., Ltd. | Vehicle display device |
US11481956B2 (en) * | 2019-02-20 | 2022-10-25 | Canon Medical Systems Corporation | Medical image processing apparatus and medical image processing method using depth-dependent transmittance and opacity information |
US20220383567A1 (en) * | 2021-06-01 | 2022-12-01 | Mazda Motor Corporation | Head-up display device |
US20220396149A1 (en) * | 2021-06-10 | 2022-12-15 | Toyota Jidosha Kabushiki Kaisha | Vehicle display device, display method, and storage medium |
US20230093446A1 (en) * | 2020-02-20 | 2023-03-23 | Sony Group Corporation | Information processing device, information processing method, and program |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7135052B2 (en) * | 2020-10-29 | 2022-09-12 | ソフトバンク株式会社 | Control device, program, and control method |
Family Cites Families (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040080541A1 (en) * | 1998-03-20 | 2004-04-29 | Hisashi Saiga | Data displaying device |
JP3695315B2 (en) * | 2000-11-14 | 2005-09-14 | 日産自動車株式会社 | Vehicle display device |
JP2005156480A (en) * | 2003-11-28 | 2005-06-16 | Sony Corp | Information service system |
US7561966B2 (en) * | 2003-12-17 | 2009-07-14 | Denso Corporation | Vehicle information display system |
CN100544404C (en) * | 2003-12-26 | 2009-09-23 | 松下电器产业株式会社 | Image pick-up device, product component and semiconductor integrated circuit |
JP2006142897A (en) | 2004-11-17 | 2006-06-08 | Nissan Motor Co Ltd | Display device for vehicle, and controlling method of display device for vehicle |
JP2010224665A (en) * | 2009-03-19 | 2010-10-07 | Sony Corp | Light-tactility conversion system, and method for providing tactile feedback |
JP2014153645A (en) * | 2013-02-13 | 2014-08-25 | Seiko Epson Corp | Image display device and display control method of image display device |
JP6176478B2 (en) * | 2013-04-26 | 2017-08-09 | 日本精機株式会社 | Vehicle information projection system |
JP5987791B2 (en) | 2013-06-28 | 2016-09-07 | 株式会社デンソー | Head-up display and program |
CN104580966A (en) * | 2013-10-22 | 2015-04-29 | 光宝科技股份有限公司 | Projecting device and projected image processing method thereof |
US20160216521A1 (en) | 2013-10-22 | 2016-07-28 | Nippon Seiki Co., Ltd. | Vehicle information projection system and projection device |
JP6201690B2 (en) | 2013-11-28 | 2017-09-27 | 日本精機株式会社 | Vehicle information projection system |
TWI500966B (en) * | 2014-02-20 | 2015-09-21 | 中強光電股份有限公司 | Head-up display |
JP6273976B2 (en) * | 2014-03-31 | 2018-02-07 | 株式会社デンソー | Display control device for vehicle |
JP6409337B2 (en) * | 2014-05-23 | 2018-10-24 | 日本精機株式会社 | Display device |
JP2015226304A (en) | 2014-05-30 | 2015-12-14 | 日本精機株式会社 | Projection device for vehicle and head-up display system |
JP2016021045A (en) | 2014-06-16 | 2016-02-04 | パナソニックIpマネジメント株式会社 | Display controller, display control method, display control program and display device |
CN105988643A (en) * | 2015-02-16 | 2016-10-05 | 联想(北京)有限公司 | Information processing method and electronic device |
KR20160120458A (en) * | 2015-04-08 | 2016-10-18 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
JP6516223B2 (en) * | 2015-06-30 | 2019-05-22 | パナソニックIpマネジメント株式会社 | Display device |
JP2017013590A (en) | 2015-06-30 | 2017-01-19 | 日本精機株式会社 | Head-up display device |
JP6562079B2 (en) * | 2015-09-18 | 2019-08-21 | 日産自動車株式会社 | Display device for vehicle and display method for vehicle |
BR112018007120B1 (en) * | 2015-10-09 | 2023-05-02 | Nissan Motor Co., Ltd | VEHICLE DISPLAY DEVICE AND VEHICLE DISPLAY METHOD |
JP6570424B2 (en) | 2015-11-05 | 2019-09-04 | アルパイン株式会社 | Electronic equipment |
JP2017094882A (en) | 2015-11-23 | 2017-06-01 | アイシン・エィ・ダブリュ株式会社 | Virtual image generation system, virtual image generation method and computer program |
JP6601441B2 (en) * | 2017-02-28 | 2019-11-06 | 株式会社デンソー | Display control apparatus and display control method |
-
2017
- 2017-06-30 JP JP2017129898A patent/JP6796806B2/en active Active
-
2018
- 2018-06-27 CN CN201880056662.7A patent/CN111033607A/en active Pending
- 2018-06-27 DE DE112018003346.5T patent/DE112018003346B4/en active Active
- 2018-06-27 WO PCT/JP2018/024285 patent/WO2019004244A1/en active Application Filing
-
2019
- 2019-12-23 US US16/724,766 patent/US10649207B1/en active Active
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11170537B2 (en) * | 2017-08-10 | 2021-11-09 | Nippon Seiki Co., Ltd. | Vehicle display device |
US11481956B2 (en) * | 2019-02-20 | 2022-10-25 | Canon Medical Systems Corporation | Medical image processing apparatus and medical image processing method using depth-dependent transmittance and opacity information |
US20230093446A1 (en) * | 2020-02-20 | 2023-03-23 | Sony Group Corporation | Information processing device, information processing method, and program |
US20220383567A1 (en) * | 2021-06-01 | 2022-12-01 | Mazda Motor Corporation | Head-up display device |
US20220396149A1 (en) * | 2021-06-10 | 2022-12-15 | Toyota Jidosha Kabushiki Kaisha | Vehicle display device, display method, and storage medium |
Also Published As
Publication number | Publication date |
---|---|
DE112018003346T5 (en) | 2020-03-05 |
JP2019012238A (en) | 2019-01-24 |
US10649207B1 (en) | 2020-05-12 |
DE112018003346B4 (en) | 2022-06-23 |
JP6796806B2 (en) | 2020-12-09 |
CN111033607A (en) | 2020-04-17 |
WO2019004244A1 (en) | 2019-01-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10600250B2 (en) | Display system, information presentation system, method for controlling display system, computer-readable recording medium, and mobile body | |
US10649207B1 (en) | Display system, information presentation system, method for controlling display system, recording medium, and mobile body | |
US11951834B2 (en) | Information provision device, information provision method, and recording medium storing information provision program for a vehicle display | |
US10699486B2 (en) | Display system, information presentation system, control method of display system, storage medium, and mobile body | |
US10852818B2 (en) | Information provision device and information provision method | |
US10551619B2 (en) | Information processing system and information display apparatus | |
EP3093194B1 (en) | Information provision device | |
US8536995B2 (en) | Information display apparatus and information display method | |
JP7113259B2 (en) | Display system, information presentation system including display system, display system control method, program, and mobile object including display system | |
JP6883759B2 (en) | Display systems, display system control methods, programs, and mobiles | |
JP2010070117A (en) | Image irradiation system and image irradiation method | |
US10983343B2 (en) | Display system, moving vehicle, method for controlling the display system, and non-transitory storage medium | |
JP7266257B2 (en) | DISPLAY SYSTEM AND METHOD OF CONTROLLING DISPLAY SYSTEM | |
JP2020060435A (en) | Image drawing device, image drawing method, and program | |
JP2019174349A (en) | Movement route guiding device, moving body, and movement route guiding method | |
JP7338632B2 (en) | Display device | |
JP7434894B2 (en) | Vehicle display device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TSUJI, MASANAGA;SHIBATA, TADASHI;NAKANO, NOBUYUKI;AND OTHERS;SIGNING DATES FROM 20191217 TO 20191218;REEL/FRAME:052143/0608 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
AS | Assignment |
Owner name: PANASONIC AUTOMOTIVE SYSTEMS CO., LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD.;REEL/FRAME:066703/0177 Effective date: 20240207 |