WO2021149740A1 - Display apparatus, movable body, display method, program, and non-transitory recording medium - Google Patents

Display apparatus, movable body, display method, program, and non-transitory recording medium Download PDF

Info

Publication number
WO2021149740A1
WO2021149740A1 PCT/JP2021/001913 JP2021001913W WO2021149740A1 WO 2021149740 A1 WO2021149740 A1 WO 2021149740A1 JP 2021001913 W JP2021001913 W JP 2021001913W WO 2021149740 A1 WO2021149740 A1 WO 2021149740A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
vehicle
movable body
presentation
display apparatus
Prior art date
Application number
PCT/JP2021/001913
Other languages
French (fr)
Inventor
Yuuki Suzuki
Kazuhiro Takazawa
Masato Kusanagi
Yuki Hori
Shin Sekiya
Original Assignee
Ricoh Company, Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2020205061A external-priority patent/JP2021117220A/en
Application filed by Ricoh Company, Ltd. filed Critical Ricoh Company, Ltd.
Priority to EP21704029.4A priority Critical patent/EP4093629A1/en
Publication of WO2021149740A1 publication Critical patent/WO2021149740A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • B60K35/23
    • B60K35/28
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3632Guidance using simplified or iconic instructions, e.g. using arrows
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/365Guidance using head up displays or projectors, e.g. virtual vehicles or arrows projected on the windscreen or on the road itself
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • B60K2360/166
    • B60K2360/177
    • B60K2360/31
    • B60K2360/347
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0141Head-up displays characterised by optical features characterised by the informative content of the display

Definitions

  • the present invention relates to a display apparatus, a movable body, a display method, a program, and a non-transitory recording medium.
  • a display apparatus such as a head-up display with which various information is superimposed on a forward view of a vehicle by projecting of a projection image including various information on a windshield of the vehicle to allow a driver of the vehicle to see the projection image is known.
  • a technique of displaying, as displayed information displayed by such a display apparatus, an image indicating a planned traveling route (planned moving route) of a vehicle as route guide information for guiding the driver of the vehicle onto a route of the vehicle and changing a display mode based on a remaining distance up to a location at which information is to be provided to the driver is disclosed (see, for example, PTL 1).
  • PTL 1 may require a large projection image angle to display an image, resulting in an increase in the size and cost of a display apparatus.
  • An object of the present disclosure is to enable presenting route guidance information even with a small projection image angle.
  • a display apparatus displays a projection image, and includes an image generating unit configured to generate a presentation image to be used to present route guidance information; and a projecting unit provided at a movable body and configured to project the projection image including the presentation image.
  • the presentation image presents a planned moving direction of the movable body.
  • route guidance information can be presented even with a small projection image angle.
  • Fig. 1 is a view depicting an example of a configuration in which a display apparatus according to a present embodiment is mounted at a vehicle.
  • Fig. 2 is a view depicting an example of a configuration of a display unit.
  • Fig. 3 is a block diagram depicting an example of a configuration of an on-vehicle system in which the display apparatus is provided in the vehicle.
  • Fig. 4 is a block diagram depicting an example of a configuration of a detecting apparatus.
  • Fig. 5 is a block diagram depicting an example of a functional configuration of an identifying apparatus;
  • Fig. 6 is a diagram depicting a position of an obstacle relative to the vehicle or an indication piece image of the vehicle.
  • Fig. 1 is a view depicting an example of a configuration in which a display apparatus according to a present embodiment is mounted at a vehicle.
  • Fig. 2 is a view depicting an example of a configuration of a display unit.
  • Fig. 3 is a block diagram depict
  • FIG. 7 is a block diagram depicting an example of a hardware configuration of a control unit according to the present embodiment.
  • Fig. 8 is a block diagram depicting an example of a functional configuration of the control unit according to the present embodiment.
  • Fig. 9 is a flowchart depicting an example of a process performed by the control unit according to the present embodiment.
  • Fig. 10 is a view depicting an example of displaying at a forward view performed when a remaining distance is greater than a presentation threshold.
  • Fig. 11 is a view depicting an example of displaying at a forward view performed when the remaining distance is smaller than or equal to the presentation threshold.
  • Fig. 12A is a diagram depicting relationships between a size of a projection image and a presentation image in a comparative example.
  • FIG. 12B is a diagram depicting relationships between a size of a projection image and a presentation image according to the present embodiment.
  • Fig. 13 is a diagram depicting an example of relationships between a size of a projection image and a projection image angle.
  • Fig. 14 is a diagram depicting an example of relationships between a display spacing of arrow graphic symbols and a superimposing area.
  • Fig. 15 is a view depicting an example of displaying positions of arrow graphic symbols.
  • a presentation image is generated to present a planned moving direction of a movable body as route guidance information, and a projection image including the presentation image is projected and displayed by a projecting unit provided at the movable body.
  • a movable body means an object capable of motion.
  • a movable body may be a land movable body capable of moving on land, such as an automobile, a vehicle, a train, or a forklift; an aerial movable body capable of moving in the air, such as an airplane, a balloon, or a drone; or a water movable body capable of moving on water, such as a ship, a boat, a vessel, or a steam ship.
  • a forward view is a view in a forward direction of a movable body seen by a driver of the movable body.
  • a forward view is a view in a forward direction of a vehicle seen by a driver through a windshield of the vehicle. (Planned moving route)
  • a planned moving route is a portion of a route through which a movable body is planned from a starting point to a destination, and the portion of the route is a portion through which the movable body moves immediately after the present time.
  • a planned moving route of a vehicle is referred to as a planned moving route because a movable body is a vehicle in one example. (Planned moving direction)
  • a planned moving direction means a direction in which a movable body is planned to move while moving through a planned moving route. For example, in a case where there is an intersection on the planned moving route, a direction in which the movable body will move at the intersection corresponds to a planned moving direction. For the present embodiment, a description will be made assuming that, because a case where a movable body is a vehicle is used as the example, a planned moving direction is referred to as a planned travel direction of a vehicle. (Route guidance information)
  • Route guidance information denotes information to be used to guide a movable body according to a route from its starting point to its destination.
  • Route guidance information includes information concerning a planned moving route (planned traveling route), a planned moving direction (planned traveling direction), and a location at which route guidance information is provided. (Projection image)
  • a projection image denotes an image projected by a projecting unit and visible to a driver of a movable body.
  • a presentation image is an image indicating route guidance information. As a result of being projected by a projecting unit, a presentation image becomes visible to a driver of a movable body.
  • a presentation image according to the present embodiment presents a planned moving direction of a movable body (a planned traveling direction of a vehicle). (Driver)
  • a driver is a person who operates or controls a movable body.
  • HUD head-up display
  • Fig. 1 is a diagram depicting an example of a configuration in which a display apparatus 1 according to the present embodiment is provided in a vehicle 8.
  • the display apparatus 1 is embedded in the inside of a dashboard provided in the vehicle 8 and projects a projection image toward a windshield 71 through an emitting window 3 provided at an upper surface of the display apparatus 1.
  • a projection image is displayed as a virtual image I in a Y direction of the windshield 71 and is visible by a driver V of the vehicle 8. By viewing the virtual image I, the driver V can obtain useful information for driving with less line-of-sight movement while keeping the eyes on other vehicles or a road in the Y direction of the vehicle 8.
  • the display apparatus 1 may be installed at the ceiling, a sun visor, or the like, other than the dashboard.
  • the display apparatus 1 includes the projecting unit 10 and a control unit 20.
  • the projecting unit 10 projects a projection image toward the windshield 71.
  • a laser scanning method and a panel method are known.
  • the laser scanning method is a method in which a laser beam emitted from a laser light source is deflected by a two-dimensional scanning device to form an intermediate image (a real image projected onto a screen that will be described later).
  • the panel method is a method of using an imaging device such as a liquid crystal panel, a digital micro-mirror device (DMD) panel, or a vacuum fluorescent display (VFD) to form an intermediate image.
  • DMD digital micro-mirror device
  • VFD vacuum fluorescent display
  • the laser scanning method is suitable because it is possible to form a high contrast image because a pixel emitting light or a pixel not emitting light is assigned to each pixel. It has been found out that high contrast improves visibility and allows vehicle occupants to view information with less attention resources than a HUD employing the panel method.
  • a display frame may be projected at an area where the HUD is capable of displaying an image (such a phenomenon is called a post card).
  • a post card such a phenomenon is called a post card.
  • the laser scanning method there is no such phenomenon, and only the content can be projected.
  • AR augmented reality
  • our visible world is virtually augmented by superimposing an image of an object that does not exist in a real landscape on the real landscape.
  • a HUD employing the panel method may be used as long as the HUD can display information in a manner providing visibility with fewer attention resources (to avoid eye fatigue).
  • Fig. 2 is a diagram depicting an example of a configuration of the projecting unit 10.
  • the projecting unit 10 includes a light source unit 101, a light deflector 102, a mirror 103, a screen 104, and a concave mirror 105. It should be noted that Fig. 2 merely depicts an example.
  • the projecting unit 10 may include other elements, and need not include all the elements depicted.
  • the light source unit 101 includes three laser light sources corresponding to red (R), green (G), and blue (B) (hereinafter referred to as laser diodes (LDs)), coupling lenses, apertures, composite elements, and lenses, and combines laser beams emitted from the three LDs and directs the combined laser beam toward a reflective surface of the light deflector 102.
  • the laser beam directed to the reflective surface of the light deflector 102 is deflected two-dimensionally by the light deflector 102.
  • the light deflector 102 As the light deflector 102, one microscopic mirror oscillating with respect to two orthogonal axes or two microscopic mirrors each oscillating or pivoting with respect to one axis can be used.
  • the light deflector 102 can be a micro electro mechanical systems (MEMS) mirror fabricated through a semiconductor process, or the like.
  • MEMS micro electro mechanical systems
  • the light deflector 102 can be driven by an actuator that is driven by deformation force of a piezoelectric element.
  • a galvano-mirror, a polygon mirror, or the like also may be used as the light deflector 102.
  • a two-dimensionally deflected laser beam outgoing from the light deflector 102 is incident on the mirror 103 and is bent by the mirror 103 to draw a two-dimensional image (an intermediate image) on a surface (scanned surface) of the screen 104.
  • a concave mirror or the like may be used, but a convex mirror or a planar mirror may also be used.
  • the screen 104 is desirably a micro-lens array or a micro-mirror array having a function of diverging a laser beam with a desired divergence angle, but a diffuser plate for diffusing a laser beam, a reflector plate or a transparent plate having a smooth surface, or the like, may also be used instead.
  • a HUD device the arrangement from the light source unit 101 through the screen 104 is referred to as a HUD device.
  • other elements may also be included in the HUD device.
  • a laser beam outgoing from the screen 104 is reflected by the concave mirror 105 and projected onto the windshield 71.
  • the concave mirror 105 acts like a lens and has a capability of forming an image at a predetermined focal length.
  • R 2 the focal length of the concave mirror 105.
  • At least part of the light beam incident on the windshield 71 is reflected toward the point of view E of the driver V of the vehicle 8.
  • the driver V of the vehicle 8 can see the virtual image I obtained from magnifying the intermediate image on the screen 104 via the windshield 71.
  • the virtual image I obtained from magnifying the intermediate image is displayed through the windshield 71.
  • the windshield 71 is slightly curved rather than flat. Therefore, although the imaged position of the virtual image I is determined not only by the focal length of the concave mirror 105 but also by the curved surface of the windshield 71, the distance R is determined approximately by the distance R 1 + R 2 as described above. The distance R 1 or R 2 may be increased in order to image the virtual image I remotely so that a line-of-sight movement of a viewer can be reduced.
  • One method for increasing the distance R 1 is to bend the optical path using a mirror, while a method for increasing the distance R 2 is to adjust the focal length of the concave mirror 105.
  • At least one of the mirror 103 and the concave mirror 105 be designed and disposed to compensate for optical distortion, because an influence of the windshield 71 causes optical distortion resulting in a horizontal line of an intermediate image becoming convex upward or downward.
  • a projection image be corrected in consideration of such optical distortion.
  • a combiner may be provided as a transmission reflection member on the point-of-view-E side of the windshield 71.
  • information can be provided in a form of a virtual image I in the same way as the case where the windshield 71 is irradiated with light outgoing from the concave mirror 105.
  • FIG. 3 is a block diagram depicting an example of a configuration of an on-vehicle system 2 in which a display apparatus 1 is provided in a vehicle 8.
  • the on-vehicle system 2 includes a car navigation system 11, an engine electronic control unit (ECU) 12, the display apparatus 1, a braking ECU 13, a steering ECU 14, an identifying apparatus 15, and a detecting apparatus 16, communicating together via an on-board network NW such as a controller area network (CAN) bus.
  • NW controller area network
  • the car navigation system 11 includes a global navigation satellite system (GNSS) such as a global positioning system (GPS) to detect the current location of the vehicle 8 and display the location of the vehicle 8 on an electronic map.
  • GNSS global navigation satellite system
  • GPS global positioning system
  • the car navigation system 11 receives a departure position and a destination that have been input, searches for a route from the departure position to the destination, and displays the route on the electronic map, or guides an occupant of the vehicle 8 in a direction of traveling by voice, letters (displayed on a display), or an animation before the course changes.
  • the car navigation system 11 may communicate with a server via a cellular phone network or the like. In this case, the server can send the electronic map to the vehicle 8 or perform a route search.
  • the engine ECU 12 determines an ideal amount of fuel injection, performs advance/lag ignition timing control, controls a valve operating mechanism, and so forth according to information from various sensors and conditions of the vehicle 8.
  • the engine ECU 12 determines whether to make a gearshift with reference a map where, with respect to relationships between current vehicle speed and accelerator opening degree, gearshift lines are set.
  • the engine ECU 12 performs acceleration and deceleration control while following a preceding vehicle.
  • An electric motor may be used to drive the vehicle together with or without using an engine.
  • the braking ECU 13 controls braking force for each wheel of the vehicle 8 even without the operation of a brake pedal by the driver V of the vehicle 8, such as antilock braking system (ABS) control, braking control while following a preceding vehicle, automatic braking control based on a time to collision (TTC) with respect to an object, and control of keeping a stopped state when starting from a slope.
  • ABS antilock braking system
  • TTC time to collision
  • the steering ECU 14 detects the steering direction and the steering amount of the steering wheel operated by the driver V of the vehicle 8 and performs power steering control for applying a steering torque in a steering direction.
  • the steering ECU 14 performs steering to cause the vehicle 8 to travel in a direction of avoiding deviation from the running lane, in a direction of maintaining the running at the center of the running lane, or in a direction of avoiding approaching an obstacle, without the need of operation of the steering wheel by the driver V of the vehicle 8.
  • the detecting apparatus 16 includes a variety of sensors for detecting an object around the vehicle 8 and detecting operation information with respect to the vehicle 8 provided by the driver V.
  • the identifying apparatus 15 identifies an object detected by the detecting apparatus 16, the relative position (direction and distance) of the object relative to the vehicle 8, and the relative position (direction and distance) of the object relative to the indication piece image representing the vehicle 8. Information such as a vehicle speed and an identification result and relative position of an object are input to the display apparatus 1. ⁇ Example configuration of detecting apparatus 16>
  • Fig. 4 is a block diagram depicting an example of a configuration of the detecting apparatus 16.
  • the detecting apparatus 16 includes a vehicle speed sensor 161 for detecting a vehicle speed to be displayed by the display apparatus 1, a vehicle information sensor 162 for obtaining vehicle information to be displayed by the display apparatus 1, radar sensors 163 for detecting an object, surrounding cameras 164, an occupant state information sensor 165 for obtaining occupant state information, which is information about an occupant, a vehicle information and communication system (VICS, registered trademark) receiving device 166 for receiving traffic jam information, and an external communication device 167 connected to the Internet.
  • VICS vehicle information and communication system
  • the sensors included in the detecting apparatus 16 do not need to be included in the detecting apparatus 16 collectively as long as the sensors are provided in the vehicle 8.
  • the vehicle speed sensor 161 has a configuration where a magnet that rotates with a rotation of the shaft of the drive train system is detected by a sensor unit fixed to a body and a pulse wave proportional to the rotation speed is generated. A vehicle speed can be detected from the number of pulses per unit time.
  • the vehicle information sensor 162 includes one or more sensors for detecting vehicle information other than the vehicle speed sensor 161. Examples include a fuel gauge sensor, a shift lever position sensor, an odometer, a trip meter, a turn signal sensor, and a water temperature sensor. These sensors may have common configurations to obtain the respective sorts of vehicle information.
  • the fuel gauge sensor detects the current remaining fuel.
  • the shift lever position sensor detects the position of a shift lever operated by the driver V of the vehicle 8.
  • the odometer cumulates a traveling distance of the vehicle 8 to provide a total traveling distance.
  • the trip meter indicates a section traveling distance from a location at a time at which an initializing operation is performed by the driver V of the vehicle 8 through a location at the present time.
  • the turn signal sensor detects the direction of a turn signal operated by driver V of the vehicle 8.
  • the water temperature sensor detects the engine cooling water temperature.
  • the surrounding cameras 164 are imaging devices that capture surrounding images of the vehicle.
  • the surrounding cameras 164 are desirably located at a plurality of locations so that images can be taken from the sides through the rear of the vehicle 8.
  • the surrounding cameras 164 may be provided at a left rear corner, a right rear corner, and at a rear section of the vehicle on the roof or bumpers of the vehicle 8.
  • An imaging device located at the rear section is called a back monitor, but such a rear surrounding camera 164 is not limited to a back monitor.
  • the surrounding cameras 164 may be disposed on side mirrors, pillars, side portions of the roof, or doors.
  • the surrounding cameras 164 may include an imaging device for capturing a forward direction image.
  • such a surrounding camera 164 may be mounted on a rear face of or near a rear-view mirror.
  • the surrounding cameras 164 may be monocular cameras or stereo cameras. In a case of monocular cameras or stereo cameras capable of obtaining distance information, the radar sensors 163 are not needed. However, if the radar sensors 163 are provided in addition to the surrounding cameras 164 capable of obtaining distance information, fusion (integration) of the distance information of the surrounding cameras 164 and the distance information of the radar sensors 163 enables compensating for respective disadvantages and obtaining high-precision distance information. In addition to the radar sensors 163 and the surrounding cameras 164, also sonic sensors (ultrasonic sensors) or the like may be provided.
  • the radar sensors 163 transmit radio waves to the surrounding of the vehicle 8, such as the forward, sides, and rearward of the vehicle 8, and receive a radio wave reflected by an object.
  • the installation location of the radar sensors 163 may be locations where an obstacle around the vehicle 8 can be detected.
  • the radar sensors 163 use a time of flight (TOF) method in which a distance to an object is detected according to a time from transmission to reception of a radio wave and a direction of the object is detected according to the radio wave emitting direction of the radar.
  • TOF time of flight
  • a light detection and ranging or laser imaging detection and ranging (LIDAR) sensor is known as a TOF type radar sensor.
  • LIDAR laser imaging detection and ranging
  • FMCW frequency modulation continuous wave
  • a direction of an object is estimated by detecting phase shifts of received waves received with a plurality of receiving antennas.
  • the occupant state information sensor 165 is a sensor that detects occupant state information directly or indirectly from an occupant of the vehicle 8.
  • a typical example is a face camera.
  • the face camera captures an image of an occupant of the vehicle 8 and performs face authentication to identify the occupant of the vehicle 8. In addition, it is possible to detect the face direction and the line of sight direction from the thus captured face image.
  • the occupant state information sensor 165 may be, for example, an electrocardiogram sensor, a heart rate sensor, a blood pressure sensor, a body temperature sensor, a pulse detection sensor, a respiration sensor, a perspiration sensor, a blinking sensor, a pupil sensor, an electroencephalogram sensor, or a myoelectric potential sensor.
  • the occupant state information sensor 165 may have a form of, for example, a wristwatch-type wearable terminal (smart watch) worn by an occupant of the vehicle 8.
  • the VICS receiving device 166 receives a radio wave transmitted from a VICS.
  • a VICS is a system that transmits traffic information such as traffic jam information and traffic restriction information on a real time basis to an on-vehicle device using a FM multiplex broadcast or a beacon.
  • the external communication device 167 connects to the Internet or the like through a network such as 3G, 4G, 5G, LTE, or wireless LAN and receives various information. For example, weather information such as rain, snow or fog can be received through the external communication device 167.
  • the external communication device 167 can obtain, for example, traffic signal state information and a time to wait until a change of a traffic signal.
  • the VICS receiving device 166 and the external communication device 167 may perform roadside-to-vehicle communication.
  • the external communication device 167 may obtain information detected by another vehicle 6 through inter-vehicle communication.
  • An advanced driver assistance system which not only displays information and provides warning, but may also control the vehicle 8, may also be provided in the detecting apparatus 16.
  • a corresponding ADAS ECU cooperates with the engine ECU 12, the braking ECU 13, and the steering ECU 14 to provide various operational assistance based on distance information with respect to an object detected by the radar sensors 163 or the surrounding cameras 164 or detected by the radar sensors 163 and the surrounding cameras 164.
  • the ADAS ECU performs acceleration/deceleration control during following a preceding vehicle, automatic braking control, control for avoidance of deviation from a traveling lane, lane keeping traveling control, and steering control to avoid collision with an object.
  • the ADAS ECU identifies road paint, such as a white line, from an image taken by the surrounding cameras 164.
  • the ADAS ECU controls driving power and braking power to maintain a target distance depending on the vehicle speed.
  • automatic braking control the ADAS ECU performs, according to a TTC, warning, displaying an indication to urge the driver to press the brake pedal down, hoisting the seat belt when there is a high possibility of collision, and braking to avoid collision.
  • the ADAS ECU identifies a white line (lane partitioning line) from a captured image and applies steering torque in a direction opposite to a direction of deviation from the traveling lane.
  • a center of a traveling lane is set as a target traveling line, and a steering torque proportional to a deviation from the target running line is added in the direction opposite to the deviating direction.
  • a traveling line for avoiding the object is determined, and a steering torque is applied to follow the traveling line.
  • the radar sensors 163 or the surrounding cameras 164 detect a vehicle traveling in an area (blind area) that is not reflected in a door mirror at an adjacent lane, an occupant is warned by the ADAS ECU. Such assistance is called a blind spot monitor.
  • a blind spot monitor Such assistance is called a blind spot monitor.
  • Fig. 5 is a block diagram depicting an example of a functional configuration of the identifying apparatus 15.
  • the identifying apparatus 15 includes an object determining unit 151 and a relative position determining unit 152.
  • the object determining unit 151 analyzes surrounding image data obtained by the surrounding cameras 164 and determines a type of an object indicated by the thus obtained data. In the present embodiment, the object determining unit 151 determines whether the object is, for example, another vehicle, a pedestrian, a motorcycle, or the like.
  • each pixel or pixel block of surrounding image data includes distance information.
  • Such surrounding image data is called a distance image.
  • the object determining unit 151 can determine an object from not only surrounding image data but also from radar ranging information. For example, if a density of points indicated by LIDAR data is sufficiently high, a shape of the object can be obtained, and thus, the shape can be analyzed to determine the type of the object.
  • An image recognition method using a machine learning technique is one of methods by which the object determining unit 151 determines a type of an object.
  • Machine learning technique is a technique for a computer to obtain learning ability like leaning ability of a human being. More specifically, machine learning is a technique in which a computer autonomously generates an algorithm necessary for implementing data identification from pre-loaded learning data, and then applies the algorithm to new data to make a prediction.
  • a specific learning method with respect to a machine learning technique may be any one of a supervised learning method, a unsupervised learning method, a semi-supervised learning method, a reinforcement learning method, a deep learning method, and any combination of these learning methods; and any other learning method may also be used.
  • Machine learning techniques include a perceptron, deep learning, support vector machine, logistic regression, naive Bayes, decision tree, random forest, and the like.
  • the relative position determining unit 152 determines a relative position (distance and direction) of an object relative to an indication piece image 62 representing the vehicle 8 or the vehicle 8.
  • Fig. 6 is a diagram depicting an example of a position of an object relative to the vehicle 8 or an indication piece image 62 representing the vehicle 8.
  • the indication piece image 62 (the center of a virtual image I) representing the vehicle 8 is displayed at a position of Q with respect to the vehicle width direction and P with respect to the vehicle length direction from the center of the vehicle 8.
  • the relative position determining unit 152 converts the coordinates (A, B) to coordinates (C, D) based on the indication piece image 62 representing the vehicle 8.
  • the indication piece image 62 representing the vehicle 8 is provided as the virtual image I and thus is at a predetermined position in front of the vehicle 8.
  • the coordinates (C, D) are obtained as follows.
  • the relative position determining unit 152 performs the same process for each of the radar sensors or the surrounding cameras provided in the vehicle to determine the position of an object relative to the indication piece image 62 representing the vehicle 8.
  • the distance and direction are also determined by determining the relative positions.
  • the distance from the indication piece image 62 representing the vehicle 8 to the other vehicle 6 is denoted as L 2 and the direction of the other vehicle 6 is denoted as ⁇ 2 .
  • the reference direction (regarded as a direction at the angle 0 degrees) with respect to the directions ⁇ 1 and ⁇ 2 may be set appropriately.
  • the 9 o'clock direction is determined as the angle 0 degrees.
  • the identifying apparatus 15 outputs type information indicating a type of an object determined by the object determining unit 151, distance information indicating a distance from the vehicle 8 to the object obtained by the relative position determining unit 152, and position information indicating the relative position of the object relative to the vehicle 8 to the control unit 20.
  • type information indicating a type of an object determined by the object determining unit 151 distance information indicating a distance from the vehicle 8 to the object obtained by the relative position determining unit 152, and position information indicating the relative position of the object relative to the vehicle 8 to the control unit 20.
  • Fig. 7 is a block diagram depicting an example of a hardware configuration of the control unit 20.
  • the control unit 20 includes a field-programmable gate array (FPGA) 201, a central processing unit (CPU) 202, a read-only memory (ROM) 203, a random access memory (RAM) 204, an interface (I/F) 205, a bus line 206, a LD driver 207, and a MEMS controller 208.
  • the FPGA 201, the CPU 202, the ROM 203, the RAM 204, and the I/F 205 are interconnected via the bus line 206.
  • the CPU 202 controls each of the functions of the control unit 20.
  • the ROM 203 stores a program 203p executed by the CPU 202 to control each of the functions of the control unit 20.
  • the RAM 204 stores the program 203p, and is used by the CPU 202 as a work area for executing the program 203p.
  • the RAM 204 includes an image memory 209.
  • the image memory 209 is used to generate an image that is projected as a virtual image I.
  • the I/F 205 is an interface for communicating with the identifying apparatus 15 and the detecting apparatus 16 and is connected to a controller area network (CAN) bus or an Ethernet (registered trademark) of the vehicle 8.
  • CAN controller area network
  • Ethernet registered trademark
  • the FPGA 201 controls the LD driver 207 based on an image generated by the CPU 202.
  • the LD driver 207 drives the LDs of the light source unit 101 of the projecting unit 10 to control light emission of the LDs according to the image.
  • the FPGA 201 operates the light deflector 102 of the projecting unit 10 through the MEMS controller 208 so that the laser beam is deflected in a direction corresponding to a pixel position of the image.
  • Fig. 8 is a block diagram depicting an example of a functional configuration of the control unit 20.
  • the control unit 20 includes a remaining distance obtaining unit 21, a determining unit 22, a direction determining unit 23, an image generating unit 24, and an output unit 25.
  • the functions of the remaining distance obtaining unit 21, the determining unit 22, the direction determining unit 23, and the image generating unit 24 are implemented by executing of predetermined programs by the CPU 202 of Fig. 7.
  • the function of the output unit 25 is implemented by the I/F 205 depicted in Fig. 7 and so forth.
  • the remaining distance obtaining unit 21 obtains remaining distance information indicating a remaining distance up to a location, at which a presentation image will be provided, for the vehicle 8 to travel to reach, based on route guidance information input from the car navigation system 11, and outputs the thus obtained result to the determining unit 22.
  • the determining unit 22 determines whether the remaining distance is smaller than or equal to a predetermined presentation threshold and outputs the determination result to the direction determining unit 23.
  • the direction determining unit 23 determines a planned traveling direction with respect to a planned traveling route of the vehicle 8 based on the route guidance information input from the car navigation system 11 and outputs the determination result to the image generating unit 24. For example, when there is an intersection in the planned traveling route of the vehicle 8, the direction determining unit 23 determines a direction in which the vehicle 8 is to travel at the intersection.
  • the image generating unit 24 generates a presentation image for providing a planned traveling direction with respect to the planned traveling route of the vehicle 8 according to a planned traveling direction determined by the direction determining unit 23. Image data of the thus generated presentation image is output to the projecting unit 10 through the output unit 25.
  • the projecting unit 10 projects a projection image including the presentation image to the windshield 71 based on the input presentation image data and allows the driver V of the vehicle 8 to see a virtual image of the presentation image.
  • the image generating unit 24 may generate not only the presentation image but also another image such as a remaining distance image for displaying remaining distance information obtained by the remaining distance obtaining unit 21 or a vehicle speed image for displaying vehicle speed information with respect to the vehicle 8 detected by the vehicle speed sensor 161 of Fig. 4. Then, the projection image data where the presentation image, the remaining distance image, and the vehicle speed image, for example, are arranged in one image can be output to the projecting unit 10 through the output unit 25.
  • Fig. 9 is a flowchart depicting an example of a process of the control unit 20.
  • step S91 the remaining distance obtaining unit 21 and the direction determining unit 23 receive route guidance information input from the car navigation system 11.
  • step S92 the remaining distance obtaining unit 21 obtains remaining distance information indicating a remaining distance up to a location, at which a presentation image will be provided, for the vehicle 8 to travel to reach, based on the route guidance information, and outputs the thus obtained result to the determining unit 22.
  • step S93 the determining unit 22 determines whether the remaining distance becomes smaller than or equal to the presentation threshold.
  • step S93 When it is determined in step S93 that the remaining distance is still greater than the presentation threshold (No in step S93), the process starting from step S91 is repeated.
  • step S93 When it is determined in step S93 that the remaining distance becomes smaller than or equal to the presentation threshold (Yes in step S93), the direction determining unit 23 determines a planned traveling direction of the vehicle 8 based on the route guidance information in step S94.
  • step S95 the image generating unit 24 generates a presentation image according to the planned traveling direction determined by the direction determining unit 23.
  • step S96 the output unit 25 outputs the presentation image data to the projecting unit 10.
  • control unit 20 generates a presentation image and outputs the presentation image to the projecting unit 10.
  • the image generating unit 24 also may generate above-mentioned remaining distance image and vehicle speed image, for example, and output projection image data including such images to the projecting unit 10 through the output unit 25.
  • a projection image including a remaining distance image and a vehicle speed image without including a presentation image is superimposed on a forward view of the vehicle 8.
  • a common car navigation system has a function of guiding a driver of a vehicle at each intersection and at each branch where the road branches according to a vehicle's planned traveling route. Therefore, also in the present embodiment, an example where a presentation image is displayed at each intersection or branch point to guide a driver of a vehicle according to a planned traveling route will be described.
  • Fig. 10 is a diagram depicting an example of a display of the vehicle 8 on a forward view 300 when a remaining distance is greater than a presentation threshold.
  • the forward view 300 includes a preceding vehicle 301, a road 302, and the like. In addition, a vehicle speed image 303 and a remaining distance image 304 are superimposed on the forward view 300.
  • the vehicle speed image 303 is an image depicting a current vehicle speed of the vehicle 8. In the example of Fig. 10, the speed "70 km/h" is displayed as the vehicle speed.
  • the remaining distance image 304 is an image depicting a remaining distance up to a location, at which a presentation image will be provided, for the vehicle 8 to travel to reach.
  • the place name "Tsuzuki I.C.” is displayed as a location at which a presentation image will be provided, and a remaining distance of 200 m up to the "Tsuzuki I.C.” is also displayed.
  • the remaining distance image 304 is displayed when the remaining distance becomes smaller than or equal to a predetermined display threshold (e.g., 500 m) and is displayed until the remaining distance becomes smaller than or equal to a presentation threshold (e.g., 100 m).
  • a presentation threshold e.g. 100 m.
  • a presentation image has not yet been displayed because the remaining distance is still greater than the presentation threshold.
  • Fig. 11 is a diagram depicting an example of a display on a forward view 300a of the vehicle 8 when the remaining distance is smaller than or equal to the presentation threshold.
  • a road 305 crossing a road 302 appears at the forward view 300a.
  • the remaining distance image 304 depicted in Fig. 10 is not displayed because the remaining distance has become smaller than or equal to the presentation threshold, and presentation images 306 are newly displayed.
  • the presentation images 306 indicate a planned traveling direction of the vehicle 8 so as to guide the driver of the vehicle to the crossing road 305. As depicted in Fig. 11, the presentation images 306 include images of two arrow graphic symbols 306a and 306b.
  • the arrow graphic symbols 306a and 306b have directionality to indicate the crossing road 305.
  • the direction indicated by the arrow graphic symbols 306a and 306b corresponds to the planned traveling direction of the vehicle 8 (the planned moving direction of the movable body).
  • Each of the arrow graphic symbols 306a and 306b has an image size smaller than a size corresponding to a projection image angle of the projecting unit 10 with respect to the Z direction and is displayed at a position spaced from each other along the Z direction.
  • the Z direction depicted by the arrow Z in Fig. 11 corresponds to a direction along a vertical direction of the forward view 300a and the X direction corresponds to a direction along a horizontal direction of the forward view 300a.
  • This manner is also applied to the X and Z directions depicted in Figs. 12A through 15 being described hereinafter. Relationships between a size of a projection image and a projection image angle will be described later with reference to Fig. 13.
  • the driver V of the vehicle 8 can easily know a planned traveling direction with respect to a planned traveling route by viewing the presentation images 306 displayed on the forward view 300a.
  • the arrow graphic symbols 306a and 306b are examples of "a plurality of graphic symbols". In the example of Fig. 11, the two arrows are depicted. However, the number of arrows may be further increased.
  • a graphic symbol is not limited to an arrow, and may be any graphic symbol as long as the graphic symbol indicates a direction.
  • an area 307 depicts an area where a projection image is displayed.
  • a projection image is displayed within the area 307 on the forward view 300a.
  • the presentation image 306 is included in the projection image and thus, is displayed within the area 307.
  • the area 307 is depicted to explain an area in which a projection image is displayed, and the contour line (broken line) of the area 307 is not actually displayed on the forward view 300a of the vehicle 8.
  • Figs. 12A and 12B are diagrams depicting relationships between a size of a projection image and a presentation image.
  • Fig. 12A depicts relationships between a size of a projection image and a presentation image according to a comparative example
  • Fig. 12B depicts relationships between a size of a projection image and a presentation image according to the present embodiment.
  • Figs. 12A and 12B correspond to the view of the superimposed display of the projection image on the forward view of each of Figs. 10 and 11 but are views from the top (in the direction of looking down from the top) of the vehicle 8.
  • a presentation image 306' that is a bold arrow graphic symbol extending along a route is displayed for presenting a planned traveling route for the vehicle 8 traveling along a road 302.
  • the projection image including the presentation image 306' requires a size capable of providing a superimposing area 308'.
  • a "superimposing area” is an area, determined by the projection image angle and an angle of depression (angle of looking down) of the projecting unit 10, where a forward view and a projection image are superimposed.
  • a presentation image 306 in order to provide a planned traveling direction, does not include a whole route of turning to the left.
  • a projection image generated has a size for a smaller superimposing area 308 compared to the superimposing area 308' depicted in Fig. 12A.
  • Fig. 13 is a diagram depicting an example of relationships between a size of a projection image and a projection image angle of the projecting unit 10.
  • a virtual image I' depicts a virtual image with respect to a comparative example.
  • the virtual image I' is a virtual image of a projection image viewed by a driver V when the projection image is projected with a projection image angle ⁇ '.
  • a virtual image I depicts a virtual image according to the present embodiment.
  • the virtual image I is a virtual image of a projection image viewed by a driver V when the projection image is projected with a projection image angle ⁇ .
  • the size h along the Z direction of the virtual image I of the projection image including a presentation image 306 according to the present embodiment is smaller than the size h' in the Z direction of the virtual image I' of the projection image including a presentation image 306' according to the comparative example. Therefore, the presentation image 306 can be provided and thus route guidance information can be provided with a smaller projection image angle ⁇ compared to the projection image angle ⁇ '.
  • the projection image angle ⁇ corresponds to the maximum projection image angle of the projecting unit 10 with respect to the Z direction along the vertical direction.
  • Fig. 14 is a diagram depicting an example of relationships between a display distance of the arrow graphic symbols 306a and 306b and a superimposing area in the presentation image 306.
  • a display distance d indicates a display distance between the arrow graphic symbols 306a and 306b that neighbor to each other
  • a superimposing distance e indicates a length (distance) of the superimposing area along the traveling direction (the direction perpendicular to both the X direction and the Y direction) of the vehicle 8.
  • a presentation image to present a planned traveling direction of the vehicle 8 is generated as route guidance information, and a projection image including the presentation image is projected and displayed by the projecting unit 10 provided at the vehicle 8.
  • a presentation image 306 can be provided with a small projection image angle ⁇ (see Fig. 13) to provide route guidance information. Therefore, the size and the cost of the display apparatus 1 can be prevented from increasing.
  • a presentation image 306 includes arrow graphic symbols 306a and 306b that are a plurality of graphic symbols each having an image size smaller than a size corresponding to the projection image angle of the projecting unit 10.
  • the arrow graphic symbols 306a and 306b are displayed at positions spaced from each other along the Z direction.
  • the area of the presentation image 306 superimposed on the road in the forward view can be made smaller as compared to the above-described case where a presentation image including a bold arrow graphic symbol extending along a planned traveling route is superimposed on the road in the forward view. This eliminates visual complexity with respect to the forward view for the driver V of the vehicle 8.
  • Fig. 15 is a diagram depicting an example of display positions of the arrow graphic symbols 306a and 306b of the presentation image 306.
  • the simplest way is to display the arrow graphic symbols 306a and 306b along a central axis 309 of the driver V.
  • a reference position 310 with respect to an intersection may be determined and the arrow graphic symbols 306a and 306b may be arranged along a straight line connecting the reference position 310 and the driver V.
  • the virtual image of the presentation image 306 presents the planned traveling direction with a plurality of graphic symbols arranged along the vertical direction of the forward view 300a.
  • the plurality of graphic symbols presenting the planned traveling direction are positioned such that, among the plurality of graphic symbols, a graphic symbol nearer to the intersection in the vertical direction of the forward view is positioned on the intersection turning direction side in the horizontal direction of the forward view.
  • the arrow graphic symbol 306a presenting the planned traveling direction at a position nearer to the crossing road 305 in the vertical direction (the Z direction) of the forward view 300a than the arrow graphic symbol 306b, presents the planned traveling direction at a position that is also nearer to the crossing road 305 in the horizontal direction (the X direction) of the forward view 300a than the arrow graphic symbol 306b (i.e., on the intersection turning direction side of the arrow graphic symbol 306b in the horizontal direction (the X direction) of the forward view 300a).
  • the remaining distance image 304 according to the present embodiment described above may also be regarded as a presentation image that is to provide route guidance information because the remaining distance image 304 provides information concerning a location where a presentation image will be provided and information concerning a remaining distance up to the location for the vehicle 8 to reach.
  • the remaining distance image 304 is text information and does not require a large projection image angle to display, an influence on a projection image angle is small.
  • control unit 20 may instead have some or all of the functions that the identifying apparatus 15 has.
  • another element such as the identifying apparatus 15, the car navigation system 11, or the detecting apparatus 16 may instead have some of the functions that the control unit 20 has.
  • Embodiments of the present invention include a display method.
  • the display method for displaying a projection image includes generating a presentation image used to present route guidance information; and projecting, by a projecting unit provided at a movable body, the projection image including the presentation image.
  • the presentation image presents a planned moving direction of the movable body.
  • the embodiments of the present invention include a program.
  • the program when executed by a computer included in a display apparatus configured to display a projection image, causes the computer to generate a presentation image used to present route guidance information; and project, using a projecting unit provided at a movable body, the projection image including the presentation image.
  • the presentation image presents a planned moving direction of the movable body.
  • processing circuit may be a processor programmed to perform each function by software such as a processor implemented by an electronic circuit; or an application specific integrated circuit (ASIC), a digital signal processor (DSP), a field programmable gate array (FPGA), or a conventional circuit module, designed to perform each function described above.
  • ASIC application specific integrated circuit
  • DSP digital signal processor
  • FPGA field programmable gate array

Abstract

A display apparatus for displaying a projection image includes an image generating unit configured to generate a presentation image to be used to present route guidance information; and a projecting unit provided at a movable body and configured to project the projection image including the presentation image. The presentation image presents a planned moving direction of the movable body.

Description

DISPLAY APPARATUS, MOVABLE BODY, DISPLAY METHOD, PROGRAM, AND NON-TRANSITORY RECORDING MEDIUM
The present invention relates to a display apparatus, a movable body, a display method, a program, and a non-transitory recording medium.
A display apparatus such as a head-up display with which various information is superimposed on a forward view of a vehicle by projecting of a projection image including various information on a windshield of the vehicle to allow a driver of the vehicle to see the projection image is known.
A technique of displaying, as displayed information displayed by such a display apparatus, an image indicating a planned traveling route (planned moving route) of a vehicle as route guide information for guiding the driver of the vehicle onto a route of the vehicle and changing a display mode based on a remaining distance up to a location at which information is to be provided to the driver is disclosed (see, for example, PTL 1).
However, the technique disclosed in PTL 1 may require a large projection image angle to display an image, resulting in an increase in the size and cost of a display apparatus.
An object of the present disclosure is to enable presenting route guidance information even with a small projection image angle.
A display apparatus according to an aspect of the present invention displays a projection image, and includes an image generating unit configured to generate a presentation image to be used to present route guidance information; and a projecting unit provided at a movable body and configured to project the projection image including the presentation image. The presentation image presents a planned moving direction of the movable body.
Effects of Invention
According to the aspect of the present invention, route guidance information can be presented even with a small projection image angle.
Other objects, features and advantages of the present invention will become more apparent from the following detailed description when read in conjunction with the accompanying drawings.

Fig. 1 is a view depicting an example of a configuration in which a display apparatus according to a present embodiment is mounted at a vehicle. Fig. 2 is a view depicting an example of a configuration of a display unit. Fig. 3 is a block diagram depicting an example of a configuration of an on-vehicle system in which the display apparatus is provided in the vehicle. Fig. 4 is a block diagram depicting an example of a configuration of a detecting apparatus. Fig. 5 is a block diagram depicting an example of a functional configuration of an identifying apparatus; Fig. 6 is a diagram depicting a position of an obstacle relative to the vehicle or an indication piece image of the vehicle. Fig. 7 is a block diagram depicting an example of a hardware configuration of a control unit according to the present embodiment. Fig. 8 is a block diagram depicting an example of a functional configuration of the control unit according to the present embodiment. Fig. 9 is a flowchart depicting an example of a process performed by the control unit according to the present embodiment. Fig. 10 is a view depicting an example of displaying at a forward view performed when a remaining distance is greater than a presentation threshold. Fig. 11 is a view depicting an example of displaying at a forward view performed when the remaining distance is smaller than or equal to the presentation threshold. Fig. 12A is a diagram depicting relationships between a size of a projection image and a presentation image in a comparative example. Fig. 12B is a diagram depicting relationships between a size of a projection image and a presentation image according to the present embodiment. Fig. 13 is a diagram depicting an example of relationships between a size of a projection image and a projection image angle. Fig. 14 is a diagram depicting an example of relationships between a display spacing of arrow graphic symbols and a superimposing area. Fig. 15 is a view depicting an example of displaying positions of arrow graphic symbols.

Hereinafter, an embodiment of the present invention will be described with reference to the drawings. In each of the drawings, the elements having the same configurations may be given the same reference numerals and overlapping descriptions may be omitted.
In the present embodiment, a presentation image is generated to present a planned moving direction of a movable body as route guidance information, and a projection image including the presentation image is projected and displayed by a projecting unit provided at the movable body. By projecting a small image indicating the planned moving direction of the movable body rather than projecting a large image indicating the planned moving route of the movable body, it is possible to present route guidance information even with a small projection image angle.
<Terms in embodiment>
(Movable body)
A movable body means an object capable of motion. For example, a movable body may be a land movable body capable of moving on land, such as an automobile, a vehicle, a train, or a forklift; an aerial movable body capable of moving in the air, such as an airplane, a balloon, or a drone; or a water movable body capable of moving on water, such as a ship, a boat, a vessel, or a steam ship.
(Forward view)
A forward view is a view in a forward direction of a movable body seen by a driver of the movable body. For example, a forward view is a view in a forward direction of a vehicle seen by a driver through a windshield of the vehicle.
(Planned moving route)
A planned moving route is a portion of a route through which a movable body is planned from a starting point to a destination, and the portion of the route is a portion through which the movable body moves immediately after the present time. For the present embodiment, a description will be made assuming that a planned moving route of a vehicle is referred to as a planned moving route because a movable body is a vehicle in one example.
(Planned moving direction)
A planned moving direction means a direction in which a movable body is planned to move while moving through a planned moving route. For example, in a case where there is an intersection on the planned moving route, a direction in which the movable body will move at the intersection corresponds to a planned moving direction. For the present embodiment, a description will be made assuming that, because a case where a movable body is a vehicle is used as the example, a planned moving direction is referred to as a planned travel direction of a vehicle.
(Route guidance information)
Route guidance information denotes information to be used to guide a movable body according to a route from its starting point to its destination. Route guidance information includes information concerning a planned moving route (planned traveling route), a planned moving direction (planned traveling direction), and a location at which route guidance information is provided.
(Projection image)
A projection image denotes an image projected by a projecting unit and visible to a driver of a movable body.
(Presentation image)
A presentation image is an image indicating route guidance information. As a result of being projected by a projecting unit, a presentation image becomes visible to a driver of a movable body. A presentation image according to the present embodiment presents a planned moving direction of a movable body (a planned traveling direction of a vehicle).
(Driver)
A driver is a person who operates or controls a movable body.
Hereinafter, the present embodiment will be described for a case of a head-up display (HUD) provided in a vehicle, as an example of a display apparatus.
<Example of configuration>
Fig. 1 is a diagram depicting an example of a configuration in which a display apparatus 1 according to the present embodiment is provided in a vehicle 8.
The display apparatus 1 is embedded in the inside of a dashboard provided in the vehicle 8 and projects a projection image toward a windshield 71 through an emitting window 3 provided at an upper surface of the display apparatus 1. A projection image is displayed as a virtual image I in a Y direction of the windshield 71 and is visible by a driver V of the vehicle 8. By viewing the virtual image I, the driver V can obtain useful information for driving with less line-of-sight movement while keeping the eyes on other vehicles or a road in the Y direction of the vehicle 8.
As long as the display apparatus 1 is capable of projecting a projection image onto the windshield 71, the display apparatus 1 may be installed at the ceiling, a sun visor, or the like, other than the dashboard.
As depicted in Fig. 1, the display apparatus 1 includes the projecting unit 10 and a control unit 20. The projecting unit 10 projects a projection image toward the windshield 71. As examples of a method for projecting a projection image by the projecting unit 10, a laser scanning method and a panel method are known. The laser scanning method is a method in which a laser beam emitted from a laser light source is deflected by a two-dimensional scanning device to form an intermediate image (a real image projected onto a screen that will be described later). The panel method is a method of using an imaging device such as a liquid crystal panel, a digital micro-mirror device (DMD) panel, or a vacuum fluorescent display (VFD) to form an intermediate image.
Unlike the panel method in which a full screen emits light and an image is formed through partial shielding of the full screen, the laser scanning method is suitable because it is possible to form a high contrast image because a pixel emitting light or a pixel not emitting light is assigned to each pixel. It has been found out that high contrast improves visibility and allows vehicle occupants to view information with less attention resources than a HUD employing the panel method.
In particular, in the panel method, light that cannot be completely shielded may be projected even at an area where no information is provided, and a display frame may be projected at an area where the HUD is capable of displaying an image (such a phenomenon is called a post card). In the laser scanning method, there is no such phenomenon, and only the content can be projected. Particularly, in the augmented reality (AR) technology, realistic effect with respect to a case where a generated image is superimposed on a real landscape is improved.
According to the AR technology, our visible world is virtually augmented by superimposing an image of an object that does not exist in a real landscape on the real landscape. However, even a HUD employing the panel method may be used as long as the HUD can display information in a manner providing visibility with fewer attention resources (to avoid eye fatigue).
Fig. 2 is a diagram depicting an example of a configuration of the projecting unit 10. The projecting unit 10 includes a light source unit 101, a light deflector 102, a mirror 103, a screen 104, and a concave mirror 105. It should be noted that Fig. 2 merely depicts an example. The projecting unit 10 may include other elements, and need not include all the elements depicted.
The light source unit 101 includes three laser light sources corresponding to red (R), green (G), and blue (B) (hereinafter referred to as laser diodes (LDs)), coupling lenses, apertures, composite elements, and lenses, and combines laser beams emitted from the three LDs and directs the combined laser beam toward a reflective surface of the light deflector 102. The laser beam directed to the reflective surface of the light deflector 102 is deflected two-dimensionally by the light deflector 102.
As the light deflector 102, one microscopic mirror oscillating with respect to two orthogonal axes or two microscopic mirrors each oscillating or pivoting with respect to one axis can be used. The light deflector 102 can be a micro electro mechanical systems (MEMS) mirror fabricated through a semiconductor process, or the like. The light deflector 102 can be driven by an actuator that is driven by deformation force of a piezoelectric element. However, a galvano-mirror, a polygon mirror, or the like, also may be used as the light deflector 102.
A two-dimensionally deflected laser beam outgoing from the light deflector 102 is incident on the mirror 103 and is bent by the mirror 103 to draw a two-dimensional image (an intermediate image) on a surface (scanned surface) of the screen 104. As the mirror 103, a concave mirror or the like may be used, but a convex mirror or a planar mirror may also be used. By employing such a configuration of deflecting a laser beam with the light deflector 102 and the mirror 103, the size of the projecting unit 10 can be reduced or the arrangement of the elements can be flexibly changed.
The screen 104 is desirably a micro-lens array or a micro-mirror array having a function of diverging a laser beam with a desired divergence angle, but a diffuser plate for diffusing a laser beam, a reflector plate or a transparent plate having a smooth surface, or the like, may also be used instead. Generally, the arrangement from the light source unit 101 through the screen 104 is referred to as a HUD device. However, other elements may also be included in the HUD device.
A laser beam outgoing from the screen 104 is reflected by the concave mirror 105 and projected onto the windshield 71. The concave mirror 105 acts like a lens and has a capability of forming an image at a predetermined focal length. Thus, assuming that the concave mirror 105 is a lens, an image on the screen 104 corresponding to an object is imaged at a distance R2 determined by the focal length of the concave mirror 105. Accordingly, when viewed from an occupant such as the driver V of the vehicle 8, a virtual image I is displayed at a distance R1+R2 from the windshield 71. If the distance from the driver V of the vehicle 8 to the windshield 71 is R3, in Fig. 2, the virtual image I is displayed at a distance R (= R1 + R2 + R3) from the point of view E of the driver V of the vehicle 8.
At least part of the light beam incident on the windshield 71 is reflected toward the point of view E of the driver V of the vehicle 8. As a result, the driver V of the vehicle 8 can see the virtual image I obtained from magnifying the intermediate image on the screen 104 via the windshield 71. In other words, when being viewed from the point of view E of the driver V of the vehicle 8, the virtual image I obtained from magnifying the intermediate image is displayed through the windshield 71.
In general, the windshield 71 is slightly curved rather than flat. Therefore, although the imaged position of the virtual image I is determined not only by the focal length of the concave mirror 105 but also by the curved surface of the windshield 71, the distance R is determined approximately by the distance R1 + R2 as described above. The distance R1 or R2 may be increased in order to image the virtual image I remotely so that a line-of-sight movement of a viewer can be reduced. One method for increasing the distance R1 is to bend the optical path using a mirror, while a method for increasing the distance R2 is to adjust the focal length of the concave mirror 105.
It is desirable that at least one of the mirror 103 and the concave mirror 105 be designed and disposed to compensate for optical distortion, because an influence of the windshield 71 causes optical distortion resulting in a horizontal line of an intermediate image becoming convex upward or downward. Alternatively, it is desirable that a projection image be corrected in consideration of such optical distortion.
Alternatively, a combiner may be provided as a transmission reflection member on the point-of-view-E side of the windshield 71. As a result of the combiner being irradiated with light outgoing from the concave mirror 105, information can be provided in a form of a virtual image I in the same way as the case where the windshield 71 is irradiated with light outgoing from the concave mirror 105.
<Example configuration of on-vehicle system 2 in which display apparatus 1 is provided>
Next, an on-vehicle system 2 in which a display apparatus 1 is provided in a vehicle 8 will be described with reference to Fig. 3. Fig. 3 is a block diagram depicting an example of a configuration of an on-vehicle system 2 in which a display apparatus 1 is provided in a vehicle 8.
The on-vehicle system 2 includes a car navigation system 11, an engine electronic control unit (ECU) 12, the display apparatus 1, a braking ECU 13, a steering ECU 14, an identifying apparatus 15, and a detecting apparatus 16, communicating together via an on-board network NW such as a controller area network (CAN) bus.
The car navigation system 11 includes a global navigation satellite system (GNSS) such as a global positioning system (GPS) to detect the current location of the vehicle 8 and display the location of the vehicle 8 on an electronic map.
The car navigation system 11 receives a departure position and a destination that have been input, searches for a route from the departure position to the destination, and displays the route on the electronic map, or guides an occupant of the vehicle 8 in a direction of traveling by voice, letters (displayed on a display), or an animation before the course changes. The car navigation system 11 may communicate with a server via a cellular phone network or the like. In this case, the server can send the electronic map to the vehicle 8 or perform a route search.
The engine ECU 12 determines an ideal amount of fuel injection, performs advance/lag ignition timing control, controls a valve operating mechanism, and so forth according to information from various sensors and conditions of the vehicle 8. The engine ECU 12 determines whether to make a gearshift with reference a map where, with respect to relationships between current vehicle speed and accelerator opening degree, gearshift lines are set. The engine ECU 12 performs acceleration and deceleration control while following a preceding vehicle. An electric motor may be used to drive the vehicle together with or without using an engine.
The braking ECU 13 controls braking force for each wheel of the vehicle 8 even without the operation of a brake pedal by the driver V of the vehicle 8, such as antilock braking system (ABS) control, braking control while following a preceding vehicle, automatic braking control based on a time to collision (TTC) with respect to an object, and control of keeping a stopped state when starting from a slope.
The steering ECU 14 detects the steering direction and the steering amount of the steering wheel operated by the driver V of the vehicle 8 and performs power steering control for applying a steering torque in a steering direction. The steering ECU 14 performs steering to cause the vehicle 8 to travel in a direction of avoiding deviation from the running lane, in a direction of maintaining the running at the center of the running lane, or in a direction of avoiding approaching an obstacle, without the need of operation of the steering wheel by the driver V of the vehicle 8.
The detecting apparatus 16 includes a variety of sensors for detecting an object around the vehicle 8 and detecting operation information with respect to the vehicle 8 provided by the driver V.
The identifying apparatus 15 identifies an object detected by the detecting apparatus 16, the relative position (direction and distance) of the object relative to the vehicle 8, and the relative position (direction and distance) of the object relative to the indication piece image representing the vehicle 8. Information such as a vehicle speed and an identification result and relative position of an object are input to the display apparatus 1.
<Example configuration of detecting apparatus 16>
Next, a configuration of the detecting apparatus 16 will be described with reference to Fig. 4. Fig. 4 is a block diagram depicting an example of a configuration of the detecting apparatus 16. The detecting apparatus 16 includes a vehicle speed sensor 161 for detecting a vehicle speed to be displayed by the display apparatus 1, a vehicle information sensor 162 for obtaining vehicle information to be displayed by the display apparatus 1, radar sensors 163 for detecting an object, surrounding cameras 164, an occupant state information sensor 165 for obtaining occupant state information, which is information about an occupant, a vehicle information and communication system (VICS, registered trademark) receiving device 166 for receiving traffic jam information, and an external communication device 167 connected to the Internet.
The sensors included in the detecting apparatus 16 do not need to be included in the detecting apparatus 16 collectively as long as the sensors are provided in the vehicle 8.
The vehicle speed sensor 161 has a configuration where a magnet that rotates with a rotation of the shaft of the drive train system is detected by a sensor unit fixed to a body and a pulse wave proportional to the rotation speed is generated. A vehicle speed can be detected from the number of pulses per unit time.
The vehicle information sensor 162 includes one or more sensors for detecting vehicle information other than the vehicle speed sensor 161. Examples include a fuel gauge sensor, a shift lever position sensor, an odometer, a trip meter, a turn signal sensor, and a water temperature sensor. These sensors may have common configurations to obtain the respective sorts of vehicle information. The fuel gauge sensor detects the current remaining fuel.
The shift lever position sensor detects the position of a shift lever operated by the driver V of the vehicle 8. The odometer cumulates a traveling distance of the vehicle 8 to provide a total traveling distance. The trip meter indicates a section traveling distance from a location at a time at which an initializing operation is performed by the driver V of the vehicle 8 through a location at the present time.
The turn signal sensor detects the direction of a turn signal operated by driver V of the vehicle 8. The water temperature sensor detects the engine cooling water temperature. These sorts of information are only examples of information that can be obtained from the vehicle. Any other sorts of information that can be obtained from the vehicle 8 can be used as the vehicle information. For example, in a case of an electric vehicle or a hybrid vehicle, the remaining battery amount, the regenerated power amount, or the power consumption amount can be obtained as the vehicle information.
The surrounding cameras 164 are imaging devices that capture surrounding images of the vehicle. The surrounding cameras 164 are desirably located at a plurality of locations so that images can be taken from the sides through the rear of the vehicle 8. For example, the surrounding cameras 164 may be provided at a left rear corner, a right rear corner, and at a rear section of the vehicle on the roof or bumpers of the vehicle 8. An imaging device located at the rear section is called a back monitor, but such a rear surrounding camera 164 is not limited to a back monitor. The surrounding cameras 164 may be disposed on side mirrors, pillars, side portions of the roof, or doors. The surrounding cameras 164 may include an imaging device for capturing a forward direction image. For example, such a surrounding camera 164 may be mounted on a rear face of or near a rear-view mirror.
The surrounding cameras 164 may be monocular cameras or stereo cameras. In a case of monocular cameras or stereo cameras capable of obtaining distance information, the radar sensors 163 are not needed. However, if the radar sensors 163 are provided in addition to the surrounding cameras 164 capable of obtaining distance information, fusion (integration) of the distance information of the surrounding cameras 164 and the distance information of the radar sensors 163 enables compensating for respective disadvantages and obtaining high-precision distance information. In addition to the radar sensors 163 and the surrounding cameras 164, also sonic sensors (ultrasonic sensors) or the like may be provided.
The radar sensors 163 transmit radio waves to the surrounding of the vehicle 8, such as the forward, sides, and rearward of the vehicle 8, and receive a radio wave reflected by an object. The installation location of the radar sensors 163 may be locations where an obstacle around the vehicle 8 can be detected. The radar sensors 163 use a time of flight (TOF) method in which a distance to an object is detected according to a time from transmission to reception of a radio wave and a direction of the object is detected according to the radio wave emitting direction of the radar.
A light detection and ranging or laser imaging detection and ranging (LIDAR) sensor is known as a TOF type radar sensor. In addition, there is a frequency modulation continuous wave (FMCW) method in which a mixture of received and transmitting waves are generated while the frequency of the transmitting wave is continuously increased, and a beat frequency of the mixed wave generated due to a slight frequency difference is converted into a distance. In the FMCW method, a direction of an object is estimated by detecting phase shifts of received waves received with a plurality of receiving antennas.
The occupant state information sensor 165 is a sensor that detects occupant state information directly or indirectly from an occupant of the vehicle 8. A typical example is a face camera. The face camera captures an image of an occupant of the vehicle 8 and performs face authentication to identify the occupant of the vehicle 8. In addition, it is possible to detect the face direction and the line of sight direction from the thus captured face image.
Other than the above-described example, the occupant state information sensor 165 may be, for example, an electrocardiogram sensor, a heart rate sensor, a blood pressure sensor, a body temperature sensor, a pulse detection sensor, a respiration sensor, a perspiration sensor, a blinking sensor, a pupil sensor, an electroencephalogram sensor, or a myoelectric potential sensor. The occupant state information sensor 165 may have a form of, for example, a wristwatch-type wearable terminal (smart watch) worn by an occupant of the vehicle 8.
The VICS receiving device 166 receives a radio wave transmitted from a VICS. A VICS is a system that transmits traffic information such as traffic jam information and traffic restriction information on a real time basis to an on-vehicle device using a FM multiplex broadcast or a beacon. The external communication device 167 connects to the Internet or the like through a network such as 3G, 4G, 5G, LTE, or wireless LAN and receives various information. For example, weather information such as rain, snow or fog can be received through the external communication device 167.
News, music, videos, etc. may also be received through the external communication device 167. The external communication device 167 can obtain, for example, traffic signal state information and a time to wait until a change of a traffic signal. Thus, the VICS receiving device 166 and the external communication device 167 may perform roadside-to-vehicle communication. In addition, the external communication device 167 may obtain information detected by another vehicle 6 through inter-vehicle communication.
An advanced driver assistance system (ADAS), which not only displays information and provides warning, but may also control the vehicle 8, may also be provided in the detecting apparatus 16. In this case, a corresponding ADAS ECU cooperates with the engine ECU 12, the braking ECU 13, and the steering ECU 14 to provide various operational assistance based on distance information with respect to an object detected by the radar sensors 163 or the surrounding cameras 164 or detected by the radar sensors 163 and the surrounding cameras 164. For example, the ADAS ECU performs acceleration/deceleration control during following a preceding vehicle, automatic braking control, control for avoidance of deviation from a traveling lane, lane keeping traveling control, and steering control to avoid collision with an object. For such control, the ADAS ECU identifies road paint, such as a white line, from an image taken by the surrounding cameras 164.
In acceleration/deceleration control during following a preceding vehicle, the ADAS ECU controls driving power and braking power to maintain a target distance depending on the vehicle speed. In automatic braking control, the ADAS ECU performs, according to a TTC, warning, displaying an indication to urge the driver to press the brake pedal down, hoisting the seat belt when there is a high possibility of collision, and braking to avoid collision. In order to avoid deviation from a traveling lane, the ADAS ECU identifies a white line (lane partitioning line) from a captured image and applies steering torque in a direction opposite to a direction of deviation from the traveling lane.
For lane keeping traveling control, a center of a traveling lane is set as a target traveling line, and a steering torque proportional to a deviation from the target running line is added in the direction opposite to the deviating direction. In steering control to avoid collision with an object, when it is determined that a collision cannot be avoided by braking, a traveling line for avoiding the object is determined, and a steering torque is applied to follow the traveling line.
When, at a time to change a traveling lane, for example, the radar sensors 163 or the surrounding cameras 164 detect a vehicle traveling in an area (blind area) that is not reflected in a door mirror at an adjacent lane, an occupant is warned by the ADAS ECU. Such assistance is called a blind spot monitor.
<Example of functional configuration of identifying apparatus 15>
Next, a functional configuration of the identifying apparatus 15 will be described with reference to Fig. 5. Fig. 5 is a block diagram depicting an example of a functional configuration of the identifying apparatus 15. The identifying apparatus 15 includes an object determining unit 151 and a relative position determining unit 152.
The object determining unit 151 analyzes surrounding image data obtained by the surrounding cameras 164 and determines a type of an object indicated by the thus obtained data. In the present embodiment, the object determining unit 151 determines whether the object is, for example, another vehicle, a pedestrian, a motorcycle, or the like.
When the surrounding cameras 164 are stereo cameras, each pixel or pixel block of surrounding image data includes distance information. Such surrounding image data is called a distance image. The object determining unit 151 can determine an object from not only surrounding image data but also from radar ranging information. For example, if a density of points indicated by LIDAR data is sufficiently high, a shape of the object can be obtained, and thus, the shape can be analyzed to determine the type of the object.
An image recognition method using a machine learning technique is one of methods by which the object determining unit 151 determines a type of an object. Machine learning technique is a technique for a computer to obtain learning ability like leaning ability of a human being. More specifically, machine learning is a technique in which a computer autonomously generates an algorithm necessary for implementing data identification from pre-loaded learning data, and then applies the algorithm to new data to make a prediction. A specific learning method with respect to a machine learning technique may be any one of a supervised learning method, a unsupervised learning method, a semi-supervised learning method, a reinforcement learning method, a deep learning method, and any combination of these learning methods; and any other learning method may also be used. Machine learning techniques include a perceptron, deep learning, support vector machine, logistic regression, naive Bayes, decision tree, random forest, and the like.
As depicted in Fig. 6, the relative position determining unit 152 determines a relative position (distance and direction) of an object relative to an indication piece image 62 representing the vehicle 8 or the vehicle 8. Fig. 6 is a diagram depicting an example of a position of an object relative to the vehicle 8 or an indication piece image 62 representing the vehicle 8. As depicted in Fig. 6, the indication piece image 62 (the center of a virtual image I) representing the vehicle 8 is displayed at a position of Q with respect to the vehicle width direction and P with respect to the vehicle length direction from the center of the vehicle 8.
Based on surrounding image data or radar ranging information, it is assumed that another vehicle 6 is detected at coordinates (A, B) from an origin that is the center of the vehicle 8. The distance from the center of the vehicle 8 to the other vehicle 6 is denoted as L1 and the direction is denoted as θ1. In a case where surrounding image data is a distance image, the coordinates (A, B) are directly obtained. In a case where radar ranging information is used, the distance and direction may be obtained from breaking down the information into (A, B). The relative position determining unit 152 converts the coordinates (A, B) to coordinates (C, D) based on the indication piece image 62 representing the vehicle 8.
The indication piece image 62 representing the vehicle 8 is provided as the virtual image I and thus is at a predetermined position in front of the vehicle 8. As described above, when the distance with respect to the vehicle width direction from the center of the vehicle to the indication piece image 62 representing the vehicle 8 is Q and the distance with respect to the vehicle length direction from the center of the vehicle to the indication piece image 62 representing the vehicle 8 is P, the coordinates (C, D) are obtained as follows.
(C, D) = (A+Q, B+P)
The relative position determining unit 152 performs the same process for each of the radar sensors or the surrounding cameras provided in the vehicle to determine the position of an object relative to the indication piece image 62 representing the vehicle 8. The distance and direction are also determined by determining the relative positions. The distance from the indication piece image 62 representing the vehicle 8 to the other vehicle 6 is denoted as L2 and the direction of the other vehicle 6 is denoted as θ2. In the example of Fig. 6, the distance is L2 = (C2 + D2)1/2, and the direction is θ2 = arctan(D/C). The reference direction (regarded as a direction at the angle 0 degrees) with respect to the directions θ1 and θ2 may be set appropriately. In Fig. 6, the 9 o'clock direction is determined as the angle 0 degrees.
The identifying apparatus 15 outputs type information indicating a type of an object determined by the object determining unit 151, distance information indicating a distance from the vehicle 8 to the object obtained by the relative position determining unit 152, and position information indicating the relative position of the object relative to the vehicle 8 to the control unit 20.
<Example of hardware configuration of control unit 20>
Next, a hardware configuration of the control unit 20 will be described with reference to Fig. 7. Fig. 7 is a block diagram depicting an example of a hardware configuration of the control unit 20. As depicted in Fig. 7, the control unit 20 includes a field-programmable gate array (FPGA) 201, a central processing unit (CPU) 202, a read-only memory (ROM) 203, a random access memory (RAM) 204, an interface (I/F) 205, a bus line 206, a LD driver 207, and a MEMS controller 208. The FPGA 201, the CPU 202, the ROM 203, the RAM 204, and the I/F 205 are interconnected via the bus line 206.
The CPU 202 controls each of the functions of the control unit 20. The ROM 203 stores a program 203p executed by the CPU 202 to control each of the functions of the control unit 20. The RAM 204 stores the program 203p, and is used by the CPU 202 as a work area for executing the program 203p. The RAM 204 includes an image memory 209.
The image memory 209 is used to generate an image that is projected as a virtual image I. The I/F 205 is an interface for communicating with the identifying apparatus 15 and the detecting apparatus 16 and is connected to a controller area network (CAN) bus or an Ethernet (registered trademark) of the vehicle 8.
The FPGA 201 controls the LD driver 207 based on an image generated by the CPU 202. The LD driver 207 drives the LDs of the light source unit 101 of the projecting unit 10 to control light emission of the LDs according to the image. The FPGA 201 operates the light deflector 102 of the projecting unit 10 through the MEMS controller 208 so that the laser beam is deflected in a direction corresponding to a pixel position of the image.
<Example of functional configuration of control unit 20>
Next, a functional configuration of the control unit 20 will be described with reference to Fig. 8. Fig. 8 is a block diagram depicting an example of a functional configuration of the control unit 20.
As depicted in Fig. 8, the control unit 20 includes a remaining distance obtaining unit 21, a determining unit 22, a direction determining unit 23, an image generating unit 24, and an output unit 25. The functions of the remaining distance obtaining unit 21, the determining unit 22, the direction determining unit 23, and the image generating unit 24 are implemented by executing of predetermined programs by the CPU 202 of Fig. 7. The function of the output unit 25 is implemented by the I/F 205 depicted in Fig. 7 and so forth.
The remaining distance obtaining unit 21 obtains remaining distance information indicating a remaining distance up to a location, at which a presentation image will be provided, for the vehicle 8 to travel to reach, based on route guidance information input from the car navigation system 11, and outputs the thus obtained result to the determining unit 22.
The determining unit 22 determines whether the remaining distance is smaller than or equal to a predetermined presentation threshold and outputs the determination result to the direction determining unit 23.
When the remaining distance is smaller than or equal to the presentation threshold, the direction determining unit 23 determines a planned traveling direction with respect to a planned traveling route of the vehicle 8 based on the route guidance information input from the car navigation system 11 and outputs the determination result to the image generating unit 24. For example, when there is an intersection in the planned traveling route of the vehicle 8, the direction determining unit 23 determines a direction in which the vehicle 8 is to travel at the intersection.
The image generating unit 24 generates a presentation image for providing a planned traveling direction with respect to the planned traveling route of the vehicle 8 according to a planned traveling direction determined by the direction determining unit 23. Image data of the thus generated presentation image is output to the projecting unit 10 through the output unit 25.
The projecting unit 10 (see Fig. 1) projects a projection image including the presentation image to the windshield 71 based on the input presentation image data and allows the driver V of the vehicle 8 to see a virtual image of the presentation image.
The image generating unit 24 may generate not only the presentation image but also another image such as a remaining distance image for displaying remaining distance information obtained by the remaining distance obtaining unit 21 or a vehicle speed image for displaying vehicle speed information with respect to the vehicle 8 detected by the vehicle speed sensor 161 of Fig. 4. Then, the projection image data where the presentation image, the remaining distance image, and the vehicle speed image, for example, are arranged in one image can be output to the projecting unit 10 through the output unit 25.
<Example of process of control unit 20>
Next, a flow of a process of the control unit 20 will be described with reference to Fig. 9. Fig. 9 is a flowchart depicting an example of a process of the control unit 20.
First, in step S91, the remaining distance obtaining unit 21 and the direction determining unit 23 receive route guidance information input from the car navigation system 11.
Subsequently, in step S92, the remaining distance obtaining unit 21 obtains remaining distance information indicating a remaining distance up to a location, at which a presentation image will be provided, for the vehicle 8 to travel to reach, based on the route guidance information, and outputs the thus obtained result to the determining unit 22.
Subsequently, in step S93, the determining unit 22 determines whether the remaining distance becomes smaller than or equal to the presentation threshold.
When it is determined in step S93 that the remaining distance is still greater than the presentation threshold (No in step S93), the process starting from step S91 is repeated.
When it is determined in step S93 that the remaining distance becomes smaller than or equal to the presentation threshold (Yes in step S93), the direction determining unit 23 determines a planned traveling direction of the vehicle 8 based on the route guidance information in step S94.
Subsequently, in step S95, the image generating unit 24 generates a presentation image according to the planned traveling direction determined by the direction determining unit 23.
Subsequently, in step S96, the output unit 25 outputs the presentation image data to the projecting unit 10.
Thus, the control unit 20 generates a presentation image and outputs the presentation image to the projecting unit 10. In steps S91-S94, the image generating unit 24 also may generate above-mentioned remaining distance image and vehicle speed image, for example, and output projection image data including such images to the projecting unit 10 through the output unit 25. In this case, a projection image including a remaining distance image and a vehicle speed image without including a presentation image is superimposed on a forward view of the vehicle 8.
(Various display examples)
Next, various display examples displayed by the display apparatus 1 will be described. Hereinafter, an example where an image is superimposed on a forward view of the vehicle 8 after a traveling destination is set using the car navigation system 11 of Fig. 3 will be described. A common car navigation system has a function of guiding a driver of a vehicle at each intersection and at each branch where the road branches according to a vehicle's planned traveling route. Therefore, also in the present embodiment, an example where a presentation image is displayed at each intersection or branch point to guide a driver of a vehicle according to a planned traveling route will be described.
Fig. 10 is a diagram depicting an example of a display of the vehicle 8 on a forward view 300 when a remaining distance is greater than a presentation threshold.
The forward view 300 includes a preceding vehicle 301, a road 302, and the like. In addition, a vehicle speed image 303 and a remaining distance image 304 are superimposed on the forward view 300.
The vehicle speed image 303 is an image depicting a current vehicle speed of the vehicle 8. In the example of Fig. 10, the speed "70 km/h" is displayed as the vehicle speed.
The remaining distance image 304 is an image depicting a remaining distance up to a location, at which a presentation image will be provided, for the vehicle 8 to travel to reach. In the example of Fig. 10, the place name "Tsuzuki I.C." is displayed as a location at which a presentation image will be provided, and a remaining distance of 200 m up to the "Tsuzuki I.C." is also displayed.
The remaining distance image 304 is displayed when the remaining distance becomes smaller than or equal to a predetermined display threshold (e.g., 500 m) and is displayed until the remaining distance becomes smaller than or equal to a presentation threshold (e.g., 100 m). In Fig. 10, a presentation image has not yet been displayed because the remaining distance is still greater than the presentation threshold.
Next, Fig. 11 is a diagram depicting an example of a display on a forward view 300a of the vehicle 8 when the remaining distance is smaller than or equal to the presentation threshold.
A road 305 crossing a road 302 appears at the forward view 300a. At the forward view 300a, the remaining distance image 304 depicted in Fig. 10 is not displayed because the remaining distance has become smaller than or equal to the presentation threshold, and presentation images 306 are newly displayed.
The presentation images 306 indicate a planned traveling direction of the vehicle 8 so as to guide the driver of the vehicle to the crossing road 305. As depicted in Fig. 11, the presentation images 306 include images of two arrow graphic symbols 306a and 306b.
The arrow graphic symbols 306a and 306b have directionality to indicate the crossing road 305. The direction indicated by the arrow graphic symbols 306a and 306b corresponds to the planned traveling direction of the vehicle 8 (the planned moving direction of the movable body).
Each of the arrow graphic symbols 306a and 306b has an image size smaller than a size corresponding to a projection image angle of the projecting unit 10 with respect to the Z direction and is displayed at a position spaced from each other along the Z direction.
The Z direction depicted by the arrow Z in Fig. 11 corresponds to a direction along a vertical direction of the forward view 300a and the X direction corresponds to a direction along a horizontal direction of the forward view 300a. This manner is also applied to the X and Z directions depicted in Figs. 12A through 15 being described hereinafter. Relationships between a size of a projection image and a projection image angle will be described later with reference to Fig. 13.
The driver V of the vehicle 8 can easily know a planned traveling direction with respect to a planned traveling route by viewing the presentation images 306 displayed on the forward view 300a.
The arrow graphic symbols 306a and 306b are examples of "a plurality of graphic symbols". In the example of Fig. 11, the two arrows are depicted. However, the number of arrows may be further increased. A graphic symbol is not limited to an arrow, and may be any graphic symbol as long as the graphic symbol indicates a direction.
In Fig. 11, an area 307 depicts an area where a projection image is displayed. A projection image is displayed within the area 307 on the forward view 300a. The presentation image 306 is included in the projection image and thus, is displayed within the area 307. The area 307 is depicted to explain an area in which a projection image is displayed, and the contour line (broken line) of the area 307 is not actually displayed on the forward view 300a of the vehicle 8.
Figs. 12A and 12B are diagrams depicting relationships between a size of a projection image and a presentation image. Fig. 12A depicts relationships between a size of a projection image and a presentation image according to a comparative example, and Fig. 12B depicts relationships between a size of a projection image and a presentation image according to the present embodiment. Figs. 12A and 12B correspond to the view of the superimposed display of the projection image on the forward view of each of Figs. 10 and 11 but are views from the top (in the direction of looking down from the top) of the vehicle 8.
In Fig. 12A, a presentation image 306' that is a bold arrow graphic symbol extending along a route is displayed for presenting a planned traveling route for the vehicle 8 traveling along a road 302. In this case, it is necessary to extend the bold arrow graphic symbol along the planned traveling route, so that the projection image including the presentation image 306' requires a size capable of providing a superimposing area 308'. A "superimposing area" is an area, determined by the projection image angle and an angle of depression (angle of looking down) of the projecting unit 10, where a forward view and a projection image are superimposed.
On the other hand, as depicted in Fig. 12B, in order to provide a planned traveling direction, a presentation image 306 according to the present embodiment does not include a whole route of turning to the left. Thus, a projection image generated has a size for a smaller superimposing area 308 compared to the superimposing area 308' depicted in Fig. 12A.
Fig. 13 is a diagram depicting an example of relationships between a size of a projection image and a projection image angle of the projecting unit 10. In Fig. 13, a virtual image I' depicts a virtual image with respect to a comparative example. The virtual image I' is a virtual image of a projection image viewed by a driver V when the projection image is projected with a projection image angle φ'. On the other hand, a virtual image I depicts a virtual image according to the present embodiment. The virtual image I is a virtual image of a projection image viewed by a driver V when the projection image is projected with a projection image angle φ.
The size h along the Z direction of the virtual image I of the projection image including a presentation image 306 according to the present embodiment is smaller than the size h' in the Z direction of the virtual image I' of the projection image including a presentation image 306' according to the comparative example. Therefore, the presentation image 306 can be provided and thus route guidance information can be provided with a smaller projection image angle φ compared to the projection image angle φ'.
The projection image angle φ corresponds to the maximum projection image angle of the projecting unit 10 with respect to the Z direction along the vertical direction.
Fig. 14 is a diagram depicting an example of relationships between a display distance of the arrow graphic symbols 306a and 306b and a superimposing area in the presentation image 306. In Fig. 14, a display distance d indicates a display distance between the arrow graphic symbols 306a and 306b that neighbor to each other, and a superimposing distance e indicates a length (distance) of the superimposing area along the traveling direction (the direction perpendicular to both the X direction and the Y direction) of the vehicle 8. As depicted in Fig. 14, it is desirable that the display distance d be smaller than the superimposing distance e. When the display distance d is smaller than or equal to a size corresponding to the maximum projection image angle of the projecting unit 10 with respect to the Z-direction, a plurality of graphic symbols, such as arrow graphic symbols 306a and 306b, can be displayed on the forward view, and the planned traveling direction can be presented in a more easily understandable manner by such a plurality of graphic symbols. Therefore, it is desirable that the display distance d be smaller than or equal to a size corresponding to the maximum projection image angle of the projecting unit 10 with respect to the Z direction.
<Advantageous effects of operation of display apparatus 1>
As described above, in the present embodiment, a presentation image to present a planned traveling direction of the vehicle 8 is generated as route guidance information, and a projection image including the presentation image is projected and displayed by the projecting unit 10 provided at the vehicle 8. As a result of projecting a small image indicating a planned traveling direction of the vehicle 8 instead of projecting a large image for providing a planned traveling route of the vehicle 8, a presentation image 306 can be provided with a small projection image angle φ (see Fig. 13) to provide route guidance information. Therefore, the size and the cost of the display apparatus 1 can be prevented from increasing.
A presentation image 306 according to the present embodiment includes arrow graphic symbols 306a and 306b that are a plurality of graphic symbols each having an image size smaller than a size corresponding to the projection image angle of the projecting unit 10. The arrow graphic symbols 306a and 306b are displayed at positions spaced from each other along the Z direction. Thus, the area of the presentation image 306 superimposed on the road in the forward view can be made smaller as compared to the above-described case where a presentation image including a bold arrow graphic symbol extending along a planned traveling route is superimposed on the road in the forward view. This eliminates visual complexity with respect to the forward view for the driver V of the vehicle 8.
The planned moving direction can be indicated also by the displayed positions of the arrow graphic symbols 306a and 306b in the presentation image 306. Fig. 15 is a diagram depicting an example of display positions of the arrow graphic symbols 306a and 306b of the presentation image 306.
The simplest way is to display the arrow graphic symbols 306a and 306b along a central axis 309 of the driver V. However, a reference position 310 with respect to an intersection may be determined and the arrow graphic symbols 306a and 306b may be arranged along a straight line connecting the reference position 310 and the driver V.
Such a display manner will now be described again for a case where the driver V of the movable body 8 views the forward view 300a (see Fig 11). The virtual image of the presentation image 306 presents the planned traveling direction with a plurality of graphic symbols arranged along the vertical direction of the forward view 300a. The plurality of graphic symbols presenting the planned traveling direction are positioned such that, among the plurality of graphic symbols, a graphic symbol nearer to the intersection in the vertical direction of the forward view is positioned on the intersection turning direction side in the horizontal direction of the forward view.
More specifically with reference to Fig. 11, from among the arrow graphic symbols 306a and 306b included in the presentation image 306, the arrow graphic symbol 306a, presenting the planned traveling direction at a position nearer to the crossing road 305 in the vertical direction (the Z direction) of the forward view 300a than the arrow graphic symbol 306b, presents the planned traveling direction at a position that is also nearer to the crossing road 305 in the horizontal direction (the X direction) of the forward view 300a than the arrow graphic symbol 306b (i.e., on the intersection turning direction side of the arrow graphic symbol 306b in the horizontal direction (the X direction) of the forward view 300a).
Thus, not only the orientations of the arrow graphic symbols 306a and 306b but also the display positions of the arrow graphic symbols 306a and 306b allow the driver V to identify the planned traveling direction.
The remaining distance image 304 according to the present embodiment described above may also be regarded as a presentation image that is to provide route guidance information because the remaining distance image 304 provides information concerning a location where a presentation image will be provided and information concerning a remaining distance up to the location for the vehicle 8 to reach. In this regard, because the remaining distance image 304 is text information and does not require a large projection image angle to display, an influence on a projection image angle is small.
<Other embodiments>
Although the display apparatuses, movable bodies, display methods, programs, and non-transitory recording media have been described with reference to the embodiments, the present invention is not limited to the embodiments, and various modifications and improvements can be made without departing from the scope of the claimed invention.
In the above-described embodiments, the control unit 20 may instead have some or all of the functions that the identifying apparatus 15 has. In a similar way, another element such as the identifying apparatus 15, the car navigation system 11, or the detecting apparatus 16 may instead have some of the functions that the control unit 20 has.
Embodiments of the present invention include a display method. For example, the display method for displaying a projection image includes generating a presentation image used to present route guidance information; and projecting, by a projecting unit provided at a movable body, the projection image including the presentation image. The presentation image presents a planned moving direction of the movable body. Such a display method provides the same advantageous effects as the advantageous effects of the above-described display apparatus.
The embodiments of the present invention include a program. For example, the program, when executed by a computer included in a display apparatus configured to display a projection image, causes the computer to generate a presentation image used to present route guidance information; and project, using a projecting unit provided at a movable body, the projection image including the presentation image. The presentation image presents a planned moving direction of the movable body. Such a program can have the same advantageous effects as the advantageous effects of the display apparatus described above.
Each of the functions of the present embodiments described above may be implemented by one or more processing circuits. As used herein, a "processing circuit" may be a processor programmed to perform each function by software such as a processor implemented by an electronic circuit; or an application specific integrated circuit (ASIC), a digital signal processor (DSP), a field programmable gate array (FPGA), or a conventional circuit module, designed to perform each function described above.
The present application is based on and claims priority to Japanese patent application No. 2020-010358 filed on January 24, 2020, and Japanese patent application No. 2020-205061 filed on December 10, 2020. The entire contents of Japanese patent application No. 2020-010358 and Japanese patent application No. 2020-205061 are hereby incorporated herein by reference.

[PTL 1]  Japanese Unexamined Patent Application Publication No. 2019-207155

1 Display apparatus
2 On-vehicle system
8 Movable body
10 Projecting unit
11 Car navigation system
15 Identifying apparatus
16 Detecting apparatus
164 Surrounding cameras
20 Controller
21 Remaining distance obtaining unit
22 Determining unit
23 Direction determining unit
24 Image generating unit
25 Output unit
300, 300a Forward views
301 Preceding vehicle
302 Road
303 Vehicle speed image
304 Remaining distance image
305 Crossing road
306 Presentation image
306a, 306b Arrow graphic symbols (an example of a plurality of graphic symbols)
307 Area
308 Superimposing area
d Display distance
e Superimposing distance
h Size of virtual image with respect to Z direction
I Virtual image
V Driver
X Horizontal direction with respect to forward view
Z Vertical direction with respect to forward view
φ Projection image angle

Claims (11)

  1.     A display apparatus for displaying a projection image, the display apparatus comprising:
        an image generating unit configured to generate a presentation image to be used to present route guidance information; and
        a projecting unit provided at a movable body and configured to project the projection image including the presentation image,
        wherein
        the presentation image presents a planned moving direction of the movable body.
  2.     The display apparatus according to claim 1,
        wherein
        the presentation image presents the planned moving direction using a plurality of graphic symbols.
  3.     The display apparatus according to claim 2,
        wherein
        the graphic symbols are images of arrow graphic symbols directions of which indicate the planned moving direction.
  4.     The display apparatus according to any one of claims 1-3,
        wherein
        the projecting unit is configured to superimpose a virtual image of the presentation image on a forward view of the movable body.
  5.     The display apparatus according to claim 4,
        wherein
        the virtual image of the presentation image presents the planned moving direction at an intersection included in the forward view.
  6.     The display apparatus according to claim 5,
        wherein
        the virtual image of the presentation image presents the planned moving direction using a plurality of graphic symbols arranged along a vertical direction of the forward view, and
        a distance between any neighboring two of the plurality of graphic symbols is smaller than or equal to a size corresponding to a maximum projection image angle of the projecting unit in the vertical direction.
  7.     The display apparatus according to claim 5 or 6,
        wherein
        the virtual image of the presentation image presents the planned moving direction using a plurality of graphic symbols arranged along a vertical direction of the forward view, and
        among the plurality of graphic symbols presenting the planned moving direction, a graphic symbol nearer to the intersection in the vertical direction is positioned on an intersection turning direction side in a horizontal direction.
  8.     A movable body including the display apparatus according to any one of claims 1-7.
  9.     A display method for displaying a projection image, the display method comprising:
        generating a presentation image to be used to present route guidance information; and
        projecting, by a projecting unit provided at a movable body, the projection image including the presentation image,
        wherein
        the presentation image presents a planned moving direction of the movable body.
  10.     A program, when executed by a computer included in a display apparatus configured to display a projection image, causing the computer to
        generate a presentation image to be used to present route guidance information; and
        project, using a projecting unit provided at a movable body, the projection image including the presentation image,
        wherein
        the presentation image presents a planned moving direction of the movable body.
  11.     A non-transitory recording medium storing a program which, when executed by a computer included in a display apparatus configured to display a projection image, causes the computer to
        generate a presentation image to be used to present route guidance information; and
        project, using a projecting unit provided at a movable body, the projection image including the presentation image,
        wherein
        the presentation image presents a planned moving direction of the movable body.
PCT/JP2021/001913 2020-01-24 2021-01-20 Display apparatus, movable body, display method, program, and non-transitory recording medium WO2021149740A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP21704029.4A EP4093629A1 (en) 2020-01-24 2021-01-20 Display apparatus, movable body, display method, program, and non-transitory recording medium

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2020-010358 2020-01-24
JP2020010358 2020-01-24
JP2020-205061 2020-12-10
JP2020205061A JP2021117220A (en) 2020-01-24 2020-12-10 Display device, mobile body, method for display, and program

Publications (1)

Publication Number Publication Date
WO2021149740A1 true WO2021149740A1 (en) 2021-07-29

Family

ID=74561970

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/001913 WO2021149740A1 (en) 2020-01-24 2021-01-20 Display apparatus, movable body, display method, program, and non-transitory recording medium

Country Status (2)

Country Link
EP (1) EP4093629A1 (en)
WO (1) WO2021149740A1 (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150185039A1 (en) * 2012-09-12 2015-07-02 Bayerische Motoren Werke Aktiengesellschaft Contact-Analogue Display, in Particular of a Lane Change
JP2017021019A (en) * 2015-07-08 2017-01-26 日産自動車株式会社 Vehicular display apparatus and vehicular display method
DE102016203080A1 (en) * 2016-02-26 2017-08-31 Robert Bosch Gmbh Method for operating a head-up display, head-up display device
EP3248824A1 (en) * 2016-05-20 2017-11-29 Ricoh Company, Ltd. Head-up display
JP2019113809A (en) * 2017-12-26 2019-07-11 マクセル株式会社 Head-up display device
WO2019189619A1 (en) * 2018-03-29 2019-10-03 Ricoh Company, Ltd. Image control apparatus, display apparatus, movable body, and image control method
JP2019207155A (en) 2018-05-29 2019-12-05 株式会社デンソー Display control device, and display control program
WO2019230271A1 (en) * 2018-05-29 2019-12-05 株式会社デンソー Display control device, display control program, and persistent tangible computer-readable recording medium therefor
JP2020010358A (en) 2008-03-19 2020-01-16 日本電気株式会社 Communication method, mobile station, and base station
JP2020205061A (en) 2020-08-07 2020-12-24 パイオニア株式会社 Display device and head mount display

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020010358A (en) 2008-03-19 2020-01-16 日本電気株式会社 Communication method, mobile station, and base station
US20150185039A1 (en) * 2012-09-12 2015-07-02 Bayerische Motoren Werke Aktiengesellschaft Contact-Analogue Display, in Particular of a Lane Change
JP2017021019A (en) * 2015-07-08 2017-01-26 日産自動車株式会社 Vehicular display apparatus and vehicular display method
DE102016203080A1 (en) * 2016-02-26 2017-08-31 Robert Bosch Gmbh Method for operating a head-up display, head-up display device
EP3248824A1 (en) * 2016-05-20 2017-11-29 Ricoh Company, Ltd. Head-up display
JP2019113809A (en) * 2017-12-26 2019-07-11 マクセル株式会社 Head-up display device
WO2019189619A1 (en) * 2018-03-29 2019-10-03 Ricoh Company, Ltd. Image control apparatus, display apparatus, movable body, and image control method
JP2019207155A (en) 2018-05-29 2019-12-05 株式会社デンソー Display control device, and display control program
WO2019230271A1 (en) * 2018-05-29 2019-12-05 株式会社デンソー Display control device, display control program, and persistent tangible computer-readable recording medium therefor
JP2020205061A (en) 2020-08-07 2020-12-24 パイオニア株式会社 Display device and head mount display

Also Published As

Publication number Publication date
EP4093629A1 (en) 2022-11-30

Similar Documents

Publication Publication Date Title
EP3415373B1 (en) Information providing device
US10546561B2 (en) Display device, mobile device, display method, and recording medium
JP7222285B2 (en) DISPLAY CONTROL DEVICE, DISPLAY DEVICE, DISPLAY SYSTEM, MOBILE OBJECT, PROGRAM, IMAGE GENERATION METHOD
US20170336629A1 (en) Information processing system and information display apparatus
JP6658859B2 (en) Information provision device
US20210016793A1 (en) Control apparatus, display apparatus, movable body, and image display method
CN113165513A (en) Head-up display, display system for vehicle, and display method for vehicle
CN110816408A (en) Display device, display control method, and storage medium
JP6876277B2 (en) Control device, display device, display method and program
US11752940B2 (en) Display controller, display system, mobile object, image generation method, and carrier means
EP3892489B1 (en) Vehicle display device
WO2021132259A1 (en) Display apparatus, display method, and program
WO2021149740A1 (en) Display apparatus, movable body, display method, program, and non-transitory recording medium
JP2021117704A (en) Display device and method for display
JP2021105986A (en) Display device, moving entity, display method, and program
US11412205B2 (en) Vehicle display device
WO2021132408A1 (en) Display apparatus, moving body, display method, and program
JP2021117220A (en) Display device, mobile body, method for display, and program
JP7037764B2 (en) Travel route guidance device, mobile body, travel route guidance method and program
WO2021132250A1 (en) In-vehicle display device and program
JP2021117089A (en) Display device and method for display
JP2021104803A (en) Display device, display method and program
JP2021109555A (en) Display control device, system, display system and information display method
JP2021117987A (en) Image display device, image display method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21704029

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2021704029

Country of ref document: EP

Effective date: 20220824