WO2021153454A1 - Appareil d'affichage d'image, procédé d'affichage d'image, programme, et support d'enregistrement non transitoire - Google Patents

Appareil d'affichage d'image, procédé d'affichage d'image, programme, et support d'enregistrement non transitoire Download PDF

Info

Publication number
WO2021153454A1
WO2021153454A1 PCT/JP2021/002249 JP2021002249W WO2021153454A1 WO 2021153454 A1 WO2021153454 A1 WO 2021153454A1 JP 2021002249 W JP2021002249 W JP 2021002249W WO 2021153454 A1 WO2021153454 A1 WO 2021153454A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
driver
information
danger
view
Prior art date
Application number
PCT/JP2021/002249
Other languages
English (en)
Inventor
Yuki Hori
Yuuki Suzuki
Kazuhiro Takazawa
Masato Kusanagi
Shin Sekiya
Original Assignee
Ricoh Company, Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2020212928A external-priority patent/JP2021117987A/ja
Application filed by Ricoh Company, Ltd. filed Critical Ricoh Company, Ltd.
Priority to EP21704031.0A priority Critical patent/EP4096957A1/fr
Publication of WO2021153454A1 publication Critical patent/WO2021153454A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/23Head-up displays [HUD]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/28Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/149Instrument input by detecting viewing direction not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/177Augmented reality
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/178Warnings
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/179Distances to obstacles or vehicles

Definitions

  • the present invention relates to an image display apparatus, an image display method, a program, and a non-transitory recording medium.
  • HUD head-up display
  • ADAS advanced driving assistance system
  • a HUD provided in a vehicle displays image as virtual images a few meters from a driver's point of view.
  • a HUD provided in a vehicle can display virtual images as if the images were present in a real space ahead of the vehicle.
  • Technology that makes it possible to display an image as if the image exists in a real space is called augmented reality (AR), and it is said that the technology enables information to be provided more intuitively to a driver.
  • AR augmented reality
  • an information output apparatus which includes a display unit for displaying a warning indication first in a direction of user's eyes and then moving the warning indication closer to a direction in which an object to be observed is present.
  • the present technique is intended to allow a driver to safely identify an object far from the center of a field of view.
  • an image display apparatus includes a surrounding information obtaining unit configured to obtain surrounding information indicating a surrounding condition of a movable body; a point-of-view information obtaining unit configured to obtain point-of-view information indicating a position of a point of view of a driver of the movable body; an image generating unit configured to calculate a degree of danger with respect to an object outside a predetermined area that is included in the driver's field of view, the degree of danger being calculated on the basis of the surrounding information and the point-of-view information, the image generating unit being further configured to generate an image to be displayed within the predetermined area, the image indicating the degree of danger and a direction in which the object is; and a display control unit configured to output the image.
  • Fig. 1 is a diagram depicting an example of virtual images displayed by a HUD.
  • Fig. 2 is a diagram depicting an example of an outline of the HUD.
  • Fig. 3 is a diagram depicting an example of a configuration of an optical system of the HUD.
  • Fig. 4 is a diagram depicting an example of a configuration of a control system of the HUD.
  • Fig. 5 is a diagram depicting an example of an arrangement of the HUD and peripheral elements.
  • Fig. 6 is a diagram depicting an example of functions of the control system of the HUD.
  • Fig. 7 is a diagram depicting an example of a flowchart of an image output process.
  • Fig. 8 is a diagram depicting an example of an image output result.
  • Fig. 1 is a diagram depicting an example of virtual images displayed by a HUD.
  • Fig. 2 is a diagram depicting an example of an outline of the HUD.
  • Fig. 3 is a diagram depicting an
  • FIG. 9 is a view depicting another example of an image output result.
  • Fig. 10 is a view depicting an example of an image output result.
  • Fig. 11 is a view depicting another example of an image output result.
  • Fig. 12 is a view depicting another example of an image output result.
  • Fig. 1 is a diagram depicting an example of virtual images displayed by the HUD.
  • Fig. 2 is a diagram depicting an example of an outline of the HUD.
  • Fig. 3 is a diagram depicting an example of a configuration of an optical system of the HUD.
  • the HUD 200 is, for example, provided in a dashboard of a vehicle 301 that is an example of a traveling body or a movable body.
  • Projected light L which is image light emitted from the HUD 200 in the dashboard, is reflected by a windshield 302 as an example of a light transmission member, and irradiates a driver 300, which is an example of a viewing person.
  • the driver 300 can view a route navigation image or the like as virtual images G (see Fig. 3), which will be described later.
  • a combiner as an example of a light transmission member may be provided on an inner wall of the windshield 302 to allow the driver 300 to view virtual images G through projected light L reflected by the combiner.
  • a forward-view camera 110 and a driver monitoring camera 150 are disposed at an upper part of the windshield 302.
  • the forward-view camera 110 photographs a forward view.
  • the forward view photographed by the forward-view camera 110 includes display information displayed on the windshield 302 by the HUD 200 and a background of the display information, as depicted in Fig. 1, for example.
  • the forward-view camera 110 photographs display information displayed by the HUD 200 and reflected by the windshield 302 and also photographs a background of the display information through the windshield 302.
  • a background that the forward-view camera 110 photographs through the windshield 302 is a forward environment of the vehicle 301 (i.e., a preceding vehicle, the road surface, or the like).
  • the driver monitoring camera 150 photographs the driver 300 to detect a point-of-view position of the driver 300.
  • An optical system 230 and so forth of the HUD 200 are designed so that a distance from the driver 300 to virtual images G is, for example, 5 meters or more. As a result of the distance to the virtual images G being greater than or equal to 5 meters, the driver 300 can focus on the virtual images G almost without the need of an ocular convergence movement.
  • the optical system 230 of the HUD 200 depicted in FIG. 3 includes red, green and blue laser light sources 201R, 201G, and 201B, collimator lenses 202, 203 and 204 respectively provided for the laser light sources 201R, 201G, and 201B, and two dichroic mirrors 205 and 206.
  • the optical system 230 further includes a light intensity adjusting unit 207, an optical scanning device 208 as an optical scanning unit, a freeform surface mirror 209, a microlens array 210 as a light diverging member, and a projecting mirror 211 as a light reflecting member.
  • the laser light sources 201R, 201G and 201B, collimator lenses 202, 203, and 204, and dichroic mirrors 205 and 206 are unitized by an optical housing.
  • LDs semiconductor laser elements
  • the wavelength of a beam emitted from the red laser light source 201R is, for example, 640 nm
  • the wavelength of a beam emitted from the green laser light source 201G is, for example, 530 nm
  • the wavelength of a beam emitted from the blue laser light source 201B is, for example, 445 nm.
  • intermediate images formed on the microlens array 210 are projected onto the windshield 302 of the vehicle 301 so that magnified images of the intermediate images can be viewed as virtual images G by the driver 300.
  • Laser light of each color emitted from the corresponding one of the laser light sources 201R, 201G, and 201B is substantially collimated by the corresponding one of the collimator lenses 202, 203, and 204 into approximately parallel light, and then, is combined with laser light of the other two colors by the two dichroic mirrors 205 and 206.
  • the intensity of the combined laser light is adjusted by the light intensity adjusting unit 207 and is deflected by a mirror of the optical scanning device 208 to two-dimensionally scan the freeform surface mirror 209. Scanning light L' that is deflected by the optical scanning device 208 and scans two-dimensionally the freeform surface mirror 209 is reflected by the freeform surface mirror 209 to be corrected in distortion, and then is condensed onto the microlens array 210 to form intermediate images G' on the microlens array 210.
  • the microlens array 210 is used as the light diverging member that separately diverges a light beam for each pixel (one point of the intermediate images) of the intermediate images G', but another type of a light diverging member may be used instead.
  • Intermediate images G' may be formed by using a liquid crystal display (LCD) or a vacuum fluorescent display (VFD).
  • the HUD 200 performs a control process of gradually increasing brightness of only a warning image from among various sorts of images included in virtual images G displayed by the HUD 200, to gradually increase the brightness of the warning image and thereby increase the degree of warning to the driver, for example.
  • the optical scanning device 208 tilts the mirror in a main-scanning direction and a sub-scanning direction using a conventional actuator drive system, such as micro electro mechanical systems (MEMS), to deflect laser light incident on the mirror to implement two-dimensional scanning of the freeform surface mirror 209 (raster scanning).
  • a conventional actuator drive system such as micro electro mechanical systems (MEMS)
  • MEMS micro electro mechanical systems
  • the optical scanning device 208 is not limited to the above-described configuration.
  • the optical scanning device 208 may be an optical scanning device of a mirror system that includes two mirrors that oscillate or rotate about two rotational axes perpendicular to each other.
  • Fig. 4 is a diagram illustrating an example of a configuration of a control system of the HUD.
  • the control system 250 of the HUD 200 is a computer and includes a field programmable gate array (FPGA) 251, a central processing unit (CPU) 252, a read-only memory (ROM) 253, a random access memory (RAM) 254, an interface (I/F) 255, a bus line 256, a LD driver 257, and a MEMS controller 258.
  • FPGA field programmable gate array
  • CPU central processing unit
  • ROM read-only memory
  • RAM random access memory
  • I/F interface
  • the FPGA 251 controls operations of the laser light sources 201R, 201G, and 201B of the light source unit 220 using the LD driver 257 and operations of the MEMS 208a of the optical scanning device 208 using the MEMS controller 258.
  • the CPU 252 implements each function of the HUD 200.
  • the ROM 253 stores various programs, such as an image processing program, that may be executed by the CPU 252 in order to implement each function of the HUD 200.
  • the RAM 254 is used as a work area of the CPU 252.
  • the I/F 255 is an interface for communicating with an external controller or the like.
  • the I/F 255 is connected to a vehicle navigation apparatus 400, various sensors 500, and so forth through a controller area network (CAN) of the vehicle 301.
  • CAN controller area network
  • the forward-view camera 110 for photographing a forward view of the vehicle 301 is connected to the I/F 255.
  • the forward-view camera 110 photographs display information displayed by the HUD 200 on the windshield 302 and also photographs a background of the display information through the windshield 302.
  • the I/F 255 is further connected to the driver monitoring camera 150 to detect a point-of-view position of the driver 300.
  • the control system 250 obtains an image viewed from the point-of-view position of the driver 300 by transforming an image of display information and a background photographed by the forward-view camera 110 on the basis of the point-of-view position of the driver 300 through image processing.
  • the control system 250 detects the point-of-view position of the driver 300, for example, by analyzing an image of the head of the driver 300 photographed by the driver monitoring camera 150.
  • Fig. 5 is a diagram depicting an example of an arrangement of the HUD and peripheral elements.
  • the vehicle navigation apparatus 400, the sensors 500, and so forth are provided as information obtaining units for obtaining provided-to-driver information to be provided to the driver 300 through virtual images G.
  • the HUD 200 includes the optical system 230 as an example of a display unit and the control system 250 as an example of a control unit.
  • the vehicle navigation apparatus 400 As the vehicle navigation apparatus 400 of the present embodiment, conventional vehicle navigation apparatuses provided in automobiles, and the like can be widely used.
  • the vehicle navigation apparatus 400 outputs information used to generate route navigation images to be displayed as virtual images G, and the information is input to the control system 250.
  • the route navigation images include images depicting information such as the number of lanes (travel lanes) of a road on which the vehicle 301 is traveling, a distance to a point where a next course change (e.g., right-turn, left-turn, branching, or the like) is to be made, a direction in which the next course change is made, and the like.
  • a next course change e.g., right-turn, left-turn, branching, or the like
  • the HUD 200 displays, under the control of the control system 250, route navigation images, such as a travel lane indicating image 711, an inter-vehicle distance indicating image 712, a route indicating image 721, a remaining distance image 722, and an intersection etc. name image 723, and the like as virtual images G at an upper image display area A.
  • route navigation images such as a travel lane indicating image 711, an inter-vehicle distance indicating image 712, a route indicating image 721, a remaining distance image 722, and an intersection etc. name image 723, and the like as virtual images G at an upper image display area A.
  • the HUD 200 displays images indicating road-specific information (road name, speed limit, and so forth) as virtual images G at a lower image display area B. also the road-specific information is input from the vehicle navigation apparatus 400 to the control system 250.
  • road-specific information road name, speed limit, and so forth
  • the HUD 200 displays a road name indicating image 701, a speed limit indicating image 702, and a passing prohibition indicating image 703 corresponding to the road-specific information as virtual images at the lower image display area B under the control of the control system 250.
  • the sensors 500 of FIG. 5 include one or more sensors for detecting various sorts of information indicative of the behavior of the vehicle 301, the condition of the vehicle 301, the surrounding condition of the vehicle 301, and the like.
  • the sensors 500 output sensing information used to produce images that are displayed as virtual images G, and the sensing information is input to the control system 250.
  • a vehicle speed indicating image 704 (a text image of "83 km/h" in the example of Fig. 1) indicating the speed of the vehicle 301 is displayed as a virtual image at the lower stage image display area B by the HUD 200. That is, vehicle speed information included in CAN information of the vehicle 301 is input to the control system 250 from a sensor 500, and the HUD 200 displays a text image indicating a vehicle speed in as a virtual image G at the lower image display area B under the control of the control system 250.
  • the sensors 500 may include, in addition to the sensor that detects the speed of the vehicle 301, for example, (1) a laser radar device or an image capturing device that detects the distance from another vehicle, a pedestrian, or a structure (such as a guardrail or utility pole) around the vehicle 301, and a sensor that detects outside environmental information (such as an ambient temperature, lightness, a weather condition, and the like) of the vehicle, (2) a sensor that detects an operation performed by the driver 300 (such as a brake operation, an accelerator position, and the like), (3) a sensor that detects a remaining amount of fuel in a fuel tank of the vehicle 301, and (4) a sensor that detects statuses of various vehicle-mounted devices, such as an engine, a battery, and so forth.
  • a laser radar device or an image capturing device that detects the distance from another vehicle, a pedestrian, or a structure (such as a guardrail or utility pole) around the vehicle 301, and a sensor that detects outside environmental information (such as an ambient temperature,
  • the HUD 200 can display the information as virtual images G and provide the information to the driver 300.
  • provided-to-driver information provided by virtual images G to the driver 300 may be any information useful to the driver 300.
  • provided-to-driver information is broadly classified into passive information and active information.
  • Passive information refers to information that is passively received by the driver 300 at a time when a predetermined information providing condition is satisfied. Accordingly, information provided to the driver 300 at a set timing of the HUD 200 is included in passive information. Information for which there is a certain relationship between a timing when the information is provided and contents of the information provided is also included in passive information.
  • Examples of passive information include driving safety information and route navigation information.
  • Driving safety information includes information indicating a distance between a vehicle 301 and a preceding vehicle 350 (an inter-vehicle distance indicating image 712) and information with emergency concerning driving (warning information such as emergency operation instruction information that instructs a driver to perform an emergency operation on a vehicle, or attention-seeking operation), and the like.
  • Route navigation information refers to information for guiding a driver to a previously set destination through a traveling route and may be information provided to the driver by a conventional vehicle navigation apparatus.
  • Route navigation information includes travel lane instruction information (a travel lane indicating image 711) suggesting, immediately before arriving at a subsequent intersection, a travel lane to use, and route change operation instruction information indicating a route change operation at an intersection or a branch to subsequently change a route from a straight route.
  • route change operation instruction information examples include route indicating information (the route indicating image 721) indicating which route should be taken at an intersection, or the like, information indicating a remaining distance to an intersection at which a route change should be made (the remaining distance image 722), information indicating a name of an intersection, or the like (the intersection etc. name image 723).
  • Active information refers to information that is actively obtained by the driver 300 at a time determined by the driver himself or herself. Active information may be information to be provided to the driver 300 at a desired time. For example, information, for which there is low or no relationship between a timing when the information is provided and the contents of the information, may be included in examples of active information.
  • Active information refers to information that the driver 300 obtains at a desired time, and thus, may continue to be displayed for a certain period or at any time.
  • road-specific information of a road on which a vehicle 301 is traveling may be included in examples of active information.
  • vehicle speed information of the vehicle 301 may be included in examples of active information.
  • Road-specific information may be information useful for the driver 300 as information relating to a road, such as information indicating a name of the road (a road name indicating image 701) and information indicating rules, such as a speed limit of the road and the like (a speed limit indicating image 702 and a passing prohibition indicating image 703).
  • passive information and active information that are generally classified are displayed in the corresponding display areas where virtual images G can be displayed.
  • the vertically arranged two display areas are set for displaying virtual images G by the HUD 200.
  • Passive information images corresponding to passive information are mainly displayed in the upper image display area A, and active information images corresponding to active information are mainly displayed in the lower image display area B.
  • active information images corresponding to active information are displayed in the upper image display area A, these active information images are displayed in such a manner that, in the upper image display area A, passive information images are more visible than the active information images.
  • a stereoscopic image expressed using stereovision technology is used as virtual images G displayed by the HUD 200.
  • the inter-vehicle distance indicating image 712 and the lane indicating image 711 displayed at the upper image display area A as virtual images G by the HUD 200 are perspective images expressed in perspective.
  • Fig. 6 is a diagram illustrating an example of functions of the control system of the HUD.
  • the control system 250 includes an operation information obtaining unit 10, a surrounding information obtaining unit 20, a point-of-view information obtaining unit 30, an image generating unit 40, and a display control unit 50.
  • the operation information obtaining unit 10 obtains operation information from various sensors 500.
  • Operation information refers to information indicating an operational detail of an operation performed by a driver on a vehicle, for example, an operational detail of a steering wheel turning operation, an accelerator pedal pressing operation, a brake pedal pressing operation, a direction indicator driving operation, a gear selecting operation, or the like.
  • the surrounding information obtaining unit 20 obtains surrounding information from various sensors 500.
  • Surrounding information refers to information indicating a condition of a surrounding area of a vehicle, for example, a size, location, speed, or the like. of another vehicle, a pedestrian, a building, or the like present in the surrounding area (in a forward, lateral, or backward direction).
  • the point-of-view information obtaining unit 30 obtains point-of-view information from the driver monitoring camera 150.
  • Point-of-view information indicates the position of a driver's point of view.
  • the point-of-view information obtaining unit 30 detects a point-of-view position of the driver 300 by analyzing an image of the head of the driver 300 photographed by the driver monitoring camera 150.
  • the image generating unit 40 generates images for displaying the images as virtual images using the optical system 230 on the basis of operation information, surrounding information, and point-of-view information. Specifically, the image generating unit 40 calculates a degree of danger of an object outside a predetermined area that is included in the driver's field of view, and generates an image indicating the magnitude of the calculated degree of danger and the direction in which the object is present. The generated image is displayed within the predetermined area that is included in the driver's field of view.
  • the display control unit 50 controls the optical system 230 to cause the optical system 230 to display an output image thus generated by the image generating unit 40.
  • control system 250 of the HUD 200 starts an image output process.
  • the control system 250 performs an image output process periodically, for example, once a second.
  • Fig. 7 is a diagram illustrating an example of a flowchart of an image output process.
  • Operation information may be, for example, an operational detail of an operation performed by the driver 300 such as an operational detail of a steering wheel turning operation, an accelerator pedal pressing operation, a brake pedal pressing operation, a direction indicator driving operation, a gear selecting operation, or the like.
  • the surrounding information obtaining unit 20 obtains surrounding information.
  • Surrounding information may be, for example, the speed of a preceding vehicle, the distance between the vehicle 301 and the preceding vehicle, the position and the speed of a pedestrian, the speed and the position of a parallelly-traveling vehicle, or the like.
  • step S13 the point-of-view information obtaining unit 30 obtains point-of-view information.
  • Point-of-view information refers to information indicating a position of a point of view of the driver 300.
  • step S14 the image generating unit 40 calculates a degree of danger. Specifically, the image generating unit 40 calculates the degree of danger for each of surrounding objects indicated by in the surrounding information, such as a preceding vehicle, a pedestrian, a parallelly-traveling vehicle, and the like.
  • the image generating unit 40 calculates an estimate of a probability that the pedestrian will suddenly come onto a road and collide with the vehicle 301, on the basis of the pedestrian's position, speed, and acceleration, and the position, speed, and acceleration of the vehicle 301.
  • the image generating unit 40 calculates an estimate of a probability that the preceding vehicle will collide with the vehicle 301 as a result of a driver of the preceding vehicle slamming the brakes on, on the basis of the position, the speed, and the acceleration of the preceding vehicle and the position, the speed, and the acceleration of the vehicle 301.
  • the image generating unit 40 calculates an estimate of a probability that the vehicle 301 will collide with the parallelly-traveling vehicle as a result of changing the lane, on the basis of the position, the speed, and the acceleration of the parallelly-traveling vehicle, the position, the speed, and the acceleration of the vehicle 301, and the operational detail of a direction indicator driving operation at the vehicle 301.
  • the image generating unit 40 calculates a degree of danger according to a thus calculated estimate. Specifically, the higher a probability of a collision, the greater a calculated degree of danger.
  • step S15 the image generating unit 40 generates an image.
  • step S16 the display control unit 50 outputs the generated image. Specifically, the display control unit 50 controls the optical system 230 to display the generated image through the optical system 230.
  • step S15 the image generating unit 40 generates an image indicating a magnitude of the calculated degree of danger and a direction in which the object is present, the image being displayed within a predetermined area that is included in the driver's field of view described above.
  • Fig. 8 is a diagram illustrating an example of an image output result.
  • the image generating unit 40 When there is no object, the image generating unit 40 generates an output image 100 extending to form a curve shaped like an arc of a semiellipse or the like obtainable from an approximately elliptic shape being cut along its major axis, as depicted in Fig. 8.
  • An area defined by the output image 100 of Fig. 8 and the lower edge of the windshield 302 corresponds to a predetermined area that is included in the driver's field of view described above.
  • Fig. 9 is a diagram depicting another example of an image output result.
  • the image generating unit 40 When an object is present, as depicted in Fig. 9, the image generating unit 40 generates an output image 100 that includes an indentation 101 that represents the magnitude of the degree of danger with respect to an object and a direction in which the object is present.
  • the image generating unit 40 generates the output image 100 by modifying the shape of the above-described output image 100 of Fig. 8 (extending to form a curve shaped like an arc of a semiellipse or the like obtainable from an approximately elliptic shape being cut along its major axis, as depicted in Fig. 8).
  • the image generating unit 40 generates an indentation 101, in a case where, from the driver's perspective, an object is outside the predetermined area defined by the above-described output image 100 of FIG. 8 and the lower edge of the windshield 302 described above with reference to Fig. 8, i.e., an object is outside the predetermined area that is included in the field of the view of the driver 300.
  • the depth of the resulting indentation 101 is proportional to the magnitude of the degree of danger calculated in step S14. Specifically, the greater the degree of danger, the deeper the generated indentation.
  • the indentation 101 is generated in such a manner that the indentation extends in a direction opposite to the direction in which the object is present from the driver's perspective.
  • an indentation 101 is not generated because the object is at a position where the object is readily visible to the driver 300.
  • the image generating unit 40 may generate an indentation 101 or may generate an image other than an indentation 101.
  • the indentation 101 may be generated in such a manner that the indentation extends in a direction in which an object is present from the driver's perspective.
  • Fig. 10 is a diagram depicting an example of an image output result.
  • the image generating unit 40 calculates an increased degree of danger with respect to the pedestrian 2. Accordingly, as depicted in Fig. 10, the depth of an indentation 101A corresponding to the pedestrian 2 included in the output image 100 is increased.
  • Fig. 11 is a diagram depicting another example of an image output result.
  • the image generating unit 40 calculates an increased degree of danger with respect to the preceding vehicle 3. Accordingly, as depicted in FIG. 11, the depth of an indentation 101B corresponding to the preceding vehicle 3 included in the output image 100 is increased.
  • Fig. 12 is a diagram depicting a yet another example of an image output result.
  • the image generating unit 40 calculates an increased degree of danger with respect to the parallelly-traveling vehicle 4. Accordingly, as depicted in FIG. 12, the depth of an indentation 101C corresponding to the parallelly-traveling vehicle 4 included in the output image 100 is increased.
  • the image generating unit 40 may calculate a degree of danger without considering operation information.
  • the calculated degree of danger with respect to the parallelly-traveling vehicle 4 increases when the image generating unit 40 determines that the driver 300 of the vehicle 301 is trying to change the lane, on the basis of the operation information including operational details with respect to the direction indicator.
  • the image generating unit 40 may calculate a degree of danger on the basis of operational details other than operational details with respect to the direction indicator, provided that the operational details to be used are related to a subsequent traveling operation of the vehicle 301. For example, on the basis of a steered angle of a steering wheel, a pressed amount with respect to a brake pedal or an accelerator pedal, or the like, a subsequent traveling operation of the vehicle 301 may be predicted to calculate a degree of danger.
  • Degree-of-danger calculating methods described above are examples, and other methods may be used instead.
  • the image generating unit 40 may calculate scores indicating degrees of danger from standpoints of a surrounding environment, a driver's line of sight, and a driver's operation of a vehicle, and may increase a degree of danger as the sum of the scores increases. In this way, it is possible to calculate an appropriate degree of danger in response to any external environmental changes and to driver's conditions.
  • an image indicating a magnitude of a degree of danger with respect to an object that is present outside a predetermined area that is included in a field of view of a driver 300 and a direction in which the object is present is displayed within the predetermined area that is included in the driver's field of view.
  • the driver 300 can identify a degree of danger outside the peripheral visual field, while viewing only the area surrounded by the peripheral visual field.
  • step S16 of the image output process illustrated in Fig. 7 the display control unit 50 may control the optical system 230 in such a manner as to display an output image 100 at a position in accordance with the point of view of the driver 300, on the basis of the point-of-view information.
  • a movable body may be a "train”, a “forklift”, a “airplane”, a “ship”, or the like.
  • an image indicating a magnitude of a degree of danger with respect to an object and a direction in which the object is present is an indentation
  • An image indicating a magnitude of a degree of danger with respect to an object and a direction in which the object is present may be a protrusion or an ellipse having different line colors, for example, as long as the image can communicate a magnitude of a degree of danger with respect to an object and a direction in which the object is present.
  • processing circuit may be a processor programmed to perform each function described above by software such as a processor implemented in an electronic circuit, an application specific integrated circuit (ASIC), a digital signal processor (DSP), a FPGA, or a conventional circuit module, designed to perform each function described above.
  • ASIC application specific integrated circuit
  • DSP digital signal processor
  • FPGA field-programmable gate array
  • conventional circuit module designed to perform each function described above.
  • Operation information obtaining unit 20 Surrounding information obtaining unit 30 Point-of-view information obtaining unit 40 Image generating unit 50
  • Display control unit 100 Output image 101, 101A, 101B, 101C Indentations 110
  • Forward-view camera 150 Driver monitoring camera 200
  • HUD 230 Optical system 250
  • Control system 301 Vehicle 302 Windshield 400
  • Vehicle navigation system 500 Sensors

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Instrument Panels (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'invention concerne un appareil d'affichage d'image comprenant une unité d'obtention d'informations d'environnement conçue pour obtenir des informations d'environnement indiquant un état environnant d'un corps mobile; une unité d'obtention d'informations de point de vue conçue pour obtenir des informations de point de vue indiquant une position d'un point de vue d'un conducteur du corps mobile; une unité de génération d'image conçue pour calculer un degré de danger par rapport à un objet à l'extérieur d'une zone prédéterminée qui est incluse dans le champ de vue du conducteur, le degré de danger étant calculé sur la base des informations d'environnement et des informations de point de vue, l'unité de génération d'image étant en outre conçue pour générer une image à afficher dans la zone prédéterminée, l'image indiquant le degré de danger et une direction dans laquelle l'objet se trouve; et une unité de commande d'affichage conçue pour délivrer en sortie l'image.
PCT/JP2021/002249 2020-01-29 2021-01-22 Appareil d'affichage d'image, procédé d'affichage d'image, programme, et support d'enregistrement non transitoire WO2021153454A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP21704031.0A EP4096957A1 (fr) 2020-01-29 2021-01-22 Appareil d'affichage d'image, procédé d'affichage d'image, programme, et support d'enregistrement non transitoire

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2020012255 2020-01-29
JP2020-012255 2020-06-19
JP2020212928A JP2021117987A (ja) 2020-01-29 2020-12-22 画像表示装置、画像表示方法およびプログラム
JP2020-212928 2020-12-22

Publications (1)

Publication Number Publication Date
WO2021153454A1 true WO2021153454A1 (fr) 2021-08-05

Family

ID=74561972

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/002249 WO2021153454A1 (fr) 2020-01-29 2021-01-22 Appareil d'affichage d'image, procédé d'affichage d'image, programme, et support d'enregistrement non transitoire

Country Status (2)

Country Link
EP (1) EP4096957A1 (fr)
WO (1) WO2021153454A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06255396A (ja) * 1993-03-04 1994-09-13 Mazda Motor Corp 車両の表示装置
JP2006224700A (ja) * 2005-02-15 2006-08-31 Denso Corp 車両用死角監視装置及び車両用運転支援システム
EP2736028A1 (fr) * 2011-07-21 2014-05-28 Toyota Jidosha Kabushiki Kaisha Appareil de transmission d'information de véhicule
JP2018173716A (ja) 2017-03-31 2018-11-08 株式会社Subaru 情報出力装置
WO2019123976A1 (fr) * 2017-12-21 2019-06-27 日本精機株式会社 Procédé de commande d'affichage, dispositif de commande d'affichage et dispositif d'affichage tête haute
JP2020012255A (ja) 2018-07-13 2020-01-23 株式会社日立建機ティエラ 建設機械

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06255396A (ja) * 1993-03-04 1994-09-13 Mazda Motor Corp 車両の表示装置
JP2006224700A (ja) * 2005-02-15 2006-08-31 Denso Corp 車両用死角監視装置及び車両用運転支援システム
EP2736028A1 (fr) * 2011-07-21 2014-05-28 Toyota Jidosha Kabushiki Kaisha Appareil de transmission d'information de véhicule
JP2018173716A (ja) 2017-03-31 2018-11-08 株式会社Subaru 情報出力装置
WO2019123976A1 (fr) * 2017-12-21 2019-06-27 日本精機株式会社 Procédé de commande d'affichage, dispositif de commande d'affichage et dispositif d'affichage tête haute
JP2020012255A (ja) 2018-07-13 2020-01-23 株式会社日立建機ティエラ 建設機械

Also Published As

Publication number Publication date
EP4096957A1 (fr) 2022-12-07

Similar Documents

Publication Publication Date Title
US10551619B2 (en) Information processing system and information display apparatus
US10890762B2 (en) Image display apparatus and image display method
JP6690657B2 (ja) 画像表示装置及び画像表示方法
US10696159B2 (en) Information providing apparatus
JP2017211366A (ja) 移動体システム、情報表示装置
JP6516151B2 (ja) 情報提供装置、情報提供方法及び情報提供用制御プログラム
JP6504431B2 (ja) 画像表示装置、移動体、画像表示方法及びプログラム
JP7300112B2 (ja) 制御装置、画像表示方法及びプログラム
JP2016107947A (ja) 情報提供装置、情報提供方法及び情報提供用制御プログラム
EP3543769B1 (fr) Appareil d'affichage d'images, objet mobile, procédé d'affichage d'images et moyens de support
WO2021153454A1 (fr) Appareil d'affichage d'image, procédé d'affichage d'image, programme, et support d'enregistrement non transitoire
JP6814416B2 (ja) 情報提供装置、情報提供方法及び情報提供用制御プログラム
JP7037764B2 (ja) 移動ルート案内装置、移動体、移動ルート案内方法及びプログラム
JP7505189B2 (ja) 画像表示装置、画像表示方法およびプログラム
JP2021117987A (ja) 画像表示装置、画像表示方法およびプログラム
WO2021132250A1 (fr) Dispositif d'affichage embarqué dans un véhicule et programme
JP7385834B2 (ja) 画像表示方法及び画像表示装置
JP7054483B2 (ja) 情報提供装置、情報提供方法及び情報提供用制御プログラム
JP6726412B2 (ja) 画像表示装置、移動体、画像表示方法及びプログラム
JP2021117703A (ja) 車載表示装置およびプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21704031

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2021704031

Country of ref document: EP

Effective date: 20220829