WO2016170764A1 - Driving assistance method and driving assistance device, driving control device, vehicle, and driving assistance program using such method - Google Patents

Driving assistance method and driving assistance device, driving control device, vehicle, and driving assistance program using such method Download PDF

Info

Publication number
WO2016170764A1
WO2016170764A1 PCT/JP2016/002049 JP2016002049W WO2016170764A1 WO 2016170764 A1 WO2016170764 A1 WO 2016170764A1 JP 2016002049 W JP2016002049 W JP 2016002049W WO 2016170764 A1 WO2016170764 A1 WO 2016170764A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
image
behavior
action
unit
Prior art date
Application number
PCT/JP2016/002049
Other languages
French (fr)
Japanese (ja)
Inventor
江村 恒一
勝長 辻
森 俊也
渉 仲井
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2015252668A external-priority patent/JP6685008B2/en
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Priority to CN201680034946.7A priority Critical patent/CN107683237B/en
Priority to EP16782788.0A priority patent/EP3269610B1/en
Priority to US15/565,887 priority patent/US10252726B2/en
Publication of WO2016170764A1 publication Critical patent/WO2016170764A1/en
Priority to US16/255,338 priority patent/US11072343B2/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K31/00Vehicle fittings, acting on a single sub-unit only, for automatically controlling vehicle speed, i.e. preventing speed from exceeding an arbitrarily established velocity or maintaining speed at a particular velocity, as selected by the vehicle operator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a vehicle, a driving support method provided in the vehicle, a driving support device using the same, a driving control device, and a driving support program.
  • Patent Document 1 discloses a travel control device that allows a driver to visually recognize the operating state of automatic steering control or automatic acceleration / deceleration control when the host vehicle performs automatic steering control or automatic acceleration / deceleration control. Yes.
  • the present invention relates to a driving support method, a driving support device, an automatic driving control device, a vehicle, and a program using the driving support method during fully automatic driving or partial automatic driving.
  • the driving support device includes a behavior information input unit, a detection information input unit, a candidate determination unit, an image generation unit, and an image output unit.
  • the behavior information input unit acquires behavior information indicating the first behavior to be executed by the vehicle from the automatic driving control unit that determines the behavior of the vehicle in the automatic driving of the vehicle.
  • the detection information input unit acquires detection information indicating a detection result from a detection unit that detects the surrounding situation and the running state of the vehicle.
  • the candidate determining unit determines a second action that can be executed by the vehicle after the first action indicated by the action information based on the detection information.
  • An image generation part produces
  • the image output unit outputs the first image and the second image to a display unit in the vehicle.
  • the apparatus includes an automatic operation control unit, a detection information input unit, a candidate determination unit, an image generation unit, and an image output unit.
  • the automatic driving control unit determines the behavior of the vehicle in the automatic driving of the vehicle.
  • the detection information input unit acquires detection information indicating a detection result from a detection unit that detects the surrounding situation and the running state of the vehicle.
  • the candidate determination unit determines a second action that can be executed by the vehicle after the first action that the automatic driving control unit causes the vehicle to execute based on the detection information.
  • An image generation part produces
  • the image output unit outputs the first image and the second image to a display unit in the vehicle.
  • Still another aspect of the present invention is a vehicle.
  • the vehicle includes an automatic driving control unit, a detection information input unit, a candidate determination unit, an image generation unit, and an image output unit.
  • the automatic driving control unit determines the behavior of the vehicle in the automatic driving of the vehicle.
  • a detection part detects the circumference condition and running state of this vehicle.
  • the detection information input unit acquires detection information indicating a detection result.
  • the candidate determination unit determines a second action that can be executed by the vehicle after the first action that is executed by the automatic driving control unit based on the detection information.
  • An image generation part produces
  • the image output unit outputs the first image and the second image to the display unit in the vehicle so that the first image and the second image are displayed within a fixed visual field of the driver of the vehicle.
  • Still another aspect of the present invention is a driving support method.
  • a computer executes a step of acquiring behavior information, a step of acquiring detection information, a step of determining, a step of generating an image, and a step of outputting to a display unit.
  • the behavior information indicating the first behavior to be executed by the vehicle is acquired from the automatic driving control unit that determines the behavior of the vehicle in the automatic driving of the vehicle.
  • detection information indicating a detection result is acquired from a detection unit that detects a surrounding situation and a running state of the vehicle.
  • the determining step determines a second action that can be executed by the vehicle after the first action indicated by the action information based on the detection information.
  • the step of generating an image generates a first image representing the first action indicated by the action information, and generates a second image representing the second action.
  • the step of outputting to the display unit outputs the first image and the second image to the display unit in the vehicle so that the first image and the second image are displayed within a fixed visual field of the driver of the vehicle.
  • the present invention in fully automatic driving or partial automatic driving, it is possible to appropriately transmit information so that a comfortable automatic driving can be performed in which the operation of the vehicle and the driver is unlikely to conflict.
  • FIG. 10 is a diagram for explaining the operation of the operation unit for the first example of the traveling environment according to the first embodiment.
  • 6 is a diagram showing another example of display in the notification unit according to Embodiment 1.
  • FIG. 3 is a flowchart showing a processing procedure of information notification processing according to Embodiment 1; It is a figure which shows the 1st example of the driving
  • FIG. It is a figure which shows the display control with respect to the 1st example of the driving environment which concerns on Embodiment 1.
  • FIG. It is a figure which shows the 1st example of the driving
  • FIG. It is a figure which shows another display control with respect to the 1st example of the driving environment which concerns on Embodiment 1.
  • FIG. It is a figure which shows the 2nd example of the driving environment which concerns on Embodiment 1.
  • FIG. 10 is a diagram showing a third example of a travel environment according to the first embodiment. It is a figure which shows the display control with respect to the 3rd example of the driving environment which concerns on Embodiment 1.
  • FIG. It is a figure which shows the 4th example of the traveling environment which concerns on Embodiment 1.
  • FIG. It is a figure which shows the display control with respect to the 4th example of the traveling environment which concerns on Embodiment 1.
  • FIG. It is a figure which shows the 5th example of the running environment which concerns on Embodiment 1.
  • FIG. 7A It is a figure which shows the display control with respect to the 5th example of the driving environment which concerns on Embodiment 1.
  • FIG. 5A It is a figure which shows another display control with respect to the 1st example of the driving
  • FIG. 7A It is a figure which shows another display control with respect to the 2nd example of the driving
  • FIG. 7A It is a block diagram which shows the principal part structure of the vehicle containing the information alerting device which concerns on Embodiment 2 of this invention. 10 is a diagram for explaining display on a touch panel according to Embodiment 2.
  • FIG. 10 is a diagram for explaining display on a touch panel according to Embodiment 2.
  • FIG. 10 is a diagram for explaining display on a touch panel according to Embodiment 2.
  • FIG. 10 is a diagram for explaining display on a touch panel according to Embodiment 2.
  • FIG. 10 is a diagram for explaining display on a touch panel according to Embodiment 2.
  • FIG. It is a figure explaining the display of the alerting
  • FIG. It is a figure explaining the display of the alerting
  • FIG. It is a figure which shows an example of the travel history which concerns on Embodiment 4 of this invention.
  • FIG. 10 is a diagram for explaining display on a touch panel according to Embodiment 2.
  • FIG. 10 is a diagram for explaining display on a touch panel according to Embodiment 2.
  • FIG. It is a figure explaining the display of the alerting
  • FIG. 10 is a diagram illustrating a clustering type driver model construction method according to a fourth embodiment.
  • FIG. 10 is a diagram illustrating an example of a clustered driver model constructed according to a fourth embodiment.
  • FIG. 10 is a diagram illustrating another example of a built clustering driver model according to the fourth embodiment.
  • FIG. 10 is a diagram illustrating a method of constructing an individual adaptive driver model according to a fourth embodiment.
  • FIG. 10 is a diagram illustrating an example of a constructed individual adaptive driver model according to a fourth embodiment.
  • FIG. 10 is a diagram illustrating an example of an operation characteristic model according to a fourth embodiment. It is a figure explaining the display of the alerting
  • FIG. 10 is a diagram illustrating a clustering type driver model construction method according to a fourth embodiment.
  • FIG. 10 is a diagram illustrating an example of a clustered driver model constructed according to a fourth embodiment.
  • FIG. 10 is a diagram illustrating another
  • FIG. It is a figure explaining the display of the alerting
  • FIG. It is a figure explaining the display of the alerting
  • FIG. It is a figure explaining the display of the alerting
  • FIG. It is a figure explaining the display of the alerting
  • FIG. It is a figure explaining the display of the alerting
  • FIG. It is a figure explaining the display of the alerting
  • FIG. It is a figure explaining the display of the alerting
  • FIG. It is a figure explaining the display of the alerting
  • FIG. It is a figure explaining the display of the alerting
  • FIG. It is a figure explaining the display of
  • FIG. It is a figure explaining the display of the alerting
  • FIG. It is a figure explaining the display of the alerting
  • FIG. It is a figure explaining the display of the alerting
  • FIG. It is a figure explaining the display of the alerting
  • FIG. It is a figure explaining the display of the alerting
  • FIG. It is a figure explaining the display of the alerting
  • FIG. It is a figure which shows an example of the travel history which concerns on Embodiment 4.
  • FIG. 16 is a diagram illustrating a method of using a driver model in a modified example of the driver model according to the fourth embodiment.
  • FIG. 16 is a diagram illustrating a method of using a driver model in a modified example of the driver model according to the fourth embodiment.
  • FIG. 20 is a block diagram illustrating an example of cache arrangement in a variation of the driver model according to the fourth embodiment.
  • FIG. 20 is a diagram illustrating an example of a cache creation method in a variation of the driver model according to the fourth embodiment.
  • FIG. 20 is a diagram illustrating an example of a cache creation method in a variation of the driver model according to the fourth embodiment.
  • FIG. 20 is a diagram illustrating an example of a cache creation method in a variation of the driver model according to the fourth embodiment.
  • FIG. 20 is a diagram illustrating an example of a cache creation method in a variation of the driver model according to the fourth embodiment.
  • FIG. 20 is a diagram illustrating an example of a cache creation method in a variation
  • FIG. 6 is a block diagram showing a configuration of a vehicle according to Embodiments 5 to 11 of the present invention. It is a figure which shows typically the interior of the vehicle of FIG. It is a block diagram which shows the detailed structure of the detection part of FIG. It is a figure which shows the action information input from the automatic driving
  • FIG. 10 is a block diagram illustrating a detailed configuration of a control unit of a driving support apparatus according to a fifth embodiment. It is a figure which shows an example of the automatic driving
  • FIG. It is a figure which shows an example of the automatic driving
  • FIG. 10 is a sequence diagram illustrating an example of processing related to HMI control of a vehicle according to a fifth embodiment. 10 is a flowchart illustrating an example of processing of the driving support device according to the fifth embodiment. It is a block diagram which shows the detailed structure of the memory
  • FIG. 10 is a block diagram illustrating a detailed configuration of a control unit of a driving assistance apparatus according to a sixth embodiment. It is a figure which shows an example of the automatic driving
  • FIG. 16 is a sequence diagram illustrating an example of processing related to HMI control of a vehicle according to a sixth embodiment.
  • 14 is a flowchart illustrating an example of processing of a driving assistance device according to a sixth embodiment.
  • 14 is a flowchart illustrating an example of processing of a driving assistance device according to a sixth embodiment. It is a figure which shows an example of the automatic driving
  • FIG. 7 It is a figure which shows an example of the automatic driving
  • FIG. It is a figure which shows an example of the automatic driving
  • FIG. 7 It is a figure which shows an example of the automatic driving
  • FIG. It is a figure which shows an example of the automatic driving
  • FIG. It is a figure which shows an example of the automatic driving
  • FIG. It is a figure which shows an example of the automatic driving
  • FIG. 7 It is a figure which shows an example of the automatic driving
  • FIG. It is a figure which shows an example of the automatic driving
  • FIG. It is a figure which shows an example of the automatic driving
  • FIG. 18 is a flowchart illustrating an example of processing of the driving support device according to the seventh embodiment.
  • FIG. 18 is a flowchart illustrating an example of processing of the driving support device according to the seventh embodiment. It is a figure which shows an example of the automatic driving
  • FIG. 20 is a flowchart illustrating an example of processing of the driving support device according to the eighth embodiment. It is a figure which shows an example of the automatic driving
  • 25 is a flowchart illustrating an example of processing of the driving support device according to the ninth embodiment. It is a block diagram which shows the detailed structure of the memory
  • FIG. 22 is a block diagram illustrating a detailed configuration of a control unit of a driving support apparatus according to Embodiment 10. It is a figure which shows an example of the automatic driving
  • FIG. 1 is a block diagram showing a main configuration of a vehicle 1 including an information notification device according to Embodiment 1 of the present invention.
  • the vehicle 1 is a vehicle that can automatically perform all or part of the driving control without requiring the operation of the driver.
  • the vehicle 1 includes a brake pedal 2, an accelerator pedal 3, a winker lever 4, a steering wheel 5, a detection unit 6, a vehicle control unit 7, a storage unit 8, and an information notification device 9.
  • the brake pedal 2 receives a brake operation by the driver and decelerates the vehicle 1.
  • the brake pedal 2 may receive a control result from the vehicle control unit 7 and change in an amount corresponding to the degree of deceleration of the vehicle 1.
  • the accelerator pedal 3 accepts an accelerator operation by the driver and accelerates the vehicle 1. Further, the accelerator pedal 3 may receive a control result by the vehicle control unit 7 and may change by an amount corresponding to the degree of acceleration of the vehicle 1.
  • the winker lever 4 receives a lever operation by the driver and turns on a direction indicator (not shown) of the vehicle 1.
  • the winker lever 4 may receive a control result from the vehicle control unit 7, change the winker lever 4 to a state corresponding to the direction indicating direction of the vehicle 1, and turn on a direction indicator (not shown) of the vehicle 1.
  • Steering wheel 5 receives the steering operation by the driver and changes the traveling direction of the vehicle 1. Further, the steering wheel 5 may receive a control result by the vehicle control unit 7 and may change by an amount corresponding to a change in the traveling direction of the vehicle 1.
  • the steering wheel 5 has an operation unit 51.
  • the operation unit 51 is provided on the front surface (surface facing the driver) of the steering wheel 5 and receives an input operation from the driver.
  • the operation unit 51 is a device such as a button, a touch panel, or a grip sensor, for example.
  • the operation unit 51 outputs information on the input operation received from the driver to the vehicle control unit 7.
  • the detection unit 6 detects the traveling state of the vehicle 1 and the situation around the vehicle 1. Then, the detection unit 6 outputs information on the detected traveling state and surrounding conditions to the vehicle control unit 7.
  • the detection unit 6 includes a position information acquisition unit 61, a sensor 62, a speed information acquisition unit 63, and a map information acquisition unit 64.
  • the position information acquisition unit 61 acquires the position information of the vehicle 1 as traveling state information by GPS (Global Positioning System) positioning or the like.
  • GPS Global Positioning System
  • the sensor 62 determines the collision prediction time (TTC: Time) from the position of the other vehicle existing around the vehicle 1 and the lane position information, from the type of the other vehicle and whether it is a preceding vehicle, the speed of the other vehicle, and the speed of the own vehicle. To Collision), obstacles existing around the vehicle 1 and other conditions around the vehicle 1 are detected.
  • TTC Time
  • the speed information acquisition unit 63 acquires information such as the speed or the traveling direction of the vehicle 1 from a speed sensor or the like (not shown) as the traveling state information.
  • the map information acquisition unit 64 obtains map information around the vehicle 1 such as the road on which the vehicle 1 travels, a merging point with other vehicles on the road, the currently traveling lane, the position of the intersection, and the like. Obtain as information.
  • the sensor 62 is constituted by a millimeter wave radar, a laser radar, a camera, or a combination thereof.
  • the storage unit 8 is a storage device such as a ROM (Read Only Memory), a RAM (Random Access Memory), a hard disk device, or an SSD (Solid State Drive), and the current running environment and the next (first predetermined time elapse). Memorize the correspondence between possible behavior candidates (later).
  • the current traveling environment is an environment determined by the position of the vehicle 1, the road on which the vehicle 1 is traveling, the position and speed of other vehicles existing around the vehicle 1, and the like.
  • the other vehicle may be interrupted during acceleration or deceleration depending on the position or speed of the other vehicle, and collision may occur after 1 second You may judge to sex.
  • the candidate for behavior is a candidate for behavior that the vehicle 1 can take next (after the first predetermined time) with respect to the current traveling environment.
  • the storage unit 8 has a merge path ahead of the lane in which the vehicle 1 travels, there is a vehicle that merges from the left side of the lane, and the lane can be changed to the right side of the lane in which the vehicle 1 travels.
  • three behavior candidates of acceleration of the vehicle 1, deceleration of the vehicle 1, and lane change to the right of the vehicle 1 are stored in advance.
  • the storage unit 8 allows a vehicle traveling in front of the same lane as the vehicle 1 (hereinafter referred to as “preceding vehicle”) to travel at a slower speed than the vehicle 1 and can change the lane to an adjacent lane.
  • preceding vehicle a vehicle traveling in front of the same lane as the vehicle 1
  • three behavior candidates are stored in advance: driving that overtakes the preceding vehicle, driving that changes the lane to the adjacent lane, and driving that decelerates the vehicle 1 and follows the preceding vehicle.
  • the storage unit 8 may store priorities for the respective behavior candidates. For example, the storage unit 8 may store the number of behaviors actually adopted in the same driving environment in the past, and may store the priority set higher for the behaviors that are adopted more frequently.
  • the vehicle control unit 7 can be realized, for example, as part of an LSI (Large Scale Integration) circuit or an electronic control unit (Electronic Control Unit: ECU) that controls the vehicle.
  • the vehicle control unit 7 controls the vehicle based on the traveling state information and the surrounding situation information acquired from the detection unit 6, and the brake pedal 2, the accelerator pedal 3, the blinker lever 4, and information notification corresponding to the vehicle control result.
  • the device 9 is controlled.
  • the object which the vehicle control part 7 controls is not limited to these.
  • the vehicle control unit 7 determines the current driving environment based on information on the driving state and surrounding conditions. For this determination, various conventionally proposed methods can be used.
  • the vehicle control unit 7 determines that the current driving environment is based on the information on the driving state and the surrounding situation: “There is a merge path in front of the lane in which the vehicle 1 travels, and a vehicle that merges from the left side of the lane. It is determined that the travel environment is present and can be changed to the right of the lane in which the vehicle 1 travels.
  • the vehicle control unit 7 determines that the time series of the travel environment is “a vehicle traveling in front of the same lane as the vehicle 1 travels at a slower speed than the vehicle 1 based on information on the travel state and the surrounding conditions. In addition, it is determined that the travel environment allows a lane change to the adjacent lane.
  • the vehicle control unit 7 causes the notification unit 92 of the information notification device 9 to notify information related to the traveling environment indicating the traveling state and the surrounding situation. Further, the vehicle control unit 7 reads, from the storage unit 8, behavior candidates that the vehicle 1 can take next (after the first predetermined time has elapsed) with respect to the determined traveling environment.
  • the vehicle control unit 7 determines which behavior is most suitable for the current traveling environment from the read behavior candidates, and sets the behavior most suitable for the current traveling environment as the first behavior.
  • the first behavior may be the same behavior that the vehicle is currently implementing, that is, continuing the currently implemented behavior.
  • the vehicle control part 7 sets the candidate of the behavior which a driver
  • the vehicle control unit 7 may set the most suitable behavior as the first behavior using a conventional technique that determines the most suitable behavior based on information on the running state and the surrounding situation.
  • the vehicle control unit 7 may set a preset behavior among the plurality of behavior candidates as the most suitable behavior, or store information on the behavior selected last time in the storage unit 8.
  • the behavior may be determined as the most suitable behavior, or the number of times each behavior has been selected in the past is stored in the storage unit 8, and the behavior with the largest number of times may be determined as the most suitable behavior. Good.
  • the vehicle control part 7 makes the alerting
  • vehicle control unit 7 may cause the notification unit 92 to simultaneously notify the information on the first behavior and the second behavior, and information on the running state and the surrounding situation.
  • the vehicle control unit 7 acquires information on the operation received by the operation unit 51 from the driver. After notifying the first behavior and the second behavior, the vehicle control unit 7 determines whether or not the operation unit 51 has accepted the operation within the second predetermined time. This operation is, for example, an operation for selecting one behavior from behaviors included in the second behavior.
  • the vehicle control unit 7 controls the vehicle so as to execute the first behavior when the operation unit 51 does not accept the operation within the second predetermined time, and the brake pedal 2 and the accelerator according to the vehicle control result.
  • the pedal 3 and the winker lever 4 are controlled.
  • the vehicle control unit 7 performs control corresponding to the accepted operation when the operation unit 51 accepts the operation within the second predetermined time.
  • the information notification device 9 acquires various information related to the traveling of the vehicle 1 from the vehicle control unit 7 and notifies the acquired information.
  • the information notification device 9 includes an information acquisition unit 91 and a notification unit 92.
  • the information acquisition unit 91 acquires various information related to the traveling of the vehicle 1 from the vehicle control unit 7. For example, the information acquisition unit 91 acquires the first behavior information and the second behavior information from the vehicle control unit 7 when the vehicle control unit 7 determines that there is a possibility of updating the behavior of the vehicle 1. To do.
  • the information acquisition part 91 memorize
  • the notification unit 92 notifies the driver of information related to the traveling of the vehicle 1.
  • the notification unit 92 displays information such as a car navigation system installed in the vehicle, a head-up display, a center display, a steering wheel 5 or a light emitter such as an LED (Light Emitting Diode) installed in the pillar.
  • the display part to display may be sufficient.
  • it may be a speaker that converts information into sound and notifies the driver, or may be a vibrating body provided at a position that can be sensed by the driver (for example, the driver's seat, the steering wheel 5).
  • the notification unit 92 may be a combination of these.
  • the notification unit 92 transmits information and corresponds to the notification device 1002 of FIG. 32 described later.
  • the notification unit 92 includes, for example, a head-up display (Head Up Display: HUD), an LCD (Liquid Crystal Display), an HMD (Head-Mounted Display or Helmet-Mounted Display), an eyeglass-type display (SmartGlass), and the like.
  • HUD head-up display
  • LCD Liquid Crystal Display
  • HMD Head-Mounted Display or Helmet-Mounted Display
  • eyeglass-type display SmartGlass
  • the HUD may be, for example, a windshield of the vehicle 1, or may be a separately provided glass surface, plastic surface (for example, a combiner), or the like.
  • the windshield may be, for example, a windshield, a side glass or a rear glass of the vehicle 1.
  • the HUD may be a transmissive display installed on the surface or inside of the windshield.
  • the transmissive display is, for example, a transmissive organic EL (electro-luminescence) display or a transparent display using glass that emits light when irradiated with light of a specific wavelength.
  • the driver can view the display on the transmissive display at the same time as viewing the background.
  • the notification unit 92 may be a display medium that transmits light. In either case, an image is displayed on the notification unit 92.
  • the notification unit 92 notifies the driver of information related to travel acquired from the vehicle control unit 7 via the information acquisition unit 91.
  • the notification unit 92 notifies the driver of information on the first behavior and the second behavior acquired from the vehicle control unit 7.
  • FIGS. 2A to 2C are diagrams for explaining a first example of the driving environment, the display of the notification unit 92 and the operation of the operation unit 51 corresponding thereto.
  • FIG. 2A is an overhead view showing the traveling environment of the vehicle 1. Specifically, FIG. 2A shows that there is a merge path ahead of the lane in which the vehicle 1 travels, there is a vehicle that merges from the left side of the lane, and the lane change to the right side of the lane in which the vehicle 1 travels. Indicates a possible driving environment.
  • the vehicle control unit 7 determines that the traveling environment is a traveling environment as shown in FIG. 2A based on information on the traveling state and the surrounding situation. Note that the vehicle control unit 7 may generate the overhead view shown in FIG. 2A and notify the notification unit 92 of the generated overhead view in addition to the information on the first behavior and the second behavior.
  • FIG. 2B shows an example of the display of the notification unit 92 for the traveling environment shown in FIG. 2A.
  • the display range of the notification unit 92 options on the behavior of the vehicle 1 are displayed on the right side, and information for switching to manual driving is displayed on the left side.
  • the first behavior is “lane change” shown in the highlighted display area 29b among the display areas 29a to 29c, 29g.
  • the second behavior is “acceleration” and “deceleration” shown in the display areas 29a and 29c, respectively.
  • the display area 29g displays “automatic operation end” indicating switching to manual operation.
  • FIG. 2C shows an example of the operation unit 51 provided in the steering wheel 5.
  • the operation unit 51 includes operation buttons 51 a to 51 d provided on the right side of the steering wheel 5 and operation buttons 51 e to 51 h provided on the left side of the steering wheel 5.
  • the number, shape, etc. of the operation part 51 provided in the steering wheel 5 are not limited to these.
  • display areas 29a to 29c and operation buttons 51a to 51c shown in FIG. 2B correspond to each other, and display area 29g and operation buttons 51g correspond to each other.
  • the driver presses an operation button corresponding to each display area when selecting any of the contents displayed in each display area. For example, when the driver selects the behavior “acceleration” displayed in the display area 29a, the driver presses the operation button 51a.
  • FIG. 2B only character information is displayed in each display area, but a symbol or icon relating to driving of the vehicle may be displayed as described below. As a result, the driver can grasp the display contents at a glance.
  • FIG. 3 is a diagram showing another example of display in the notification unit 92. As shown in FIG. 3, both character information and symbols indicating the information are displayed in the display areas 39a to 39c and 39g. Only symbols may be displayed.
  • FIG. 4 is a flowchart showing a processing procedure of information notification processing in the present embodiment.
  • FIG. 5A is a diagram illustrating a first example of a traveling environment
  • FIG. 5B is a diagram illustrating display control for the first example.
  • the detection unit 6 detects the traveling state of the vehicle (step S11). Next, the detection unit 6 detects the situation around the vehicle (step S12). Information on the detected traveling state of the vehicle and the situation around the vehicle is output by the detection unit 6 to the vehicle control unit 7.
  • the vehicle control unit 7 determines the current traveling environment based on the information on the traveling state and the surrounding situation (step S13). In the case of the example of FIG. 5A, the vehicle control unit 7 determines that the current driving environment is “there is a merge path in front of the lane in which the vehicle 1 travels, there is a vehicle that merges from the left side of the lane, and the vehicle 1 Is determined to be a “traveling environment in which a lane to the right of the lane in which the vehicle travels can be changed”.
  • the vehicle control unit 7 causes the notification unit 92 of the information notification device 9 to notify the determined traveling environment information (step S14).
  • the vehicle control unit 7 outputs information on the determined traveling environment to the information acquisition unit 91.
  • the notification unit 92 acquires travel environment information from the information acquisition unit 91 and displays it as character information 59.
  • the vehicle control unit 7 may notify the driver of the information on the driving environment as sound through a speaker or the like instead of displaying the information on the driving environment on the notification unit 92.
  • the vehicle control unit 7 determines whether or not the determined traveling environment has a possibility of updating the behavior. If it is determined that there is a possibility of updating, the vehicle control unit 7 further includes the first behavior, And determination of a 2nd behavior is performed (step S15). The determination as to whether or not the driving environment is likely to be updated is made based on whether or not the driving environment has changed.
  • the behavior to be implemented after the update is, for example, the vehicle decelerates when there is a possibility of a collision with another vehicle, the speed changes when the preceding vehicle disappears in ACC (Adaptive Cruise Control), the adjacent lane is empty It is possible to change lanes in the event of an accident. Whether to update or not is determined using conventional technology.
  • the vehicle control unit 7 reads, from the storage unit 8, candidate behaviors that the vehicle 1 can take next (after the first predetermined time has elapsed) with respect to the determined traveling environment. Then, the vehicle control unit 7 determines which behavior is most suitable for the current traveling environment from the behavior candidates, and sets the behavior most suitable for the current traveling environment as the first behavior. Then, the vehicle control unit 7 sets behavior candidates excluding the first behavior to the second behavior.
  • the vehicle control unit 7 reads from the storage unit 8 candidates for three behaviors: acceleration of the vehicle 1, deceleration of the vehicle 1, and lane change to the right of the vehicle 1. Then, the vehicle control unit 7 determines that the rightward lane change of the vehicle 1 is the most suitable behavior based on the speed of the vehicle joining from the left side and the situation of the right lane of the vehicle 1. The behavior is set to the first behavior. Then, the vehicle control unit 7 sets behavior candidates excluding the first behavior to the second behavior.
  • the vehicle control unit 7 causes the notification unit 92 of the information notification device 9 to notify the first behavior and the second behavior (step S16).
  • the notification unit 92 highlights and displays the character information “lane change”, which is the first behavior information, in the display area 59b, and “acceleration”, which is the second behavior information, “Deceleration” is displayed in the display areas 59a and 59c, respectively.
  • the vehicle control unit 7 determines whether or not the operation unit 51 has received an operation from the driver within the second predetermined time (step S17).
  • the vehicle control unit 7 sets the first predetermined time as the time from when it is determined that the current driving environment is the driving environment shown in FIG. And the vehicle control part 7 sets 2nd predetermined time shorter than 1st predetermined time as time when reception of operation with respect to the next behavior performed by a merge point is possible.
  • the vehicle control unit 7 determines whether the received operation is an operation for terminating automatic driving or a behavior selection operation (so-called operation). Update) is determined (step S18).
  • each display area of the notification unit 92 and each operation button of the operation unit 51 correspond to each other.
  • the driver presses the operation button 51g shown in FIG. 2C.
  • the driver presses one of the operation buttons 51a to 51c shown in FIG. 2C.
  • the vehicle control unit 7 terminates the automatic driving when the operation received by the operating unit 51 is an operation for terminating the automatic driving (that is, when it is detected that the operation button 51g is pressed) (step S19).
  • the operation received by the operation unit 51 is a behavior selection operation (that is, when any of the operation buttons 51a to 51c is pressed)
  • the vehicle control unit 7 executes the behavior corresponding to the pressed operation button.
  • the vehicle 1 is controlled (step S20).
  • the vehicle control unit 7 controls the vehicle 1 to execute the first behavior when the operation unit 51 does not accept the operation from the driver within the second predetermined time (NO in step S17). (Step S21).
  • FIG. 6A is a diagram showing a first example of the driving environment
  • FIG. 6B is a diagram showing another display control for the first example. 6A is similar to FIG. 5A, but the display control of FIG. 6B is different from the display control of FIG. 5B.
  • the vehicle control unit 7 performs acceleration of the vehicle 1, deceleration of the vehicle 1, and right side of the vehicle 1 from the storage unit 8 with respect to the traveling environment illustrated in FIG. 6A. Candidates for the three behaviors of lane change to. At that time, it is assumed that the storage unit 8 stores a behavior in which the lane change to the right side of the vehicle 1 has the highest priority.
  • the vehicle control unit 7 causes the notification unit 92 to notify the traveling environment information and the first behavior information.
  • the vehicle control part 7 produces
  • the vehicle control unit 7 causes the display areas 69a and 69c to display a display prompting the driver to adopt or reject the first behavior.
  • the vehicle control unit 7 displays a display “automatic driving end” indicating that switching to manual driving is possible in the display area 69g.
  • the vehicle control unit 7 highlights and displays “YES” corresponding to adopting the first behavior. Which of “YES” and “NO” is emphasized and displayed may be determined in advance, the option selected last time may be highlighted and displayed, or the number of times selected in the past May be stored in the storage unit 8, and the notification unit 92 may highlight and display the one with the larger number of times.
  • the vehicle control unit 7 can appropriately notify the driver of information. Moreover, the display made to alert
  • FIG. 7A is a diagram showing a second example of the driving environment
  • FIG. 7B is a diagram showing display control for the second example.
  • FIG. 7A is an overhead view showing a traveling environment.
  • the traveling environment shown in FIG. 7A is similar to FIGS. 5A and 6A in that there is a junction path ahead, but differs from FIGS. 5A and 6A in that a traveling vehicle exists on the right side of the vehicle 1. In such a case, the vehicle control unit 7 determines that the lane change cannot be performed.
  • the vehicle control unit 7 determines that the traveling environment of the vehicle 1 is as shown in FIG. 7A
  • the vehicle control unit 7 displays information on the determined traveling environment as character information 79 on the notification unit 92 as shown in FIG. 7B.
  • the vehicle control unit 7 selects the right side of the vehicle 1 among the three behavior candidates of acceleration of the vehicle 1 read from the storage unit 8, deceleration of the vehicle 1, and lane change to the right side of the vehicle 1. Since the lane cannot be changed, only the acceleration of the vehicle 1 and the deceleration of the vehicle 1 are selected.
  • the vehicle control unit 7 predicts that the vehicle 1 is too close to the joining vehicle when proceeding at this speed, and determines that the deceleration of the vehicle 1 is the most suitable behavior, that is, the first behavior.
  • the most suitable behavior is determined using a conventional technique that determines the most suitable behavior based on information on the driving state and the surrounding situation. Further, which behavior is most suitable may be determined in advance, or information on the behavior selected last time may be stored in the storage unit 8, and the behavior may be determined as the most suitable behavior. Then, the number of times each behavior has been selected in the past may be stored in the storage unit 8, and the behavior with the largest number of times may be determined as the most suitable behavior.
  • the vehicle control unit 7 displays “Deceleration” as the first behavior in the display area 79c, and displays “Acceleration” as the second behavior in the display area 79a. Further, the vehicle control unit 7 causes the display area 79g to display “automatic driving end” indicating switching to manual driving.
  • the vehicle control unit 7 can notify the driver of the behavior most suitable for the traveling environment as the first behavior according to the traveling environment.
  • the information on the first behavior may be arranged on the upper side and the information on the second behavior may be arranged on the lower side, and selection functions may be assigned to the operation buttons 51a and 51c, respectively.
  • the acceleration behavior information is arranged upward
  • the deceleration behavior information is arranged downward
  • the right lane change behavior information is arranged on the right side
  • the left lane change behavior information is arranged on the left side, respectively.
  • 51b, 51d may be assigned a selection function, or they may be switched, and a separate action priority arrangement or operation priority arrangement may be displayed.
  • the display size of the first behavior information may be increased and the display size of the second behavior information may be decreased.
  • the behavior information display corresponding to the behavior of the front / rear / left / right of the vehicle the driver can recognize and operate intuitively.
  • FIG. 8A is a diagram showing a third example of the driving environment
  • FIG. 8B is a diagram showing display control for the third example.
  • FIG. 8A is an overhead view showing the traveling environment of the vehicle 1. Specifically, FIG. 8A shows a travel environment in which the preceding vehicle travels at a speed slower than that of the vehicle 1 and the lane can be changed to the adjacent lane.
  • the vehicle control unit 7 determines that the traveling environment is a traveling environment as shown in FIG. 8A based on information on the traveling state and the surrounding situation. In this case, the vehicle control unit 7 causes the notification unit 92 to display the determined traveling environment information as character information 89.
  • the vehicle control unit 7 can select three behaviors as a candidate for the behavior corresponding to the determined traveling environment: traveling that overtakes the preceding vehicle, traveling that changes the lane to the adjacent lane, and traveling that decelerates the vehicle 1 and follows the preceding vehicle.
  • the candidate for the street behavior is read from the storage unit 8.
  • the vehicle control unit 7 allows the speed after the deceleration of the preceding vehicle to be higher than a predetermined value, so that the behavior in which the vehicle 1 decelerates and follows the preceding vehicle is most suitable, that is, the first behavior. It is determined that
  • the most suitable behavior is determined using a conventional technique that determines the most suitable behavior based on information on the driving state and the surrounding situation. Further, which behavior is most suitable may be determined in advance, or information on the behavior selected last time may be stored in the storage unit 8, and the behavior may be determined as the most suitable behavior. Then, the number of times each behavior has been selected in the past may be stored in the storage unit 8, and the behavior with the largest number of times may be determined as the most suitable behavior.
  • the vehicle control unit 7 highlights and displays the character information “follow” indicating the first behavior in the display area 89 c, and displays “passing” and “lane” indicating the second behavior. Character information “change” is displayed in the display areas 89a and 89b, respectively. Further, the vehicle control unit 7 causes the display area 89g to display “automatic driving end” indicating switching to manual driving.
  • the information on the first behavior may be arranged on the upper side and the information on the second behavior may be arranged on the lower side, and selection functions may be assigned to the operation buttons 51a and 51c, respectively.
  • the passing behavior information is arranged upward
  • the following behavior information is arranged downward
  • the right lane change behavior information is arranged on the right side
  • the left lane changing behavior information is arranged on the left side
  • the operation buttons 51a and 51c are arranged.
  • 51b, 51d may be assigned a selection function, or they may be switched, and a separate action priority arrangement or operation priority arrangement may be displayed.
  • the display size of the first behavior information may be increased and the display size of the second behavior information may be decreased.
  • FIG. 9A is a diagram showing a fourth example of the driving environment
  • FIG. 9B is a diagram showing display control for the fourth example.
  • FIG. 9A is an overhead view showing the traveling environment of the vehicle 1. Specifically, FIG. 9A shows that the traveling environment is a traveling environment in which lanes decrease in front of the same lane as the vehicle 1.
  • the vehicle control unit 7 determines that the traveling environment is a traveling environment as illustrated in FIG. 9A based on information on the traveling state and the surrounding situation. In this case, the vehicle control unit 7 causes the notification unit 92 to display the determined traveling environment information as the character information 99.
  • the vehicle control unit 7 reads out from the storage unit 8 two candidate behaviors, that is, a behavior for changing the lane to the adjacent lane and a driving for maintaining the current lane as the behavior candidates corresponding to the determined travel environment. .
  • the vehicle control unit 7 determines that the travel to change the lane to the adjacent lane is the most suitable behavior, that is, the first behavior because the TTC to the lane decrease point is shorter than a predetermined value. To do.
  • which of the two behavior candidates is the most suitable behavior is determined using a conventional technique for determining the most suitable behavior based on information on the driving state and the surrounding situation. Further, which behavior is most suitable may be determined in advance, or information on the behavior selected last time may be stored in the storage unit 8, and the behavior may be determined as the most suitable behavior. Then, the number of times each behavior has been selected in the past may be stored in the storage unit 8, and the behavior with the largest number of times may be determined as the most suitable behavior.
  • the vehicle control unit 7 highlights and displays the character information “lane change” indicating the first behavior in the display area 99b, and the characters “as is” indicating the second behavior. Information is displayed in the display area 99c. Further, the vehicle control unit 7 causes the display area 99g to display “automatic driving end” indicating switching to manual driving.
  • the first behavior information may be arranged above, the second behavior information may be arranged below, and a selection function may be assigned to each of the operation buttons 51a and 51c.
  • the change behavior information is arranged on the right side, the left lane change behavior information is arranged on the left side, and a selection function may be assigned to each of the operation buttons 51c, 51b, 51d.
  • You may display whether it is action priority arrangement
  • the display size of the first behavior information may be increased and the display size of the second behavior information may be decreased.
  • FIGS. 7B, 8B, and 9B different functions are assigned to the display areas according to different traveling environments, so that information notification or operation can be performed in a small area.
  • the vehicle control unit 7 causes the notification unit 92 to notify the behavior in accordance with the information on the traveling environment and the surrounding situation, but the present invention is not limited to this.
  • the notification unit 92 may be notified of the behavior.
  • FIG. 10A is a diagram illustrating a fifth example of the driving environment
  • FIG. 10B is a diagram illustrating display control for the fifth example.
  • FIG. 10A is an overhead view showing the traveling environment of the vehicle 1. Specifically, FIG. 10A shows a travel environment indicating that the vehicle 1 is a travel environment in which lanes can be changed to the left and right.
  • the driving environment shown in FIG. 10A is a driving environment in which normal driving is possible without changing lanes or accelerating or decelerating the vehicle, unlike the case of FIGS. 5A to 9A.
  • the vehicle control unit 7 does not have to display the travel environment information on the notification unit 92 as character information.
  • the vehicle control unit 7 stores the behavior candidate in the normal travel as the storage unit 8. Read from.
  • the acceleration of the vehicle 1, the deceleration of the vehicle 1, the lane change to the right side of the vehicle 1, Four candidate behaviors for lane change to the left are stored.
  • the vehicle control unit 7 reads out these and displays them on the display areas 109a to 109d of the notification unit 92, respectively.
  • the vehicle control unit 7 displays “automatic driving end” indicating that switching to manual driving is displayed in the display area 109g, and displays “cancel” indicating canceling behavior update in the display area 109e. Highlight and display.
  • the display in the notification unit 92 has been described as character information, but the present invention is not limited to this.
  • it may be displayed visually to the driver using a symbol indicating behavior.
  • display using symbols that are visually displayed to the driver will be described by taking the display for FIGS. 5B and 7B as an example.
  • FIG. 11 is a diagram showing another display control for the first example of the traveling environment shown in FIG. 5A.
  • the first behavior described above is a lane change to the right of the vehicle 1
  • the second behavior is acceleration of the vehicle 1 and deceleration of the vehicle 1.
  • a symbol 111 indicating “lane change” as the first behavior is displayed large in the center
  • deceleration of the vehicle 1 is displayed small to the right.
  • a symbol 114 indicating the end of automatic driving is displayed small on the left.
  • the lane change is performed.
  • FIG. 12A and 12B are diagrams showing another display control for the second example of the traveling environment shown in FIG. 7A.
  • the lane cannot be changed. Therefore, for example, “deceleration of the vehicle 1” is set to the first behavior, and “acceleration of the vehicle 1” is set to the second behavior.
  • a symbol 121 indicating “deceleration of the vehicle 1” as the first behavior is displayed large in the center, and a symbol indicating “acceleration of the vehicle 1” as the second behavior. 122 is displayed small to the right. In addition, a symbol 123 indicating the end of automatic driving is displayed small on the left.
  • the operation unit 51 receives an operation for selecting “acceleration of the vehicle 1” from the driver.
  • a symbol 122 ′ indicating “acceleration of the vehicle 1” that is the first behavior is displayed large in the center, and a symbol 121 indicating “deceleration of the vehicle 1” that is the second behavior. 'Will be displayed small to the right.
  • the driver can grasp the behavior performed by the vehicle and other behaviors that can be selected, and can continue the automatic driving with a sense of security. Alternatively, the driver can give instructions to the car smoothly.
  • the option to be notified to the notification unit, that is, the second behavior can be made variable according to the traveling environment.
  • FIG. 13 is a block diagram showing a main configuration of the vehicle 1 including the information notification device according to Embodiment 2 of the present invention.
  • the same components as those in FIG. 1 are denoted by the same reference numerals as those in FIG.
  • a vehicle 1 shown in FIG. 13 is provided with a touch panel 10 instead of the operation unit 51 of the steering wheel 5.
  • the touch panel 10 is a device composed of a liquid crystal panel or the like capable of displaying information and receiving input, and is connected to the vehicle control unit 7.
  • the touch panel 10 includes a display unit 101 that displays information based on control by the vehicle control unit 7 and an input unit 102 that receives an operation from a driver or the like and outputs the received operation to the vehicle control unit 7.
  • display control of the touch panel 10 will be described.
  • display control when the vehicle 1 is traveling in the center of three lanes and the lane can be changed to either the right lane or the left lane will be described.
  • FIGS. 14A to 14C are diagrams illustrating the display on the touch panel 10 according to the second embodiment.
  • FIG. 14A is an initial display of the display unit 101 of the touch panel 10.
  • the vehicle control unit 7 determines that the vehicle 1 can change the lane to either the right lane or the left lane, the vehicle control unit 7 causes the display unit 101 of the touch panel 10 to perform a display as illustrated in FIG. 14A.
  • the display “Touch” in the display area 121 indicates that the touch panel 10 is in a mode in which a touch operation by the driver can be received.
  • the input unit 102 receives this operation and sends information indicating that this operation has been performed to the vehicle control unit 7. Output.
  • the vehicle control unit 7 displays the display shown in FIG. 14B on the display unit 101 and displays the display shown in FIG. 14C on the notification unit 92.
  • FIG. 14B shows a display area 121a on which “Move” indicating an operation for instructing the vehicle 1 to move is displayed.
  • FIG. 14B also shows display areas 121b to 121d indicating that the vehicle 1 can travel in each of the three lanes.
  • the display areas 121b to 121d correspond to traveling in the lane indicated by arrows X, Y, and Z in FIG. 14C, respectively.
  • each display area in FIG. 14B and each arrow in FIG. 14C have the same mode (for example, color or arrangement). As a result, the display is easier to understand for the driver.
  • the behavior performed by the vehicle determined by the vehicle control may be displayed so that the behavior selectable by the driver can be distinguished.
  • the driver selects the behavior of the vehicle 1 by touching the display area corresponding to the lane to be traveled among the display areas 121b to 121d.
  • the input unit 102 accepts a driver's behavior selection operation and outputs information on the selected behavior to the vehicle control unit 7.
  • the vehicle control unit 7 controls the vehicle 1 to execute the selected behavior.
  • the vehicle 1 travels in the lane that the driver wants to travel.
  • the driver may perform a swipe operation on the touch panel 10 instead of the touch operation.
  • the driver when the driver wants to change to the lane indicated by the arrow X in FIG. 14C, the driver performs a swipe operation to the right on the touch panel 10.
  • the input unit 102 receives the swipe operation and outputs information indicating the content of the swipe operation to the vehicle control unit 7. And the vehicle control part 7 controls the vehicle 1 to perform the lane change to the lane shown by the arrow X which is the selected behavior.
  • the user may speak “behavior selection” or the like by voice. Thereby, it becomes possible to operate only by displaying the HUD without looking at the touch panel at hand.
  • the display mode of the lane corresponding to the display area of the selected touch panel may be changed so that it can be confirmed before selecting which lane is being selected. For example, at the moment when the display area b is touched, the thickness of the lane X increases, and if the hand is released immediately, the lane X is not selected and the thickness of the lane X returns to the original size, and the display area 121c is touched. If the thickness of the lane Y increases and the state is maintained for a while, the lane Y may be selected and the fact that the lane Y has blinked may be notified. Thereby, selection or determination operation can be performed without visually checking the hand.
  • vehicle control functions such as acceleration, deceleration, overtaking, and the like may be assigned to the display area according to the driving environment.
  • the driver can perform an intuitive operation.
  • the touch panel can freely change the number, shape, color, and the like of display areas for receiving operations, the degree of freedom of the user interface is improved.
  • the configuration according to the present embodiment is a configuration in which, in the configuration of FIG. 1 described in the first embodiment, the operation unit 51 further includes a grip sensor that detects whether or not the driver has gripped the steering wheel 5. .
  • 15A to 15D are diagrams for explaining the display of the notification unit 92 according to Embodiment 3 of the present invention.
  • 15A to 15D similarly to the case shown in FIG. 8A, a vehicle traveling in front of the same lane as the vehicle 1 travels at a slower speed than the vehicle 1, and the lane can be changed to the adjacent lane.
  • An example of display in a different driving environment is shown.
  • the vehicle control unit 7 When the vehicle control unit 7 determines that the traveling environment is the traveling environment illustrated in FIG. 8A, the vehicle control unit 7 first causes the notification unit 92 to execute the display illustrated in FIG. 15A.
  • FIG. 15A shows, in a first mode (for example, the first color), a symbol 131 indicating “passing” which is the first behavior among the behavior candidates to be implemented after the first predetermined time has elapsed. Has been.
  • the vehicle control unit 7 changes the symbol 131 from the first mode to the second mode different from the first mode.
  • the notification unit 92 is displayed in a mode (for example, a second color different from the first color).
  • the second predetermined time is the same as the second predetermined time described in the first embodiment.
  • the driver can select the second behavior, but when the symbol 131 is changed to the second mode, the driver Selection of the behavior of 2 becomes impossible.
  • FIG. 15A shows a steering wheel shape symbol 132 indicating that the second behavior can be selected.
  • the symbol 132 is displayed, when the driver holds the steering wheel 5, the second behavior is displayed.
  • the symbol 132 is a display indicating that the second behavior can be selected.
  • the driver can select the second behavior. It may be shown. In this case, the symbol 132 may not be displayed.
  • FIG. 15A also shows a symbol 133 indicating that automatic operation is currently in progress.
  • the symbol 133 is an auxiliary display that indicates to the driver that the vehicle is traveling in automatic driving, but the symbol 133 may not be displayed.
  • the grip sensor detects it and outputs information of the detection result to the vehicle control unit 7.
  • the vehicle control unit 7 causes the notification unit 92 to execute the display illustrated in FIG. 15B.
  • FIG. 15B similarly to FIG. 15A, a symbol 131 indicating “passing” which is the first behavior is shown in the first mode (for example, the first color). Further, a symbol 134 indicating “lane change” as the second behavior and a symbol 135 indicating “deceleration” as the second behavior are shown.
  • the driver changes the first behavior to the second behavior by operating the operation unit 51 of the steering wheel 5. For example, the driver depresses the operation button 51a or the operation button 51c (see FIG. 2C) of the operation unit 51, thereby moving to “change lane” (symbol 134) or “deceleration” (symbol 135). Update the behavior.
  • FIG. 15B shows a symbol 136 indicating that the vehicle control unit 7 is learning the behavior of the vehicle 1.
  • the vehicle control unit 7 learns the behavior selected by the driver.
  • the symbol 136 may not be displayed. In addition, learning may always be performed.
  • the vehicle control unit 7 stores the behavior selected by the driver in the storage unit 8, and when the same driving environment is subsequently set, the stored behavior is displayed on the notification unit 92 as the first behavior.
  • the vehicle control part 7 may memorize
  • FIG. 15B also shows a symbol 137 indicating that automatic operation is not being performed.
  • the vehicle control unit 7 waits until a behavior selected after the first predetermined time has elapsed is selected by the driver.
  • the vehicle control unit 7 receives information on this selection operation, and is shown in FIG. 15C.
  • the notification unit 92 executes the display.
  • FIG. 15C shows a symbol 134 ′ indicating “lane change” in the first mode.
  • the vehicle control unit 7 receives information on the selection operation for selecting “lane change”, the vehicle control unit 7 determines that the selected behavior is the next behavior to be performed, and sets the symbol 134 ′ indicating “lane change” to the first. Is displayed on the notification unit 92.
  • the symbol 131 ′ in FIG. 15C is displayed by replacing the symbol 131 displayed as the first behavior in FIG. 15B with the symbol 134.
  • the vehicle control unit 7 receives information on an operation of continuously pressing any one of the operation buttons twice, and causes the notification unit 92 to change the display shown in FIG. 15C to the display shown in FIG. 15B.
  • the vehicle control unit 7 informs FIG. 15B and FIG. 15C based on the driver's operation after the notification unit 92 executes the display shown in FIG. 15A until the second predetermined time elapses.
  • the display of the unit 92 is changed. Thereafter, the vehicle control unit 7 causes the notification unit 92 to display the display illustrated in FIG. 15D after the second predetermined time has elapsed since the notification unit 92 has executed the display illustrated in FIG. 15A.
  • the vehicle control unit 7 displays the display shown in FIG. 15D before the second predetermined time elapses. May be displayed.
  • a symbol 134 ′ indicating “lane change” selected by the driver is displayed in the second mode, and a symbol 133 indicating that the vehicle is traveling in automatic driving. Again, the displayed state is shown.
  • the vehicle control unit 7 displays the information on the notification unit 92 so that the candidate for another behavior can be confirmed only when the driver wants to update the next behavior. change.
  • the display visually recognized by the driver can be reduced, and the driver's troublesomeness can be reduced.
  • the driver model is obtained by modeling the tendency of the operation by the driver for each driving environment based on information on the frequency of each operation.
  • the driver model aggregates the traveling histories of a plurality of drivers and is constructed from the aggregated traveling histories.
  • the driving history of the driver is, for example, a history in which the behavior frequency actually selected by the driver among the behavior candidates corresponding to each driving environment is aggregated for each behavior candidate.
  • FIG. 16 is a diagram showing an example of a travel history.
  • the behavior candidates “decelerate”, “accelerate”, and “lane change” are selected three times, once, and five times, respectively. It has been shown.
  • candidate behaviors “follow”, “passing”, and “lane change” are shown twice, twice, 1 It is shown that you have selected once. The same applies to the driver y.
  • the driving history of the driver may aggregate behaviors selected during automatic driving or may aggregate behaviors actually performed by the driver during manual driving. This makes it possible to collect travel histories according to operating conditions such as automatic driving or manual driving.
  • the driver model includes a clustering type constructed by clustering the driving histories of a plurality of drivers, and a driver model of the driver x from a plurality of driving histories similar to a driving history of a specific driver (for example, driver x).
  • a specific driver for example, driver x
  • the clustering type driver model construction method aggregates the driving histories of a plurality of drivers as shown in FIG. Then, a driver model is constructed by grouping a plurality of drivers having a high degree of similarity between the traveling histories, that is, a plurality of drivers having similar driving operation tendencies.
  • FIG. 17 is a diagram illustrating a clustering type driver model construction method.
  • FIG. 17 shows the travel histories of the drivers a to f in a table format. From the driving histories of the drivers a to f, it is shown that the model A is constructed from the traveling histories of the drivers a to c, and the model B is constructed from the traveling histories of the drivers d to f.
  • the similarity of the travel histories is, for example, treating each frequency (each numerical value) in the travel histories of the driver a and the driver b as a frequency distribution, calculating a correlation value between the frequency distributions, and using the calculated correlation value as the similarity It is good.
  • each frequency each numerical value
  • the driving history of the driver a and the driver b is set as one group.
  • the calculation of similarity is not limited to this.
  • the degree of similarity may be calculated based on the number of the most frequently matched behaviors in the driving histories of the driver a and the driver b.
  • the clustering type driver model is constructed by, for example, calculating the average of each frequency in the driving history of drivers in each group.
  • FIG. 18 is a diagram illustrating an example of a built clustering driver model.
  • the average frequency of each group is derived by calculating the average of the respective frequencies.
  • the clustering type driver model is constructed with an average frequency for the behavior determined for each driving environment.
  • FIG. 19 is a diagram illustrating another example of the constructed clustering type driver model. As shown in FIG. 19, the most frequent behavior is selected for each traveling environment, and a driver model is constructed from the selected behavior.
  • the driver model as shown in FIG. 18 is stored in advance in the storage unit 8 of the vehicle 1. Further, the vehicle control unit 7 stores a travel history when the driver y has driven in the past in the storage unit 8. The driver y is detected by a camera or the like (not shown) installed in the vehicle.
  • the vehicle control unit 7 calculates the similarity between the driving history of the driver y and the driving history of each model of the driver model, and determines which model is most suitable for the driver y. For example, in the case of the driving history of the driver y shown in FIG. 16 and the driver model shown in FIG. 18, the vehicle control unit 7 determines that the model B is most suitable for the driver y.
  • the vehicle control unit 7 determines that the behavior with the highest frequency is the behavior most suitable for the driver y, that is, the first behavior in each traveling environment of the model B during actual automatic traveling.
  • the vehicle control unit 7 is based on the model B shown in FIG. “Follow-up” can be determined as the first behavior.
  • the individual adaptive type driver model construction method aggregates the driving histories of a plurality of drivers as shown in FIG.
  • the difference from the clustering type is that a driver model is constructed for each driver.
  • operator y is demonstrated.
  • the driving histories of a plurality of drivers having high similarity to the driving history of the driver y are extracted from the driving histories of the plurality of drivers collected. Then, a driver model of the driver y is constructed from the extracted driving histories of the plurality of drivers.
  • FIG. 20 is a diagram showing a method for constructing an individual adaptive driver model.
  • the driving histories of the drivers a to f are shown in a table format, as in FIG. FIG. 20 shows that the driver model of the driver y is constructed from the driving history of the driver y shown in FIG. 16 and the driving histories of the drivers c to e having high similarity.
  • the individual adaptive driver model is constructed by calculating the average of each frequency in the extracted driving history of each driver.
  • FIG. 21 is a diagram illustrating an example of a constructed individual adaptive driver model.
  • the average frequency of each behavior is derived for each driving environment.
  • the individually adaptive driver model for the driver y is constructed with an average frequency of behavior corresponding to each traveling environment.
  • the driver model of the driver y as shown in FIG. 21 is stored in advance in the storage unit 8 of the vehicle 1. Further, the vehicle control unit 7 stores a travel history when the driver y has driven in the past in the storage unit 8. The driver y is detected by a camera or the like (not shown) installed in the vehicle.
  • the vehicle control unit 7 determines that the behavior with the highest frequency is the most suitable behavior for the driver y, that is, the first behavior in each driving environment of the driver model of the driver y in actual automatic driving. Judge that there is.
  • the vehicle control unit 7 is based on the driver model shown in FIG. “Changing lane” can be determined as the first behavior.
  • the actual operation for example, the magnitude of acceleration, deceleration, or the amount of operation of the steering wheel
  • one behavior for example, lane change
  • the vehicle control unit 7 extracts a feature amount indicating the driving characteristics of the driver from the operation content of each part of the vehicle 1 of the driver, and stores it in the storage unit 8.
  • the feature amount includes, for example, a feature amount related to speed, a feature amount related to steering, a feature amount related to operation timing, a feature amount related to outside-vehicle sensing, a feature amount related to in-vehicle sensing, and the like.
  • the feature quantity related to speed includes, for example, the speed, acceleration, and deceleration of the vehicle, and these feature quantities are acquired from a speed sensor or the like that the vehicle has.
  • the feature amount related to steering includes, for example, the steering angle, angular velocity, and acceleration of the steering, and these feature amounts are acquired from the steering wheel 5.
  • the feature quantities related to the operation timing include, for example, the operation timing of the brake, accelerator, blinker lever, and steering wheel. These feature quantities are obtained from the brake pedal 2, the accelerator pedal 3, the blinker lever 4, and the steering wheel 5, respectively. Is done.
  • the feature amount related to outside-vehicle sensing includes, for example, a distance between vehicles in front, side, and rear, and these feature amounts are acquired from the sensor 62.
  • the feature amount related to in-vehicle sensing is, for example, personal recognition information indicating who the driver is and who is the passenger, and these feature amounts are acquired from a camera or the like installed in the vehicle.
  • the vehicle control unit 7 detects that the driver has manually changed the lane.
  • the detection method is performed by analyzing operation time-series data acquired from CAN (Controller Area Network) information or the like by rule-setting an operation time-series pattern for changing lanes in advance. In that case, the vehicle control part 7 acquires the feature-value mentioned above.
  • the vehicle control unit 7 stores the feature amount in the storage unit 8 for each driver, and constructs a driving characteristic model.
  • the vehicle control unit 7 may construct the above-described driver model based on the feature amount for each driver. That is, the vehicle control unit 7 extracts a feature value related to speed, a feature value related to steering, a feature value related to operation timing, a feature value related to outside-vehicle sensing, and a feature value related to in-vehicle sensing, and stores them in the storage unit 8. And based on the feature-value memorize
  • FIG. 22 is a diagram showing an example of the driving characteristic model.
  • FIG. 22 shows the feature values in a tabular format for each driver.
  • FIG. 22 also shows the number of times each behavior has been selected in the past for each driver. Although only a part of the feature amount is described, any or all of the above may be described.
  • the numerical value of speed is a numerical value indicating the actual speed in stages.
  • the numerical values for the steering wheel, brake, and accelerator are numerical values that indicate the operation amount in stages. These numerical values are obtained, for example, by calculating an average value of speed, steering wheel, brake, and accelerator operation amounts within a predetermined period in the past and expressing the average value stepwise.
  • the speed level is 8
  • the steering wheel, brake, and accelerator operation amount levels are 4, 6, and 8, respectively.
  • the vehicle control unit 7 performs driving corresponding to the driver, the behavior, and the passenger according to who the driver is, what kind of behavior is executed, and who the passenger is.
  • the characteristic model is selected from the driving characteristic models shown in FIG.
  • the vehicle control unit 7 causes the vehicle 1 to travel at a speed corresponding to the selected driving characteristic model, and controls the vehicle 1 by a combination of the steering wheel, brake, accelerator operation amounts and timing. Thereby, the automatic driving
  • FIGS. 23A to 23D are diagrams for explaining display of the notification unit 92 according to Embodiment 4 of the present invention.
  • 23A to 23D are displays for the first example of the traveling environment shown in FIG. 5A.
  • FIG. 23A is a display of the notification unit 92 in a state where the vehicle is performing normal travel that does not require lane change or vehicle acceleration / deceleration.
  • FIG. 23A shows a symbol 231 indicating that the driving characteristic of the driver is the “high deceleration” driving characteristic, and a symbol 232 indicating that the driver is currently in automatic driving.
  • the vehicle control unit 7 determines the driving characteristics of the driver based on the number of times each behavior included in the driving characteristics model shown in FIG. 22 has been selected in the past, for example.
  • the vehicle control unit 7 includes, for example, a symbol 231 as shown in FIGS. 23A to 23D for a driver who has a lot of “deceleration” based on driving characteristics (a large number of times the behavior of so-called “deceleration” is selected).
  • the display is displayed on the notification unit 92.
  • the vehicle control unit 7 determines that the driving environment is the driving environment of the first example illustrated in FIG. 5A
  • the vehicle control unit 7 determines that the driving characteristic of the driver is “high deceleration” driving characteristic. Based on the fact, the first behavior is determined to be “deceleration”, and the notification unit 92 is caused to execute the display of FIG. 23B.
  • a symbol 233 indicating “deceleration” which is the first behavior is shown in a first mode (for example, a first color). Further, a symbol 234 indicating “acceleration” as the second behavior and a symbol 235 indicating “lane change” as the second behavior are shown.
  • the vehicle control unit 7 causes the notification unit 92 to execute the display of FIG. 23C.
  • FIG. 23C shows a symbol 234 ′ indicating “acceleration” as the selected behavior in the first mode. Further, the symbol 233 ′ is displayed by replacing the symbol 233 that was displayed as the first behavior in FIG. 23B with the symbol 234.
  • the vehicle control unit 7 causes the notification unit 92 to display the display illustrated in FIG. 23D after the second predetermined time has elapsed since the notification unit 92 has executed the display illustrated in FIG. 23A.
  • a symbol 234 ′ indicating “acceleration” selected by the driver is displayed in the second mode.
  • the vehicle control unit 7 When it is determined that the next behavior to be taken is “acceleration”, the vehicle control unit 7 reads out feature amounts corresponding to the “acceleration” behavior included in the driving characteristic model, and performs “acceleration” reflecting those feature amounts. The vehicle 1 is controlled to do so.
  • FIGS. 24A to 24D are diagrams illustrating the display of the notification unit 92 according to Embodiment 4 of the present invention.
  • 24A to 24D are displays for the second example of the traveling environment shown in FIG. 7A.
  • 24A to 24D components common to those in FIGS. 23A to 23D are denoted by the same reference numerals as in FIGS. 23A to 23D, and detailed description thereof is omitted.
  • 24A to 24D are diagrams in which the symbol 235 indicating “lane change” is deleted from FIGS. 23A to 23D.
  • FIG. 7A As described above, in the second example (FIG. 7A), unlike the first example (FIG. 5A), another vehicle is traveling to the right of the vehicle 1, and therefore the lane cannot be changed. Therefore, “lane change” is not displayed in FIGS. 24B and 24C. In the example of FIG. 24C, “acceleration” is selected as in the case of FIG. 23C. Therefore, the vehicle control unit 7 changes the behavior of “acceleration” included in the driving characteristic model as in FIGS. 23A to 23D. The corresponding feature values are read out, and the vehicle 1 is controlled to perform “acceleration” reflecting those feature values.
  • 25A to 25D are diagrams for explaining the display of the notification unit 92 according to Embodiment 4 of the present invention.
  • 25A to 25D are displays for the third example of the traveling environment shown in FIG. 8A.
  • FIG. 25A is the same as FIG. 23A.
  • the vehicle control unit 7 determines that the driving environment of the third example illustrated in FIG. 8A is satisfied, the vehicle control unit 7 determines that the driving characteristic of the driver is a driving characteristic with “a lot of deceleration”.
  • the first behavior is determined as “deceleration”, and the notification unit 92 is caused to execute the display of FIG. 25B.
  • a symbol 251 indicating “deceleration” which is the first behavior is shown in a first mode (for example, a first color). Further, a symbol 252 indicating “passing” that is the second behavior and a symbol 253 indicating “lane change” that is the second behavior are shown.
  • the vehicle control unit 7 causes the notification unit 92 to execute the display in FIG. 25C.
  • FIG. 25C shows a symbol 252 ′ indicating “overtaking” as the selected behavior in the first mode. Further, the symbol 251 ′ is displayed by replacing the symbol 251 displayed as the first behavior in FIG. 25B with the symbol 252.
  • the vehicle control unit 7 causes the notification unit 92 to display the display illustrated in FIG. 25D after the second predetermined time has elapsed since the notification unit 92 has executed the display illustrated in FIG. 25A.
  • a symbol 252 ′ indicating “overtaking” selected by the driver is displayed in the second mode.
  • the vehicle control unit 7 reads out feature amounts corresponding to the “passing” behavior included in the driving characteristic model, and performs “acceleration” reflecting those feature amounts. The vehicle 1 is controlled to do so.
  • FIGS. 26A to 26D are diagrams illustrating the display of the notification unit 92 according to the fourth embodiment of the present invention.
  • 26A to 26D are displays for the first example of the traveling environment shown in FIG. 5A.
  • FIG. 26A shows an example in which the driving characteristics of the driver are “high acceleration” driving characteristics
  • FIG. 26B shows an example in which the driving characteristics of the driver are “many lane changes”. Is shown.
  • FIG. 26A shows a symbol 261 indicating that the driving characteristic of the driver is the “high acceleration” driving characteristic. Further, a symbol 262 indicating “acceleration” which is the first behavior is shown in the first mode (for example, the first color). Further, a symbol 263 indicating “lane change” as the second behavior and a symbol 264 indicating “deceleration” as the second behavior are shown.
  • the vehicle control unit 7 displays a symbol 261 as shown in FIG. 26A for a driver who has a lot of “acceleration” in the past based on driving characteristics (a large number of times the behavior of “acceleration” has been selected in the past). Is executed by the notification unit 92. Further, the vehicle control unit 7 determines that the first behavior is “acceleration” based on the driving characteristic of the driver being “high acceleration” and executes the display of FIG.
  • FIG. 26B shows a symbol 265 indicating that the driving characteristic of the driver is a driving characteristic with “many lane changes”. Further, a symbol 266 indicating “lane change” as the first behavior is shown in the first mode (for example, the first color). Further, a symbol 267 indicating “lane change” as the second behavior and a symbol 268 indicating “deceleration” as the second behavior are shown.
  • the vehicle control unit 7 gives a symbol 265 as shown in FIG. 26B to a driver who has a lot of “lane change” in the past based on driving characteristics (a large number of times the behavior of “lane change” has been selected in the past).
  • the notification unit 92 is caused to execute the display including it.
  • the vehicle control unit 7 determines that the first behavior is “lane change” based on the driving characteristics of the driver being “many lane changes” and executes the display of FIG.
  • symbols 231 and 261 265 and the like may indicate the type of driver model selected from the operation history of the driver.
  • a notification including the symbol 231 as shown in FIGS. 23A to 23D is displayed on the notification unit 92 in the driver model applied to the driver who often selects “deceleration”.
  • the first behavior is determined as “deceleration”.
  • the notification unit 92 is executed by the notification unit 92, and the first behavior is determined as “acceleration”.
  • the display including the symbol 261 as shown in FIG. 26A is executed by the notification unit 92, and the first behavior is determined to be “lane change”.
  • the driver's past driving history can be learned, and the result can be reflected in the determination of the future behavior.
  • the vehicle control unit controls the vehicle, the driving characteristics (driving preference) of the driver can be learned and reflected in the control of the vehicle.
  • automatic driving can be controlled at the timing or amount of operation that the driver or occupant prefers the vehicle, and unnecessary operation intervention by the driver during automatic driving without departing from the sense of actual driving by the actual driver. Can be suppressed.
  • a server device such as a cloud server may execute a function similar to the function executed by the vehicle control unit 7.
  • storage part 8 may exist not in the vehicle 1 but in server apparatuses, such as a cloud server.
  • the storage unit 8 may store an already constructed driver model, and the vehicle control unit 7 may determine the behavior with reference to the driver model stored in the storage unit 8.
  • the vehicle control unit 7 acquires feature amount information indicating the driving characteristics of the driver, the storage unit 8 stores the feature amount information, and the vehicle control unit 7 stores the feature amount information. Based on the feature amount information stored in the unit 8, a driver model indicating the tendency of the behavior of the vehicle selected by the driver with the frequency of each selected behavior is constructed for each traveling environment of the vehicle.
  • the vehicle control unit 7 determines a group of drivers who select a similar behavior among a plurality of drivers, and constructs a driver model for each group and for each driving environment of the vehicle.
  • the vehicle control unit 7 calculates the average value of the behavior frequency selected by each driver for each group of drivers who perform similar operations, and calculates the behavior tendency of the vehicle selected by the driver.
  • the driver model indicated by the value was constructed for each driving environment of the vehicle.
  • the vehicle control unit 7 determines the vehicle selected by the specific driver based on the behavior of the vehicle selected by another driver that tends to be similar to the behavior tendency of the vehicle selected by the specific driver.
  • a driver model indicating the tendency of behavior with the frequency of each selected behavior is constructed for each traveling environment of the vehicle.
  • the vehicle control unit 7 can construct a driver model more suitable for the driving tendency of the driver, and can perform more appropriate automatic driving for the driver based on the constructed driver model.
  • driver model (Modified example of driver model)
  • driver model demonstrated above modeled the tendency of operation (behavior) by the driver for every driving environment based on the information of the frequency of each operation, etc.
  • present invention is not limited to this. .
  • the driver model is constructed based on a travel history in which environmental parameters indicating travel environments (that is, situations) that have traveled in the past and operations (behaviors) actually selected by the driver in the travel environment are associated with each other. May be.
  • environmental parameters indicating travel environments (that is, situations) that have traveled in the past and operations (behaviors) actually selected by the driver in the travel environment are associated with each other. May be.
  • options can be determined without going through the procedure of separately detecting and classifying the driving environment and inputting (storing) the classification result into the driver model.
  • the differences in the driving environment as shown in FIGS. 23A to 23D and 24A to 24D are acquired as environment parameters and directly input (stored) in the driver model, whereby “acceleration” and “ “Deceleration” and “lane change” are options, and in FIGS. 24A to 24D, “acceleration” and “deceleration” are options.
  • the driver model described below may be referred to as a situation database.
  • FIG. 27 is a diagram illustrating an example of a travel history.
  • FIG. 27 shows a travel history in which an environment parameter indicating a travel environment in which the vehicle driven by the driver x has traveled in the past and an operation (behavior) actually selected by the driver in the travel environment are associated with each other. Has been.
  • the environmental parameters (a) to (c) of the travel history shown in FIG. 27 are the travel environments when the behavior of the vehicle is presented to the driver as shown in FIGS. 8B, 5B, and 7B, for example. Is shown.
  • the environmental parameter of the travel history is obtained from sensing information and infrastructure information.
  • Sensing information is information detected by a sensor or radar of the vehicle.
  • the infrastructure information includes GPS information, map information, information acquired through road-to-vehicle communication, and the like.
  • the environmental parameters of the travel history shown in FIG. 27 include “information on own vehicle”, “information on preceding vehicle” indicating information on a vehicle traveling in front of the lane on which the host vehicle is traveling, and information on the lane on which the host vehicle is traveling. “Side lane information” indicating side lane information, “Merge lane information” indicating the merge lane information when the host vehicle is traveling, and the location of the vehicle and its surroundings “Location information” indicating the information of Moreover, you may include the information of a back vehicle. In that case, you may use the relative speed of a back vehicle and the own vehicle, the distance between heads, the change rate of the distance between heads, etc. Moreover, you may include the information of presence of a vehicle.
  • “information on own vehicle” includes information on the speed Va of the own vehicle.
  • the “preceding vehicle information” includes information on the relative speed Vba of the preceding vehicle with respect to the own vehicle, the inter-vehicle distance DRba between the preceding vehicle and the own vehicle, and the rate of change RSb of the size of the preceding vehicle.
  • the speed Va of the host vehicle is detected by a speed sensor of the host vehicle.
  • the relative speed Vba and the inter-vehicle distance DRba are detected by a sensor or a radar.
  • “Side lane information” includes information on the side rear vehicle traveling behind the host vehicle in the side lane, information on the side front vehicle traveling ahead of the host vehicle in the side lane, and the remaining side of the host vehicle. Information on the direction lane length DRda.
  • the information on the side rear vehicle includes information on the relative speed Vca of the side rear vehicle with respect to the own vehicle, the inter-head distance Dca between the side rear vehicle and the own vehicle, and the change rate Rca of the inter-head distance.
  • the inter-head distance Dca between the side rear vehicle and the host vehicle is determined by measuring the front end portion (vehicle head) of the host vehicle and the front end portion of the side rear vehicle (in the direction along the traveling direction of the host vehicle (and the side rear vehicle)). This is the distance between The inter-vehicle distance may be calculated from the inter-vehicle distance and the vehicle length. The inter-vehicle distance may be substituted for the inter-vehicle distance.
  • the relative speed Vca and the inter-head distance Dca are detected by a sensor or a radar.
  • the information on the side front vehicle includes information on the relative speed Vda of the side front vehicle with respect to the host vehicle, the distance Dda between the head of the side front vehicle and the host vehicle, and the change rate Rda of the head distance.
  • the head-to-head distance Dda between the side front vehicle and the host vehicle is measured along the traveling direction of the host vehicle (and the side front vehicle) and the tip end portion (vehicle head) of the host vehicle and the tip portion (vehicle head) of the side front vehicle. Is the distance between.
  • the remaining side lane length DRda of the host vehicle is a parameter indicating a high possibility of lane change to the side lane. Specifically, the remaining side lane length DRda of the host vehicle is measured in the direction along the traveling direction of the host vehicle (and the side front vehicle) and the rear end portion of the side front vehicle. Is longer than the inter-vehicle distance DRba between the preceding vehicle and the host vehicle, the distance between the front end portion (vehicle head) of the host vehicle and the rear end portion of the side forward vehicle, and the front end portion of the host vehicle ( When the distance between the vehicle head) and the rear end portion of the side front vehicle is shorter than DRba, DRba is set. The remaining side lane length DRda of the host vehicle is detected by a sensor or a radar.
  • the information on the merging lane includes information on the relative speed Vma of the merging vehicle with respect to the own vehicle, the distance Dma between the merging vehicle and the own vehicle, and the rate of change Rma of the inter-vehicle distance.
  • the inter-head distance Dma between the joining vehicle and the host vehicle is measured in the direction along the traveling direction of the host vehicle (and the joining vehicle) and the leading end portion (head of the host vehicle) and the leading end portion (head of the joining vehicle) ).
  • the relative speed Vma and the inter-head distance Dma are detected by a sensor or a radar.
  • the numerical values of the speed, distance, and change rate described above are classified into a plurality of levels, and numerical values indicating the classified levels are stored. Note that the numerical values of the speed, the distance, and the change rate may be stored as they are without being classified into levels.
  • the location information is "location information of own vehicle”, “number of lanes”, “travel lane of own vehicle”, “distance to start / end points of merge section” "distance to start / end points of branch section” It includes information such as “distance to construction section start / end points”, “distance to lane decrease section start / end points”, and “distance to traffic accident occurrence point”.
  • position information “travel lane of own vehicle” (travel lane of FIG. 27) and “distance to start / end points of merge section” (in FIG. 27, “distance to merge point”) Information) is shown.
  • the distance to the start / end point of the merge section is determined in advance when the start / end point of the merge section exists within a predetermined distance. It is classified into a plurality of levels, and the numerical values of the classified levels are stored. If there is no start / end point of the merging section within the predetermined distance, “0” is stored in the “distance to the start / end point of the merging section” column.
  • the distance to the start / end point of the branch section is determined in advance. It is classified into a plurality of levels, and the numerical values of the classified levels are stored. If there is no start / end point of the branch section within the predetermined distance, “0” is stored in the “distance to the start / end point of the branch section”. In the "Distance to construction section start / end point” column, if there is a construction section start / end point within a predetermined distance, the distance to the construction section start / end point is determined in multiple levels. And the numerical value of the classified level is stored. When there is no construction section start / end point within a predetermined distance, “0” is stored in the column “Distance to construction section start / end point”.
  • the distance to the start / end point of lane decrease section is determined in advance when there is a start / end point of lane reduction section within the predetermined distance. It is classified into a plurality of levels, and the numerical values of the classified levels are stored. When there is no lane decrease section start / end point within a predetermined distance, “0” is stored in the “distance to lane decrease section start / end point” column.
  • the distance to the traffic accident occurrence point is classified into a plurality of predetermined levels. The numerical value of the selected level is stored. If there is no traffic accident occurrence point within a predetermined distance, “0” is stored in the “distance to the traffic accident occurrence point” column.
  • the position information may include information on which lanes of the road on which the vehicle is traveling are merge lanes, branch lanes, construction lanes, reduced lanes, and accident lanes.
  • the travel history shown in FIG. 27 is merely an example, and the present invention is not limited to this.
  • the travel history may further include “information on the left side lane” on the opposite side.
  • Left lane information includes information on the left rear vehicle traveling behind the host vehicle in the left lane, information on the left front vehicle traveling ahead of the host vehicle in the left lane, and the remaining left side of the host vehicle. Information on the direction lane length DRda.
  • the information on the left rear vehicle includes information on the relative speed Vfa of the left rear vehicle with respect to the host vehicle, the head distance Dfa between the left rear vehicle and the host vehicle, and the change rate Rfa of the head head distance.
  • the head-to-head distance Dfa between the left rear vehicle and the host vehicle is a front end portion (vehicle head) of the host vehicle measured in a direction along the traveling direction of the host vehicle (and the left rear vehicle) and a front end portion of the left rear vehicle ( This is the distance between
  • the information on the left front vehicle includes information on the relative speed Vga of the left front vehicle with respect to the own vehicle, the distance Dga between the left front vehicle and the own vehicle, and the rate of change Rga of the head distance.
  • the head-to-head distance Dga between the left front vehicle and the host vehicle is measured along the traveling direction of the host vehicle (and the left front vehicle) and the tip portion (vehicle head) of the host vehicle and the tip portion (vehicle head) of the left front vehicle. Is the distance between.
  • the travel history shown in FIG. 27 may include “rear vehicle information” indicating information on the rear vehicle traveling behind the host vehicle in the travel lane.
  • the information on the rear vehicle includes information on the relative speed Vea of the rear vehicle with respect to the host vehicle, the distance Dea between the rear vehicle and the host vehicle, and the rate of change Rea of the head distance.
  • the head-to-head distance Dea between the rear vehicle and the host vehicle is determined by the front end portion (vehicle head) of the host vehicle and the front end portion (vehicle head) of the rear vehicle measured in the direction along the traveling direction of the host vehicle (and the rear vehicle). Is the distance between.
  • the relative speed Vea and the inter-head distance Dea are detected by a sensor or a radar.
  • the measurable distance between vehicles or an approximate value obtained by adding a predetermined vehicle length to the distance between vehicles may be used instead of the distance between vehicle heads.
  • the distance may be calculated by adding the length of each recognized vehicle type to the distance. Regardless of whether the head-to-head distance can be measured or not, as an alternative to the head-to-head distance, a measurable head-to-head distance or an approximate value obtained by adding a predetermined vehicle length to the head-to-head distance may be used. You may calculate by adding the vehicle length for every recognized vehicle type.
  • the traveling history may include various other information related to the traveling environment of the vehicle.
  • the travel history may include information on the size or type of the preceding vehicle, the side vehicle, the joining vehicle, or the relative position with respect to the host vehicle.
  • the type of a vehicle approaching from behind may be recognized by a camera sensor, and information indicating that the vehicle is an emergency vehicle may be included when the vehicle is an emergency vehicle. Thereby, it can inform that it is information reporting for correspondence to an emergency vehicle. Or the numerical value which showed the steering wheel, the brake, the amount of accelerator operation in steps, or the passenger's information etc. as demonstrated in FIG.
  • the behaviors selected during the automatic driving may be aggregated, or the behaviors actually performed by the driver during the manual driving may be aggregated. This makes it possible to collect travel histories according to operating conditions such as automatic driving or manual driving.
  • the environmental parameter included in the travel history indicates the travel environment when the behavior of the vehicle is presented to the driver.
  • the travel environment when the driver selects the behavior May be shown.
  • both the environmental parameter indicating the driving environment when the behavior of the vehicle is presented to the driver and the environmental parameter indicating the driving environment when the driver selects the behavior may be included in the driving history. .
  • the vehicle control unit 7 generates the overhead view shown in FIGS. 2A, 5A, 6A, 7A, 8A, 9A, and 10A, or the display shown in FIG. 14C, the first behavior.
  • at least one of the information on the environmental parameter having a high contribution level and the information related to the environmental parameter (for example, an icon) that causes the second behavior to be selected is generated as the notification information.
  • the notification information may be notified to the notification unit 92 by, for example, showing the generated notification information on an overhead view.
  • the vehicle control unit 7 increases the brightness between the preceding vehicle and the host vehicle in the overhead view.
  • An area where the color is raised or the color is changed may be displayed to notify the notification unit 92 of the notification information.
  • the vehicle control unit 7 may display an icon indicating that the contribution degree of the inter-vehicle distance DRba or the change rate RSb is high in the region between the preceding vehicle and the host vehicle as the notification information. Further, the vehicle control unit 7 causes the notification unit 92 to draw a line segment connecting the preceding vehicle and the host vehicle as notification information on the overhead view, or to notify line segments connecting all the surrounding vehicles and the host vehicle. The line segment connecting the preceding vehicle and the host vehicle may be emphasized on the overhead view.
  • the vehicle control unit 7 raises the luminance between the preceding vehicle and the host vehicle in the viewpoint image seen from the driver, not the overhead view, and between the preceding vehicle and the host vehicle.
  • AR Augmented Reality
  • display may be realized by displaying differently colored areas as notification information.
  • the vehicle control unit 7 may cause the notification unit 92 to display an AR indicating an environmental parameter having a high contribution degree in the region between the preceding vehicle and the host vehicle as notification information in the viewpoint image.
  • the vehicle control unit 7 displays the line segment connecting the preceding vehicle and the host vehicle in the viewpoint image as AR information, or the line segment connecting all the surrounding vehicles and the host vehicle in the viewpoint image. May be displayed as the notification information and the line segment connecting the preceding vehicle and the host vehicle may be emphasized.
  • the vehicle control unit 7 may generate, as notification information, an image that highlights a preceding vehicle that is a target of an environmental parameter with a high contribution, and may display the image on the notification unit 92.
  • the vehicle control unit 7 generates information indicating the direction of the preceding vehicle or the like that is the target of the environmental parameter with a high contribution in the overhead view or the AR display as the notification information, and the information is the own vehicle or the vicinity of the own vehicle. May be displayed.
  • the vehicle control unit 7 reduces the display brightness of a preceding vehicle or the like that is the target of the environmental parameter with a low contribution instead of notifying the information about the environmental parameter with a high contribution or information related to the environmental parameter.
  • information on an environmental parameter having a high degree of contribution that is made inconspicuous by making it inconspicuous or information related to the environmental parameter may be generated as notification information and displayed on the notification unit 92.
  • the driver model includes a clustering type constructed by clustering the driving histories of a plurality of drivers, and a driver model of the driver x from a plurality of driving histories similar to a driving history of a specific driver (for example, driver x).
  • a specific driver for example, driver x
  • the clustering type driver model construction method the driving history of the driver as shown in FIG. 27 is aggregated in advance for each driver. Then, a driver model is constructed by grouping a plurality of drivers having a high degree of similarity between the traveling histories, that is, a plurality of drivers having similar driving operation tendencies.
  • the similarity between the driving histories is a correlation between vectors having environmental parameter values and behavior values as elements. Can be determined from the value.
  • the correlation value calculated from the driving history of the driver a and the driver b is higher than a predetermined value, the driving history of the driver a and the driver b is set as one group. The calculation of the similarity is not limited to this.
  • the individual adaptive type driver model construction method aggregates the driving histories of a plurality of drivers as shown in FIG.
  • the difference from the clustering type is that a driver model is constructed for each driver.
  • the driving history of the driver y is compared with the driving histories of other drivers, and the driving histories of a plurality of drivers with high similarity are extracted. .
  • an individually adaptive driver model for the driver y is constructed from the extracted driving histories of the plurality of drivers.
  • driver model (situation database) based on the travel history shown in FIG. 27 is not limited to the clustering type or the individual adaptation type, and may be configured to include the travel history of all drivers, for example.
  • driver model in which the driving histories of four drivers a to d are aggregated is used for the driver x.
  • the driver model is constructed by the vehicle control unit 7.
  • FIG. 28A and 28B are diagrams showing a method of using the driver model in this modification.
  • FIG. 28A is an environmental parameter indicating the current traveling environment of the vehicle driven by the driver x.
  • FIG. 28B is an example of a driver model for the driver x.
  • the behavior (operation) for the environmental parameter indicating the current driving environment is blank.
  • the vehicle control unit 7 acquires environmental parameters at predetermined intervals, and determines the next behavior from the driver model shown in FIG. 28B using any one of the environmental parameters as a trigger.
  • a trigger for example, when the distance to the start point of the merging section is a predetermined distance or less, or when the relative speed with the preceding vehicle is a predetermined value or less, it is necessary to change the operation of the vehicle.
  • An environmental parameter indicating the case may be used as a trigger.
  • the vehicle control unit 7 compares the environmental parameters shown in FIG. 28A with the environmental parameters of the respective driving histories of the driver model shown in FIG. 28B, and the behavior corresponding to the most similar environmental parameter is the first behavior. Judge that there is. In addition, some behaviors associated with other similar environmental parameters are determined as second behaviors.
  • Whether the environmental parameters are similar can be determined from the correlation value of the vectors whose elements are the numerical values of the environmental parameters. For example, when the correlation value calculated from the vector whose element is the numerical value of the environmental parameter shown in FIG. 28A and the vector whose element is the numerical value of the environmental parameter shown in FIG. 28B is higher than a predetermined value, these environmental parameters are It is determined that they are similar. Note that the method for determining whether the environmental parameters are similar is not limited to this.
  • the storage unit 8 stores information indicating a safe driving criterion, and the vehicle control unit 7 determines whether or not the driving history satisfies this criterion. Furthermore, the vehicle control unit 7 may register a travel history that satisfies this criterion in the database, and may not register a travel history that does not satisfy this criterion in the database.
  • the vehicle control unit 7 accurately determines the next behavior without determining the specific driving environment, that is, without labeling the driving environment. Can be determined.
  • the driver model may be constructed from a travel history in which a behavior selected by the driver during automatic driving and an environment parameter indicating a travel environment when the behavior is presented are associated with each other.
  • the driver model may be constructed from a travel history in which a behavior selected by the driver during automatic driving and an environmental parameter indicating a travel environment when the behavior is performed by the vehicle are associated with each other.
  • the environmental parameter indicating the future driving environment is predicted from the environmental parameter indicating the current driving environment.
  • the behavior associated with the environmental parameter most similar to the predicted environmental parameter is the first behavior, and the others Some behaviors associated with similar environmental parameters may be determined to be the second behavior.
  • the above prediction is performed, for example, by extrapolating environmental parameters at a future time from environmental parameters indicating the driving environment at the current time and a time before the current time.
  • the driver model (situation database) includes a driving history that associates a behavior selected by the driver during automatic driving with an environmental parameter indicating a driving environment when the behavior is presented, and a driver during automatic driving. May be constructed from both the travel history in which the behavior selected by and the environmental parameters indicating the travel environment when the vehicle performs the behavior are associated with each other.
  • both travel histories are stored in a format as shown in FIG. 28B, and the vehicle control unit 7 determines the next behavior from them.
  • the vehicle control unit 7 gives priority between the two, for example, associating the behavior selected by the driver during the automatic driving with the environment parameter indicating the traveling environment when the vehicle performs the behavior.
  • the next behavior may be determined preferentially from the travel history.
  • a server device such as a cloud server may execute a function similar to the function executed by the vehicle control unit 7.
  • the storage unit 8 since the storage unit 8 has an enormous number of data as the driving history is accumulated, it may be in a server device such as a cloud server instead of the vehicle 1.
  • the storage unit 8 may store an already constructed driver model, and the vehicle control unit 7 may determine the behavior with reference to the driver model stored in the storage unit 8.
  • the storage unit 8 In the configuration in which the storage unit 8 is provided in the cloud server, it is desirable to provide a cache in case the storage unit 8 cannot be accessed due to a decrease in communication speed or communication disconnection.
  • FIG. 29 is a block diagram showing an example of cache arrangement.
  • the vehicle control unit 7 stores the travel history in the storage unit 8 through the communication unit 291 and holds a part of the driver model (situation database) stored in the storage unit 8 in the cache 292 through the communication unit 291.
  • the vehicle control unit 7 accesses the driver model of the cache 292.
  • a method for creating a cache at this time a method of limiting by the presence or absence of environmental parameters, a method of using position information, a method of processing data, and the like are conceivable. Each will be described below.
  • the vehicle control unit 7 extracts driving environments having only the same environmental parameters from the driving environments stored in the storage unit 8, sorts these, and holds them in the cache 292.
  • the vehicle control unit 7 updates the primary cache at the timing when the environmental parameter obtained from the detected situation is changed. By doing so, the vehicle control unit 7 can extract a similar surrounding situation even if the communication speed decreases.
  • the environmental parameters for determining whether or not there is a change may be all of the environmental parameters listed above, or some of the environmental parameters.
  • a primary cache and a secondary cache may be prepared in the cache 292.
  • the vehicle control unit 7 holds a traveling environment having the same environmental parameter in the primary cache. Further, the vehicle control unit 7 is reduced by one from the driving environment in which one environmental parameter is added to the driving environment held in the temporary cache and from the driving environment in which the environmental parameter is held in the temporary cache. At least one of the driving environments in the state is held in the secondary cache.
  • the vehicle control unit 7 can extract a similar situation using only the data in the cache 292 even if a temporary communication interruption occurs.
  • the vehicle control unit 7 determines that the traveling environment in which only the side front vehicle 302 exists (the same The driving environment in which only the environmental parameters exist is extracted from the storage unit 8 in which all the driving environments (situations) are stored, and stored in the primary cache 304.
  • the vehicle control unit 7 is configured such that the traveling environment in which only one vehicle other than the side front vehicle 302 is added (the traveling environment in which one environmental parameter is added to the same environmental parameter) or the side front vehicle 302 is used.
  • a driving environment without a vehicle is extracted from the storage unit 8 and stored in the secondary cache 305.
  • the vehicle control unit 7 copies the driving environment corresponding to the changed ambient condition 303 from the secondary cache 305 to the primary cache 304, and the changed ambient condition 303. 2 is extracted from the storage unit 8 and stored in the secondary cache 305 by extracting a driving environment in which one environmental parameter has been added and a driving environment in which one environmental parameter has been reduced. The next cache 305 is updated. As a result, the vehicle control unit 7 can smoothly extract a similar surrounding situation by comparing the surrounding situations smoothly.
  • the vehicle control unit 7 displays from the storage unit 8 a driving environment (situation) in which the position indicated by the position information is included within a certain range centered on the vehicle position. It can be extracted and stored in the cache 292.
  • the vehicle control unit 7 updates the cache 292 when the position indicated by the position information corresponding to the traveling environment is out of the certain range. By doing so, the vehicle control unit 7 can extract a similar ambient situation as long as the position is within a certain range even if communication is interrupted for a long time.
  • the storage unit 8 stores operation histories including environmental parameters.
  • the vehicle control unit 7 divides each environmental parameter into a certain range and creates a mesh in a multidimensional space. And the vehicle control part 7 creates the table which counted the behavior contained in each mesh for every classification.
  • the vehicle control unit 7 maps the environmental parameters included in the operation history in a planar shape as shown in FIG. 31A, and divides each axis in a certain range, thereby dividing the plane into a plurality of blocks. This is called a mesh.
  • the vehicle control unit 7 counts the number of behaviors included in each mesh for each type (for example, types such as acceleration, deceleration, lane change, and overtaking).
  • FIG. 31B shows a table in which the number of behaviors included in each mesh is counted for each type.
  • the vehicle control unit 7 holds this content in the cache 292. Then, the vehicle control unit 7 determines which mesh the detected environmental parameter is located in when extracting a similar surrounding situation by comparing the surrounding situations, and the behavior included in the determined mesh The behavior having the largest number is selected, and the behavior for notifying the selected behavior is determined.
  • the vehicle control section 7 when the vehicle control unit 7 determines that the detected environmental parameter is located at the third mesh position, the vehicle control section 7 indicates a behavior (here “acceleration”) indicating the maximum number of behaviors included in the third mesh. The behavior for notifying the operation is determined.
  • the update timing of the cache 292 may be anytime, and the capacity of the cache 292 can be made constant.
  • the vehicle control unit 7 acquires feature amount information indicating the driving characteristics of the driver including information on past driving environments
  • the storage unit 8 stores the feature amount information, and stores the behavior of the vehicle.
  • the feature amount indicating the driving characteristics of the driver including information on the driving environment newly acquired from the feature amount information stored in the storage unit 8 by the vehicle control unit 7 The information similar to is determined, and the behavior corresponding to the determined information is notified.
  • Information on feature values that indicate the driving characteristics of the driver is information on the feature values when the behavior of the vehicle is presented to the driver, and when the driver selects the behavior.
  • the information is at least one piece of feature amount information.
  • feature information indicating the driver's driving characteristics including past driving environment information, feature information when the vehicle behavior is presented to the driver, and the driver selects the behavior If it is both of the feature amount information at the time, information similar to the feature amount indicating the driving characteristics of the driver including the information of the newly acquired driving environment is determined from the information of both feature amounts, The behavior corresponding to the determined information is notified.
  • feature information indicating the driver's driving characteristics including past driving environment information, feature information when the vehicle behavior is presented to the driver, and the driver selects the behavior
  • the driver's driving characteristics including the newly acquired driving environment information are preferentially selected from the feature amount information when the driver selects the behavior.
  • Information similar to the feature amount to be shown is determined, and a behavior corresponding to the determined information is notified.
  • the feature amount information indicating the driving characteristics of the driver including the past driving environment information is the information of the feature amounts indicating the driving characteristics of the driver during the automatic driving and / or manual driving of the vehicle. It was decided.
  • the vehicle control unit 7 can construct a driver model more suitable for the driving tendency of the driver, and can perform more appropriate automatic driving for the driver based on the constructed driver model.
  • the parameter indicating the driving environment By associating the parameter indicating the driving environment with the behavior, it is possible to accurately determine the next behavior without requiring processing for determining a specific driving environment, that is, without labeling the driving environment.
  • Level 1 is a driving support system that automatically performs one of acceleration, deceleration, and steering.
  • Level 2 is a driving support system that automatically performs two or more of acceleration, deceleration, and steering in harmony. is there. In either case, the driver remains involved in the driving operation.
  • the automation level 4 is a fully automatic traveling system that automatically performs all of acceleration, deceleration, and steering, and the driver is not involved in the driving operation.
  • the automation level 3 is a quasi-fully automatic traveling system in which acceleration, deceleration, and steering are all automatically performed, and a driver performs a driving operation as necessary.
  • an HMI Human Machine Interface
  • driving support device mainly at the level 3 or 4 of the automatic driving.
  • the technologies described in the fifth to eleventh embodiments have one object to support realization of a safe and comfortable automatic driving by presenting useful information to the driver during the automatic driving of the vehicle. To do.
  • the technologies described in the fifth to eleventh embodiments make it easier and easier for the driver to change the determined behavior on the vehicle side during automatic driving by presenting information with less discomfort to the driver. Can be.
  • the “behavior” of the vehicle in the following description corresponds to the “behavior” of the vehicle in the description of the first to fourth embodiments.
  • the control content related to the operation state of the vehicle or the automatic operation control is included. For example, constant speed running, acceleration, deceleration, temporary stop, stop, lane change, course change, right / left turn, parking, etc.
  • the behavior of the vehicle is classified into an action that is currently being executed (also referred to as “current action”), an action that is executed after the action that is currently being executed (also referred to as “scheduled action”), and the like.
  • the scheduled action includes an action to be executed immediately and an action scheduled to be executed regardless of the timing after the end of the currently executing action.
  • the scheduled action may include an action that is executed after the currently executed action ends.
  • FIG. 32 is a block diagram showing a configuration of the vehicle 1000, and shows a configuration related to automatic driving.
  • the vehicle 1000 can travel in an automatic driving mode, and includes a notification device 1002, an input device 1004, a wireless device 1008, a driving operation unit 1010, a detection unit 1020, an automatic driving control device 1030, and a driving support device 1040.
  • Each device shown in FIG. 32 may be connected by wired communication such as a dedicated line or CAN (Controller Area Network). Further, it may be connected by wired communication or wireless communication such as USB (Universal Serial Bus), Ethernet (registered trademark), Wi-Fi (registered trademark), or Bluetooth (registered trademark).
  • USB Universal Serial Bus
  • Ethernet registered trademark
  • Wi-Fi registered trademark
  • Bluetooth registered trademark
  • the notification device 1002 notifies the driver of information related to the traveling of the vehicle 1000.
  • the notification device 1002 is, for example, a car navigation system installed in a vehicle, a head-up display, a center display, a steering wheel, a pillar, a dashboard, a light emitter such as an LED installed around a meter panel, and the like. It may be a display unit that displays information. Alternatively, it may be a speaker that converts information into sound and notifies the driver, or is a vibrator provided at a position that can be sensed by the driver (for example, the driver's seat, steering wheel, etc.) Also good. Further, the notification device 1002 may be a combination of these.
  • Vehicle 1000 corresponds to vehicle 1 in the first to fourth embodiments.
  • the notification device 1002 corresponds to the information notification device 9 in FIGS. 1 and 13
  • the input device 1004 corresponds to the operation unit 51 in FIG. 1 and the input unit 102 in FIG. 13, and the detection unit 1020 detects in FIG. This corresponds to part 6.
  • the automatic driving control device 1030 and the driving support device 1040 correspond to the vehicle control unit 7 in FIGS. 1 and 13.
  • the description of the configuration described in the first to fourth embodiments will be omitted as appropriate.
  • the notification device 1002 is a user interface device that presents information on automatic driving of the vehicle to the occupant.
  • the notification device 1002 may be a head unit such as a car navigation system or display audio, may be a mobile terminal device such as a smartphone or a tablet, or may be a dedicated console terminal device.
  • the notification device 1002 may be a liquid crystal display, an organic EL display, or a head-up display (HUD).
  • the input device 1004 is a user interface device that receives an operation input by an occupant. For example, the input device 1004 receives information related to automatic driving of the host vehicle input by the driver. The input device 1004 outputs the received information as an operation signal to the driving support device 1040.
  • FIG. 33 schematically shows the interior of the vehicle 1000 in FIG.
  • the notification device 1002 may be a head-up display (HUD) 1002a or a center display 1002b.
  • the input device 1004 may be the first operation unit 1004a provided on the steering 1011 or the second operation unit 1004b provided between the driver seat and the passenger seat. Note that the notification device 1002 and the input device 1004 may be integrated, and may be implemented as a touch panel display, for example.
  • the vehicle 1000 may be provided with a speaker 1006 that presents information related to automatic driving to the occupant by voice.
  • the driving support device 1040 may display an image indicating information related to automatic driving on the notification device 1002 and output a sound indicating information related to automatic driving from the speaker 1006 together with or instead of the information.
  • the wireless device 1008 corresponds to a mobile phone communication system, WMAN (Wireless Metropolitan Area Network), and the like, and performs wireless communication with a device (not shown) outside the vehicle 1000.
  • the driving operation unit 1010 includes a steering 1011, a brake pedal 1012, an accelerator pedal 1013, and a blinker switch 1014.
  • the steering wheel 1011 is shown in FIGS. 1 and 13, the brake pedal 1012 is shown in FIG. 1, the brake pedal 2 shown in FIG. 13, the accelerator pedal 1013 is shown in FIG. 1, the accelerator pedal 3 shown in FIG. 13, and the winker switch 1014 is shown in FIGS. This corresponds to the winker lever 4.
  • Steering 1011, brake pedal 1012, accelerator pedal 1013, and winker switch 1014 can be electronically controlled by a winker controller, at least one of a steering ECU, a brake ECU, an engine ECU, and a motor ECU.
  • a winker controller at least one of a steering ECU, a brake ECU, an engine ECU, and a motor ECU.
  • the steering ECU, the brake ECU, the engine ECU, and the motor ECU drive the actuator in accordance with a control signal supplied from the automatic operation control device 1030.
  • the blinker controller turns on or off the blinker lamp according to a control signal supplied from the automatic operation control device 1030.
  • Detecting unit 1020 detects the surrounding state and running state of vehicle 1000. As described in part in the first to fourth embodiments, for example, the detection unit 1020 detects the speed of the vehicle 1000, the relative speed of the preceding vehicle with respect to the vehicle 1000, the distance between the vehicle 1000 and the preceding vehicle, and the side lane with respect to the vehicle 1000. The relative speed of the vehicle, the distance between the vehicle 1000 and the vehicle in the side lane, and the position information of the vehicle 1000 are detected.
  • the detection unit 1020 outputs various types of detected information (hereinafter referred to as “detection information”) to the automatic driving control device 1030 and the driving support device 1040. Details of the detection unit 1020 will be described later.
  • the automatic driving control device 1030 is an automatic driving controller that implements an automatic driving control function, and determines the behavior of the vehicle 1000 in automatic driving.
  • the automatic operation control apparatus 1030 includes a control unit 1031, a storage unit 1032, and an I / O unit (input / output unit) 1033.
  • the configuration of the control unit 1031 can be realized by cooperation of hardware resources and software resources, or only by hardware resources. Processors, ROM, RAM, and other LSIs can be used as hardware resources, and programs such as an operating system, application, and firmware can be used as software resources.
  • the storage unit 1032 has a nonvolatile recording medium such as a flash memory.
  • the I / O unit (input / output unit) 1033 executes communication control according to various communication formats.
  • the I / O unit (input / output unit) 1033 outputs information related to automatic driving to the driving support device 1040 and inputs a control command from the driving support device 1040.
  • the I / O unit (input / output unit) 1033 inputs detection information from the detection unit 1020.
  • the control unit 1031 applies a control command input from the driving support device 1040, various information collected from the detection unit 1020 or various ECUs to the automatic driving algorithm, and controls an automatic control target such as a traveling direction of the vehicle 1000. Calculate the control value.
  • the control unit 1031 transmits the calculated control value to each control target ECU or controller. In this embodiment, it is transmitted to the steering ECU, the brake ECU, the engine ECU, and the winker controller. In the case of an electric vehicle or a hybrid car, the control value is transmitted to the motor ECU instead of or in addition to the engine ECU.
  • the driving support device 1040 is an HMI controller that executes an interface function between the vehicle 1000 and the driver, and includes a control unit 1041, a storage unit 1042, and an I / O unit (input / output unit) 1043.
  • the control unit 1041 executes various data processing such as HMI control.
  • the control unit 1041 can be realized by cooperation of hardware resources and software resources, or only by hardware resources. Processors, ROM, RAM, and other LSIs can be used as hardware resources, and programs such as an operating system, application, and firmware can be used as software resources.
  • the storage unit 1042 is a storage area that stores data that is referred to or updated by the control unit 1041. For example, it is realized by a non-volatile recording medium such as a flash memory.
  • the I / O unit (input / output unit) 1043 executes various communication controls corresponding to various communication formats.
  • the I / O unit (input / output unit) 1043 includes an operation input unit 1050, an image output unit 1051, a detection information input unit 1052, a command IF (interface) 1053, and a communication IF 1056.
  • the operation input unit 1050 receives an operation signal from the input device 1004 by the operation of the driver, the passenger, or the user outside the vehicle made to the input device 1004 and outputs the operation signal to the control unit 1041.
  • the image output unit 1051 outputs the image data generated by the control unit 1041 to the notification device 1002 for display.
  • the detection information input unit 1052 is a result of the detection process by the detection unit 1020, receives information (hereinafter referred to as “detection information”) indicating the current surrounding state and running state of the vehicle 1000 from the detection unit 1020, and performs control. Output to the unit 1041.
  • the command IF 1053 executes an interface process with the automatic driving control apparatus 1030, and includes a behavior information input unit 1054 and a command output unit 1055.
  • the behavior information input unit 1054 receives information regarding the automatic driving of the vehicle 1000 transmitted from the automatic driving control device 1030 and outputs the information to the control unit 1041.
  • the command output unit 1055 receives from the control unit 1041 a control command that instructs the automatic driving control device 1030 to specify the mode of automatic driving, and transmits the control command to the automatic driving control device 1030.
  • the communication IF 1056 executes interface processing with the wireless device 1008.
  • the communication IF 1056 transmits the data output from the control unit 1041 to the wireless device 1008, and transmits the data from the wireless device 1008 to a device outside the vehicle.
  • the communication IF 1056 receives data from a device outside the vehicle transferred by the wireless device 1008 and outputs the data to the control unit 1041.
  • the automatic driving control device 1030 and the driving support device 1040 are configured as separate devices.
  • the automatic driving control device 1030 and the driving support device 1040 may be integrated into one controller.
  • one automatic driving control device may have the functions of both the automatic driving control device 1030 and the driving support device 1040 in FIG.
  • a plurality of ECUs may be provided in the integrated controller, and one ECU may realize the function of the automatic driving control device 1030, and the other ECU may realize the function of the driving support device 1040.
  • one ECU in the integrated controller executes a plurality of OSs (Operating System), one OS realizes the function of the automatic driving control device 1030, and the other OS realizes the function of the driving support device 1040. May be.
  • OSs Operating System
  • FIG. 34 is a block diagram showing a detailed configuration of the detection unit 1020 in FIG.
  • the detection unit 1020 includes a position information acquisition unit 1021, a sensor 1022, a speed information acquisition unit 1023, and a map information acquisition unit 1024.
  • the position information acquisition unit 1021 acquires the current position of the vehicle 1000 from the GPS receiver.
  • the sensor 1022 is a general term for various sensors for detecting the situation outside the vehicle and the state of the vehicle 1000.
  • a camera, a millimeter wave radar, a LIDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), a temperature sensor, a pressure sensor, a humidity sensor, an illuminance sensor, and the like are mounted as sensors for detecting the situation outside the vehicle.
  • the situation outside the vehicle includes a road condition in which the host vehicle travels including lane information, an environment including weather, a situation around the host vehicle, and other vehicles in the vicinity (such as other vehicles traveling in the adjacent lane). Any information outside the vehicle that can be detected by the sensor may be used.
  • an acceleration sensor, a gyro sensor, a geomagnetic sensor, a tilt sensor, and the like are mounted as sensors for detecting the state of the vehicle 1000.
  • Speed information acquisition unit 1023 acquires the current speed of vehicle 1000 from the vehicle speed sensor.
  • the map information acquisition unit 1024 acquires map information around the current position of the vehicle 1000 from the map database.
  • the map database may be recorded on a recording medium in the vehicle 1000, or may be downloaded from a map server via a network when used.
  • the automatic driving control apparatus 1030 determines both the current action and the scheduled action based on the detection information output by the detection unit 1020.
  • the action information input unit 1054 of the driving support device 1040 is configured to cause the vehicle 1000 to execute the current action information indicating the current action that the automatic driving control apparatus 1030 executes on the vehicle 1000 and the current action after the current action.
  • Scheduled behavior information indicating behavior is acquired from the automatic driving control device 1030.
  • data including both current behavior information and scheduled behavior information is acquired as behavior information.
  • the current action is an action that the vehicle is currently executing.
  • the scheduled action is an action to be executed after the currently executed action is completed, for example, an action to be executed next to the current action, or an action to be executed next to the current action. May be.
  • FIG. 35 shows action information input from the automatic driving control apparatus 1030.
  • the action information includes current action information that is identification information of the current action and scheduled action information that is identification information of the scheduled action.
  • the identification information of the current action and the scheduled action may be a code that can uniquely identify the type of action.
  • the behavior information is the time from the current time until the scheduled behavior is executed, in other words, the remaining time information indicating the time from the current time until the current behavior is switched to the scheduled behavior.
  • the action information in FIG. 35 indicates that the current action is a lane change to the right, the scheduled action is acceleration, and the remaining time from the end of the current action to the start of the scheduled action is 10 seconds. ing.
  • the “current time” is the current time recognized by the automatic driving control device 1030, and may be, for example, the system time inside the automatic driving control device 1030. Moreover, the time acquired from the clock apparatus not shown which time-measures the present time inside the vehicle 1000 may be sufficient. Furthermore, it may be the time when the content of the current action is determined, the time when the content of the scheduled action to be executed in the future is determined, or the time when the behavior information is notified to the driving support device 1040.
  • FIG. 36 is a block diagram showing a detailed configuration of the control unit 1041 of the driving support device 1040.
  • the control unit 1041 includes an image generation unit 1060.
  • the image generation unit 1060 generates a current action image representing the current action based on the current action information input from the automatic driving control apparatus 1030, and generates a scheduled action image representing the scheduled action based on the scheduled action information.
  • a current action image having a size larger than the size of the scheduled action image is generated.
  • the image output unit 1051 of the driving support apparatus 1040 outputs the current action image and the scheduled action image to the notification device 1002 so that the current action image and the scheduled action image are displayed within a certain field of view of the driver of the vehicle 1000. Displaying within the driver's fixed visual field can be said to display within the same visual field of the driver. For example, the current action image and the scheduled action image may be displayed simultaneously in the vicinity position or within a predetermined distance so that the driver can view both simultaneously. Further, these images may be displayed with a temporal overlap at predetermined positions in the same screen. Further, both the current action image and the scheduled action image may be displayed in the vicinity range where the movement of the line of sight is unnecessary. Furthermore, the image generation unit 1060 may generate image data of a screen in which the current action image and the scheduled action image are arranged in such a vicinity range.
  • the image generation unit 1060 further generates a remaining time image representing the time until the scheduled action updated by the remaining time information input from the automatic driving control device 1030 is executed.
  • the image output unit 1051 further outputs the remaining time image to the notification device 1002, and displays the current action image and the scheduled action image to which the remaining time image is added within a fixed visual field of the driver of the vehicle 1000.
  • the screen including various information related to automatic driving displayed on the notification device 1002 is also referred to as “automatic driving information screen”.
  • 37A and 37B show examples of the automatic driving information screen.
  • FIG. 37A shows an example of the automatic driving information screen 1103 at the first time point
  • FIG. 37B shows an example of the automatic driving information screen 1103 at the second time point after the first time point.
  • the display mode of the current action image 1104 on the automatic driving information screen 1103 is set to be different from the display mode of the scheduled action image 1106. This prevents the driver from confusing the current behavior of the vehicle with the scheduled behavior following the current behavior.
  • the current action image 1104 is displayed with a size larger than the scheduled action image 1106.
  • the current action image 1104 is displayed at the center position within a fixed visual field of the driver of the vehicle 1000, for example, at a position near the center of the automatic driving information screen 1103.
  • the scheduled action image 1106 is displayed at a peripheral position within a fixed visual field of the driver, for example, near the end of the automatic driving information screen 1103.
  • the current action image 1104 is displayed in the main area 1100 on the automatic driving information screen 1103, while the scheduled action image 1106 is displayed in the sub area 1102 smaller than the main area 1100.
  • the remaining time image 1108 is displayed in association with the scheduled action image 1106. Specifically, the scheduled action image 1106 and the remaining time image 1108 are located in the vicinity of the same sub-region 1102. They are displayed side by side.
  • the remaining time image 1108 includes a plurality of time indicators 1109. Each time indicator 1109 is displayed in a lit state or an unlit state, and the longer the remaining time until the scheduled action is executed, the more time indicators 1109 are displayed in the lit state.
  • the time indicator 1109 in the lit state gradually changes to the unlit state as time elapses, thereby notifying the driver of the remaining time until the scheduled action is executed. For example, 5 seconds may be allocated to one time indicator 1109, and the remaining time of the maximum 25 seconds may be indicated by the five time indicators 1109. An appropriate value for the time until one time indicator 1109 is turned off may be determined by a developer's knowledge or experiment, and a user such as a driver may be able to set an arbitrary time.
  • the automatic driving control apparatus 1030 may determine an action plan that is information obtained by continuously combining a plurality of actions (single actions) as the scheduled action.
  • the action plan includes “overtaking”.
  • the action plan “passing” is composed of a combination of three single actions, specifically, a combination of (1) lane change to the right, (2) acceleration, and (3) lane change to the left. Composed.
  • the action information input unit 1054 of the driving support apparatus 1040 acquires scheduled action information indicating the action plan. Specifically, a plurality of single actions and information indicating the execution order of each single action are set in the scheduled action column (scheduled action information) of the action information shown in FIG.
  • the image generation unit 1060 arranges a plurality of images expressing a plurality of single actions included in the action plan in units of actions according to the order in which the plurality of single actions are executed.
  • a scheduled action image 1106 is generated.
  • FIG. 38 shows an example of the automatic driving information screen.
  • a scheduled action image 1106 representing an action plan indicating overtaking is displayed.
  • the scheduled action image 1106 in the figure includes three images showing three single actions, and the single action image having the earlier execution order is arranged at a lower position.
  • the automatic driving control apparatus 1030 may determine, as the scheduled action, a plurality of action candidates that can be executed next to the current action in the automatic driving as the scheduled action.
  • the scheduled behavior in this case may include, for example, two of the first candidate “acceleration” and the second candidate “deceleration”, and may further include three of the third candidate “lane change to the left”.
  • the automatic driving control apparatus 1030 determines the remaining time when each of the plurality of action candidates is executed, and further determines the priority order among the plurality of action candidates.
  • the behavior information input unit 1054 of the driving support device 1040 acquires scheduled behavior information including a plurality of behavior candidates (referred to herein as “candidate information”). Specifically, candidate information indicating a plurality of action candidates and the remaining time and priority for each action candidate is set in the scheduled action column of action information shown in FIG.
  • the image generation unit 1060 generates a current action image 1104 representing the current action indicated by the action information and a plurality of candidate images corresponding to the plurality of action candidates indicated by the action information.
  • a plurality of scheduled action images 1106 representing a plurality of action candidates and a plurality of remaining time images 1108 representing the remaining time of each action candidate are generated as a plurality of candidate images.
  • the image output unit 1051 outputs the current action image 1104 to the notification device 1002 and outputs a plurality of candidate images with a predetermined ranking. As a result, a plurality of candidate images displayed and arranged in a predetermined ranking are displayed together with the current action image 1104 within a fixed visual field of the driver.
  • Ranking among the candidate actions is performed according to the priority of each action candidate indicated in the action information, and specifically, the action candidate having a higher priority is ranked higher. As a modified example, an action candidate with a shorter remaining time until execution may have a higher rank. Further, as will be described later in the embodiment, the action candidate having a higher degree of matching with the driver's preference or operation pattern under the current surrounding conditions and running state of the vehicle 1000 may be ranked higher. .
  • the display arrangement by ranking is a non-parametric method in which a plurality of candidate images corresponding to a plurality of behavior candidates are visualized in a predetermined position in the automatic driving information screen 1103 corresponding to the ranking of each behavior candidate in order of top, bottom, left and right. It may be arranged in the form of display.
  • This order (same in other embodiments) can be said to be a priority order and a priority degree of display on the screen or a proposal to the driver, and can also be said to be a recommended order and a recommended degree.
  • the candidate images of higher-ranking action candidates may be arranged on the right side (the left side may be determined in advance) in the sub area 1102 of the automatic driving information screen 1103.
  • the display arrangement by ranking may be a parametric display in which the ranking is visualized by a histogram or the like.
  • each candidate image may be displayed in a form to which an object indicating the order generated by the image generation unit 1060 is added.
  • This object may be a histogram image having a shape corresponding to the ranking, a numeric image indicating the ranking itself, or the like, or may be generated by the image generation unit 1060. Examples of parametric display are also shown in FIGS. 63A and 63B described later.
  • the higher-ranking action candidate candidate images may be displayed with a more conspicuous appearance (design), for example, the display size may be increased, or the colors may be displayed with a higher visibility. .
  • FIG. 39 shows an example of the automatic driving information screen.
  • a first scheduled action image 1106a representing the first action candidate “acceleration” and a second scheduled action image 1106b representing the second action candidate “deceleration” are displayed.
  • a first remaining time image 1108a indicating the remaining time until the first action candidate is executed is displayed, and a second remaining time image 1108b indicating the remaining time until the second action candidate is executed are displayed.
  • the priority of “acceleration” is higher, and the first scheduled action image 1106a representing “acceleration” is displayed on the right side indicating that the ranking is higher.
  • the order of the first scheduled action image 1106a and the remaining time image 1108 is presented in a non-parametric display form.
  • the ranking may be presented in a parametric format in which a histogram having a shape corresponding to the ranking is added to each of the first scheduled behavior image 1106a and the second scheduled behavior image 1106b.
  • the driver it is possible to allow the driver to know in advance a plurality of action candidates that can be executed in the future by automatic driving of the vehicle, and to provide further security to the driver. Further, as will be described in a later embodiment, when the driver can select the scheduled action, the driver can select the future action of the vehicle in automatic driving from a plurality of candidates. Moreover, the selection of candidates by the driver can be supported by showing the rank among candidates to the driver.
  • FIG. 40 is a sequence diagram illustrating an example of processing related to HMI control of the vehicle 1000.
  • Detection unit 1020 detects the surrounding situation and running state of vehicle 1000, and outputs detection information indicating the detection result to automatic driving control device 1030 (P1).
  • the automatic driving control apparatus 1030 determines the remaining time until execution of the current action, the scheduled action, and the scheduled action of the vehicle according to the detection information acquired from the detection unit 1020. Then, by outputting an action instruction for instructing execution of the current action to the driving operation unit 1010, the vehicle 1000 is caused to execute the current action (P2). Furthermore, the automatic driving control apparatus 1030 transmits the current action information, the scheduled action information, and the remaining time information to the driving support apparatus 1040 (P3).
  • the driving support apparatus 1040 generates a current action image, a scheduled action image, and a remaining time image based on the current action information, the scheduled action information, and the remaining time information acquired from the automatic driving control apparatus 1030, and displays them on the notification device 1002. (P4). Specifically, the image generation unit 1060 of the driving support device 1040 further determines the display positions on the screen of the current action image, the scheduled action image, and the remaining time image. The image output unit 1051 of the driving support device 1040 outputs display position information indicating the display position of each image together with the current action image, the scheduled action image, and the remaining time image to the notification device 1002, and the images are shown in FIGS. 37A to 37B, etc. The automatic driving information screen arranged at the position indicated by is displayed.
  • the image generation unit 1060 of the driving support device 1040 generates image data of the entire automatic driving information screen in which the current action image is arranged at the center position and the scheduled action image and the remaining time image are arranged at the peripheral positions. Also good.
  • the image output unit 1051 of the driving support device 1040 may output and display the generated image data of the automatic driving information screen to the notification device 1002.
  • the driving support device 1040 measures the time from P4 (P5), and outputs the updated remaining time image reflecting the passage of time to the notification device 1002, thereby updating the mode of the remaining time image on the automatic driving information screen. (P6).
  • the remaining time image after the update is, for example, an image obtained by changing a time indicator that has been in a lit state until then to an unlit state.
  • the driving support device 1040 repeats the remaining time image update processing until new current behavior information, scheduled behavior information, and remaining time information are acquired from the automatic driving control device 1030 (S7 to S8).
  • the detection unit 1020 periodically detects the surrounding state and the running state of the vehicle 1000, and outputs detection information indicating the detection result to the automatic driving control device 1030 (P9).
  • the automatic driving control device 1030 newly determines the current action, the scheduled action, and the remaining time until the scheduled action according to the detection information. Then, by outputting an action instruction for instructing execution of the newly determined current action to the driving operation unit 1010, the vehicle 1000 is caused to execute the newly determined current action (P10).
  • the automatic driving control apparatus 1030 transmits the newly determined current action, scheduled action, new current action information indicating the remaining time, scheduled action information, and remaining time information to the driving support apparatus 1040 (P11).
  • the driving support device 1040 Based on the new current behavior information, new planned behavior information, and new remaining time information acquired from the automatic driving control device 1030, the driving support device 1040 creates a new current behavior image, new planned behavior image, and new remaining information. A time image is generated and displayed on the notification device 1002 (P12).
  • the automatic driving control device 1030 and the driving support device 1040 may be integrated into one automatic driving control device.
  • one automatic driving control device may execute both the automatic driving control device 1030 and the driving support device 1040. The same applies to the following embodiments.
  • FIG. 41 is a flowchart showing an example of processing of the driving support apparatus 1040.
  • the behavior information input unit 1054 acquires the behavior information output from the automatic driving control device 1030 (Y in S100)
  • the image generation unit 1060 displays the current behavior indicated by the behavior information and the current behavior stored in the storage unit 1042 in advance. Is matched. Further, it is determined whether or not the scheduled action indicated by the action information matches the scheduled action stored in the storage unit 1042 in advance.
  • the image generation unit 1060 represents the current action indicated by the action information.
  • a current action image is generated (S103).
  • the image output unit 1051 outputs and displays the current action image to the notification device 1002 (S104), and the image generation unit 1060 stores the current action indicated by the action information in the storage unit 1042 (S105). If the current action indicated by the action information matches the current action stored in advance in the storage unit 1042, that is, if there is no update of the current action (N in S102), S103 to S105 are skipped.
  • the image generation unit 1060 represents the scheduled action indicated by the action information.
  • a scheduled action image is generated.
  • the image generation unit 1060 generates a remaining time image representing the remaining time indicated by the behavior information (S107).
  • the image output unit 1051 outputs and displays the scheduled action image and the remaining time image to the notification device 1002 (S108), and the image generation unit 1060 stores the scheduled action indicated by the action information in the storage unit 1042 (S109).
  • the image generation unit 1060 starts measuring the elapsed time from the start of displaying the scheduled action image and the remaining time image (S110).
  • the flow of this figure is terminated. If the end condition is not satisfied (N in S111), the process returns to S100.
  • the termination condition is common to the following embodiments, and is satisfied, for example, when the driver terminates the automatic driving mode or when the vehicle ignition key or the power source is switched off.
  • the image generation unit 1060 determines whether or not a predetermined time has elapsed from the start of the elapsed time measurement. Similarly, when the scheduled action indicated by the action information matches the scheduled action stored in advance in the storage unit 1042, that is, when the scheduled action is not updated (N in S106), the image generation unit 1060 starts measuring elapsed time. It is determined whether or not a predetermined time has passed. When it is detected that a predetermined time has elapsed since the start of the elapsed time measurement (Y in S112), the image generation unit 1060 updates the remaining time image (S113).
  • a remaining time image in which one time indicator 1109 is changed from a lighting state to a non-lighting state is generated.
  • the image output unit 1051 outputs and displays the updated remaining time image to the notification device 1002 (S114).
  • the entire image of the automatic driving information screen may be updated. If the predetermined time has not elapsed since the start of the elapsed time measurement (N in S112), S113 and S114 are skipped.
  • the automatic driving control apparatus 1030 changes the scheduled action from the first scheduled action (for example, lane change to the right) to the second scheduled action (for example, acceleration) while executing a certain current action. May change.
  • the behavior information input unit 1054 of the driving support device 1040 acquires from the automatic driving control device 1030 behavior information in which the scheduled behavior information and the remaining time information are updated while the current behavior information is not updated. Specifically, the action information updated so that the scheduled action information indicates the second scheduled action and the remaining time information indicates the time until the second scheduled action is executed is acquired.
  • the image generation unit 1060 generates a new scheduled action image representing the second scheduled action and a new remaining time image representing the time until the second scheduled action is executed.
  • the image output unit 1051 outputs a new scheduled action image and a new remaining time image to the notification device 1002. Thereby, instead of the previous scheduled action image representing the first scheduled action and the remaining time image within the fixed visual field of the driver, a new scheduled action image representing the second scheduled action and a new remaining time image are displayed. It is displayed with an unupdated current action image.
  • each process is executed sequentially, but each process may be executed in parallel as appropriate.
  • the processing of S102 to S105 and the processing of S106 to S110 may be executed in parallel.
  • the determinations in S102 and S106 may be skipped. That is, new generation and output of the current action image, the scheduled action image, and the remaining time image may be executed regardless of whether the current action and the scheduled action are updated.
  • the driving support apparatus 1040 notifies the vehicle occupants (drivers, etc.) of the current action in the automatic driving and is the action at the future time point.
  • the scheduled action following the current action is also notified in advance. Further, the remaining time until the current action is switched to the scheduled action is further notified.
  • the automatic driving control device 1030 performs the second action (for example, acceleration) scheduled to be executed next to the first action (for example, lane change to the right) that is the current action and the second action during the automatic driving of the vehicle. Determine the remaining time until execution.
  • the automatic driving control apparatus 1030 may further determine the third action (for example, deceleration) that is scheduled to be executed next to the second action and the remaining time until the third action is executed.
  • the action information input unit 1054 of the driving support device 1040 includes current action information indicating the first action, scheduled action information indicating the second action and the third action, the remaining time until the second action, and the remaining until the third action. You may acquire the remaining time information which shows time from the automatic driving
  • the image generation unit 1060 displays the current action image representing the first action, the scheduled action image indicating the remaining time until the second action and the second action, the remaining time image, and the schedule indicating the remaining time until the third action and the third action. A behavior image and a remaining time image may be generated.
  • the image output unit 1051 may output these image data to the notification device 1002 and display these images side by side within a fixed visual field of the driver in the manner shown in FIG.
  • the scheduled action image indicating the second action and the remaining time until the second action and the remaining time image may be arranged as a first scheduled action image 1106a and a second scheduled action image 1106b in FIG.
  • the scheduled action image indicating the remaining time until the third action and the third action and the remaining time image may be arranged as a second scheduled action image 1106b and a second remaining time image 1108b in FIG.
  • the scheduled behavior image indicating the remaining time until the third behavior and the third behavior and the remaining time image may be hidden. For example, processing related to generation, output, or display of these images may be skipped.
  • the third action When the third action is not displayed, it may be displayed in the form of FIG. 37A.
  • the image output unit 1051 hides the image representing the first behavior and displays the second behavior within the same field of view of the driver.
  • An image representing the image and an image representing the third action may be displayed. This is, for example, a case where a notification (new behavior information or the like) indicating that the current behavior of the vehicle has been switched from the first behavior to the second behavior is received from the automatic driving control device 1030.
  • the image generation unit 1060 generates a current action image indicating the second action as the current action and a scheduled action image indicating the third action as the scheduled action, and the image output unit 1051 outputs these images to the notification device 1002.
  • the contents of the automatic driving information screen 1103 may be switched.
  • the automatic driving information screen 1103 of this modification is, for example, a screen transition from FIG. 37A to FIG. 37B when the third behavior is not displayed during the execution of the first behavior.
  • the screen transition from FIG. 39 to FIG. 37B for example.
  • the third action is displayed during the execution of the first action
  • the position of the image representing the second action is fixed by the driver. It changes from the peripheral position in the field of view to the center position.
  • the display is switched from the display of the relatively small scheduled action image representing the second action to the display of the relatively large current action image representing the second action.
  • the scheduled action image that is an image representing the third action is displayed at a peripheral position within a fixed visual field of the driver with a smaller size than the current action image representing the second action.
  • a candidate for action (hereinafter referred to as “current action candidate”) to be immediately executed by the vehicle instead of the current action of the vehicle in the automatic driving is presented to the driver.
  • the current action candidate can be said to be a candidate for an alternative action that replaces the current action, and can also be a candidate for an action that can be executed by the vehicle 1000 in place of the current action.
  • the automatic driving control device 1030 determines the current action
  • the driving support device 1040 determines the current action candidate.
  • the driving support device 1040 causes the notification device 1002 in the vehicle 1000 to display both the current action and the current action candidate.
  • the driver instructs an action to be executed immediately, in actuality, processing within each device or communication between devices occurs, so of course some delay from the current time is allowed. is there.
  • FIG. 42 is a block diagram illustrating a detailed configuration of the storage unit 1042 of the driving support apparatus 1040.
  • the storage unit 1042 includes a statistical information storage unit 1070 and a determination criterion holding unit 1071.
  • Statistic information accumulating unit 1070 accumulates statistical information indicating the relevance between the surrounding situation and running state of the vehicle and the behavior of the vehicle.
  • FIG. 43 schematically shows statistical information stored in the statistical information storage unit 1070.
  • the statistical information of the sixth embodiment corresponds to the travel history of FIG. 27 and the driver model of FIGS. 28A and 28B.
  • the statistical information is information including a plurality of records in which values of a plurality of types of environmental parameters indicating the surrounding situation and the running state of the vehicle are associated with actions that are immediately executed by the vehicle (or action results that are immediately executed by the vehicle). .
  • it is information obtained by associating current behaviors executed in various environmental states with parameter values indicating the environmental states. It may be information modeled and patterned by known statistical processing.
  • the action defined by the statistical information of the sixth embodiment is the current action of the vehicle, in other words, the action that is immediately executed by the vehicle.
  • the behavior defined by the statistical information includes a single behavior as shown by histories (d) and (e), and also includes an action plan combining a plurality of single behaviors as shown by history (f).
  • the environmental parameters include the speed of the vehicle 1000, the relative speed of the preceding vehicle with respect to the vehicle 1000, the distance between the vehicle 1000 and the preceding vehicle, the relative speed of other vehicles in the side lane with respect to the vehicle 1000, the vehicle 1000 and the side lane. The distance from other vehicles and the position information of the vehicle 1000 are included.
  • the item of the environmental parameter defined by the statistical information is included in the detection information input from the detection unit 1020, or the item value can be specified by calculation based on the detection information.
  • the determination criterion holding unit 1071 holds data (hereinafter referred to as “determination criterion”) that serves as a reference for determination processing by the determination unit 1062 described later.
  • the determination criterion is data that defines actions that can be executed (immediately) by the vehicle 1000 for each pattern for a plurality of patterns of detection information input from the detection unit 1020. For example, when a pattern with detection information indicates that there are other vehicles in the forward and right lanes, deceleration and lane change to the left may be defined as possible action candidates for the pattern. In other words, acceleration and lane change to the right may be excluded from possible action candidates.
  • the behavior information input unit 1054 of the driving support device 1040 acquires from the automatic driving control device 1030 behavior information indicating the current behavior that the automatic driving control device 1030 causes the vehicle 1000 to execute.
  • the detection information input unit 1052 of the driving support device 1040 acquires detection information indicating the detection result of the surrounding state and the running state of the vehicle 1000 from the detection unit 1020.
  • FIG. 44 is a block diagram illustrating a detailed configuration of the control unit 1041 of the driving support apparatus 1040.
  • the control unit 1041 includes an image generation unit 1060, a candidate determination unit 1061, a determination unit 1062, and an instruction unit 1063.
  • Candidate determination section 1061 and determination section 1062 determine action candidates that can be executed separately from the current action indicated by the action information acquired from automatic driving control apparatus 1030 based on the detection information acquired from detection section 1020.
  • the candidate determining unit 1061 constructs an n-dimensional vector space corresponding to the number (n) of environmental parameters in the statistical information accumulated in the statistical information accumulating unit 1070, and performs an action determined in the statistical information. Place in that vector space. Next, a specific position in the vector space and a position corresponding to the environmental parameter value indicated by the detection information (hereinafter also referred to as “current environmental position”) are specified. Then, one or more actions existing within a predetermined range (in other words, within a predetermined distance) from the current environment position in the vector space are determined as temporary current action candidates.
  • the candidate determination unit 1061 determines, as a temporary current action candidate, an action associated with an environmental parameter value that approximates the detection information among a plurality of types of actions defined in the statistical information.
  • the predetermined range of thresholds may be determined by developer knowledge or experimentation.
  • the determination unit 1062 refers to the detection information output from the detection unit 1020 and the determination criterion stored in the determination criterion storage unit 1071, and each of the provisional current action candidates determined by the candidate determination unit 1061 is a vehicle. It is determined whether or not the current (immediate) execution is possible.
  • the determination unit 1062 determines, as one of the one or more provisional current action candidates determined by the candidate determination unit 1061, a candidate that can be currently executed by the vehicle as a final current action candidate to be presented to the driver. To do. For example, when the temporary current action candidate is a lane change to the right, if there is no other vehicle on the right lane, the temporary current action candidate may be determined as the final current action candidate.
  • the temporary current action candidate when the temporary current action candidate is a lane change to the right, if there is another vehicle in the right lane, the temporary current action candidate may be excluded from the final current action candidate.
  • the determination process by the determination unit 1062 as to whether or not the vehicle can currently execute a specific action may be realized by a known method.
  • the current action candidate to be presented to the driver is determined by the cooperation of the candidate determination unit 1061 and the determination unit 1062, but only one of the processes is executed to determine the current action candidate. Also good.
  • the candidate determination unit 1061 or the determination unit 1062 determines one or more current behavior candidates
  • the candidate determination unit 1061 or the determination unit 1062 determines whether the current behavior indicated by the behavior information matches each candidate. Then, the current action candidate that matches the current action is excluded from the candidates, and only the current action candidate that does not match the current action is passed to the image generation unit 1060 as a presentation target to the driver. This suppresses presenting an action that matches the current action as an action that replaces the current action.
  • the image generation unit 1060 generates a current action image that represents the current action indicated by the action information, and generates a current action candidate image that represents the final current action candidate determined by the candidate determination unit 1061 and the determination unit 1062.
  • the image output unit 1051 outputs the current action image and the current action candidate image to the notification device 1002 so that the current action image and the current action candidate image are displayed within a fixed visual field of the driver of the vehicle.
  • the driving support device 1040 acquires the scheduled behavior information and the remaining time information from the automatic driving control device 1030 in addition to the current behavior information. The remaining time image may be further displayed on the notification device 1002.
  • the notification device 1002 displays an automatic driving information screen including the current behavior image and the current behavior candidate image output from the driving support device 1040.
  • FIG. 45 shows an example of the automatic driving information screen.
  • an automatic driving information screen 1103 in which the current action image 1104 is arranged in the main area 1100 and two current action candidate images 1110 are arranged in the sub area 1102 is shown.
  • the upper current action candidate image 1110 shows “deceleration” that is the first current action candidate
  • the lower current action candidate image 1110 shows “advance (speed maintenance)” that is the second current action candidate. ing.
  • the display mode of the current action image 1104 on the automatic driving information screen 1103 is set to be different from the display mode of the current action candidate image 1110. This prevents the driver from confusing the current action already determined by the automatic driving controller with the current action candidate proposed as an action that replaces the current action.
  • the current action image 1104 is displayed with a size larger than the current action candidate image 1110.
  • the current action image 1104 is displayed at the center position within a fixed visual field of the driver of the vehicle 1000, for example, at a position near the center of the automatic driving information screen 1103.
  • the current action candidate image 1110 is displayed at a peripheral position within a fixed visual field of the driver, for example, near the end of the automatic driving information screen 1103.
  • the instruction unit 1063 automatically outputs a control command from the command output unit 1055 for causing the vehicle to execute the current action candidate when a predetermined time has elapsed after the current action candidate image 1110 is output (after display). Output to the operation control device 1030.
  • the operation input unit 1050 receives from the input device 1004 a signal (hereinafter referred to as “operation instruction”) that specifies the behavior of the vehicle to be executed during automatic driving.
  • the instruction unit 1063 executes the current action candidate represented by the current action candidate image 1110 when the current action image 1104 and the current action candidate image 1110 have not received an operation instruction within a predetermined time displayed on the notification device 1002.
  • This situation can also be said to be a case where an operation instruction is not received within a predetermined time after the current action candidate image 1110 is output.
  • the predetermined time for waiting for reception of the operation instruction may be set to an appropriate value based on the developer's knowledge or experiment, and may be set to, for example, 5 seconds to 10 seconds.
  • the instruction unit 1063 causes the vehicle to execute the action specified by the operation instruction.
  • the control command is output from the command output unit 1055 to the automatic operation control device 1030.
  • the action specified by the operation instruction is a current action determined by the automatic driving control apparatus 1030 or a current action candidate determined by the driving support apparatus 1040.
  • a control command for causing the vehicle to execute the current action candidate represented by the current action candidate image 1110 is sent from the command output unit 1055 to the automatic driving control device 1030.
  • a control command for causing the vehicle to execute the current action represented by the current action image 1104 is output from the command output unit 1055 to the automatic driving control device 1030.
  • FIG. 46 is a sequence diagram illustrating an example of processing related to HMI control of the vehicle 1000.
  • the detection unit 1020 periodically detects the surrounding conditions and the running state of the vehicle 1000, and periodically outputs detection information indicating the detection result to the automatic driving control device 1030 (P21).
  • the automatic driving control device 1030 determines the current behavior of the vehicle according to the detection information acquired from the detection unit 1020. Then, by outputting an action instruction instructing execution of the current action to the driving operation unit 1010, the vehicle 1000 is caused to execute the current action (P22). Furthermore, the automatic driving control device 1030 transmits behavior information indicating the current behavior to the driving support device 1040 (P23).
  • the driving support device 1040 generates a current behavior image based on the current behavior information acquired from the automatic driving control device 1030, and displays it on the notification device 1002 (P24).
  • the driving support device 1040 acquires detection information that the detection unit 1020 periodically outputs (P25). For example, the detection information output by the detection unit 1020 in P21 may be acquired in parallel (independently) with the automatic driving control device 1030. Alternatively, the detection information transferred by the automatic operation control device 1030 may be acquired.
  • the driving support device 1040 determines a current action candidate based on the detection information every time the detection information output periodically is acquired (P26).
  • the driving support device 1040 generates a current behavior candidate image representing the current behavior candidate and causes the notification device 1002 to display the current behavior candidate image (P27).
  • the input apparatus 1004 instructs to execute the current action candidate.
  • An operation instruction is transmitted to the driving support device 1040 (P28).
  • the driving support device 1040 transmits a control command having a content for instructing execution of the current action candidate to the automatic driving control device 1030 (P29).
  • the automatic driving control apparatus 1030 identifies the current action candidate specified by the control command as a new current action, and outputs a new action instruction to instruct the execution of the new current action to the driving operation unit 1010, thereby newly The current action is executed by the vehicle 1000 (P30).
  • the automatic driving control device 1030 transmits new behavior information indicating a new current behavior to the driving support device 1040 (P31).
  • the driving support device 1040 generates a new current behavior image based on the new current behavior information acquired from the automatic driving control device 1030, and displays the new current behavior image on the notification device 1002 (P32).
  • the driving support device 1040 acquires the latest detection information output by the detection unit 1020 (P33), and determines a new current action candidate (P34).
  • the driving support device 1040 displays the new current candidate image on the notification device 1002 (P35).
  • the detection information input unit 1052 of the driving support device 1040 indicates the current behavior updated in accordance with the control command after the control command is transmitted to the automatic driving control device 1030. Information is acquired from the automatic operation control apparatus 1030. The updated current action is the action specified by the control command, that is, the current action candidate specified by the driver. Then, the driving support device 1040 updates the content of the automatic driving information screen 1103 of the notification device 1002 so as to reflect the latest state.
  • FIG. 47 is a flowchart showing an example of processing of the driving support apparatus 1040.
  • the behavior information input unit 1054 acquires the behavior information output from the automatic driving control device 1030 (Y in S120)
  • the image generation unit 1060 displays the current behavior indicated by the behavior information and the current behavior stored in the storage unit 1042 in advance. Is matched.
  • the subsequent processes in S121 to S124 are the same as the processes in S102 to S105 in FIG. If the action information has not been acquired (N in S120), S121 to S124 are skipped.
  • the candidate determination unit 1061 displays the detection information, the statistical information stored in the statistical information storage unit 1070, and the determination criterion holding unit.
  • One or more current action candidates are determined based on the determination criterion held in 1071 (S126).
  • the candidate determination unit 1061 determines whether or not the current action candidate determined in S126 matches the current action candidate stored in the storage unit 1042 in advance.
  • the candidate determination unit 1061 determines in S126. It is determined whether the current action candidate matches the current action indicated by the action information. Then, the current action candidate that matches the current action is excluded from the subsequent processing targets (S128).
  • the image generation unit 1060 generates a current action candidate image representing the current action candidate that has passed the filtering of S127 and S128 (S129), and the image output unit 1051 outputs the current candidate image to the notification device 1002 for display ( S130).
  • the candidate determining unit 1061 stores information on the current action candidate for which the current action candidate image has been generated in the storage unit 1042 (S131). If the detection information has not been acquired (N in S125), S126 to S131 are skipped. If there is no update of the current action candidate (N of S127), S128 to S131 are skipped. If the predetermined end condition is satisfied (Y in S132), the flow of this figure is ended. If the end condition is not satisfied (N in S132), the process returns to S120.
  • the candidate determination unit 1061 determines the driver's preference or operation pattern under the current surrounding conditions and driving state of the vehicle 1000. An action candidate having a higher degree of match may be ranked higher. Then, similarly to the case where a plurality of scheduled actions are input in the fifth embodiment, the image generation unit 1060 displays a plurality of current action candidate images corresponding to a plurality of current action candidates according to the ranks of the candidates. May be generated. In addition, the image output unit 1051 may cause the notification device 1002 to display a plurality of current action candidate images in a manner corresponding to the rank of each candidate.
  • the image generation unit 1060 generates automatic driving information screen data including both the current action image and the current action candidate image, and the image output unit 1051 displays the automatic driving information.
  • the screen data may be output to the notification device 1002 and displayed. That is, both the current action image and the current action candidate image may be output to the notification device 1002 at once.
  • FIG. 48 is also a flowchart showing an example of processing of the driving support device 1040.
  • the instruction unit 1063 starts measuring the elapsed time from the output (S141).
  • the instruction unit 1063 sends a control command to instruct the execution of the current action candidate to the command output unit.
  • This threshold value is a time value determined in advance by a developer's knowledge or experiment, and may be, for example, 5 seconds to 10 seconds.
  • the instructing unit 1063 may output a control command instructing execution of the highest current action candidate according to a predetermined rank of each candidate. Good.
  • the process returns to S142.
  • the instruction unit 1063 sends a control command for instructing execution of the current action or the current action candidate designated by the operation instruction to the command output unit.
  • Output from 1055 to the automatic operation controller 1030 (S145).
  • This operation instruction may be a signal indicating that an operation for selecting one of the current action image and the current action candidate image on the automatic driving information screen is input. If the current action candidate image has not been output (N in S140), the processes in S141 and subsequent steps are skipped, and the flow of FIG.
  • the driving support device 1040 notifies the vehicle occupants (drivers, etc.) of the current behavior in automatic driving, and can be executed immediately instead of the current behavior. Suggest action candidates. In this way, by presenting the driver with the latest action options of the vehicle, it is an automatic driving that further reflects the driver's intention, and an automatic driving that meets the driver's preference and the like can be realized.
  • the current action candidate that can be selected by the driver is one that can be executed in the current situation or running state of the vehicle, the driver can give an instruction to change the automatic driving with a sense of security.
  • the control command for executing the current action candidate is automatically driven. It output to the control apparatus 1030. That is, in the sixth embodiment, the behavior determined by the driving support device 1040 is preferentially executed by the vehicle over the behavior determined by the automatic driving control device 1030.
  • the instruction unit 1063 sends a control command for causing the vehicle to execute the current action represented by the current action image 1104 from the command output unit 1055 to the automatic driving control device 1030. May be output. That is, the behavior determined by the automatic driving control device 1030 may be preferentially executed by the vehicle over the behavior determined by the driving support device 1040.
  • the driving support device 1040 accumulates statistical information indicating the relationship between the vehicle surroundings and the running state and the vehicle behavior in the local storage unit.
  • the statistical information may be stored in an information processing apparatus outside the vehicle, for example, a database server connected via a communication network. That is, the statistical information storage unit 1070 may be provided in a remote place (for example, on a cloud) outside the vehicle.
  • the candidate determination unit 1061 of the driving support device 1040 accesses the statistical information stored in the statistical information storage unit outside the vehicle via the communication IF 1056 and the wireless device 1008, and uses the statistical information and the detection information of the detection unit 1020. Therefore, the current action candidate may be determined. The same applies to the following embodiments using the statistical information.
  • the statistical information stored in the statistical information storage unit 1070 may be information (driver model) reflecting the preference or driving pattern of a vehicle occupant (typically a driver). It can be said that the statistical information of this modified example is an accumulation of a combination of past environmental parameter values and results of current actions under the environment.
  • the driving support device 1040 indicates the past environmental parameter value in the travel of the vehicle and the latest operation (the latest behavior of the vehicle as a result) by the driver in the case of the environmental parameter value. You may further have a statistical information recording part recorded sequentially.
  • the candidate determination unit 1061 of the present modification refers to the statistical information storage unit 1070 to determine a plurality of current action candidates in which the difference from the currently detected environmental parameter value is associated with the environmental parameter value within a predetermined range. To do. At the same time, the priority order (also referred to as priority) of each of the plurality of current action candidates is determined. For example, when determining a plurality of current action candidates, the candidate determination unit 1061 may increase the rank of the current action candidate associated with the environmental parameter value that is close to the currently detected environmental parameter value. That is, among the plurality of current action candidates, a higher priority may be given to a current action candidate that matches the driver's preference or driving pattern.
  • the image generation unit 1060 generates a plurality of current action candidate images corresponding to the plurality of current action candidates determined by the candidate determination unit 1061, that is, generates an image representing the contents of each candidate.
  • the image output unit 1051 outputs the current action image and the plurality of current action candidate images to the notification device 1002 so that the current action image and the plurality of current action candidate images are displayed within a certain field of view of the driver of the vehicle 1000.
  • the image output unit 1051 displays a plurality of current action candidate images on the notification device 1002 in a manner corresponding to the priority order of each candidate.
  • the image generation unit 1060 and the image output unit 1051 have a parametric display in which priority is visualized by a histogram or the like, or a non-parametric display form in which the order is visualized in the vertical and horizontal directions.
  • a plurality of current action candidate images may be displayed. Specifically, an image showing the order itself may be added to each of the current action candidate images, or the higher-order current action candidate images may be displayed in a manner with higher visibility.
  • the current action candidate images may be arranged in descending order from a predetermined position. In addition, only current action candidates having a predetermined rank or higher may be generated and displayed.
  • this modification it is possible to support selection of a suitable candidate for the occupant from among a plurality of current action candidates.
  • the priority according to the driving pattern of the target person of the driver model (for example, the driver himself, the passenger himself, a standard or exemplary driver model, etc.) Can be presented.
  • This modification can also be applied to generation and display of current candidate images or scheduled action candidate images in other embodiments.
  • the automatic driving control device 1030 decides to execute the current action “change lane to the overtaking lane”. Then, it is assumed that the driving support device 1040 (also referred to as an HMI control ECU) presents that the current action is being executed.
  • the driving support device 1040 detects that the right turn signal of the preceding vehicle is lit, and “the advantage that the own vehicle changes the lane by changing the lane of the preceding vehicle to the overtaking lane side. May be determined. Then, the driving support device 1040 can select “accelerate if possible after lane maintenance” as a current action candidate to replace the current action “change lane to the overtaking lane” as an option that can immediately give an instruction (in other words, selectable). Recommendations may be presented.
  • candidates for actions to be executed in the future by the vehicle in automatic driving are presented to the driver.
  • the automatic driving control device 1030 determines the current action
  • the driving support device 1040 determines the scheduled action candidate.
  • the driving support device 1040 causes the notification device 1002 in the vehicle 1000 to display both the current action and the scheduled action candidate.
  • the scheduled action candidate is an action that can be executed after the currently executed action, and can be said to be an action plan that can be selected next.
  • the scheduled action candidate corresponds to the scheduled action in the fifth embodiment, and in the seventh embodiment, it can be said that the driving support apparatus 1040 determines and presents it to the driver as a selectable candidate.
  • the functional block of the driving support device 1040 is the same as that of the sixth embodiment. That is, the control unit 1041 includes an image generation unit 1060, a candidate determination unit 1061, a determination unit 1062, and an instruction unit 1063 as shown in FIG. Further, the storage unit 1042 includes a statistical information storage unit 1070 and a determination criterion holding unit 1071 as shown in FIG.
  • the determination standard held in the determination reference holding unit 1071 is the same as that in the sixth embodiment.
  • the determination criterion is data that defines actions that can be executed (immediately) by the vehicle 1000 for each pattern for a plurality of patterns of detection information input from the detection unit 1020.
  • the statistical information stored in the statistical information storage unit 1070 of the seventh embodiment also corresponds to the travel history of FIG. 27 and the driver model of FIGS. 28A and 28B, and the relationship between the vehicle's surrounding situation and travel state and the behavior of the vehicle. Is statistical information (FIG. 43).
  • the statistical information of the seventh embodiment includes a plurality of records in which values of a plurality of types of environmental parameters indicating the surrounding situation and running state of the vehicle are associated with actions (or action results) to be executed by the vehicle at a future time point.
  • Information is information obtained by accumulating actions executed at a future time point (after a predetermined time) with respect to the current environmental state in association with a parameter value indicating the current environmental state.
  • the future time point may be 10 seconds to several minutes later.
  • Each action defined by the statistical information is associated with remaining time information (for example, 10 seconds to several minutes) until the action is executed in the future.
  • the statistical information may be accumulated in a device outside the vehicle 1000, and the driving support device 1040 may access the remote statistical information via the communication IF 1056 and the wireless device 1008.
  • the behavior information input unit 1054 of the driving support device 1040 acquires from the automatic driving control device 1030 behavior information indicating the current behavior that the automatic driving control device 1030 causes the vehicle 1000 to execute.
  • the detection information input unit 1052 of the driving support device 1040 acquires detection information indicating the detection result of the surrounding state and the running state of the vehicle 1000 from the detection unit 1020.
  • the candidate determination unit 1061 determines one or more scheduled behavior candidates that are behaviors that can be executed by the vehicle 1000 after the current behavior indicated by the behavior information, based on the detection information output from the detection unit 1020. Specifically, as in the sixth embodiment, candidate determination section 1061 selects one or more actions associated with environmental parameter values that approximate the detection information from among actions specified in the statistical information as candidates. Extract. However, unlike the sixth embodiment, the candidates extracted from the statistical information of the seventh embodiment are scheduled behavior candidates indicating the behavior to be executed by the vehicle at a future time.
  • the scheduled action candidate may be an action that is scheduled to be executed following the current action of the vehicle 1000. After the current action of the vehicle 1000 is finished, another action is sandwiched and several tens of seconds or minutes after the present. It may be an action scheduled to be executed. For both actions, different remaining times are defined in the statistical information.
  • the image generation unit 1060 generates a current action image representing the current action indicated by the action information, and generates one or more scheduled action candidate images representing one or more scheduled action candidates.
  • the image output unit 1051 outputs the current action image and the scheduled action candidate image to the notification device 1002 so that the current action image and the scheduled action candidate image are displayed within a certain field of view of the driver of the vehicle.
  • the notification device 1002 displays an automatic driving information screen including the current action image and the scheduled action candidate image output from the driving support device 1040.
  • the candidate determining unit 1061 determines the scheduled action candidate
  • the candidate determining unit 1061 further passes the remaining time information associated with the scheduled action candidate in the statistical information to the image generating unit 1060.
  • the image generation unit 1060 further generates a remaining time image representing the remaining time indicated by the remaining time information.
  • the image output unit 1051 outputs the remaining time image to the notification device 1002 in addition to the current action image and the scheduled action candidate image, thereby displaying the scheduled action candidate image with the remaining time image added on the automatic driving information screen.
  • FIG. 49 shows an example of the automatic driving information screen.
  • the figure shows an automatic driving information screen 1103 in which a current action image 1104 is arranged in the main area 1100 and a scheduled action candidate image 1112 representing one scheduled action candidate and a remaining time image 1108 are arranged in the sub area 1102.
  • the scheduled action candidate image 1112 in the figure shows an action plan (overtaking in this example) in which three single actions are combined as one scheduled action candidate.
  • the remaining time image 1108 in the figure shows the length of the remaining time until the scheduled action candidate is executed as a ratio of the shaded area in the sub area 1102.
  • the ratio of the shaded area may be increased as the remaining time decreases.
  • the display mode of the remaining time image 1108 may be updated.
  • the display mode of the remaining time image 1108 may be updated so that the two-thirds area from the bottom of the sub-area 1102 is a shaded area.
  • the remaining time may be notified by a change in pattern or color, or the remaining time may be notified by another method such as displaying a timer object for counting the remaining time.
  • the image output unit 1051 issues a command for instructing the display of the scheduled action candidate image 1112 to end. It may be transmitted to 1002 and the display of the scheduled action candidate image 1112 may be terminated. Further, the image generation unit 1060 generates a new scheduled behavior candidate image indicating the scheduled behavior candidate newly determined by the candidate determination unit 1061, and the image output unit 1051 transmits the new scheduled behavior candidate image to the notification device 1002. May be displayed.
  • 50A to 50F also show examples of the automatic operation information screen.
  • the scheduled action candidate image 1112 is arranged at the center position of the double circle, and the current action image 1104 is arranged in the outer circle.
  • a remaining time display area 1114 is provided in the outer circle, and the shaded area indicated by the remaining time image 1108 is expanded as the remaining time until the scheduled action candidate is executed becomes shorter (FIG. 50A to FIG. 50).
  • FIG. 50E For example, if the initial remaining time is 60 seconds and 20 seconds have elapsed from the start of displaying the scheduled action candidate image 1112, the remaining time display area 1114 is left as a shaded area.
  • the display mode of the time image 1108 may be updated.
  • FIG. 50F shows that when the remaining time until the scheduled action candidate execution becomes zero, in other words, the elapsed time from the display start of the scheduled action candidate image 1112 is the remaining time associated with the scheduled action candidate in the statistical information.
  • An automatic driving information screen 1103 when it has been reached is shown.
  • the scheduled action candidate (acceleration) that was previously shown in the scheduled action candidate image 1112 is switched to the current action, and accordingly, from the current action image 1104 that shows a lane change to the right, the current action showing acceleration is shown.
  • the image 1104 has been switched.
  • 51A to 51F also show examples of the automatic operation information screen.
  • 51A to 51F are different from the automatic driving information screen 1103 shown in FIGS. 50A to 50F in the following three points.
  • (2) The scheduled action candidate image 1112 is arranged around the inner circle.
  • the upward triangle indicates acceleration, the downward triangle indicates deceleration, the left triangle indicates lane change to the left, and the right triangle indicates lane change to the right.
  • the remaining time image 1108 is a point indicating the remaining time until the scheduled action represented by the scheduled action image 1106 is executed.
  • the scheduled action candidate presented to the driver (scheduled action candidate image 1112) while the remaining time until the scheduled action “acceleration” indicated by the scheduled action image 1106 is executed decreases. Will be updated from time to time.
  • FIG. 51F shows the case where the remaining time until the scheduled action execution becomes zero, in other words, the elapsed time from the start of displaying the scheduled action image 1106 reaches the remaining time notified from the automatic driving control device 1030.
  • An automatic driving information screen 1103 is shown.
  • the scheduled action (acceleration) that has been shown in the scheduled action image 1106 until then is switched to the current action, and accordingly, the current action image 1104 that shows acceleration from the current action image 1104 that shows a lane change to the right. It has switched to.
  • the driver may press the cross button provided on the input device 1004 to instruct execution of the scheduled action candidate represented by the scheduled action candidate image 1112. For example, the driver may instruct execution of acceleration indicated by the scheduled action candidate image 1112 by selecting the up button of the cross button while the scheduled action candidate image 1112 indicating the upward triangle is displayed.
  • FIG. 52 also shows an example of the automatic driving information screen.
  • two scheduled action candidate images 1112 representing two scheduled action candidates (two action plans) are displayed.
  • a remaining time image 1108 representing the remaining time until each scheduled action candidate is executed is indicated by a plurality of time indicators 1109 as in the fifth embodiment.
  • the automatic driving information screen 1103 displays a selection frame 1116 for allowing the driver to select a specific scheduled action candidate from a plurality of scheduled action candidates represented by a plurality of scheduled action candidate images 1112.
  • the driver inputs an operation for designating a desired scheduled action candidate to the input device 1004 using the selection frame 1116.
  • the image generation unit 1060 of the driving support apparatus 1040 gives the driver “execution” or “reservation” of the designated scheduled action candidate.
  • An inquiry image of the contents to be inquired is generated.
  • “Execution” means that the scheduled action candidate is immediately executed by the vehicle.
  • “Reservation” is a timing after a predetermined time from an operation instruction for specifying a reservation, and means that a scheduled action candidate is executed at a timing when the vehicle becomes executable.
  • the image output unit 1051 outputs the inquiry image to the notification device 1002 to be displayed.
  • the instruction unit 1063 receives a control command for causing the vehicle 1000 to execute the scheduled action candidate at the first timing when an operation for designating “execution” is input via the operation input unit 1050 during display of the inquiry image. Is output to the automatic operation control device 1030. On the other hand, when an operation for designating “reservation” is input during the display of the inquiry image, a control command for causing the vehicle 1000 to execute the scheduled action candidate is automatically executed at a second timing after the first timing. Output to the operation control device 1030.
  • the processing sequence of the vehicle 1000 in the seventh embodiment is the same as the processing sequence of FIG. 46 described in the sixth embodiment.
  • the determination of the current action candidate and the display of the current action candidate image are replaced with the determination of the scheduled action candidate and the display of the scheduled action candidate image (P26, P27).
  • the input device 1004 notifies the driving support device 1040 of the scheduled action candidate selection operation (P28).
  • the driving support device 1040 outputs a control command instructing execution of the scheduled action candidate to the automatic driving control device 1030 (P29).
  • the detection information input unit 1052 of the driving support device 1040 indicates the current behavior updated in accordance with the control command after the control command is transmitted to the automatic driving control device 1030.
  • Information is acquired from the automatic operation control apparatus 1030.
  • the updated current action is the action specified by the control command, that is, the scheduled action candidate specified by the driver.
  • the driving support device 1040 updates the content of the automatic driving information screen 1103 of the notification device 1002 to reflect the latest behavior of the vehicle.
  • FIG. 53 is a flowchart showing an example of processing of the driving support apparatus 1040.
  • the behavior information input unit 1054 acquires the behavior information output from the automatic driving control device 1030 (Y in S150)
  • the image generation unit 1060 displays the current behavior indicated by the behavior information and the current behavior stored in the storage unit 1042 in advance. Is matched.
  • the subsequent processing of S151 to S154 is the same as the processing of S102 to S105 in FIG. If the action information has not been acquired (N in S150), S151 to S154 are skipped.
  • the candidate determination unit 1061 uses the detection information and the statistical information stored in the statistical information storage unit 1070 based on the detection information.
  • One or more scheduled action candidates are determined (S156).
  • the image generation unit 1060 determines whether or not the scheduled action candidate determined in S156 matches the scheduled action candidate stored in the storage unit 1042 in advance.
  • the scheduled action candidate determined in S156 does not match the scheduled action candidate stored in the storage unit 1042 in advance, that is, when the scheduled action candidate is updated (Y in S157)
  • the image generation unit 1060 displays the scheduled action candidate. Is generated (S158).
  • the image generation unit 1060 further identifies the remaining time until the execution associated with the scheduled action candidate in advance in the statistical information, and further generates a remaining time image representing the remaining time in S158.
  • the image output unit 1051 outputs the scheduled action candidate image and the remaining time image to the notification device 1002 to display the automatic driving information screen (S159).
  • the image generation unit 1060 stores information indicating the scheduled action candidate that generated the image in the storage unit 1042 (S160), and starts measuring the elapsed time after outputting (starting to display) the scheduled action candidate image (S161). ). If the predetermined end condition is satisfied (Y in S162), the flow of this figure is terminated. If the end condition is not satisfied (N in S162), the process returns to S150.
  • the image generation unit 1060 determines whether or not a predetermined time has elapsed since the start of the elapsed time measurement. .
  • the predetermined time is a unit time in which the remaining time image is to be updated, and may be a time assigned to one time indicator 1109, for example.
  • the image generation unit 1060 updates the remaining time image (S164).
  • the image output unit 1051 outputs and displays the updated remaining time image to the notification device 1002 (S165). Thereby, for example, the shaded area in the remaining time display area 1114 in FIG.
  • FIGS. 50A to 50F is enlarged, and one time indicator 1109 in FIG. 52 is switched from the on state to the off state. If the predetermined time has not elapsed since the start of the elapsed time measurement (N in S163), S164 and S165 are skipped.
  • FIG. 54 is also a flowchart showing an example of processing of the driving support device 1040.
  • the image output unit 1051 outputs the scheduled behavior candidate image to the notification device 1002
  • the scheduled behavior candidate image (scheduled behavior candidate image) is displayed while the scheduled behavior candidate image is displayed on the automatic driving information screen of the notification device 1002 (Y in S170). If not selected (N in S171), the process returns to S170.
  • the candidate determining unit 1061 temporarily stops the new scheduled action candidate determining process (S172).
  • the determining unit 1062 determines whether or not the vehicle 1000 can immediately execute the selected scheduled action candidate (hereinafter also referred to as “selected action”) (S173). Specifically, the determination unit 1062 can cause the vehicle to currently execute the selection action with reference to the latest detection information output from the detection unit 1020 and the determination criterion held in the determination criterion holding unit 1071. It is determined whether or not. Here, even a selection action that cannot be immediately executed by the vehicle may be executed if the surrounding environment or the running state of the vehicle changes. Therefore, in the seventh embodiment, it is possible to reserve a selected action. Reservation can be said to be an act of instructing execution of a selection action at a possible timing within a certain time range.
  • the image generation unit 1060 generates an inquiry image for inquiring the driver about the immediate execution or reservation of the selected action, and the image output unit 1051 outputs the inquiry image to the notification device 1002 for display (S174).
  • 55A and 55B show an example of the automatic driving information screen.
  • the automatic driving information screen 1103 of FIG. 55A the scheduled action candidate image 1112 selected on the automatic driving information screen 1103 of FIG. 52 is displayed.
  • an inquiry image including an execution button 1118 and a reservation button 1120 is displayed on the automatic driving information screen 1103 in FIG.
  • the determination unit 1062 determines that the selection action can be immediately executed
  • the image generation unit 1060 generates an inquiry image including both the execution button 1118 and the reservation button 1120.
  • the operation input unit 1050 receives an operation instruction for instructing immediate execution of the selected action from the input device 1004.
  • This operation instruction may be a signal indicating that the scheduled action candidate represented by the scheduled action candidate image is selected in the inquiry image for immediate execution.
  • the instruction unit 1063 transmits a control command for instructing immediate execution of the selected action from the command output unit 1055 to the automatic driving control apparatus 1030 (S176).
  • This control command can be said to be a control command for instructing the vehicle to immediately execute the scheduled action candidate represented by the scheduled action candidate image instead of the current action represented by the current action image on the automatic driving information screen.
  • the current action image on the automatic driving information screen is changed to one showing the selected action by the processing of S150 to S153 in FIG.
  • the candidate determination unit 1061 resumes the new scheduled action candidate determination process (S177). If the scheduled action candidate image is not being displayed (N in S170), the subsequent processing is skipped and the flow of this figure is terminated.
  • the operation input unit 1050 receives an operation instruction for instructing reservation of the selected action from the input device 1004.
  • the image generation unit 1060 generates a cancel time setting image for allowing the driver to set the time until the execution of the selection action (hereinafter also referred to as “reservation action”) instructed to be reserved.
  • the image output unit 1051 outputs and displays the cancel time setting image to the notification device 1002 (S178).
  • FIG. 55B shows an automatic driving information screen 1103 including a cancel time setting image 1122. In the cancel time setting image 1122 in the figure, the cancel time can be set between 30 seconds and 10 minutes. While the reservation is being made, the process of determining a new scheduled action candidate is stopped, but the driver sets a reservation cancellation time, so that after the reservation is canceled, a new scheduled action candidate is presented to the driver.
  • the operation input unit 1050 receives a signal indicating the set cancellation time from the input device 1004 (S179).
  • the instruction unit 1063 starts measuring the elapsed time from the start of reservation (S180).
  • the image generation unit 1060 generates a reservation behavior image that is an image representing the reservation behavior and a time limit image that is an image representing the length of the reservation cancellation time.
  • the image output unit 1051 outputs and displays the reservation action image and the time limit image to the notification device 1002 (S181).
  • the reserved action image is an image having a display mode different from the scheduled action candidate image, and is an image indicating that the specific action selected by the driver is being reserved. For example, an image to which a predetermined symbol indicating reservation is added may be used.
  • 56A and 56B show an example of the automatic driving information screen.
  • a reserved action image 1124 and a time limit image 1126 are displayed in the sub-region 1102.
  • the time limit image 1126 represents the length of time until reservation cancellation by a plurality of time indicators 1109.
  • the image generation unit 1060 generates the reserved action image 1124 in a mode different from the mode of the scheduled action candidate image 1112.
  • the image output unit 1051 displays the reserved action image 1124 in a mode different from the mode of the scheduled action candidate image 1112. This allows the driver to easily determine whether the reservation action candidate is being proposed or has already been reserved on the automatic driving information screen 1103.
  • the remaining time image 1108 is arranged on the left side of the scheduled action candidate image 1112.
  • the time limit image 1126 is arranged on the right side of the reservation action image 1124.
  • the background color of the scheduled action candidate image 1112 and the reserved action image 1124 may be different, or the display size may be different.
  • an image of a predetermined symbol indicating that a reservation is being made may be added to the reservation action image 1124.
  • the determination unit 1062 determines whether or not the determination unit 1062 can cause the vehicle to immediately execute the reservation action according to the latest detection information output from the detection unit 1020 and the determination criterion held in the determination criterion holding unit 1071. Is determined (S182). If the reservation action can be immediately executed (Y in S183), the instruction unit 1063 causes the command output unit 1055 to transmit a control command instructing the immediate execution of the reservation action to the automatic driving control device 1030 (S176). ). Thereafter, the current action image on the automatic driving information screen changes to the one indicating the reservation action by the processes of S150 to S153 in FIG.
  • the candidate determination unit 1061 determines the new scheduled action candidate without performing an instruction to execute the reservation action. Is resumed (S177), and the flow of FIG. If the cancellation time has not elapsed (N in S184) and the predetermined time for updating the time limit image has not yet elapsed (N in S185), the process returns to S182 to determine again whether the reservation action can be immediately executed. When a predetermined time for updating the time limit image has elapsed (Y in S185), the image generation unit 1060 updates the time limit image. The image output unit 1051 outputs and displays the updated time limit image to the notification device 1002 (S186). Then, the process returns to S182 to determine again whether or not the reservation action can be immediately executed.
  • FIG. 56B shows an automatic driving information screen 1103 at a time point after FIG. 56A.
  • the current action of the vehicle represented by the current action image 1104 is switched from deceleration to advance (maintain speed), but the reservation action cannot be executed yet, so the time limit image 1126 in the sub-region 1102. Only has changed. Specifically, by changing a part of the time indicator 1109 from the lit state to the unlit state, the driver is notified that the remaining time until the reservation is canceled. When the elapsed time from the reservation start reaches the cancel time (for example, 5 minutes), the display of the reservation action image 1124 ends, and the automatic driving information screen 1103 (FIG. 10) presenting the scheduled action candidate newly determined by the candidate determination unit 1061. 52).
  • the cancel time for example, 5 minutes
  • the driving support device 1040 can notify a vehicle occupant (driver or the like) of the current behavior in automatic driving and can instruct to change the control of automatic driving. Present the future behavior of the vehicle to the driver. In this way, by presenting the driver with options for the future behavior of the vehicle in automatic driving, it is an automatic driving that further reflects the driver's intention, and an automatic driving that matches the driver's preference and the like can be realized. In addition, the driver's anxiety about automatic driving can be suppressed. In addition, according to the driving support device 1040 of the seventh embodiment, even if it is an action that cannot be executed at present, it is possible to reserve an action in the future, so that automatic driving in accordance with the driver's preference is further realized. It becomes easy.
  • the driving support device 1040 determines whether or not the selection action by the driver can be immediately executed.
  • the automatic driving control device 1030 may determine whether or not the selection action by the driver can be immediately executed. In this case, the driving assistance device 1040 may not execute this determination process.
  • the driving assistance device 1040 may generate an inquiry image that can select both immediate execution and reservation regardless of whether or not the selection action by the driver can be immediately executed, and may present it to the driver.
  • the instruction unit 1063 of the driving support device 1040 receives a first control command that causes the vehicle to execute the selection operation immediately when an operation instruction indicating immediate execution of the selection operation is received as the driver's operation on the inquiry image. You may make it transmit to the automatic operation control apparatus 1030 from the output part 1055.
  • a second control command for causing the vehicle to execute the selected action after a predetermined time is sent from the command output unit 1055 to the automatic driving control device 1030. May be sent to.
  • the second control command may be, for example, a control command that instructs the cancel time setting image 1122 to execute the selection action within the cancel time set by the driver.
  • the control command may be an instruction to cancel execution of the selected action when the cancel time has elapsed.
  • the configuration for causing the vehicle to execute the scheduled action candidate may be the same as the configuration described in the sixth embodiment. That is, the instruction unit 1063 of the driving support device 1040 outputs a control command for causing the vehicle to execute the scheduled action candidate after the current action image 1104 and the scheduled action candidate image 1112 are displayed on the automatic driving information screen 1103 for a predetermined time. You may make it transmit to the automatic driving
  • the instruction unit 1063 selects a scheduled action candidate when an operation instruction for designating a vehicle action is not input within a predetermined time in which the current action image 1104 and the scheduled action candidate image 1112 are displayed on the automatic driving information screen 1103.
  • a control command to be executed by the vehicle may be transmitted from the command output unit 1055 to the automatic driving control device 1030.
  • the instruction unit 1063 has received an operation instruction to select a scheduled action candidate (scheduled action candidate image 1112) within a predetermined time in which the current action image 1104 and the scheduled action candidate image 1112 are displayed on the automatic driving information screen 1103.
  • a control command for causing the vehicle to execute the scheduled action candidate may be transmitted from the command output unit 1055 to the automatic driving control device 1030.
  • the statistical information stored in the statistical information storage unit 1070 may be information (driver model) reflecting the preference or driving pattern of a vehicle occupant (typically a driver). It can be said that the statistical information of this modification is an accumulation of past environmental parameter values and combinations of scheduled behaviors under the environment.
  • the driving support device 1040 displays the past environmental parameter value in the travel of the vehicle and the operation at the predetermined future time point (the resulting future behavior of the vehicle) by the driver in the case of the environmental parameter value as the statistical information storage unit 1070. You may further have a statistical information recording part recorded sequentially on this statistical information.
  • the candidate determination unit 1061 of the present modification refers to the statistical information storage unit 1070 and determines a plurality of scheduled action candidates in which the difference from the currently detected environmental parameter value is associated with the environmental parameter value within a predetermined range. To do. At the same time, the priority order (also referred to as priority) of each of the plurality of scheduled action candidates is determined. For example, when the candidate determination unit 1061 determines a plurality of scheduled action candidates, the candidate action unit 1061 may increase the rank of the scheduled action candidate associated with the environmental parameter value that is close to the currently detected environmental parameter value. That is, among the plurality of scheduled action candidates, a higher priority may be given to a scheduled action candidate that matches the driver's preference or driving pattern.
  • the image generation unit 1060 generates a plurality of scheduled action candidate images corresponding to the plurality of scheduled action candidates determined by the candidate determination unit 1061, that is, generates an image representing the contents of each candidate.
  • the image output unit 1051 outputs the current action image and the plurality of scheduled action candidate images to the notification device 1002 so that the current action image and the plurality of scheduled action candidate images are displayed within a certain field of view of the driver of the vehicle 1000.
  • the image output unit 1051 displays a plurality of scheduled action candidate images on the notification device 1002 in a manner corresponding to the priority order of each candidate.
  • the image generation unit 1060 and the image output unit 1051 have a parametric display in which priority is visualized by a histogram or the like, or a non-parametric display form in which the order is visualized in the vertical and horizontal directions.
  • a plurality of scheduled action candidate images may be displayed. Specifically, an image indicating the order itself may be added to each of the scheduled action candidate images, or the higher order scheduled action candidate images may be displayed in a higher visibility.
  • the scheduled action candidate images may be arranged in descending order from a predetermined position. Further, only the scheduled action candidates having a predetermined rank or higher may be generated and used as display targets.
  • the priority according to the driver model it is possible to assist so that a suitable candidate for the occupant can be selected from among a plurality of scheduled action candidates.
  • the priority according to the driving pattern of the target person of the driver model for example, the driver himself, the passenger himself, a standard or exemplary driver model, etc.
  • the preceding vehicle has decelerated and the inter-vehicle distance has been shortened.
  • the automatic driving control device 1030 since vehicles with higher speed than the own vehicle are continuously approaching from behind the overtaking lane, the automatic driving control device 1030 is Therefore, it is assumed that the execution of the current action “deceleration” is determined, and the driving support device 1040 presents that the current action is being executed. At this time, the driving assistance device 1040 may determine “the advantage will be increased if there is no vehicle approaching from behind the overtaking lane” based on the detection information of the detection unit 1020.
  • the driving support device 1040 presents “change lane to the overtaking lane” as a scheduled action candidate that can be instructed to run at an appropriate timing, and recommends it to the driver as an option to reserve the instruction. May be.
  • candidates for actions to be executed by the vehicle before the scheduled action are presented to the driver.
  • the candidates presented to the driver in the eighth embodiment are action candidates that are immediately executed by the vehicle (hereinafter referred to as “current action candidates”).
  • the automatic driving control device 1030 determines the scheduled behavior
  • the driving support device 1040 determines the current behavior candidate.
  • the driving support device 1040 causes the notification device 1002 in the vehicle 1000 to display both the scheduled behavior and the current behavior candidate.
  • the control unit 1041 includes an image generation unit 1060, a candidate determination unit 1061, a determination unit 1062, and an instruction unit 1063.
  • the storage unit 1042 includes a statistical information storage unit 1070 and a determination criterion holding unit 1071.
  • the statistical information stored in the statistical information storage unit 1070 and the determination standard stored in the determination standard storage unit 1071 are the same as in the sixth embodiment. That is, statistical information for determining a current action candidate is stored in the statistical information storage unit 1070, and a determination criterion for an action that can be executed immediately according to an environmental parameter is stored in the determination criterion storage unit 1071. As described above, the statistical information may be accumulated in a device outside the vehicle 1000, and the driving support device 1040 may access the remote statistical information via the communication IF 1056 and the wireless device 1008.
  • the behavior information input unit 1054 receives behavior information (“scheduled behavior information” in the fifth embodiment) indicating scheduled behavior, which is a behavior scheduled to be executed by the automatic driving control device 1030 at a future time, in the automatic driving control device 1030. Get from.
  • the behavior information input unit 1054 further acquires remaining time information indicating the time from the current time until the scheduled behavior is executed from the automatic driving control device 1030. As shown in FIG. 35, the remaining time information is included in the data set of behavior information acquired from the automatic driving control apparatus 1030.
  • the detection information input unit 1052 acquires detection information indicating the detection result of the surrounding state and the running state of the vehicle 1000 from the detection unit 1020.
  • the candidate determination unit 1061 and the determination unit 1062 of the driving support device 1040 determine an action that can be executed by the vehicle 1000 before the scheduled action indicated by the action information based on the detection information.
  • an action that can be immediately executed by the vehicle 1000 is determined as a current action candidate.
  • it is an action that is executed before the scheduled action execution time indicated by the action information but is delayed from the current time, for example, an action executed after the current action being executed by the vehicle 1000 is ended. You may determine as an action candidate.
  • the candidate determination unit 1061 is associated with environmental parameter values that approximate detection information among a plurality of types of actions defined in the statistical information of the statistical information storage unit 1070, as in the sixth embodiment. One or more actions are determined as temporary current action candidates.
  • the determination unit 1062 also refers to the detection information output from the detection unit 1020 and the determination criterion of the determination criterion holding unit 1071, and the provisional current action candidate determined by the candidate determination unit 1061 Each determines whether the vehicle can be executed (immediately) at present. Then, a candidate that can be currently executed by the vehicle is determined as a final current action candidate to be presented to the driver.
  • the image generation unit 1060 generates a scheduled behavior image that represents the scheduled behavior indicated by the behavior information, and generates a current behavior candidate image that represents the final current behavior candidate determined by the candidate determination unit 1061 and the determination unit 1062.
  • the image output unit 1051 outputs the scheduled action image and the current action candidate image to the notification device 1002 so that the scheduled action image and the current action candidate image are displayed within a certain field of view of the driver of the vehicle 1000.
  • the image generation unit 1060 further generates a remaining time image representing the time until the scheduled action updated by the remaining time information input from the automatic driving control device 1030 is executed.
  • the image output unit 1051 further outputs the remaining time image to the notification device 1002, and displays the scheduled action image and the current action candidate image to which the remaining time image is added within the fixed visual field of the driver of the vehicle 1000.
  • the notification device 1002 displays an automatic driving information screen including these images.
  • FIGS. 57A to 57E show an example of the automatic driving information screen.
  • the scheduled action image 1106 and the remaining time image 1108 are arranged in the inner circle, and the current action candidate image 1110 is arranged in the outer circle.
  • the inner circle is the remaining time display area as a whole, and the remaining time image 1108 is updated so that the range of the remaining time image 1108 indicated by shading is expanded as the remaining time until execution of the scheduled action decreases.
  • the current action candidate image 1110 for example, an upward triangle indicates acceleration, a downward triangle indicates deceleration, a left triangle indicates lane change to the left, and a right triangle indicates lane change to the right.
  • two current action candidates “lane change to the left” and “deceleration” are presented.
  • the current action candidate image 1110 is updated at any time according to the change in the surrounding situation or the running state of the vehicle while the remaining time until the scheduled action execution is reduced.
  • the driver may instruct execution of the current action candidate represented by the current action candidate image 1110 by pressing a cross button provided on the input device 1004.
  • the driver may instruct execution of deceleration indicated by the current action candidate image 1110 by selecting the lower button of the cross button while the current action candidate image 1110 indicating the downward triangle is displayed.
  • FIG. 57E shows the result of the driver selecting one of the current action candidates (current action candidate image 1110) before the acceleration indicated by the scheduled action image 1106 of FIG. 57D is executed.
  • the driver selects the current action candidate “decelerate” on the automatic driving information screen 1103 in FIG. 57D
  • the current action of the vehicle is switched to deceleration.
  • the planned behavior of the vehicle (scheduled behavior image 1106) determined by the automatic driving control device 1030 and the current behavior candidate of the vehicle determined by the driving support device 1040 (current behavior candidate image 1110). ) Has changed.
  • processing sequence of the vehicle 1000 in the eighth embodiment is the same as the processing sequence of FIG. 46 described in the sixth embodiment, the description thereof is omitted. However, acquisition of action information indicating the current action (P23, P31), generation and output of the current action image (P24, P32), acquisition of action information indicating the scheduled action and the remaining time, generation of the scheduled action image, and Replaced by output.
  • FIG. 58 is a flowchart showing an example of processing of the driving support apparatus 1040.
  • the processing of S190 to S199 in the figure related to the generation and display of the scheduled action image is the same as the processing of S100, S106 to S110, and S112 to S114 of FIG. 41 described in the fifth embodiment, and thus description thereof is omitted.
  • the detection information input unit 1052 acquires the detection information output from the detection unit 1020 (Y in S200)
  • the candidate determination unit 1061 displays the detection information, the statistical information stored in the statistical information storage unit 1070, and the determination criterion holding unit.
  • One or more current action candidates are determined based on the determination criterion held in 1071 (S201).
  • the image generation unit 1060 determines whether or not the current action candidate determined in S201 matches the current action candidate stored in the storage unit 1042 in advance.
  • the image generation unit 1060 displays the current action candidate. Is generated (S203).
  • the image output unit 1051 outputs and displays the current action candidate image to the notification device 1002 (S204).
  • the image generation unit 1060 causes the storage unit 1042 to store information on the current action candidate that generated the current action candidate image (S205). If the detection information has not been acquired (N in S200), S201 to S205 are skipped. If no action candidate is currently updated (N in S202), S203 to S205 are skipped. If the predetermined end condition is satisfied (Y in S206), the flow of this figure is ended. If the end condition is not satisfied (N in S206), the process returns to S190.
  • the processing of the driving support device 1040 related to the selection of the current action candidate by the driver is the same as that of the sixth embodiment, and the processing flow is the same as the processing flow of FIG.
  • the length of time for accepting selection of the current action candidate may be a predetermined fixed value, or a value less than the remaining time until the scheduled action execution determined by the automatic driving control device 1030 (for example, remaining time—5 seconds, etc.) It may be.
  • the driving support device 1040 notifies the vehicle occupants (drivers, etc.) of the planned behavior planned for the automatic driving, and is currently executable immediately. Suggest action candidates. In this way, by presenting the driver with the latest action options of the vehicle, it is an automatic driving that further reflects the driver's intention, and an automatic driving that meets the driver's preference and the like can be realized.
  • the current action candidate that can be selected by the driver is one that can be executed in the current situation or running state of the vehicle, the driver can give an instruction to change the automatic driving with a sense of security.
  • the statistical information stored in the statistical information storage unit 1070 may be information (driver model) reflecting the preference or driving pattern of a vehicle occupant (typically a driver).
  • the candidate determination unit 1061 may determine a plurality of current action candidates in which a difference from the currently detected environmental parameter value is associated with an environmental parameter value within a predetermined range with reference to the statistical information storage unit 1070. . Moreover, you may determine the priority of each of several current action candidate with it.
  • the image generation unit 1060 may generate a plurality of current action candidate images corresponding to the plurality of current action candidates determined by the candidate determination unit 1061, that is, an image representing the contents of each candidate.
  • the image output unit 1051 may display a plurality of current action candidate images on the notification device 1002 in a manner corresponding to the priority order of each candidate.
  • a scheduled action also referred to as a driving control plan for the future
  • a current action candidate also referred to as a situation-adaptive recommendation for the present
  • the automatic driving control device 1030 executes the scheduled action “keep lane” next to the current driving action. It is determined that the driving support apparatus 1040 presents that “lane keeping” is scheduled to be executed.
  • the driving assistance device 1040 may determine that “when a merged vehicle appears before approaching the merge path, a sudden operation is required and a disadvantage increases.” Then, the driving support device 1040 can immediately issue an instruction “change lane from leftmost lane to right lane” as a current action candidate capable of instructing driving control before the “lane keeping” scheduled to be executed by the vehicle 1000 later. Recommendations may be presented as options.
  • a scheduled action candidate that is a candidate for an action to be executed by the vehicle at a future time is presented to the driver instead of the scheduled action.
  • the automatic driving control device 1030 determines a scheduled action
  • the driving support device 1040 determines a scheduled action candidate. Then, the driving support device 1040 displays both the scheduled behavior and the scheduled behavior candidate on the notification device 1002 in the vehicle 1000.
  • the scheduled action candidate is an action different from the scheduled action determined by the automatic driving controller, and can be said to be an action scheduled to be executed by the vehicle.
  • control unit 1041 includes an image generation unit 1060, a candidate determination unit 1061, a determination unit 1062, and an instruction unit 1063.
  • the storage unit 1042 includes a statistical information storage unit 1070 and a determination criterion holding unit 1071.
  • the statistical information stored in the statistical information storage unit 1070 and the determination standard stored in the determination standard storage unit 1071 are the same as in the seventh embodiment. That is, statistical information for determining a scheduled action candidate is stored in the statistical information storage unit 1070, and a determination criterion for an action that can be executed immediately according to the environmental parameter is stored in the determination criterion storage unit 1071. As described above, the statistical information may be accumulated in a device outside the vehicle 1000, and the driving support device 1040 may access the remote statistical information via the communication IF 1056 and the wireless device 1008.
  • the behavior information input unit 1054 receives behavior information (“scheduled behavior information” in the fifth embodiment) indicating scheduled behavior, which is a behavior scheduled to be executed by the automatic driving control device 1030 at a future time, in the automatic driving control device 1030. Get from.
  • the behavior information input unit 1054 further acquires remaining time information indicating the time from the current time until the scheduled behavior is executed from the automatic driving control device 1030. As shown in FIG. 35, the remaining time information is included in the data set of behavior information acquired from the automatic driving control apparatus 1030.
  • the detection information input unit 1052 acquires detection information indicating the detection result of the surrounding state and the running state of the vehicle 1000 from the detection unit 1020.
  • the candidate determination unit 1061 is based on the detection information output from the detection unit 1020, which is one or more scheduled behavior candidates that are different from the scheduled behavior indicated by the behavior information and can be scheduled to be executed by the vehicle 1000. To decide.
  • the scheduled action candidate can be said to be an action that can be executed by the vehicle 1000 in place of the scheduled action indicated by the action information.
  • candidate determination unit 1061 selects one or more actions associated with an environmental parameter value that approximates detection information from among actions specified in statistical information as scheduled actions. Extract as a candidate.
  • the candidate determining unit 1061 excludes the candidate from the target to be presented to the driver. In other words, the candidate determination unit 1061 determines a candidate that is different from the scheduled behavior indicated by the behavior information, among the scheduled behavior candidates extracted from the statistical information, as a subject to be presented to the driver. Thereby, it is prevented that the same scheduled action candidate as the scheduled action is presented to the driver.
  • the image generation unit 1060 generates a scheduled action image representing the scheduled action indicated by the action information, and one or more schedules representing one or more scheduled action candidates determined as candidates for presentation to the driver by the candidate determination unit 1061.
  • An action candidate image is generated.
  • the image output unit 1051 outputs the scheduled action image and the scheduled action candidate image to the notification device 1002 so that the scheduled action image and the scheduled action candidate image are displayed within a certain field of view of the driver of the vehicle.
  • the notification device 1002 displays an automatic driving information screen including the scheduled action image and the scheduled action candidate image output from the driving support device 1040.
  • the image generation unit 1060 further generates a first remaining time image representing the time until the scheduled action updated by the remaining time information input from the automatic driving control device 1030 is executed.
  • the image output unit 1051 further outputs the first remaining time image in association with the scheduled action image to the notification device 1002, thereby displaying the scheduled action image with the first remaining time image added on the automatic driving information screen.
  • the candidate determining unit 1061 determines the scheduled action candidate
  • the candidate determining unit 1061 further passes the remaining time information associated with the scheduled action candidate in the statistical information to the image generating unit 1060.
  • the image generation unit 1060 further generates a second remaining time image representing the remaining time indicated by the remaining time information.
  • the image output unit 1051 further outputs the second remaining time image to the notification device 1002 in association with the scheduled action candidate image, thereby displaying the scheduled action candidate image to which the second remaining time image is added on the automatic driving information screen. .
  • FIG. 59 shows an example of the automatic driving information screen.
  • one scheduled action image 1106 representing one scheduled action and one scheduled action candidate image 1112 representing one scheduled action candidate are displayed.
  • a plurality of scheduled action images 1106 may be displayed, and when a plurality of scheduled action candidates are determined, a plurality of scheduled action candidate images 1112 may be displayed.
  • the first remaining time image 1108a is arranged in the vicinity of the scheduled action image 1106, and the second remaining time image 1108b is arranged in the vicinity of the scheduled action candidate image 1112, all of which depend on the display mode of the plurality of time indicators 1109. Announce the length of remaining time.
  • each of the scheduled action and the scheduled action candidate shown in FIG. 59 is an action plan in which a plurality of single actions are combined, but may be a single action.
  • the display mode of the scheduled action image 1106 is set to be different from the display mode of the scheduled action candidate image 1112.
  • the pattern, color, size, or the like may be different between the scheduled action image 1106 and the scheduled action candidate image 1112. This prevents the driver from confusion between the scheduled behavior planned in the automatic driving controller and the scheduled behavior candidate proposed as an action instead of the scheduled behavior.
  • the label “scheduled action” is added to the scheduled action image 1106, and the label “recommended action” is added to the scheduled action candidate image 1112.
  • a shaded area is further added to the scheduled action candidate image 1112.
  • the image generation unit 1060 may set a symbol or decoration different from the scheduled action image 1106 in the scheduled action candidate image 1112.
  • the image output unit 1051 may further output to the notification device 1002 display control data that specifies that the display mode of the scheduled action image 1106 and the display mode of the scheduled action candidate image 1112 are different.
  • the automatic driving information screen 1103 also allows the driver to select a specific action from one or more scheduled actions (scheduled action image 1106) and one or more scheduled action candidates (scheduled action candidate image 1112).
  • a selection frame 1116 is displayed. The driver inputs an operation for selecting a desired scheduled action or a scheduled action candidate to the input device 1004 using the selection frame 1116.
  • the image generation unit 1060 generates an inquiry image (for example, FIGS. 55A and 55B) for causing the driver to specify “execution” or “reservation” of the selection action according to the determination result of the determination unit 1062.
  • the instruction unit 1063 is for causing the driver to perform a selection action at a timing corresponding to the designated “execution” or “reservation”.
  • the control command is transmitted to the automatic operation control device 1030.
  • processing sequence of the vehicle 1000 in the ninth embodiment is the same as the processing sequence of FIG. 46 described in the sixth embodiment, the description thereof is omitted.
  • acquisition of action information indicating the current action (P23, P31), generation and output of the current action image (P24, P32), acquisition of action information indicating the scheduled action and the remaining time, generation of the scheduled action image, and Replaced by output.
  • generation and output of current action candidate images (P26, P27, P34, and P35) are replaced with generation and output of scheduled action candidate images.
  • FIG. 60 is a flowchart showing an example of processing of the driving support apparatus 1040.
  • the processes in S210 to S218 in FIG. 41 relating to the generation and display of the scheduled action image are the same as the processes in S100, S106 to S110, and S112 to S114 in FIG. .
  • the detection information input unit 1052 acquires the detection information output from the detection unit 1020 (Y in S219)
  • the candidate determination unit 1061 is based on the detection information and the statistical information accumulated in the statistical information accumulation unit 1070.
  • One or more scheduled action candidates are determined.
  • the candidate determining unit 1061 excludes the action from the scheduled action candidate. That is, the candidate determination unit 1061 determines a scheduled action candidate different from the scheduled action (S220).
  • the image generation unit 1060 determines whether or not the scheduled action candidate determined in S220 matches the scheduled action candidate previously stored in the storage unit 1042. When the current action candidate determined in S220 is inconsistent with the scheduled action candidate stored in the storage unit 1042 in advance, that is, when the scheduled action candidate is updated (Y in S221), the image generation unit 1060 displays the scheduled action candidate. Is generated (S222). The image generation unit 1060 further identifies the remaining time until the execution associated with the scheduled action candidate in advance in the statistical information, and further generates a remaining time image representing the remaining time in S222.
  • the image output unit 1051 outputs the scheduled action candidate image and the remaining time image to the notification device 1002 to display the automatic driving information screen (S223).
  • the image generation unit 1060 stores information indicating the scheduled action candidate that generated the image in the storage unit 1042 (S224), and starts measuring the elapsed time after outputting (starting to display) the scheduled action candidate image (S225). ). If the predetermined end condition is satisfied (Y in S229), the flow of this figure is terminated. If the end condition is not satisfied (N in S229), the process returns to S210.
  • the image generation unit 1060 determines whether a predetermined time has elapsed since the start of the elapsed time measurement. .
  • the predetermined time is a unit time in which the remaining time image is to be updated, and may be a time assigned to one time indicator 1109, for example.
  • the image generation unit 1060 updates the remaining time image (S227).
  • the image output unit 1051 outputs and displays the updated remaining time image to the notification device 1002 (S228). Thereby, for example, one time indicator 1109 in FIG. 59 is switched from the lighting state to the extinguishing state. If the predetermined time has not elapsed since the start of the elapsed time measurement (N in S226), S227 and S228 are skipped.
  • the processing of the driving support device 1040 (the image generation unit 1060, the determination unit 1062, the instruction unit 1063, etc.) related to the selection of the scheduled action or the scheduled action candidate by the driver is the same as that in the seventh embodiment.
  • the processing shown in the flowchart of FIG. 54 and the user interface shown in the automatic driving information screen 1103 of FIGS. 55A and 55B and FIGS. 56A and 56B are directly applied to the ninth embodiment.
  • the selection target by the driver includes the scheduled action determined by the automatic driving control apparatus 1030 in addition to the scheduled action candidate determined by the driving support apparatus 1040.
  • the driving support device 1040 notifies the vehicle occupants (drivers, etc.) of the scheduled behavior in the automatic driving and can be executed instead of the scheduled behavior. Suggest a candidate.
  • the driver by presenting the driver with behavior options at a future time point to the driver, it is an automatic driving that further reflects the driver's intention, and an automatic driving that meets the driver's preference and the like can be realized.
  • the driver's anxiety about automatic driving can be suppressed.
  • even if it is an action that cannot be executed at the present time it is possible to reserve an action in the future, so that it becomes easier to realize automatic driving in accordance with the driver's preference and the like.
  • the statistical information stored in the statistical information storage unit 1070 may be information (driver model) reflecting the preference or driving pattern of a vehicle occupant (typically a driver).
  • the candidate determination unit 1061 may determine a plurality of scheduled action candidates in which a difference from the currently detected environmental parameter value is associated with an environmental parameter value within a predetermined range with reference to the statistical information storage unit 1070. . Moreover, you may determine the priority of each of several schedule action candidates with it.
  • the image generation unit 1060 may generate a plurality of scheduled behavior candidate images corresponding to the plurality of scheduled behavior candidates determined by the candidate determination unit 1061, that is, an image representing the contents of each candidate.
  • the image output unit 1051 may output the scheduled action image and the plurality of scheduled action candidate images to the notification device 1002 so that the scheduled action image and the plurality of scheduled action candidate images are displayed within a certain field of view of the driver of the vehicle 1000. Good.
  • the image output unit 1051 may display a plurality of scheduled action candidate images on the notification device 1002 in a manner corresponding to the priority order of each candidate.
  • a scheduled action also referred to as a driving control plan for the future
  • a scheduled action candidate also referred to as a situation-adaptive recommendation for the future
  • the driving assistance device 1040 may determine that “when a merged vehicle appears before approaching the merge path, a sudden operation is required and a disadvantage increases.” Then, the driving support device 1040 instructs “change lane from the leftmost lane to the right lane” as a scheduled action candidate that can be instructed to travel control at the timing before the “lane keeping” scheduled to be executed by the vehicle 1000 later. May be recommended as an option that can be reserved.
  • information adapted to the individual driver is provided as information for supporting selection of the current behavior of the vehicle.
  • the driving support device 1040 determines the current action candidate from both the viewpoint based on the surrounding state or the running state of the vehicle 1000 and the viewpoint based on the individual driver, and displays it on the notification device 1002 in the vehicle 1000. .
  • smooth action selection in accordance with the driver's will is supported, and the driver is supported so that instructions for action selection or action change can be issued with peace of mind.
  • FIG. 61 is a block diagram showing a detailed configuration of the storage unit 1042 of the driving support apparatus 1040.
  • the determination criterion holding unit 1071 holds the same determination criterion as in the other embodiments.
  • the statistical information accumulation unit 1070 includes a first statistical information accumulation unit 1072 that accumulates first statistical information and a second statistical information accumulation unit 1073 that accumulates second statistical information.
  • Both the first statistical information and the second statistical information are statistical information indicating the relevance of the surrounding situation and the running state of the vehicle and the behavior of the vehicle, as in the previous embodiments.
  • the values of a plurality of types of environmental parameters indicating the surrounding conditions and the running state of the vehicle, and actions to be immediately executed by the vehicle (or immediate execution by the vehicle) Information including a plurality of records associated with each other.
  • the range of statistics differs between the first statistical information and the second statistical information.
  • the first statistical information has a wider statistical range than the second statistical information.
  • the first statistical information is a record of operation results and action results in various environmental states for a group of a plurality of people and a large number of vehicles.
  • the history of operation results and behavior results in various environmental states may be modeled as operation patterns and behavior patterns by a known statistical method.
  • the second statistical information is a record of an operation result or an action result in an environmental state so far for an individual driver and a vehicle 1000 alone.
  • a driver's individual operation results and a history of behavior results of the vehicle 1000 alone may be modeled as operation patterns and behavior patterns by a known statistical method.
  • the first statistical information may be information in which operation histories in a large group, in other words, action histories of a plurality of vehicles are sequentially recorded together with environmental parameter values.
  • an average combination of environmental parameter values and behavior in a large group of people may be recorded.
  • the first statistical information can be said to be information indicating a typical operation pattern according to various surrounding environments or running conditions, in other words, a typical behavior pattern of the vehicle.
  • the environmental parameter value and the behavior are comprehensive.
  • the second statistical information may be an accumulation of an individual driver's operation history, in other words, an action history of the vehicle 1000 alone. Further, a combination of environmental parameter values and actions based on individual driver operations may be sequentially recorded. It can be said that the second statistical information is statistical information that reflects the individual preference or operation pattern of the driver more strongly than the first statistical information. Further, since the second statistical information is obtained by accumulating the individual driver's operation history and the behavior history of the vehicle 1000 alone, the environmental parameter value and the behavior coverage are low.
  • both the first statistical information and the second statistical information are stored locally in the vehicle 1000, but at least one of the first statistical information and the second statistical information is a device outside the vehicle, for example, on a cloud It may be stored in a database or the like.
  • the second statistical information may be sequentially accumulated locally in the vehicle 1000 because the range of the statistics is for the individual driver or the vehicle 1000 alone.
  • the first statistical information may be subjected to processing such as aggregation / statistics / accumulation on a server on the cloud.
  • the driving support device 1040 may access remote statistical information via the communication IF 1056 and the wireless device 1008.
  • the behavior information input unit 1054 of the driving support device 1040 acquires from the automatic driving control device 1030 behavior information indicating the current behavior that the automatic driving control device 1030 causes the vehicle 1000 to execute.
  • the detection information input unit 1052 of the driving support device 1040 acquires detection information indicating the detection result of the surrounding state and the running state of the vehicle 1000 from the detection unit 1020.
  • FIG. 62 is a block diagram showing a detailed configuration of the control unit 1041 of the driving support apparatus 1040.
  • the control unit 1041 includes a first determination unit 1080, a second determination unit 1082, an image generation unit 1060, a determination unit 1062, and an instruction unit 1063.
  • the first determination unit 1080 includes one or more items that can be executed by the vehicle 1000 based on the detection information output from the detection unit 1020 and the first statistical information stored in the first statistical information storage unit 1072. Determine the first action.
  • the first determination unit 1080 includes a situation adaptive determination unit 1081.
  • the situation adaptive determination unit 1081 corresponds to the candidate determination unit 1061 of the sixth embodiment.
  • the situation adaptive determination unit 1081 replaces the current action indicated by the action information with one or more action candidates (hereinafter referred to as “situation adaptive current action candidates”) that can be executed by the vehicle 1000. Determine as. Specifically, as in the sixth embodiment, among the actions defined in the first statistical information, the action associated with the environmental parameter value that approximates the detection information is selected as the situation adaptation current action candidate (the sixth embodiment). Corresponding to the current action candidate).
  • the situation adaptation current action candidate can be said to be a typical operation pattern / behavior pattern that is immediately executed in the current ambient situation or running state.
  • the situation adaptive determination unit 1081 excludes the candidate from the presentation target to the driver if there is the same candidate as the current action indicated by the action information among the situation adaptive current action candidates extracted from the first statistical information. To do. In other words, the situation adaptive determination unit 1081 determines a candidate that is different from the current action indicated by the behavior information, from among the situation adaptive current action candidates extracted from the first statistical information, as a presentation target to the driver. This prevents presenting the same candidate as the current action to the driver.
  • the determination unit 1062 refers to the detection information output from the detection unit 1020 and the determination criterion of the determination criterion holding unit 1071, and the situation adaptation current determination unit 1081 determines the situation adaptation current Each of the action candidates determines whether or not the vehicle can be immediately executed.
  • the situation adaptive determination unit 1081 determines a candidate determined to be immediately executable by the determination unit 1062 as one to be presented to the driver among one or more situation adaptive current action candidates extracted from the first statistical information.
  • the second determination unit 1082 can execute one or more vehicles that can be executed by the vehicle 1000. Determine the second action.
  • the second determination unit 1082 indicates the priority for each of the one or more situation-adapted current action candidates, and indicates information indicating the priority according to the correlation between the second statistical information and the detection information. Determine as two actions. In other words, the one or more second actions determined by the second determination unit 1082 indicate the priority of each of the one or more first actions.
  • the second determination unit 1082 includes a personal adaptive determination unit 1083 and a priority determination unit 1084.
  • the personal adaptive determination unit 1083 corresponds to the candidate determination unit 1061 of the sixth embodiment.
  • the personal adaptation determination unit 1083 determines one or more actions (hereinafter referred to as “personal adaptation current action candidates”) that can be executed by the vehicle 1000 instead of the current action indicated by the action information.
  • personal adaptation current action candidates are selected as the personal adaptation current action candidate (the sixth embodiment).
  • the personal adaptation current action candidate can be said to be a driver's individual operation pattern in the current surrounding state or running state, and can also be said to be an action pattern of the vehicle 1000 alone.
  • the personal adaptive determination unit 1083 excludes the candidate for personal adaptation current behavior extracted from the second statistical information from the ranking target described later if the same candidate as the current behavior indicated by the behavior information is present. To do. In other words, the personal adaptive determination unit 1083 determines candidates that are different from the current behavior indicated by the behavior information, as candidates for ranking, from among the individual adaptive current behavior candidates extracted from the second statistical information.
  • the determination unit 1062 determines whether each of the individual adaptation current action candidates determined by the individual adaptation type determination unit 1083 can be immediately executed with reference to the detection information and the determination criterion, in the same manner as the situation adaptation current action candidates. May be.
  • the personal adaptive determination unit 1083 may narrow down the candidates determined to be immediately executable out of one or more individual adaptive current action candidates extracted from the second statistical information so that the candidates for ranking may not be immediately executed.
  • the candidate may be excluded from the personal adaptation current action candidate.
  • the personal adaptation determination unit 1083 ranks one or more personal adaptation current action candidates.
  • the personal adaptive determination unit 1083 may increase the rank of the personal adaptive current action candidate associated with the environmental parameter value that is close to the environmental parameter value indicated by the latest detection information. For example, in an n-dimensional vector space corresponding to n environment parameters, an action associated with an environment parameter value within a predetermined range from the position of the environment parameter value (current environment position) indicated by the latest detection information is personally adapted. You may extract as an action candidate now. Then, for each extracted candidate, the rank may be higher for a candidate whose position in the vector space is closer to the current environment position.
  • the priority determination unit 1084 is based on the ranking of one or more individual adaptation current action candidates determined by the individual adaptation type determination unit 1083, and is one or more determined as a driver presentation target by the situation adaptation type determination unit 1081.
  • the priority of the current behavior candidate for the situation adaptation is determined. For example, the rank of the personal adaptation current action candidate is given as a provisional rank to the situation adaptation current action candidate showing the same action as the individual adaptation current action candidate. Then, according to the provisional rank assigned to each of one or more situation adaptation current action candidates, the higher the provisional rank, the higher the priority on the screen display, in other words, the priority of proposal / recommendation to the driver. Set. It should be noted that the priority order of the situation adaptation current action candidates for which there is no individual adaptation current action candidate showing the same action, that is, the situation adaptation current action candidates for which provisional ranks cannot be assigned is the lowest.
  • the image generation unit 1060 generates a first image representing one or more first actions, and generates a second image representing one or more second actions. Specifically, the image generation unit 1060 generates a current action candidate image representing each of one or more situation-adapted current action candidates as a first image. In addition, the image generation unit 1060 generates a priority image representing the priority set by the priority determination unit 1084 for each of the one or more situation adaptation current action candidates as the second image. The image generation unit 1060 may generate a priority image that represents a priority order given to each of one or more situation-adapted current action candidates by a histogram or a number. The histogram may be an image of a graphic object in which a display size corresponding to the priority is set.
  • the image output unit 1051 outputs the current action candidate image and the priority image to the notification device 1002 so that the current action candidate image and the priority image are displayed within a certain field of view of the driver of the vehicle.
  • the notification device 1002 displays an automatic driving information screen including a current action candidate image and a priority image.
  • 63A and 63B show examples of the automatic driving information screen.
  • the priority order of each situation adaptation current action candidate is indicated by a number.
  • the outer circle is the action candidate display area 1128, and four current action candidate images 1110 representing the four situation-adapted current action candidates are displayed.
  • the inner circle is the priority display area 1130, and four priority images 1132 representing the priority order of the four situation-adapted current action candidates are displayed.
  • a priority image 1132 indicating the highest priority “1” is given to the current action candidate image 1110 indicating the lane change to the right, and the lane change to the right is the most important for the individual driver. Indicates that the current action is recommended.
  • the priority of each situation adaptation current action candidate is indicated by a histogram.
  • four current action candidate images 1110 representing four situation-adapted current action candidates are displayed around the circle.
  • four priority images 1132 representing the priority order of the four situation-adapted current action candidates are displayed, and a priority image 1132 having a larger size is added to a situation-adapted current action candidate having a higher priority.
  • a maximum size priority image 1132 indicating the highest priority is given to the current action candidate image 1110 indicating the lane change to the right, and the lane change to the right is the most important for the individual driver. Indicates that the current action is recommended.
  • the driver may instruct execution of the current action candidate represented by the current action candidate image 1110 by pressing the cross button provided on the input device 1004. .
  • the driver may instruct execution of deceleration indicated by the current action candidate image 1110 by selecting the lower button of the cross button while the current action candidate image 1110 indicating the downward triangle is displayed.
  • the processing of the instruction unit 1063 of the driving support device 1040 is the same as that in the sixth embodiment.
  • FIG. 64 is a sequence diagram illustrating an example of processing related to HMI control of the vehicle 1000.
  • P41 to P44 in the figure are the same as P21 to P23 and P25 in the sequence diagram of FIG. 46 described in the sixth embodiment.
  • the driving support device 1040 may further display the current behavior image on the notification device 1002 based on the behavior information acquired from the automatic driving control device 1030. .
  • the driving support device 1040 determines one or more situation adaptation current action candidates according to the correlation between the detection information and the first statistical information, and determines each situation adaptation current action according to the correlation between the detection information and the second statistical information.
  • the priority of the candidate is determined (P45).
  • the driving support apparatus 1040 generates a current action candidate image representing one or more situation-adapted current action candidates and a priority image representing the priority of each candidate, and outputs and displays the information on the notification apparatus 1002 (P46).
  • the subsequent processing of P47 to P49 is the same as P28 to P30 of the sequence diagram of FIG. 46 described in the sixth embodiment.
  • FIG. 65 is a flowchart showing an example of processing of the driving support apparatus 1040.
  • the control unit 1041 displays the current behavior indicated by the behavior information and the current behavior stored in the storage unit 1042 in advance. It is determined whether or not.
  • the control unit 1041 stores the current behavior indicated by the behavior information in the storage unit 1042 (S242). If the action information has not been acquired (N in S240), S241 and S242 are skipped. If the current action is not updated (N in S241), S242 is skipped.
  • the situation adaptive determination unit 1081 determines one or more situation adaptive current actions based on the detection information and the first statistical information. Candidates are determined (S244).
  • the situation adaptive determination unit 1081 excludes from the candidates one or more situation adaptation current action candidates once determined that match the current action stored in the storage unit 1042 (S245).
  • the personal adaptive determination unit 1083 determines one or more personal adaptive current action candidates based on the detection information and the second statistical information, and determines the rank of each candidate.
  • the situation adaptive determination unit 1081 may also exclude one or more personal adaptation current behavior candidates once determined that match the current behavior stored in the storage unit 1042 from the candidates.
  • the priority determination unit 1084 determines the priority of one or more situation adaptation current action candidates on the screen display based on the ranking of the one or more individual adaptation current action candidates (S246).
  • the priority determination unit 1084 stores information indicating the situation adaptation current action candidate and the priority in the storage unit 1042.
  • the image generation unit 1060 When the situation adaptation current action candidate is updated (Y in S247), the image generation unit 1060 generates a current action candidate image representing the updated situation adaptation current action candidate (S248). The image output unit 1051 outputs the current action candidate image to the notification device 1002 for display (S249). The image generation unit 1060 generates a priority image representing the priority of each of one or more situation-adapted current action candidates (S250), and the image output unit 1051 outputs the priority image to the notification device 1002 for display. (S251). Although the situation adaptation current action candidate is not updated (N in S247), if the priority is updated (Y in S252), the process proceeds to S250, and if the priority is not updated (N in S252), the process proceeds to S253. move on.
  • the driving assistance device 1040 determines the situation adaptation current action candidate every time the detection information is input, and determines the priority of the proposal according to the driver's preference. When at least one of the situation adaptation current action candidate and the priority is updated, the display content of the automatic driving information screen 1103 is also updated.
  • the processing of the driving support device 1040 related to the selection of the current action candidate by the driver is the same as that of the sixth embodiment, and the processing flow is the same as the processing flow of FIG.
  • the length of the waiting time for receiving the selection of the current action candidate may be a predetermined fixed value.
  • the driving support apparatus 1040 has an automatic driving information screen 1103 in which one or more situation-adapted current action candidates are assigned priorities adapted to the individual driver or the vehicle 1000 alone. To the driver. Thereby, useful information can be presented to the driver to assist in selecting the current behavior of the vehicle.
  • the current action candidates to be displayed on the automatic driving information screen 1103 are extracted from the first statistical information in which various environmental parameter values and various actions are comprehensively recorded, thereby changing every moment. It becomes easy to present the current action candidate that is more suitable for the environmental parameter value of the vehicle.
  • operator's individual preference or operation pattern can be shown by determining the priority of each candidate based on the 2nd statistical information which strongly reflects a driver
  • the situation adaptive current action candidate determined by the situation adaptive determination unit 1081 and the priority determined by the priority determination unit 1084 are displayed on the automatic driving information screen 1103.
  • both the situation adaptive current action candidate determined by the situation adaptive determination unit 1081 and the personal adaptation current action candidate determined by the personal adaptive determination unit 1083 may be displayed on the automatic driving information screen 1103 in parallel. Good.
  • the situation adaptive determination unit 1081 of the present modification determines a situation adaptive current action candidate different from the current action indicated by the action information acquired from the automatic driving control apparatus 1030.
  • the personal adaptive determination unit 1083 also determines a personal adaptive current behavior candidate that is different from the current behavior indicated by the behavior information. Thereby, it is possible to prevent the same candidate as the current behavior of the vehicle from being presented to the driver.
  • the display mode of the situation adaptation current action candidate and the display mode of the individual adaptation current action candidate are different.
  • the display position, the pattern, the color, the size, and the like may be different so that the driver can easily distinguish the situation adaptation current action candidate and the individual adaptation current action candidate.
  • the image generation unit 1060 may generate an image of the personal adaptation current action candidate in a manner different from that of the situation adaptation current action candidate image.
  • the image output unit 1051 further transmits display control data instructing to display both images in different modes. May be.
  • the situation adaptive determination unit 1081 of the present modification may determine the priority order of one or more situation adaptive current action candidates based on the correlation between the detection information and the first statistical information.
  • the personal adaptive determination unit 1083 may also determine the priority order of one or more personal adaptive current action candidates based on the correlation between the detection information and the second statistical information as in the embodiment.
  • the image generation unit 1060 and the image output unit 1051 include an image representing one or more situation-adapted current action candidates and an image representing the priority order of each candidate, and an image representing one or more individual adapted current action candidates and each candidate. May be displayed on the automatic driving information screen 1103.
  • one or more status adaptation current action candidates and one or more individual adaptation current action candidates may be displayed side by side in an easily comparable manner.
  • a situation adaptation current action candidate also referred to as a situation adaptation type recommendation for the present
  • a personal adaptation current action candidate also referred to as a person adaptation type recommendation for the present
  • the automatic operation control device 1030 performs the current action “ It is determined that “lane keeping (arbitrary)” is to be executed, and the driving support apparatus 1040 presents that “lane keeping (arbitrary)” is being executed (for example, Embodiments 5 and 6). See).
  • the driving assistance device 1040 may determine that “a sudden operation is required and a disadvantage increases when a merged vehicle appears before approaching the merge path”. Then, the driving support device 1040 may recommend and present “decelerate” as a situation adaptation current action candidate that can give a travel control instruction instead of “lane keeping” executed by the vehicle 1000 as an option that can be instructed immediately.
  • the driving support device 1040 reads “based on the tendency of the driving action taken by the driver when there is a vehicle approaching at a certain speed from the rear of the left lane in front of the merge path”. Instead of “maintenance”, “acceleration” may be determined as a personal adaptation current action candidate capable of instructing traveling control. Then, the driving support apparatus 1040 may further recommend and present the personal adaptation current action candidate “acceleration” as an option that can be instructed immediately. As described above, the driving support device 1040 may simultaneously present the situation adaptation current action candidate and the individual adaptation current action candidate determined at the approximate timing to the driver in parallel.
  • information adapted to the individual driver is provided as information for supporting the selection of the scheduled behavior of the vehicle.
  • the driving support device 1040 determines a scheduled action candidate from both the viewpoint based on the surrounding state or the running state of the vehicle 1000 and the viewpoint based on the individual driver, and displays the scheduled action candidate on the notification device 1002 in the vehicle 1000.
  • smooth action selection in accordance with the driver's will is supported, and the driver is supported so that instructions for action selection or action change can be issued with peace of mind.
  • the candidate for the current behavior of the vehicle is presented to the driver, but in the present eleventh embodiment, the candidate for the scheduled behavior of the vehicle is presented to the driver.
  • the contents already described in the above embodiments are omitted as appropriate.
  • the structure or operation described in this embodiment can be combined with or replaced with the structure or operation described in another embodiment or modification without departing from the spirit of the present invention.
  • the storage unit 1042 of the driving support device 1040 has the configuration shown in FIG. 61 described in the tenth embodiment, and the control unit 1041 of the driving support device 1040 has the configuration shown in FIG. 62 described in the tenth embodiment.
  • the first statistical information stored in the first statistical information storage unit 1072 and the second statistical information stored in the second statistical information storage unit 1073 are both related to the vehicle surroundings and driving conditions and vehicle behavior. It is statistical information (FIG. 43) which shows sex. Specifically, as in the seventh embodiment, a record in which values of a plurality of types of environmental parameters indicating the surrounding situation and the running state of the vehicle are associated with actions (or action results) to be executed by the vehicle at a future time point is recorded. It is information that includes a plurality. In other words, it is information obtained by accumulating actions executed at a future time point (after a predetermined time) with respect to the current environmental state in association with a parameter value indicating the current environmental state. Thus, it differs from Embodiment 10 in the action prescribed
  • the first statistical information and the second statistical information have different statistical ranges, and specifically, the first statistical information has a wider statistical range than the second statistical information.
  • the first statistical information is a record of operation results and action results in various environmental states for a group of a plurality of people and a large number of vehicles.
  • the history of operation results and behavior results in various environmental states may be modeled as operation patterns and behavior patterns by a known statistical method.
  • the second statistical information is a record of an operation result or an action result in an environmental state so far for an individual driver and a vehicle 1000 alone.
  • a driver's individual operation results and a history of behavior results of the vehicle 1000 alone may be modeled as operation patterns and behavior patterns by a known statistical method.
  • both the first statistical information and the second statistical information are stored locally in the vehicle 1000, but at least one of the first statistical information and the second statistical information is a device outside the vehicle, for example, on a cloud It may be stored in a database or the like.
  • the second statistical information may be sequentially accumulated locally in the vehicle 1000 because the range of the statistics is for the individual driver or the vehicle 1000 alone.
  • the first statistical information may be subjected to processing such as aggregation / statistics / accumulation on a server on the cloud.
  • the driving support device 1040 may access remote statistical information via the communication IF 1056 and the wireless device 1008.
  • the detection information input unit 1052 of the driving support device 1040 acquires detection information indicating the detection result of the surrounding state and the running state of the vehicle 1000 from the detection unit 1020. Based on the detection information output from the detection unit 1020 and the first statistical information stored in the first statistical information storage unit 1072, the first determination unit 1080 may be an action that is scheduled to be executed by the vehicle 1000. The above first action is determined.
  • the situation adaptation type determination unit 1081 of the first determination unit 1080 uses one or more action candidates (hereinafter referred to as “situation adaptation scheduled action candidates”) that can be actions scheduled to be executed by the vehicle 1000 as the first action. decide. Specifically, in the same manner as in the seventh embodiment, among actions defined in the first statistical information, actions associated with environment parameter values that approximate the detection information are designated as situation adaptation scheduled action candidates (the seventh embodiment). Corresponding to the scheduled action candidate).
  • the situation adaptation scheduled action candidate can be said to be a typical operation pattern / behavior pattern executed at a future time with respect to the current surrounding situation or running state.
  • the second determination unit 1082 can be an action that is scheduled to be executed by the vehicle 1000.
  • the above second action is determined.
  • the second determination unit 1082 indicates the priority for each of the one or more situation-adapted scheduled action candidates, and indicates information indicating the priority according to the correlation between the second statistical information and the detection information. Determine as two actions. In other words, the one or more second actions determined by the second determination unit 1082 indicate the priority of each of the one or more first actions.
  • the personal adaptation determination unit 1083 of the second determination unit 1082 determines one or more action candidates (hereinafter referred to as “individual adaptation scheduled action candidates”) that can be the actions scheduled to be executed by the vehicle 1000. Specifically, as in the seventh embodiment, among the actions defined in the second statistical information, the action associated with the environmental parameter value that approximates the detection information is selected as the personal adaptation scheduled action candidate (the seventh embodiment). Corresponding to the scheduled action candidate).
  • the personal adaptation scheduled action candidate can be said to be a driver's future operation pattern with respect to the current surrounding state or running state, and can also be said to be a future action pattern of the vehicle 1000 alone.
  • the personal adaptation determination unit 1083 ranks one or more personal adaptation scheduled action candidates extracted from the second statistical information, as in the tenth embodiment. Similar to the tenth embodiment, the priority determination unit 1084 is determined by the situation adaptive determination unit 1081 based on the ranks of one or more individual adaptive scheduled behavior candidates determined by the individual adaptive determination unit 1083. Determine the priority of one or more situation adaptation scheduled action candidates.
  • the image generation unit 1060 generates a first image representing one or more first actions, and generates a second image representing one or more second actions. Specifically, the image generation unit 1060 generates a scheduled action candidate image representing each of one or more situation adaptation scheduled action candidates as a first image. In addition, the image generation unit 1060 generates a priority image representing the priority set by the priority determination unit 1084 for each of the one or more situation adaptation scheduled action candidates as the second image. The image generation unit 1060 may generate a priority image that represents a priority level given to each of one or more situation adaptation scheduled action candidates by a histogram or a number. The histogram may be an image of a graphic object in which a display size corresponding to the priority is set.
  • the image output unit 1051 outputs the scheduled action candidate image and the priority image to the notification device 1002 so that the scheduled action candidate image and the priority image are displayed within a certain field of view of the driver of the vehicle.
  • the notification device 1002 displays an automatic driving information screen including a scheduled action candidate image and a priority image.
  • FIG. 66 shows an example of the automatic driving information screen.
  • a first scheduled action candidate image 1112a and a second scheduled action candidate image 1112b are displayed as the scheduled action candidate images.
  • the scheduled action candidate (situation adaptation scheduled action candidate) in the figure is an action plan that combines a plurality of single actions, but if a single action is determined as a situation adaptation current action candidate, the single action A scheduled action candidate image representing is displayed.
  • the priority of each scheduled action candidate is shown as a histogram. Specifically, the priority of the scheduled action candidate indicated by the first scheduled action candidate image 1112a is indicated by a first priority image 1132a (shaded), and the priority of the scheduled action candidate indicated by the second scheduled action candidate image 1112b is indicated. A second priority image 1132b (shaded) is shown. In the figure, a relatively high priority image 1132 is added to the first scheduled action candidate image 1112a. Accordingly, the behavior indicated by the first scheduled behavior candidate image 1112a indicates that the degree of recommendation for the individual driver is higher. Note that, as shown in FIG. 63A, a priority image representing the priority of each scheduled action candidate may be displayed.
  • the processing sequence of the vehicle 1000 in the eleventh embodiment is the same as the processing sequence of FIG. 64 described in the tenth embodiment, the description thereof is omitted. However, it is assumed that the action information input from the automatic driving control device 1030 to the driving support device 1040 indicated by P43 in FIG. 64 is not the eleventh embodiment, and will be described in a later-described modification. Also, the current action candidate and the current action candidate image described in FIG. 64 are replaced with a scheduled action candidate (specifically, a situation adaptation scheduled action candidate and a personal adaptation scheduled action candidate) and a scheduled action candidate image, respectively.
  • a scheduled action candidate specifically, a situation adaptation scheduled action candidate and a personal adaptation scheduled action candidate
  • FIG. 67 is a flowchart showing an example of processing of the driving support apparatus 1040.
  • the situation adaptive determination unit 1081 determines one or more situation adaptation scheduled actions based on the detection information and the first statistical information. Candidates are determined (S261).
  • the personal adaptation type determination unit 1083 determines one or more personal adaptation scheduled action candidates based on the detection information and the second statistical information, and determines the rank of each candidate.
  • the priority determination unit 1084 determines the priority of one or more situation adaptation scheduled action candidates based on the ranking of one or more individual adaptation scheduled action candidates (S262).
  • the priority determination unit 1084 stores information indicating the situation adaptation scheduled action candidate and the priority in the storage unit 1042.
  • the image generation unit 1060 When the situation adaptation scheduled action candidate is updated (Y of S263), the image generation unit 1060 generates a scheduled action candidate image representing the updated situation adaptation scheduled action candidate (S264). The image output unit 1051 outputs the scheduled action candidate image to the notification device 1002 for display (S265). The image generation unit 1060 generates a priority image representing the priority of each of one or more situation adaptation scheduled action candidates (S266), and the image output unit 1051 outputs the priority image to the notification device 1002 for display. (S267). If the situation adaptation scheduled action candidate is not updated (N in S263), but the priority is updated (Y in S268), the process proceeds to S266, and if the priority is not updated (N in S268), the process proceeds to S269. move on.
  • the driving support device 1040 determines the situation adaptation scheduled action candidate every time the detection information is input, and determines the priority of the proposal according to the driver's preference. When at least one of the situation adaptation scheduled action candidate and the priority is updated, the display content of the automatic driving information screen 1103 is also updated.
  • the processing of the driving support device 1040 (image generation unit 1060, determination unit 1062, instruction unit 1063, etc.) related to the selection of the scheduled action candidate by the driver is the same as that in the seventh embodiment.
  • the process shown in the flowchart of FIG. 54 and the user interface shown in the automatic driving information screen 1103 of FIGS. 55A and 55B and FIGS. 56A and 56B are applied to the eleventh embodiment as they are.
  • the driving support apparatus 1040 gives an automatic driving information screen 1103 to which one or more situation-adapted scheduled action candidates are given priorities adapted to the individual driver or the vehicle 1000 alone. To the driver. Thereby, useful information can be presented to the driver in order to assist the selection of the scheduled behavior of the vehicle.
  • candidates for scheduled behavior to be displayed on the automatic driving information screen 1103 are extracted from the first statistical information in which various environmental parameter values and various behaviors are comprehensively recorded, thereby changing every moment. It becomes easy to present a scheduled action candidate that is more suitable for the environmental parameter value of the vehicle.
  • operator's individual preference or operation pattern can be shown by determining the priority of each candidate based on the 2nd statistical information which strongly reflects a driver
  • a situation adaptation scheduled action candidate also referred to as a situation adaptation type recommendation for the future
  • a personal adaptation scheduled action candidate also referred to as a personal adaptation type recommendation for the future
  • the driving support device 1040 may determine “a distance from a vehicle approaching from the front of the oncoming lane that can sufficiently secure the time required for the host vehicle to turn right”. Then, after the current action “pause”, the driving support apparatus 1040 may recommend and present “turn right” as a situation adaptation scheduled action candidate that can give a travel control instruction at the expected timing as an option for which an instruction can be reserved.
  • the driving support device 1040 based on “the tendency of the driving action taken by the driver when there is a vehicle approaching from the front of the opposite lane”, after the current action “pause”, the driving control instruction at the timing is estimated.
  • the driving support device 1040 may further recommend the personal adaptation current action candidate as an option for which an instruction can be reserved.
  • the tendency of the driving behavior taken by the driver may be as follows.
  • the driving support device 1040 may simultaneously present each of the situation-adapted scheduled action candidate and the individual-adapted scheduled action candidate determined at the approximate timing to the driver.
  • the behavior information input unit 1054 may acquire behavior information indicating the scheduled behavior determined by the automatic driving control device 1030 from the automatic driving control device 1030.
  • the image generation unit 1060 may further generate a scheduled action image indicating the scheduled action.
  • the image output unit 1051 may further output the scheduled action image to the notification device 1002, and simultaneously display the scheduled action image, the scheduled action candidate image, and the priority image within the driver's fixed visual field.
  • the scheduled behavior determined by the automatic driving control device 1030 may be a selection target by the driver, and the driving assistance device 1040 changes to the automatic driving control device 1030. It may be a candidate for action to instruct.
  • the situation adaptation type determination unit 1081 determines that the candidate for situation adaptation is the situation adaptation scheduled if the candidate for the situation adaptation scheduled action extracted from the first statistical information is the same as the scheduled action indicated by the action information. You may exclude from an action candidate (for example, presentation object to a driver). In addition, if the individual adaptive scheduled action candidate extracted from the second statistical information includes the same candidate as the scheduled action indicated by the action information, the individual adaptive determination unit 1083 selects the candidate as the individual adaptive scheduled action candidate (for example, rank). May be excluded from the target). Thereby, it is possible to prevent the driver from presenting the same candidate as the scheduled action planned by the automatic driving control device 1030.
  • the scheduled action defined by the statistical information (for example, the first statistical information) is executed as in the seventh or ninth embodiment. May be further displayed on the automatic driving information screen 1103. For example, a remaining time image indicating the remaining time until the scheduled action candidate (for example, the situation adaptation current action candidate) indicated by the image is executed may be added to the scheduled action candidate image.
  • the driving support device 1040 When the seat in the vehicle 1000 can be deformed into a forward state during manual operation and a facing state in which the front row and the rear row face each other during automatic operation, the driving support device 1040 includes a seat sensor (not shown) that detects the seat state. If the sheet is in a face-to-face state based on the detection information, all the notification methods may be voiced. Further, the notification method may be switched from display to sound. In addition, the driving support device 1040 may notify the current action by voice and notify the scheduled action by display. In other words, the means (medium) for notifying part of the information about the current action, the current action candidate, the scheduled action, and the scheduled action candidate may be different from the means (medium) for notifying other kinds of information. Good.
  • the action information determined by the automatic driving control device 1030 may be notified by voice, and the action candidate determined by the driving support device 1040 may be notified by screen display, or the reverse combination.
  • the notification method according to the state in the vehicle 1000 including the driver, the occupant, etc. the receiving side is less likely to be bothersome and the reliability of information transmission can be improved.
  • the driving support device 1040 informs the driver by vibration that the notification device 1002 (here, a display device such as a head-up display) should be viewed (in automatic driving, the driver always looks at the notification device 1002). Because it is not necessary), a voice explanation voice may be output from the speaker in the vicinity of the timing when the driver views the display of the notification device 1002. Humans respond quickly to tactile sensations (vibrations) and hearing (sounds) ⁇ tactile sensations (vibrations to the eardrum), but react slowly to vision (display). On the other hand, tactile sense and single-tone hearing are difficult to understand. The sense of hearing by voice is transmitted, but it takes time to transmit the meaning. Vision can transmit information that expresses meaning.
  • the configuration of the present modified example uses a combination of such features of each sense.
  • the situation to be notified may not overlap in a single shot but may overlap.
  • “ ⁇ ” is displayed, and “ ⁇ ” is added while waiting for a while, “ ⁇ ” disappears but “ ⁇ ” remains, and both can disappear, allowing departure May be transmitted to the driver and passengers by a combination of touch, hearing, and vision.
  • an explanatory voice description may be output in the vicinity of the timing when the driver views the display of the notification device 1002. .
  • the display content of the notification device 1002 may be changed or added by notifying the fact that the situation has changed with a predetermined notification sound.
  • a computer that realizes the above-described functions by a program includes an input device such as a keyboard or mouse, a touch pad, an output device such as a display or a speaker, a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory). And a storage device such as a hard disk device or SSD (Solid State Drive). Furthermore, it has a reader that reads information from a recording medium such as a DVD-ROM (Digital Versatile Disk Read Only Memory) or USB (Universal Serial Bus) memory, a network card that communicates via a network, etc. Connected.
  • a recording medium such as a DVD-ROM (Digital Versatile Disk Read Only Memory) or USB (Universal Serial Bus) memory, a network card that communicates via a network, etc. Connected.
  • the reading device reads the program from the recording medium on which the program is recorded, and stores the program in the storage device.
  • a network card communicates with the server apparatus connected to the network, and memorize
  • the CPU copies the program stored in the storage device to the RAM, and sequentially reads out and executes the instructions included in the program from the RAM, thereby realizing the functions of the respective devices.
  • a behavior information input unit that acquires behavior information indicating a first behavior to be executed by the vehicle from an automatic driving control unit that determines the behavior of the vehicle in automatic driving of the vehicle;
  • a detection information input unit that acquires detection information indicating a detection result from a detection unit that detects a surrounding situation and a running state of the vehicle;
  • a candidate determination unit that determines, based on the detection information, a second action that can be executed by the vehicle after the first action indicated by the action information;
  • An image generation unit that generates a first image representing the first behavior indicated by the behavior information and generates a second image representing the second behavior;
  • An image output unit for outputting the first image and the second image to a display unit in the vehicle so as to display the first image and the second image within a fixed visual field of the driver of the vehicle;
  • a driving support device having
  • a candidate determination part is a driving support device given in item 1 which determines the 2nd action based on statistical information and detection information.
  • a communication interface capable of communicating with an accumulation unit outside the vehicle for accumulating statistical information indicating the relationship between the vehicle surroundings and the running state and the behavior of the vehicle via the communication network;
  • the candidate determination unit is a driving support device according to item 1, wherein the candidate determination unit accesses the statistical information of the storage unit via the communication interface, and determines the second action based on the statistical information and the detection information.
  • the image generation unit generates an additional image representing a time until the second action is executed,
  • the driving support device according to item 1, wherein the image output unit outputs the additional image to a display unit in the vehicle and displays a second image including the additional image.
  • the command output unit further includes a command output unit that outputs a command for causing the vehicle to execute the second action to the automatic driving control unit.
  • An operation signal input unit for receiving an operation instruction for specifying the behavior of the vehicle;
  • a command for outputting a command for causing the vehicle to execute the second action to the automatic driving control unit when the operation instruction is not received within a predetermined time when the first image and the second image are displayed on the display unit of the vehicle.
  • the driving support device according to any one of items 1 to 4, further comprising:
  • An operation signal input unit for receiving an operation instruction for specifying the behavior of the vehicle; Automatic operation control of a command for causing the vehicle to execute the second action when an operation instruction for selecting the second image is received within a predetermined time during which the first image and the second image are displayed on the display unit of the vehicle Command output part to output to
  • the driving support device according to any one of items 1 to 4, further comprising:
  • a command output unit When an operation for specifying the second action is input, the image output unit outputs an inquiry image for specifying execution or reservation of the second action to the display unit, The command output unit outputs a command for causing the vehicle to execute the second action to the automatic driving control unit at the first timing when an operation designating execution is input while the inquiry image is displayed.
  • the driving support device according to any one of items 1 to 4, wherein when an operation for designating a reservation is input during display, the command is output to the automatic driving control unit at a second timing after the first timing. .
  • An operation input unit that receives an operation instruction that specifies an action of the vehicle to be executed during automatic driving of the vehicle;
  • the image generation unit receives the operation instruction for selecting the second image within a predetermined time when the first image and the second image are displayed on the display unit of the vehicle, and the second image represented by the selected second image is displayed.
  • Generate a query image that asks the vehicle driver whether to take action immediately or after a certain amount of time,
  • a command output unit When the operation input unit receives an operation instruction indicating that the second action represented by the second image is to be executed immediately in the inquiry image, the command output unit performs the first action represented by the first image.
  • the driving support device according to item 11, wherein a control command for causing the vehicle to immediately execute the second action represented by the second image instead is transmitted to the automatic driving control unit.
  • a command output unit When the operation input unit receives an operation instruction indicating that the execution of the second action represented by the second image after a predetermined time is selected in the inquiry image, the command output unit displays the second action represented by the second image. 13.
  • An automatic driving control unit for determining the behavior of the vehicle in the automatic driving of the vehicle;
  • a detection information input unit that acquires detection information indicating a detection result from a detection unit that detects a surrounding situation and a running state of the vehicle;
  • a candidate determining unit that determines, based on the detection information, a second action that can be executed by the vehicle after the first action that the automatic driving control unit causes the vehicle to execute;
  • An image generation unit for generating a first image representing the first action and generating a second image representing the second action;
  • An image output unit for outputting the first image and the second image to a display unit in the vehicle so as to display the first image and the second image within a fixed visual field of the driver of the vehicle;
  • An operation control device for determining the behavior of the vehicle in the automatic driving of the vehicle.
  • An automatic driving control unit for determining the behavior of the vehicle in the automatic driving of the vehicle;
  • a detection information input unit for acquiring detection information indicating a detection result from a detection unit for detecting the surrounding state and the running state of the vehicle;
  • a candidate determining unit that determines, based on the detection information, a second action that can be executed by the vehicle after the first action that the automatic driving control unit causes the vehicle to execute;
  • An image generation unit for generating a first image representing the first action and generating a second image representing the second action;
  • An image output unit for outputting the first image and the second image to a display unit in the vehicle so as to display the first image and the second image within a fixed visual field of the driver of the vehicle; Vehicle with.
  • the present invention it is possible to present useful information to the driver or information with less discomfort to the driver during automatic driving of the vehicle.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Traffic Control Systems (AREA)

Abstract

This driving assistance device acquires movement information indicating a first movement that a vehicle is caused to execute, the information obtained from an automatic driving control device that determines the movement of the vehicle in automatic driving. The driving assistance device acquires detection information, indicating detection results, from a detection unit that detects the status of the surroundings of the vehicle and the traveling state of the vehicle. The driving assistance device determines a second movement that can be executed in place of the first movement on the basis of the detection information. The driving assistance device generates a first image that represents the first movement and generates a second image that represents the second movement. The driving assistance device outputs the first image and the second image to a notification device so that the first image and the second image are displayed within a fixed field of view of the driver of the vehicle.

Description

運転支援方法およびそれを利用した運転支援装置、運転制御装置、車両、運転支援プログラムDriving support method, driving support device, driving control device, vehicle, and driving support program using the same
 本発明は、車両、車両に設けられる運転支援方法およびそれを利用した運転支援装置、運転制御装置、運転支援プログラムに関する。 The present invention relates to a vehicle, a driving support method provided in the vehicle, a driving support device using the same, a driving control device, and a driving support program.
 近年、車両の周囲の状況や車両の走行状態(例えば、自車両の速度や操舵・アクセル・ブレーキ・方向指示器・アクチュエータの制御情報など)に基づいて、運転者が自ら運転操作を行う手動運転と一部若しくはすべての運転操作を自動で行う自動運転とによる走行が可能な車両や完全自動運転可能な車両に関する様々な技術が提案され、実用化されている。 In recent years, manual driving in which the driver performs the driving operation based on the surrounding conditions of the vehicle and the running state of the vehicle (for example, the control information of the own vehicle speed, steering, accelerator, brake, direction indicator, actuator, etc.) Various technologies relating to vehicles capable of traveling by automatic driving in which some or all driving operations are automatically performed and vehicles capable of fully automatic driving have been proposed and put into practical use.
 例えば、特許文献1には、自車両が自動操舵制御や自動加減速制御となる場合に、自動操舵制御や自動加減速制御の作動状態を視覚的にドライバに認識させる走行制御装置が開示されている。 For example, Patent Document 1 discloses a travel control device that allows a driver to visually recognize the operating state of automatic steering control or automatic acceleration / deceleration control when the host vehicle performs automatic steering control or automatic acceleration / deceleration control. Yes.
特開2005-67483号公報Japanese Patent Laid-Open No. 2005-67483
 本発明は、完全自動運転または一部自動運転中において、運転支援方法およびそれを利用した運転支援装置、自動運転制御装置、車両、プログラムに関する。 The present invention relates to a driving support method, a driving support device, an automatic driving control device, a vehicle, and a program using the driving support method during fully automatic driving or partial automatic driving.
 本発明のある態様の運転支援装置は、行動情報入力部と、検出情報入力部と、候補決定部と、画像生成部と、画像出力部と、を有する。行動情報入力部は、車両の自動運転における車両の行動を決定する自動運転制御部から、車両に実行させる第1行動を示す行動情報を取得する。検出情報入力部は、車両の周囲状況および走行状態を検出する検出部から、検出結果を示す検出情報を取得する。候補決定部は、行動情報が示す第1行動の後に車両に実行させることが可能な第2行動を検出情報に基づいて決定する。画像生成部は、行動情報が示す第1行動を表す第1画像を生成し、第2行動を表す第2画像を生成する。画像出力部は、第1画像と第2画像を車両内の表示部に出力する。 The driving support device according to an aspect of the present invention includes a behavior information input unit, a detection information input unit, a candidate determination unit, an image generation unit, and an image output unit. The behavior information input unit acquires behavior information indicating the first behavior to be executed by the vehicle from the automatic driving control unit that determines the behavior of the vehicle in the automatic driving of the vehicle. The detection information input unit acquires detection information indicating a detection result from a detection unit that detects the surrounding situation and the running state of the vehicle. The candidate determining unit determines a second action that can be executed by the vehicle after the first action indicated by the action information based on the detection information. An image generation part produces | generates the 1st image showing 1st action which action information shows, and produces | generates the 2nd image showing 2nd action. The image output unit outputs the first image and the second image to a display unit in the vehicle.
 本発明の別の態様は、運転制御装置である。この装置は、自動運転制御部と、検出情報入力部と、候補決定部と、画像生成部と、画像出力部と、を有する。自動運転制御部は、車両の自動運転における車両の行動を決定する。検出情報入力部は、車両の周囲状況および走行状態を検出する検出部から、検出結果を示す検出情報を取得する。候補決定部は、自動運転制御部が車両に実行させる第1行動の後に車両に実行させることが可能な第2行動を検出情報に基づいて決定する。画像生成部は、第1行動を表す第1画像を生成し、第2行動を表す第2画像を生成する。画像出力部は、第1画像と第2画像を車両内の表示部に出力する。 Another aspect of the present invention is an operation control device. The apparatus includes an automatic operation control unit, a detection information input unit, a candidate determination unit, an image generation unit, and an image output unit. The automatic driving control unit determines the behavior of the vehicle in the automatic driving of the vehicle. The detection information input unit acquires detection information indicating a detection result from a detection unit that detects the surrounding situation and the running state of the vehicle. The candidate determination unit determines a second action that can be executed by the vehicle after the first action that the automatic driving control unit causes the vehicle to execute based on the detection information. An image generation part produces | generates the 1st image showing 1st action, and produces | generates the 2nd image showing 2nd action. The image output unit outputs the first image and the second image to a display unit in the vehicle.
 本発明のさらに別の態様は、車両である。この車両は、自動運転制御部と、検出情報入力部と、候補決定部と、画像生成部と、画像出力部と、を有する。自動運転制御部は、本車両の自動運転における本車両の行動を決定する。検出部は、本車両の周囲状況および走行状態を検出する。検出情報入力部は、検出結果を示す検出情報を取得する。候補決定部は、自動運転制御部が本車両に実行させる第1行動の後に本車両に実行させることが可能な第2行動を検出情報に基づいて決定する。画像生成部は、第1行動を表す第1画像を生成し、第2行動を表す第2画像を生成する。画像出力部は、本車両の運転者の一定視野内に第1画像と第2画像を表示させるように第1画像と第2画像を本車両内の表示部に出力する。 Still another aspect of the present invention is a vehicle. The vehicle includes an automatic driving control unit, a detection information input unit, a candidate determination unit, an image generation unit, and an image output unit. The automatic driving control unit determines the behavior of the vehicle in the automatic driving of the vehicle. A detection part detects the circumference condition and running state of this vehicle. The detection information input unit acquires detection information indicating a detection result. The candidate determination unit determines a second action that can be executed by the vehicle after the first action that is executed by the automatic driving control unit based on the detection information. An image generation part produces | generates the 1st image showing 1st action, and produces | generates the 2nd image showing 2nd action. The image output unit outputs the first image and the second image to the display unit in the vehicle so that the first image and the second image are displayed within a fixed visual field of the driver of the vehicle.
 本発明のさらに別の態様は、運転支援方法である。この方法は、行動情報を取得するステップと、検出情報を取得するステップと、決定するステップと、画像を生成するステップと、表示部に出力するステップと、をコンピュータが実行する。行動情報を取得するステップは、車両の自動運転における車両の行動を決定する自動運転制御部から、車両に実行させる第1行動を示す行動情報を取得する。検出情報を取得するステップは、車両の周囲状況および走行状態を検出する検出部から、検出結果を示す検出情報を取得する。決定するステップは、行動情報が示す第1行動の後に車両に実行させることが可能な第2行動を検出情報に基づいて決定する。画像を生成するステップは、行動情報が示す第1行動を表す第1画像を生成し、第2行動を表す第2画像を生成する。表示部に出力するステップは、車両の運転者の一定視野内に第1画像と第2画像を表示させるように第1画像と第2画像を車両内の表示部に出力する。 Still another aspect of the present invention is a driving support method. In this method, a computer executes a step of acquiring behavior information, a step of acquiring detection information, a step of determining, a step of generating an image, and a step of outputting to a display unit. In the step of acquiring the behavior information, the behavior information indicating the first behavior to be executed by the vehicle is acquired from the automatic driving control unit that determines the behavior of the vehicle in the automatic driving of the vehicle. In the step of acquiring detection information, detection information indicating a detection result is acquired from a detection unit that detects a surrounding situation and a running state of the vehicle. The determining step determines a second action that can be executed by the vehicle after the first action indicated by the action information based on the detection information. The step of generating an image generates a first image representing the first action indicated by the action information, and generates a second image representing the second action. The step of outputting to the display unit outputs the first image and the second image to the display unit in the vehicle so that the first image and the second image are displayed within a fixed visual field of the driver of the vehicle.
 なお、以上の構成要素の任意の組合せ、本発明の表現を装置、システム、方法、プログラム、プログラムを記録した記録媒体、本装置を搭載した車両などの間で変換したものもまた、本発明の態様として有効である。 An arbitrary combination of the above components, the expression of the present invention converted between an apparatus, a system, a method, a program, a recording medium recording the program, a vehicle equipped with the apparatus, and the like are also included in the present invention. It is effective as an embodiment.
 本発明によれば、完全自動運転または一部自動運転において、車両と運転者の操作が対立しにくい、快適な自動運転ができるように、適切に情報を伝達することができる。 According to the present invention, in fully automatic driving or partial automatic driving, it is possible to appropriately transmit information so that a comfortable automatic driving can be performed in which the operation of the vehicle and the driver is unlikely to conflict.
本発明の実施の形態1に係る情報報知装置を含む車両の要部構成を示すブロック図である。It is a block diagram which shows the principal part structure of the vehicle containing the information alerting device which concerns on Embodiment 1 of this invention. 実施の形態1に係る走行環境の第1の例について説明する図である。It is a figure explaining the 1st example of the travel environment concerning Embodiment 1. FIG. 実施の形態1に係る走行環境の第1の例に対する報知部の表示について説明する図である。It is a figure explaining the display of the alerting | reporting part with respect to the 1st example of the driving environment which concerns on Embodiment 1. FIG. 実施の形態1に係る走行環境の第1の例に対する操作部の操作について説明する図である。FIG. 10 is a diagram for explaining the operation of the operation unit for the first example of the traveling environment according to the first embodiment. 実施の形態1に係る報知部における表示の別の例を示す図である。6 is a diagram showing another example of display in the notification unit according to Embodiment 1. FIG. 実施の形態1に係る情報報知処理の処理手順を示すフローチャートである。3 is a flowchart showing a processing procedure of information notification processing according to Embodiment 1; 実施の形態1に係る走行環境の第1の例を示す図である。It is a figure which shows the 1st example of the driving | running | working environment which concerns on Embodiment 1. FIG. 実施の形態1に係る走行環境の第1の例に対する表示制御を示す図である。It is a figure which shows the display control with respect to the 1st example of the driving environment which concerns on Embodiment 1. FIG. 実施の形態1に係る走行環境の第1の例を示す図である。It is a figure which shows the 1st example of the driving | running | working environment which concerns on Embodiment 1. FIG. 実施の形態1に係る走行環境の第1の例に対する別の表示制御を示す図である。It is a figure which shows another display control with respect to the 1st example of the driving environment which concerns on Embodiment 1. FIG. 実施の形態1に係る走行環境の第2の例を示す図である。It is a figure which shows the 2nd example of the driving environment which concerns on Embodiment 1. FIG. 実施の形態1に係る走行環境の第2の例に対する表示制御を示す図である。It is a figure which shows the display control with respect to the 2nd example of the driving environment which concerns on Embodiment 1. FIG. 実施の形態1に係る走行環境の第3の例を示す図である。FIG. 10 is a diagram showing a third example of a travel environment according to the first embodiment. 実施の形態1に係る走行環境の第3の例に対する表示制御を示す図である。It is a figure which shows the display control with respect to the 3rd example of the driving environment which concerns on Embodiment 1. FIG. 実施の形態1に係る走行環境の第4の例を示す図である。It is a figure which shows the 4th example of the traveling environment which concerns on Embodiment 1. FIG. 実施の形態1に係る走行環境の第4の例に対する表示制御を示す図である。It is a figure which shows the display control with respect to the 4th example of the traveling environment which concerns on Embodiment 1. FIG. 実施の形態1に係る走行環境の第5の例を示す図である。It is a figure which shows the 5th example of the running environment which concerns on Embodiment 1. FIG. 実施の形態1に係る走行環境の第5の例に対する表示制御を示す図である。It is a figure which shows the display control with respect to the 5th example of the driving environment which concerns on Embodiment 1. FIG. 図5Aに示した走行環境の第1の例に対する別の表示制御を示す図である。It is a figure which shows another display control with respect to the 1st example of the driving | running | working environment shown to FIG. 5A. 図7Aに示した走行環境の第2の例に対する別の表示制御を示す図である。It is a figure which shows another display control with respect to the 2nd example of the driving | running | working environment shown to FIG. 7A. 図7Aに示した走行環境の第2の例に対する別の表示制御を示す図である。It is a figure which shows another display control with respect to the 2nd example of the driving | running | working environment shown to FIG. 7A. 本発明の実施の形態2に係る情報報知装置を含む車両の要部構成を示すブロック図である。It is a block diagram which shows the principal part structure of the vehicle containing the information alerting device which concerns on Embodiment 2 of this invention. 実施の形態2に係るタッチパネルの表示を説明する図である。10 is a diagram for explaining display on a touch panel according to Embodiment 2. FIG. 実施の形態2に係るタッチパネルの表示を説明する図である。10 is a diagram for explaining display on a touch panel according to Embodiment 2. FIG. 実施の形態2に係るタッチパネルの表示を説明する図である。10 is a diagram for explaining display on a touch panel according to Embodiment 2. FIG. 本発明の実施の形態3に係る報知部の表示を説明する図である。It is a figure explaining the display of the alerting | reporting part which concerns on Embodiment 3 of this invention. 実施の形態3に係る報知部の表示を説明する図である。It is a figure explaining the display of the alerting | reporting part which concerns on Embodiment 3. FIG. 実施の形態3に係る報知部の表示を説明する図である。It is a figure explaining the display of the alerting | reporting part which concerns on Embodiment 3. FIG. 実施の形態3に係る報知部の表示を説明する図である。It is a figure explaining the display of the alerting | reporting part which concerns on Embodiment 3. FIG. 本発明の実施の形態4に係る走行履歴の一例を示す図である。It is a figure which shows an example of the travel history which concerns on Embodiment 4 of this invention. 実施の形態4に係るクラスタリング型のドライバモデルの構築方法を示す図である。FIG. 10 is a diagram illustrating a clustering type driver model construction method according to a fourth embodiment. 実施の形態4に係る構築されたクラスタリング型のドライバモデルの一例を示す図である。FIG. 10 is a diagram illustrating an example of a clustered driver model constructed according to a fourth embodiment. 実施の形態4に係る構築されたクラスタリング型のドライバモデルの別の一例を示す図である。FIG. 10 is a diagram illustrating another example of a built clustering driver model according to the fourth embodiment. 実施の形態4に係る個別適応型のドライバモデルの構築方法を示す図である。FIG. 10 is a diagram illustrating a method of constructing an individual adaptive driver model according to a fourth embodiment. 実施の形態4に係る構築された個別適応型のドライバモデルの一例を示す図である。FIG. 10 is a diagram illustrating an example of a constructed individual adaptive driver model according to a fourth embodiment. 実施の形態4に係る運転特性モデルの一例を示す図である。FIG. 10 is a diagram illustrating an example of an operation characteristic model according to a fourth embodiment. 実施の形態4に係る報知部の表示を説明する図である。It is a figure explaining the display of the alerting | reporting part which concerns on Embodiment 4. FIG. 実施の形態4に係る報知部の表示を説明する図である。It is a figure explaining the display of the alerting | reporting part which concerns on Embodiment 4. FIG. 実施の形態4に係る報知部の表示を説明する図である。It is a figure explaining the display of the alerting | reporting part which concerns on Embodiment 4. FIG. 実施の形態4に係る報知部の表示を説明する図である。It is a figure explaining the display of the alerting | reporting part which concerns on Embodiment 4. FIG. 実施の形態4に係る報知部の表示を説明する図である。It is a figure explaining the display of the alerting | reporting part which concerns on Embodiment 4. FIG. 実施の形態4に係る報知部の表示を説明する図である。It is a figure explaining the display of the alerting | reporting part which concerns on Embodiment 4. FIG. 実施の形態4に係る報知部の表示を説明する図である。It is a figure explaining the display of the alerting | reporting part which concerns on Embodiment 4. FIG. 実施の形態4に係る報知部の表示を説明する図である。It is a figure explaining the display of the alerting | reporting part which concerns on Embodiment 4. FIG. 実施の形態4に係る報知部の表示を説明する図である。It is a figure explaining the display of the alerting | reporting part which concerns on Embodiment 4. FIG. 実施の形態4に係る報知部の表示を説明する図である。It is a figure explaining the display of the alerting | reporting part which concerns on Embodiment 4. FIG. 実施の形態4に係る報知部の表示を説明する図である。It is a figure explaining the display of the alerting | reporting part which concerns on Embodiment 4. FIG. 実施の形態4に係る報知部の表示を説明する図である。It is a figure explaining the display of the alerting | reporting part which concerns on Embodiment 4. FIG. 実施の形態4に係る報知部の表示を説明する図である。It is a figure explaining the display of the alerting | reporting part which concerns on Embodiment 4. FIG. 実施の形態4に係る報知部の表示を説明する図である。It is a figure explaining the display of the alerting | reporting part which concerns on Embodiment 4. FIG. 実施の形態4に係る走行履歴の一例を示す図である。It is a figure which shows an example of the travel history which concerns on Embodiment 4. FIG. 実施の形態4に係るドライバモデルの変形例におけるドライバモデルの使用方法を示す図である。FIG. 16 is a diagram illustrating a method of using a driver model in a modified example of the driver model according to the fourth embodiment. 実施の形態4に係るドライバモデルの変形例におけるドライバモデルの使用方法を示す図である。FIG. 16 is a diagram illustrating a method of using a driver model in a modified example of the driver model according to the fourth embodiment. 実施の形態4に係るドライバモデルの変形例におけるキャッシュの配置の一例を示すブロック図である。FIG. 20 is a block diagram illustrating an example of cache arrangement in a variation of the driver model according to the fourth embodiment. 実施の形態4に係るドライバモデルの変形例におけるキャッシュの作成方法の一例を示す図である。FIG. 20 is a diagram illustrating an example of a cache creation method in a variation of the driver model according to the fourth embodiment. 実施の形態4に係るドライバモデルの変形例におけるキャッシュの作成方法の一例を示す図である。FIG. 20 is a diagram illustrating an example of a cache creation method in a variation of the driver model according to the fourth embodiment. 実施の形態4に係るドライバモデルの変形例におけるキャッシュの作成方法の一例を示す図である。FIG. 20 is a diagram illustrating an example of a cache creation method in a variation of the driver model according to the fourth embodiment. 本発明の実施の形態5~11に係る車両の構成を示すブロック図である。FIG. 6 is a block diagram showing a configuration of a vehicle according to Embodiments 5 to 11 of the present invention. 図32の車両の室内を模式的に示す図である。It is a figure which shows typically the interior of the vehicle of FIG. 図32の検出部の詳細な構成を示すブロック図である。It is a block diagram which shows the detailed structure of the detection part of FIG. 本発明の実施の形態5に係る自動運転制御装置から入力される行動情報を示す図である。It is a figure which shows the action information input from the automatic driving | operation control apparatus which concerns on Embodiment 5 of this invention. 実施の形態5に係る運転支援装置の制御部の詳細な構成を示すブロック図である。FIG. 10 is a block diagram illustrating a detailed configuration of a control unit of a driving support apparatus according to a fifth embodiment. 実施の形態5に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 5. FIG. 実施の形態5に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 5. FIG. 実施の形態5に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 5. FIG. 実施の形態5に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 5. FIG. 実施の形態5に係る車両のHMI制御に係る処理の例を示すシーケンス図である。FIG. 10 is a sequence diagram illustrating an example of processing related to HMI control of a vehicle according to a fifth embodiment. 実施の形態5に係る運転支援装置の処理の例を示すフローチャートである。10 is a flowchart illustrating an example of processing of the driving support device according to the fifth embodiment. 本発明の実施の形態6に係る運転支援装置の記憶部の詳細な構成を示すブロック図である。It is a block diagram which shows the detailed structure of the memory | storage part of the driving assistance apparatus which concerns on Embodiment 6 of this invention. 実施の形態6に係る統計情報蓄積部に蓄積される統計情報を模式的に示す図である。It is a figure which shows typically the statistical information accumulate | stored in the statistical information storage part which concerns on Embodiment 6. FIG. 実施の形態6に係る運転支援装置の制御部の詳細な構成を示すブロック図である。FIG. 10 is a block diagram illustrating a detailed configuration of a control unit of a driving assistance apparatus according to a sixth embodiment. 実施の形態6に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 6. FIG. 実施の形態6に係る車両のHMI制御に係る処理の例を示すシーケンス図である。FIG. 16 is a sequence diagram illustrating an example of processing related to HMI control of a vehicle according to a sixth embodiment. 実施の形態6に係る運転支援装置の処理の例を示すフローチャートである。14 is a flowchart illustrating an example of processing of a driving assistance device according to a sixth embodiment. 実施の形態6に係る運転支援装置の処理の例を示すフローチャートである。14 is a flowchart illustrating an example of processing of a driving assistance device according to a sixth embodiment. 本発明の実施の形態7に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 7 of this invention. 実施の形態7に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 7. FIG. 実施の形態7に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 7. FIG. 実施の形態7に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 7. FIG. 実施の形態7に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 7. FIG. 実施の形態7に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 7. FIG. 実施の形態7に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 7. FIG. 実施の形態7に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 7. FIG. 実施の形態7に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 7. FIG. 実施の形態7に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 7. FIG. 実施の形態7に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 7. FIG. 実施の形態7に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 7. FIG. 実施の形態7に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 7. FIG. 実施の形態7に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 7. FIG. 実施の形態7に係る運転支援装置の処理の例を示すフローチャートである。18 is a flowchart illustrating an example of processing of the driving support device according to the seventh embodiment. 実施の形態7に係る運転支援装置の処理の例を示すフローチャートである。18 is a flowchart illustrating an example of processing of the driving support device according to the seventh embodiment. 実施の形態7に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 7. FIG. 実施の形態7に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 7. FIG. 実施の形態7に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 7. FIG. 実施の形態7に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 7. FIG. 本発明の実施の形態8に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 8 of this invention. 実施の形態8に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 8. FIG. 実施の形態8に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 8. FIG. 実施の形態8に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 8. FIG. 実施の形態8に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 8. FIG. 実施の形態8に係る運転支援装置の処理の例を示すフローチャートである。20 is a flowchart illustrating an example of processing of the driving support device according to the eighth embodiment. 本発明の実施の形態9に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 9 of this invention. 実施の形態9に係る運転支援装置の処理の例を示すフローチャートである。25 is a flowchart illustrating an example of processing of the driving support device according to the ninth embodiment. 本発明の実施の形態10に係る運転支援装置の記憶部の詳細な構成を示すブロック図である。It is a block diagram which shows the detailed structure of the memory | storage part of the driving assistance apparatus which concerns on Embodiment 10 of this invention. 実施の形態10に係る運転支援装置の制御部の詳細な構成を示すブロック図である。FIG. 22 is a block diagram illustrating a detailed configuration of a control unit of a driving support apparatus according to Embodiment 10. 実施の形態5に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 5. FIG. 実施の形態5に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 5. FIG. 本発明の実施の形態10に係る車両のHMI制御に係る処理の例を示すシーケンス図である。It is a sequence diagram which shows the example of the process which concerns on the HMI control of the vehicle which concerns on Embodiment 10 of this invention. 実施の形態10に係る運転支援装置の処理の例を示すフローチャートである。24 is a flowchart illustrating an example of processing of the driving support device according to the tenth embodiment. 本発明の実施の形態11に係る自動運転情報画面の一例を示す図である。It is a figure which shows an example of the automatic driving | operation information screen which concerns on Embodiment 11 of this invention. 実施の形態11に係る運転支援装置の処理の例を示すフローチャートである。42 is a flowchart illustrating an example of processing of the driving support device according to the eleventh embodiment.
 本発明の実施の形態の説明に先立ち、従来の装置における問題点を簡単に説明する。自動運転(完全自動運転及び一部自動運転の両方を含む)中、車に運転を任せるということで、車と運転者の間の信頼関係が極めて大事であり、車両と運転者(乗員)との間に適切な情報を伝達することが必要となる。特許文献1には、運転者に対して現在の作動状態のみを報知している。 Prior to the description of the embodiment of the present invention, problems in the conventional apparatus will be briefly described. During automatic driving (including both fully automatic driving and partial automatic driving), it is important to leave the driving to the car, so the trusting relationship between the car and the driver is extremely important. It is necessary to communicate appropriate information during this period. In Patent Document 1, only the current operating state is notified to the driver.
 自動運転中、車両の現在の挙動(作動状態)を報知されるだけで、これから実施する挙動(例えば、特に合流前、交差点進入の前や緊急車両が近くにいた場合や周囲の他車が何らかの作動をした/しそうなときに、車両が実施しようとする車線変更、加速、減速といった挙動)について何も知らされていない状態だと、運転者が非常に不安感を抱えてしまうという第1の問題があった。 During automatic driving, only the current behavior (operation state) of the vehicle is notified, and the behavior to be carried out (for example, before joining, before entering an intersection, when an emergency vehicle is nearby, The first is that the driver will be very anxious if nothing is known about the lane change, acceleration, deceleration, etc. that the vehicle is trying to implement when it is / will be activated. There was a problem.
 また、完全自動運転中だと、運転者が運転監視以外の他の行動を取っている可能性が高く、いきなり現在の作動状態のみ表示されても、現在の車両の周囲状況や車両の走行状態も把握できないし、運転者の意思で運転を指示しようとしてもすぐに対応できなく、運転者がスムーズに車へ指示を与えることができないという第2の問題があった。 Also, during fully automatic driving, it is highly likely that the driver is taking actions other than driving monitoring, and even if only the current operating state is suddenly displayed, In addition, there is a second problem that the driver cannot give an instruction to the car smoothly because the driver cannot give an instruction to drive the vehicle at will.
 また、運転者に現在の作動状態のみを報知しているだけで、運転者が車に対して直接手動運転を行おうとしても、すぐに切り替えられないという第3の問題があった。 In addition, there is a third problem that even if the driver is only informing the driver of the current operating state, the driver cannot immediately switch the vehicle even if he / she tries to perform manual driving directly on the vehicle.
 また、運転者若しくは乗員によって、車が同じ動作を取るとしても、動作のタイミングや操作量は人によって異なり、実際運転者が手動運転する場合の感覚と乖離する可能性が高く、最悪な場合、自動運転中に運転者による不要な操作介入を誘発してしまうことがあるという第4の問題があった。 In addition, even if the car takes the same movement depending on the driver or the occupant, the timing and operation amount of the movement differ depending on the person, and there is a high possibility that the actual driver will deviate from the sense of manual driving. There was a fourth problem that unnecessary operation intervention by the driver may be induced during automatic driving.
 以下、本発明の実施の形態について、図面を参照して詳細に説明する。なお、以下に説明する各実施の形態は一例であり、本発明はこれらの実施の形態により限定されるものではない。 Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. Each embodiment described below is an example, and the present invention is not limited to these embodiments.
 (実施の形態1)
 図1は、本発明の実施の形態1に係る情報報知装置を含む車両1の要部構成を示すブロック図である。車両1は、運転者の操作を必要とせずに、運転制御の全てまたは一部を自動で行うことができる車両である。
(Embodiment 1)
FIG. 1 is a block diagram showing a main configuration of a vehicle 1 including an information notification device according to Embodiment 1 of the present invention. The vehicle 1 is a vehicle that can automatically perform all or part of the driving control without requiring the operation of the driver.
 車両1は、ブレーキペダル2と、アクセルペダル3と、ウィンカレバー4と、ステアリングホイール5と、検出部6と、車両制御部7と、記憶部8と、情報報知装置9とを有する。 The vehicle 1 includes a brake pedal 2, an accelerator pedal 3, a winker lever 4, a steering wheel 5, a detection unit 6, a vehicle control unit 7, a storage unit 8, and an information notification device 9.
 ブレーキペダル2は、運転者によるブレーキ操作を受けつけ、車両1を減速させる。またブレーキペダル2は、車両制御部7による制御結果を受けつけ、車両1の減速の度合いに対応した量変化してもよい。アクセルペダル3は、運転者によるアクセル操作を受けつけ、車両1を加速させる。またアクセルペダル3は、車両制御部7による制御結果を受けつけ、車両1の加速の度合いに対応した量変化してもよい。ウィンカレバー4は、運転者によるレバー操作を受けつけ、車両1の図示しない方向指示器を点灯させる。またウィンカレバー4は、車両制御部7による制御結果を受けつけ、車両1の方向指示方向に対応する状態にウィンカレバー4を変化させ、車両1の図示しない方向指示器を点灯させてもよい。 The brake pedal 2 receives a brake operation by the driver and decelerates the vehicle 1. The brake pedal 2 may receive a control result from the vehicle control unit 7 and change in an amount corresponding to the degree of deceleration of the vehicle 1. The accelerator pedal 3 accepts an accelerator operation by the driver and accelerates the vehicle 1. Further, the accelerator pedal 3 may receive a control result by the vehicle control unit 7 and may change by an amount corresponding to the degree of acceleration of the vehicle 1. The winker lever 4 receives a lever operation by the driver and turns on a direction indicator (not shown) of the vehicle 1. The winker lever 4 may receive a control result from the vehicle control unit 7, change the winker lever 4 to a state corresponding to the direction indicating direction of the vehicle 1, and turn on a direction indicator (not shown) of the vehicle 1.
 ステアリングホイール5は、運転者によるステアリング操作を受けつけ、車両1の走行する方向を変更する。またステアリングホイール5は、車両制御部7による制御結果を受けつけ、車両1の走行する方向の変更に対応した量変化してもよい。ステアリングホイール5は、操作部51を有する。 Steering wheel 5 receives the steering operation by the driver and changes the traveling direction of the vehicle 1. Further, the steering wheel 5 may receive a control result by the vehicle control unit 7 and may change by an amount corresponding to a change in the traveling direction of the vehicle 1. The steering wheel 5 has an operation unit 51.
 操作部51は、ステアリングホイール5の前面(運転者と対向する面)に設けられ、運転者からの入力操作を受け付ける。操作部51は、例えば、ボタン、タッチパネル、グリップセンサ等の装置である。操作部51は、運転者から受けつけた入力操作の情報を車両制御部7へ出力する。 The operation unit 51 is provided on the front surface (surface facing the driver) of the steering wheel 5 and receives an input operation from the driver. The operation unit 51 is a device such as a button, a touch panel, or a grip sensor, for example. The operation unit 51 outputs information on the input operation received from the driver to the vehicle control unit 7.
 検出部6は、車両1の走行状態、及び、車両1の周囲の状況を検出する。そして、検出部6は、検出した走行状態、及び、周囲の状況の情報を車両制御部7へ出力する。 The detection unit 6 detects the traveling state of the vehicle 1 and the situation around the vehicle 1. Then, the detection unit 6 outputs information on the detected traveling state and surrounding conditions to the vehicle control unit 7.
 検出部6は、位置情報取得部61と、センサ62と、速度情報取得部63と、地図情報取得部64とを有する。 The detection unit 6 includes a position information acquisition unit 61, a sensor 62, a speed information acquisition unit 63, and a map information acquisition unit 64.
 位置情報取得部61は、GPS(Global Positioning System)測位等により車両1の位置情報を走行状態の情報として取得する。 The position information acquisition unit 61 acquires the position information of the vehicle 1 as traveling state information by GPS (Global Positioning System) positioning or the like.
 センサ62は、車両1の周囲に存在する他車両の位置および車線位置情報から、他車両の位置および先行車両かどうかという種別、他車両の速度と自車両の速度から衝突予測時間(TTC:Time To Collision)、車両1の周囲に存在する障害物など、車両1の周囲の状況を検出する。 The sensor 62 determines the collision prediction time (TTC: Time) from the position of the other vehicle existing around the vehicle 1 and the lane position information, from the type of the other vehicle and whether it is a preceding vehicle, the speed of the other vehicle, and the speed of the own vehicle. To Collision), obstacles existing around the vehicle 1 and other conditions around the vehicle 1 are detected.
 速度情報取得部63は、走行状態の情報として、図示しない速度センサ等から車両1の速度或いは走行方向などの情報を取得する。 The speed information acquisition unit 63 acquires information such as the speed or the traveling direction of the vehicle 1 from a speed sensor or the like (not shown) as the traveling state information.
 地図情報取得部64は、車両1が走行する道路、道路における他車両との合流ポイント、現在走行中の車線、交差点の位置などの車両1の周辺の地図情報を、車両1の周囲の状況の情報として取得する。 The map information acquisition unit 64 obtains map information around the vehicle 1 such as the road on which the vehicle 1 travels, a merging point with other vehicles on the road, the currently traveling lane, the position of the intersection, and the like. Obtain as information.
 なお、センサ62は、ミリ波レーダ、レーザレーダ或いはカメラなど、またそれらの組合せから構成される。 The sensor 62 is constituted by a millimeter wave radar, a laser radar, a camera, or a combination thereof.
 記憶部8は、ROM(Read Only Memory)、RAM(Random Access Memory)、ハードディスク装置或いはSSD(Solid State Drive)などの記憶装置であり、現時点の走行環境と、次に(第1の所定時間経過後に)とり得る挙動の候補との間の対応関係を記憶する。 The storage unit 8 is a storage device such as a ROM (Read Only Memory), a RAM (Random Access Memory), a hard disk device, or an SSD (Solid State Drive), and the current running environment and the next (first predetermined time elapse). Memorize the correspondence between possible behavior candidates (later).
 現時点の走行環境とは、車両1の位置、車両1が走行している道路、車両1の周囲に存在する他車両の位置および速度等によって判定される環境である。なお、瞬間的なデータのみならず、その時点の前後のデータまでを基に、例えば、他車両の位置或いは速度により加速中、減速中、他車両が割込んできて1秒後には衝突する可能性まで判定してもよい。これにより、他車両の行動を予測することができ、走行環境をより詳細かつ正確に把握することが可能である。挙動の候補とは、現時点の走行環境に対して、車両1が次に(第1の所定時間経過後に)とり得る挙動の候補である。 The current traveling environment is an environment determined by the position of the vehicle 1, the road on which the vehicle 1 is traveling, the position and speed of other vehicles existing around the vehicle 1, and the like. In addition, based on not only the instantaneous data but also the data before and after that point, for example, the other vehicle may be interrupted during acceleration or deceleration depending on the position or speed of the other vehicle, and collision may occur after 1 second You may judge to sex. Thereby, the behavior of the other vehicle can be predicted, and the traveling environment can be grasped in more detail and accurately. The candidate for behavior is a candidate for behavior that the vehicle 1 can take next (after the first predetermined time) with respect to the current traveling environment.
 例えば、記憶部8は、車両1が走行する車線の前方に合流路があり、車線の左方から合流する車両が存在し、かつ、車両1が走行する車線の右方への車線変更が可能な走行環境に対応付けて、車両1の加速、車両1の減速、及び、車両1の右方への車線変更の3通りの挙動の候補を予め記憶する。 For example, the storage unit 8 has a merge path ahead of the lane in which the vehicle 1 travels, there is a vehicle that merges from the left side of the lane, and the lane can be changed to the right side of the lane in which the vehicle 1 travels. In association with various driving environments, three behavior candidates of acceleration of the vehicle 1, deceleration of the vehicle 1, and lane change to the right of the vehicle 1 are stored in advance.
 また、記憶部8は、車両1と同一車線の前方を走行する車両(以下、「先行車両」と記載)が車両1よりも遅い速度で走行し、かつ、隣の車線への車線変更が可能な走行環境に対応付けて、先行車両を追い越す走行、隣の車線へ車線変更を行う走行、車両1を減速させて先行車両に追従する走行の3通りの挙動の候補を予め記憶する。 In addition, the storage unit 8 allows a vehicle traveling in front of the same lane as the vehicle 1 (hereinafter referred to as “preceding vehicle”) to travel at a slower speed than the vehicle 1 and can change the lane to an adjacent lane. Corresponding to a different driving environment, three behavior candidates are stored in advance: driving that overtakes the preceding vehicle, driving that changes the lane to the adjacent lane, and driving that decelerates the vehicle 1 and follows the preceding vehicle.
 さらに、記憶部8は、それぞれの挙動の候補に対する優先順位を記憶してもよい。例えば、記憶部8は、過去の同一の走行環境において実際に採用された挙動の回数を記憶し、採用された回数の多い挙動ほど高く設定された優先順位を記憶してもよい。 Furthermore, the storage unit 8 may store priorities for the respective behavior candidates. For example, the storage unit 8 may store the number of behaviors actually adopted in the same driving environment in the past, and may store the priority set higher for the behaviors that are adopted more frequently.
 車両制御部7は、例えば、LSI(Large Scale Integration)回路、または、車両を制御する電子制御ユニット(Electronic Control Unit:ECU)の一部として実現可能である。車両制御部7は、検出部6から取得する走行状態および周囲の状況の情報に基づいて、車両を制御し、車両制御結果に対応してブレーキペダル2、アクセルペダル3、ウィンカレバー4、情報報知装置9を制御する。なお、車両制御部7が制御する対象は、これらに限定されない。 The vehicle control unit 7 can be realized, for example, as part of an LSI (Large Scale Integration) circuit or an electronic control unit (Electronic Control Unit: ECU) that controls the vehicle. The vehicle control unit 7 controls the vehicle based on the traveling state information and the surrounding situation information acquired from the detection unit 6, and the brake pedal 2, the accelerator pedal 3, the blinker lever 4, and information notification corresponding to the vehicle control result. The device 9 is controlled. In addition, the object which the vehicle control part 7 controls is not limited to these.
 まず、車両制御部7は、走行状態および周囲の状況の情報に基づいて、現時点の走行環境を判定する。この判定には、従来提案されている様々な方法が利用され得る。 First, the vehicle control unit 7 determines the current driving environment based on information on the driving state and surrounding conditions. For this determination, various conventionally proposed methods can be used.
 例えば、車両制御部7は、走行状態および周囲の状況の情報に基づいて、現時点の走行環境が、「車両1が走行する車線の前方に合流路があり、車線の左方から合流する車両が存在し、かつ、車両1が走行する車線の右方への車線変更が可能な走行環境」であると判定する。 For example, the vehicle control unit 7 determines that the current driving environment is based on the information on the driving state and the surrounding situation: “There is a merge path in front of the lane in which the vehicle 1 travels, and a vehicle that merges from the left side of the lane. It is determined that the travel environment is present and can be changed to the right of the lane in which the vehicle 1 travels.
 また、例えば、車両制御部7は、走行状態および周囲の状況の情報に基づいて、走行環境の時系列が、「車両1と同一車線の前方を走行する車両が車両1よりも遅い速度で走行し、かつ、隣の車線への車線変更が可能な走行環境」であると判定する。 In addition, for example, the vehicle control unit 7 determines that the time series of the travel environment is “a vehicle traveling in front of the same lane as the vehicle 1 travels at a slower speed than the vehicle 1 based on information on the travel state and the surrounding conditions. In addition, it is determined that the travel environment allows a lane change to the adjacent lane.
 車両制御部7は、走行状態および周囲の状況を示す走行環境に関する情報を情報報知装置9の報知部92に報知させる。また、車両制御部7は、判定した走行環境に対して、車両1が次に(第1の所定時間経過後に)とり得る挙動の候補を記憶部8から読み出す。 The vehicle control unit 7 causes the notification unit 92 of the information notification device 9 to notify information related to the traveling environment indicating the traveling state and the surrounding situation. Further, the vehicle control unit 7 reads, from the storage unit 8, behavior candidates that the vehicle 1 can take next (after the first predetermined time has elapsed) with respect to the determined traveling environment.
 車両制御部7は、読み出した挙動の候補から、現在の走行環境に最も適した挙動がどれかを判定し、現在の走行環境に最も適した挙動を第1の挙動に設定する。なお、第1の挙動は車両が現在実施している挙動と同じ挙動、即ち現在実施している挙動を継続することであってもよい。そして、車両制御部7は、現在の走行環境において第1の挙動を除く他に運転者が実施可能な挙動の候補を第2の挙動(いわゆる実施する挙動とは異なる挙動)に設定する。 The vehicle control unit 7 determines which behavior is most suitable for the current traveling environment from the read behavior candidates, and sets the behavior most suitable for the current traveling environment as the first behavior. The first behavior may be the same behavior that the vehicle is currently implementing, that is, continuing the currently implemented behavior. And the vehicle control part 7 sets the candidate of the behavior which a driver | operator can implement other than a 1st behavior in the present driving environment to a 2nd behavior (behavior different from the behavior to implement).
 例えば、車両制御部7は、走行状態および周囲の状況の情報に基づいて最も適した挙動を判定する従来技術を用いて、最も適した挙動を第1の挙動に設定することとしてもよい。 For example, the vehicle control unit 7 may set the most suitable behavior as the first behavior using a conventional technique that determines the most suitable behavior based on information on the running state and the surrounding situation.
 または、車両制御部7は、複数の挙動の候補のうち、予め設定された挙動を最も適した挙動として設定してもよいし、前回選択された挙動の情報を記憶部8に記憶しておき、その挙動を最も適した挙動と判定してもよいし、過去に各挙動が選択された回数を記憶部8に記憶しておき、回数が最も多い挙動を最も適した挙動と判定してもよい。 Alternatively, the vehicle control unit 7 may set a preset behavior among the plurality of behavior candidates as the most suitable behavior, or store information on the behavior selected last time in the storage unit 8. The behavior may be determined as the most suitable behavior, or the number of times each behavior has been selected in the past is stored in the storage unit 8, and the behavior with the largest number of times may be determined as the most suitable behavior. Good.
 そして、車両制御部7は、第1の挙動と第2の挙動の情報を情報報知装置9の報知部92に報知させる。なお、車両制御部7は第2の挙動が無いと判定した場合、第1の挙動のみを報知部92に報知させる。 And the vehicle control part 7 makes the alerting | reporting part 92 of the information alerting | reporting apparatus 9 alert | report the information of a 1st behavior and a 2nd behavior. Note that when the vehicle control unit 7 determines that there is no second behavior, the vehicle control unit 7 informs the notification unit 92 of only the first behavior.
 なお、車両制御部7は、第1の挙動と第2の挙動の情報と、走行状態および周囲の状況の情報とを、同時に、報知部92に報知させてもよい。 Note that the vehicle control unit 7 may cause the notification unit 92 to simultaneously notify the information on the first behavior and the second behavior, and information on the running state and the surrounding situation.
 さらに、車両制御部7は、操作部51が運転者から受けつけた操作の情報を取得する。車両制御部7は、第1の挙動と第2の挙動を報知してから、第2の所定時間内に操作部51が操作を受けつけたか否かを判定する。この操作は、例えば、第2の挙動に含まれる挙動の中から1つの挙動を選択する操作である。 Furthermore, the vehicle control unit 7 acquires information on the operation received by the operation unit 51 from the driver. After notifying the first behavior and the second behavior, the vehicle control unit 7 determines whether or not the operation unit 51 has accepted the operation within the second predetermined time. This operation is, for example, an operation for selecting one behavior from behaviors included in the second behavior.
 車両制御部7は、第2の所定時間内に操作部51が操作を受けつけなかった場合、第1の挙動を実行するように車両を制御し、車両制御結果に対応してブレーキペダル2、アクセルペダル3、ウィンカレバー4を制御する。 The vehicle control unit 7 controls the vehicle so as to execute the first behavior when the operation unit 51 does not accept the operation within the second predetermined time, and the brake pedal 2 and the accelerator according to the vehicle control result. The pedal 3 and the winker lever 4 are controlled.
 車両制御部7は、第2の所定時間内に操作部51が操作を受けつけた場合、受けつけた操作に対応する制御を行う。 The vehicle control unit 7 performs control corresponding to the accepted operation when the operation unit 51 accepts the operation within the second predetermined time.
 情報報知装置9は、車両制御部7から車両1の走行に関する種々の情報を取得し、取得した情報を報知する。情報報知装置9は、情報取得部91と報知部92とを有する。 The information notification device 9 acquires various information related to the traveling of the vehicle 1 from the vehicle control unit 7 and notifies the acquired information. The information notification device 9 includes an information acquisition unit 91 and a notification unit 92.
 情報取得部91は、車両制御部7から車両1の走行に関する種々の情報を取得する。例えば、情報取得部91は、車両制御部7が車両1の挙動を更新する可能性があると判定した場合に、車両制御部7から第1の挙動の情報と第2の挙動の情報を取得する。 The information acquisition unit 91 acquires various information related to the traveling of the vehicle 1 from the vehicle control unit 7. For example, the information acquisition unit 91 acquires the first behavior information and the second behavior information from the vehicle control unit 7 when the vehicle control unit 7 determines that there is a possibility of updating the behavior of the vehicle 1. To do.
 そして、情報取得部91は、取得した情報を図示しない記憶部に一時的に記憶し、必要に応じて記憶した情報を記憶部から読み出して報知部92へ出力する。 And the information acquisition part 91 memorize | stores the acquired information temporarily in the memory | storage part which is not shown in figure, reads the memorize | stored information from the memory | storage part as needed, and outputs it to the alerting | reporting part 92.
 報知部92は、車両1の走行に関する情報を運転者に報知する。報知部92は、例えば、車内に設置されているカーナビゲーションシステム、ヘッドアップディスプレイ、センターディスプレイ、ステアリングホイール5或いはピラーに設置されているLED(Light Emitting Diode)などの発光体などのような情報を表示する表示部であってもよい。あるいは、情報を音声に変換して運転者に報知するスピーカであってもよいし、運転者が感知できる位置(例えば、運転者の座席、ステアリングホイール5など)に設けられる振動体であってもよい。また、報知部92は、これらの組み合わせであってもよい。 The notification unit 92 notifies the driver of information related to the traveling of the vehicle 1. For example, the notification unit 92 displays information such as a car navigation system installed in the vehicle, a head-up display, a center display, a steering wheel 5 or a light emitter such as an LED (Light Emitting Diode) installed in the pillar. The display part to display may be sufficient. Alternatively, it may be a speaker that converts information into sound and notifies the driver, or may be a vibrating body provided at a position that can be sensed by the driver (for example, the driver's seat, the steering wheel 5). Good. The notification unit 92 may be a combination of these.
 以下の説明では、報知部92が情報を伝達するためのものであり、後述する図32の報知装置1002に相当する。 In the following description, the notification unit 92 transmits information and corresponds to the notification device 1002 of FIG. 32 described later.
 この場合、報知部92とは、例えば、ヘッドアップディスプレイ(Head Up Display:HUD)、LCD(Liquid Crystal Display)、HMD(Head-Mounted DisplayまたはHelmet-Mounted Display)、眼鏡型ディスプレイ(Smart Glasses)、その他の専用のディスプレイなどである。HUDは、例えば、車両1のウインドシールドであってもよいし、別途設けられるガラス面、プラスチック面(例えば、コンバイナ)などであってもよい。また、ウインドシールドは、例えば、フロントガラスであってもよいし、車両1のサイドガラスまたはリアガラスであってもよい。 In this case, the notification unit 92 includes, for example, a head-up display (Head Up Display: HUD), an LCD (Liquid Crystal Display), an HMD (Head-Mounted Display or Helmet-Mounted Display), an eyeglass-type display (SmartGlass), and the like. Other dedicated displays. The HUD may be, for example, a windshield of the vehicle 1, or may be a separately provided glass surface, plastic surface (for example, a combiner), or the like. The windshield may be, for example, a windshield, a side glass or a rear glass of the vehicle 1.
 さらに、HUDは、ウインドシールドの表面または内側に設置された透過型ディスプレイであってもよい。ここで、透過型ディスプレイとは、例えば、透過型の有機EL(electro-luminescence)ディスプレイ、または、特定の波長の光を照射した際に発光するガラスを用いた透明なディスプレイである。運転者は、背景を視認すると同時に、透過型ディスプレイ上の表示を視認することができる。このように報知部92は、光を透過する表示媒体であってもよい。いずれの場合も、画像が報知部92に表示される。 Further, the HUD may be a transmissive display installed on the surface or inside of the windshield. Here, the transmissive display is, for example, a transmissive organic EL (electro-luminescence) display or a transparent display using glass that emits light when irradiated with light of a specific wavelength. The driver can view the display on the transmissive display at the same time as viewing the background. Thus, the notification unit 92 may be a display medium that transmits light. In either case, an image is displayed on the notification unit 92.
 報知部92は、車両制御部7から情報取得部91を介して取得した走行に関する情報を運転者に報知する。例えば、報知部92は、車両制御部7から取得した第1の挙動、及び、第2の挙動の情報を運転者に報知する。 The notification unit 92 notifies the driver of information related to travel acquired from the vehicle control unit 7 via the information acquisition unit 91. For example, the notification unit 92 notifies the driver of information on the first behavior and the second behavior acquired from the vehicle control unit 7.
 ここで、具体的な表示内容、及び、操作部51に対してなされる操作について説明する。 Here, specific display contents and operations performed on the operation unit 51 will be described.
 図2A~2Cは、走行環境の第1の例と、それに対する報知部92の表示、及び、操作部51の操作について説明する図である。 2A to 2C are diagrams for explaining a first example of the driving environment, the display of the notification unit 92 and the operation of the operation unit 51 corresponding thereto.
 図2Aは、車両1の走行環境を示す俯瞰図である。具体的には、図2Aは、車両1が走行する車線の前方に合流路があり、車線の左方から合流する車両が存在し、かつ、車両1が走行する車線の右方への車線変更が可能な走行環境であることを示している。 FIG. 2A is an overhead view showing the traveling environment of the vehicle 1. Specifically, FIG. 2A shows that there is a merge path ahead of the lane in which the vehicle 1 travels, there is a vehicle that merges from the left side of the lane, and the lane change to the right side of the lane in which the vehicle 1 travels. Indicates a possible driving environment.
 車両制御部7は、走行状態および周囲の状況の情報に基づき、走行環境が、図2Aに示すような走行環境であると判定する。なお、車両制御部7は、図2Aに示す俯瞰図を生成し、第1の挙動、及び、第2の挙動の情報に加えて、生成した俯瞰図を報知部92に報知させてもよい。 The vehicle control unit 7 determines that the traveling environment is a traveling environment as shown in FIG. 2A based on information on the traveling state and the surrounding situation. Note that the vehicle control unit 7 may generate the overhead view shown in FIG. 2A and notify the notification unit 92 of the generated overhead view in addition to the information on the first behavior and the second behavior.
 図2Bは、図2Aに示した走行環境に対する報知部92の表示の一例を示している。報知部92の表示範囲のうち、右側には、車両1の挙動に関わる選択肢が表示され、左側には、手動運転に切り替えるための情報が表示される。 FIG. 2B shows an example of the display of the notification unit 92 for the traveling environment shown in FIG. 2A. Among the display range of the notification unit 92, options on the behavior of the vehicle 1 are displayed on the right side, and information for switching to manual driving is displayed on the left side.
 第1の挙動は、表示領域29a~29c、29gのうち、強調されている表示領域29bに示されている「車線変更」である。第2の挙動は、表示領域29a、29cにそれぞれ示されている「加速」、「減速」である。また、表示領域29gには、手動運転に切替えることを示す「自動運転終了」が表示されている。 The first behavior is “lane change” shown in the highlighted display area 29b among the display areas 29a to 29c, 29g. The second behavior is “acceleration” and “deceleration” shown in the display areas 29a and 29c, respectively. The display area 29g displays “automatic operation end” indicating switching to manual operation.
 図2Cは、ステアリングホイール5に設けられる操作部51の一例を示している。操作部51は、ステアリングホイール5の右側に設けられる操作ボタン51a~51dと、ステアリングホイール5の左側に設けられる操作ボタン51e~51hとを有する。なお、ステアリングホイール5に設けられる操作部51の数や形状等はこれらに限定されない。 FIG. 2C shows an example of the operation unit 51 provided in the steering wheel 5. The operation unit 51 includes operation buttons 51 a to 51 d provided on the right side of the steering wheel 5 and operation buttons 51 e to 51 h provided on the left side of the steering wheel 5. In addition, the number, shape, etc. of the operation part 51 provided in the steering wheel 5 are not limited to these.
 本実施の形態では、図2Bに示す表示領域29a~29cと操作ボタン51a~51cがそれぞれ対応し、表示領域29gと操作ボタン51gとが対応する。 In the present embodiment, display areas 29a to 29c and operation buttons 51a to 51c shown in FIG. 2B correspond to each other, and display area 29g and operation buttons 51g correspond to each other.
 この構成において、運転者は、各表示領域に表示される内容のいずれかを選択する際に、各表示領域に対応する操作ボタンを押下する。例えば、運転者が表示領域29aに表示される「加速」という挙動を選択する場合、運転者は、操作ボタン51aを押下する。 In this configuration, the driver presses an operation button corresponding to each display area when selecting any of the contents displayed in each display area. For example, when the driver selects the behavior “acceleration” displayed in the display area 29a, the driver presses the operation button 51a.
 なお、図2Bには、各表示領域に文字の情報のみが表示されているが、次に説明するように、車両の駆動に関する記号或いはアイコンを表示してもよい。これにより、運転者に表示内容を一目瞭然に把握できる。 In FIG. 2B, only character information is displayed in each display area, but a symbol or icon relating to driving of the vehicle may be displayed as described below. As a result, the driver can grasp the display contents at a glance.
 図3は、報知部92における表示の別の例を示す図である。図3に示すように、表示領域39a~39c、39gに文字の情報とその情報を示す記号の両方が表示される。なお、記号のみが表示されてもよい。 FIG. 3 is a diagram showing another example of display in the notification unit 92. As shown in FIG. 3, both character information and symbols indicating the information are displayed in the display areas 39a to 39c and 39g. Only symbols may be displayed.
 次に、具体的な走行環境を例に挙げて、表示制御の流れについて説明する。 Next, the flow of display control will be described using a specific driving environment as an example.
 図4は、本実施の形態における情報報知処理の処理手順を示すフローチャートである。図5Aは、走行環境の第1の例と、図5Bは、それに対する表示制御を示す図である。 FIG. 4 is a flowchart showing a processing procedure of information notification processing in the present embodiment. FIG. 5A is a diagram illustrating a first example of a traveling environment, and FIG. 5B is a diagram illustrating display control for the first example.
 図4に示すように、検出部6は、車両の走行状態を検出する(ステップS11)。次に、検出部6は、車両の周囲の状況を検出する(ステップS12)。検出された車両の走行状態、及び、車両の周囲の状況の情報は、検出部6により車両制御部7へ出力される。 As shown in FIG. 4, the detection unit 6 detects the traveling state of the vehicle (step S11). Next, the detection unit 6 detects the situation around the vehicle (step S12). Information on the detected traveling state of the vehicle and the situation around the vehicle is output by the detection unit 6 to the vehicle control unit 7.
 つぎに、車両制御部7は、走行状態および周囲の状況の情報に基づいて、現時点の走行環境を判定する(ステップS13)。図5Aの例の場合、車両制御部7は、現時点の走行環境が、「車両1が走行する車線の前方に合流路があり、車線の左方から合流する車両が存在し、かつ、車両1が走行する車線の右方への車線変更が可能な走行環境」であると判定する。 Next, the vehicle control unit 7 determines the current traveling environment based on the information on the traveling state and the surrounding situation (step S13). In the case of the example of FIG. 5A, the vehicle control unit 7 determines that the current driving environment is “there is a merge path in front of the lane in which the vehicle 1 travels, there is a vehicle that merges from the left side of the lane, and the vehicle 1 Is determined to be a “traveling environment in which a lane to the right of the lane in which the vehicle travels can be changed”.
 その後、車両制御部7は、判定した走行環境の情報を情報報知装置9の報知部92に報知させる(ステップS14)。図5Bの例の場合、車両制御部7は、判定した走行環境の情報を情報取得部91へ出力する。報知部92は、情報取得部91から走行環境の情報を取得し、文字情報59として表示させる。なお、車両制御部7は、走行環境の情報を報知部92に表示させる代わりに、スピーカ等で音声として走行環境の情報を運転者に報知してもよい。これにより、運転者がディスプレイ或いはモニターを見ていない、もしくは見落としている場合でも、運転者に確実に情報を伝達できる。 Thereafter, the vehicle control unit 7 causes the notification unit 92 of the information notification device 9 to notify the determined traveling environment information (step S14). In the case of the example of FIG. 5B, the vehicle control unit 7 outputs information on the determined traveling environment to the information acquisition unit 91. The notification unit 92 acquires travel environment information from the information acquisition unit 91 and displays it as character information 59. In addition, the vehicle control unit 7 may notify the driver of the information on the driving environment as sound through a speaker or the like instead of displaying the information on the driving environment on the notification unit 92. Thus, even when the driver is not looking at or overlooking the display or monitor, information can be reliably transmitted to the driver.
 次に、車両制御部7は、判定した走行環境が挙動の更新の可能性があるとするか否かを判定し、更新する可能性があるとすると判定された場合、さらに第1の挙動、及び、第2の挙動の判定を行う(ステップS15)。走行環境が挙動の更新の可能性があるとするか否かの判定は、走行環境が変更したか否かによって判定される。更新後実施する挙動とは、例えば、他の車両等と衝突が発生する可能性がある場合に減速する、ACC(AdaptiveCruise Control)において先行車両が消えた場合に速度変更する、隣の車線が空いた場合に車線変更するなどが考えられる。更新するか否かを判定するときは従来技術を用いてなされる。 Next, the vehicle control unit 7 determines whether or not the determined traveling environment has a possibility of updating the behavior. If it is determined that there is a possibility of updating, the vehicle control unit 7 further includes the first behavior, And determination of a 2nd behavior is performed (step S15). The determination as to whether or not the driving environment is likely to be updated is made based on whether or not the driving environment has changed. The behavior to be implemented after the update is, for example, the vehicle decelerates when there is a possibility of a collision with another vehicle, the speed changes when the preceding vehicle disappears in ACC (Adaptive Cruise Control), the adjacent lane is empty It is possible to change lanes in the event of an accident. Whether to update or not is determined using conventional technology.
 この場合、車両制御部7は、判定した走行環境に対して、車両1が次に(第1の所定時間経過後に)とり得る挙動の候補を記憶部8から読み出す。そして、車両制御部7は、挙動の候補から、現在の走行環境に最も適した挙動がどれかを判定し、現在の走行環境に最も適した挙動を第1の挙動に設定する。そして、車両制御部7は、第1の挙動を除く挙動の候補を第2の挙動に設定する。 In this case, the vehicle control unit 7 reads, from the storage unit 8, candidate behaviors that the vehicle 1 can take next (after the first predetermined time has elapsed) with respect to the determined traveling environment. Then, the vehicle control unit 7 determines which behavior is most suitable for the current traveling environment from the behavior candidates, and sets the behavior most suitable for the current traveling environment as the first behavior. Then, the vehicle control unit 7 sets behavior candidates excluding the first behavior to the second behavior.
 図5Bの例の場合、車両制御部7は、記憶部8から、車両1の加速、車両1の減速、及び車両1の右方への車線変更の3通りの挙動の候補を読み出す。そして、車両制御部7は、左方から合流する車両の速度、及び、車両1の右方の車線の状況に基づき、車両1の右方への車線変更が最も適した挙動であると判定し、その挙動を第1の挙動に設定する。そして、車両制御部7は、第1の挙動を除く挙動の候補を第2の挙動に設定する。 5B, the vehicle control unit 7 reads from the storage unit 8 candidates for three behaviors: acceleration of the vehicle 1, deceleration of the vehicle 1, and lane change to the right of the vehicle 1. Then, the vehicle control unit 7 determines that the rightward lane change of the vehicle 1 is the most suitable behavior based on the speed of the vehicle joining from the left side and the situation of the right lane of the vehicle 1. The behavior is set to the first behavior. Then, the vehicle control unit 7 sets behavior candidates excluding the first behavior to the second behavior.
 次に、車両制御部7は、第1の挙動、及び、第2の挙動を情報報知装置9の報知部92に報知させる(ステップS16)。図5Bの例の場合、報知部92は、第1の挙動の情報である「車線変更」という文字情報を表示領域59bに強調して表示し、第2の挙動の情報である「加速」、「減速」をそれぞれ表示領域59a、59cに表示させる。 Next, the vehicle control unit 7 causes the notification unit 92 of the information notification device 9 to notify the first behavior and the second behavior (step S16). In the case of the example in FIG. 5B, the notification unit 92 highlights and displays the character information “lane change”, which is the first behavior information, in the display area 59b, and “acceleration”, which is the second behavior information, “Deceleration” is displayed in the display areas 59a and 59c, respectively.
 次に、車両制御部7は、第2の所定時間内に操作部51が運転者からの操作を受けつけたか否かを判定する(ステップS17)。 Next, the vehicle control unit 7 determines whether or not the operation unit 51 has received an operation from the driver within the second predetermined time (step S17).
 例えば、車両制御部7は、現時点での走行環境が図5Aに示す走行環境であると判定してから、合流ポイントに到達するまでの時間を第1の所定時間と設定する。そして、車両制御部7は、第1の所定時間よりも短い第2の所定時間を、合流ポイントまでに実行される次の挙動に対する操作の受付が可能な時間として設定する。 For example, the vehicle control unit 7 sets the first predetermined time as the time from when it is determined that the current driving environment is the driving environment shown in FIG. And the vehicle control part 7 sets 2nd predetermined time shorter than 1st predetermined time as time when reception of operation with respect to the next behavior performed by a merge point is possible.
 車両制御部7は、第2の所定時間内に操作部51が運転者からの操作を受けつけた場合(ステップS17においてYES)、受けつけた操作が自動運転終了の操作か、挙動の選択操作(いわゆる更新)かを判定する(ステップS18)。 When the operation unit 51 receives an operation from the driver within the second predetermined time (YES in step S17), the vehicle control unit 7 determines whether the received operation is an operation for terminating automatic driving or a behavior selection operation (so-called operation). Update) is determined (step S18).
 図2Cにて説明したように、報知部92の各表示領域と操作部51の各操作ボタンとは対応している。運転者は、図5Bにおける自動運転終了を選択する場合、図2Cに示した操作ボタン51gを押下する。また、運転者は、挙動の選択を行う場合、図2Cに示した操作ボタン51a~51cのいずれかを押下する。 As described in FIG. 2C, each display area of the notification unit 92 and each operation button of the operation unit 51 correspond to each other. When selecting the end of the automatic driving in FIG. 5B, the driver presses the operation button 51g shown in FIG. 2C. Further, when selecting the behavior, the driver presses one of the operation buttons 51a to 51c shown in FIG. 2C.
 車両制御部7は、操作部51が受けつけた操作が自動運転終了の操作である場合(つまり、操作ボタン51gが押下されたことを検知した場合)、自動運転を終了させる(ステップS19)。車両制御部7は、操作部51が受けつけた操作が挙動の選択操作である場合(つまり、操作ボタン51a~51cのいずれかが押下された場合)、押下された操作ボタンに対応する挙動を実行するように、車両1の制御を行う(ステップS20)。 The vehicle control unit 7 terminates the automatic driving when the operation received by the operating unit 51 is an operation for terminating the automatic driving (that is, when it is detected that the operation button 51g is pressed) (step S19). When the operation received by the operation unit 51 is a behavior selection operation (that is, when any of the operation buttons 51a to 51c is pressed), the vehicle control unit 7 executes the behavior corresponding to the pressed operation button. Thus, the vehicle 1 is controlled (step S20).
 車両制御部7は、第2の所定時間内に操作部51が運転者からの操作を受けつけなかった場合(ステップS17においてNO)、第1の挙動を実行するように、車両1の制御を行う(ステップS21)。 The vehicle control unit 7 controls the vehicle 1 to execute the first behavior when the operation unit 51 does not accept the operation from the driver within the second predetermined time (NO in step S17). (Step S21).
 図6Aは、走行環境の第1の例を、図6Bは、それに対する別の表示制御を示す図である。図6Aは、図5Aと同様であるが、図6Bの表示制御が図5Bの表示制御とは異なっている。 FIG. 6A is a diagram showing a first example of the driving environment, and FIG. 6B is a diagram showing another display control for the first example. 6A is similar to FIG. 5A, but the display control of FIG. 6B is different from the display control of FIG. 5B.
 図5Bを用いて説明した場合と同様に、車両制御部7は、図6Aに示した走行環境に対して、記憶部8から、車両1の加速、車両1の減速、及び車両1の右方への車線変更の3通りの挙動の候補を読み出す。その際、記憶部8には、車両1の右方への車線変更が最も優先される挙動として記憶されているものとする。 Similarly to the case described with reference to FIG. 5B, the vehicle control unit 7 performs acceleration of the vehicle 1, deceleration of the vehicle 1, and right side of the vehicle 1 from the storage unit 8 with respect to the traveling environment illustrated in FIG. 6A. Candidates for the three behaviors of lane change to. At that time, it is assumed that the storage unit 8 stores a behavior in which the lane change to the right side of the vehicle 1 has the highest priority.
 この場合、車両制御部7は、走行環境の情報と、第1の挙動の情報とを報知部92に報知させる。図6Bの場合、車両制御部7は、走行環境の情報と、第1の挙動の情報を示す文字情報69を生成し、報知部92に文字情報69を表示させる。 In this case, the vehicle control unit 7 causes the notification unit 92 to notify the traveling environment information and the first behavior information. In the case of FIG. 6B, the vehicle control part 7 produces | generates the character information 69 which shows the information of driving environment, and the information of 1st behavior, and displays the character information 69 on the alerting | reporting part 92. FIG.
 そして、車両制御部7は、運転者に第1の挙動の採否を促す表示を表示領域69a、69cに表示させる。また、車両制御部7は、手動運転に切り替え可能であることを示す「自動運転終了」という表示を表示領域69gに表示させる。 Then, the vehicle control unit 7 causes the display areas 69a and 69c to display a display prompting the driver to adopt or reject the first behavior. In addition, the vehicle control unit 7 displays a display “automatic driving end” indicating that switching to manual driving is possible in the display area 69g.
 ここで、車両制御部7は、第1の挙動を採用することに対応する「YES」を強調して表示する。「YES」、「NO」のどちらを強調して表示するかは、予め定められていてもよいし、前回選択された選択肢を強調して表示することとしてもよいし、過去に選択された回数を記憶部8に記憶しておき、回数が多い方を報知部92が強調して表示することとしてもよい。 Here, the vehicle control unit 7 highlights and displays “YES” corresponding to adopting the first behavior. Which of “YES” and “NO” is emphasized and displayed may be determined in advance, the option selected last time may be highlighted and displayed, or the number of times selected in the past May be stored in the storage unit 8, and the notification unit 92 may highlight and display the one with the larger number of times.
 このように過去に選択された挙動を学習することにより、車両制御部7は、運転者に適切に情報を報知できる。また、図5Bの場合よりも報知部92に報知させる表示を減らすことができ、運転者の煩わしさを低減できる。 Thus, by learning the behavior selected in the past, the vehicle control unit 7 can appropriately notify the driver of information. Moreover, the display made to alert | report the alerting part 92 can be reduced rather than the case of FIG. 5B, and a driver | operator's troublesomeness can be reduced.
 図7Aは、走行環境の第2の例を、図7Bは、それに対する表示制御を示す図である。図7Aは、走行環境を示す俯瞰図である。図7Aに示す走行環境は、前方に合流路がある点で図5A、図6Aと同様であるが、車両1の右側に走行車両が存在する点で図5A、図6Aと異なる。このような場合、車両制御部7は、車線変更が行えないと判断する。 FIG. 7A is a diagram showing a second example of the driving environment, and FIG. 7B is a diagram showing display control for the second example. FIG. 7A is an overhead view showing a traveling environment. The traveling environment shown in FIG. 7A is similar to FIGS. 5A and 6A in that there is a junction path ahead, but differs from FIGS. 5A and 6A in that a traveling vehicle exists on the right side of the vehicle 1. In such a case, the vehicle control unit 7 determines that the lane change cannot be performed.
 そして、車両制御部7は、車両1の走行環境が図7Aのようなものと判定した場合、図7Bに示すように、判定した走行環境の情報を報知部92に文字情報79として表示させる。 Then, when the vehicle control unit 7 determines that the traveling environment of the vehicle 1 is as shown in FIG. 7A, the vehicle control unit 7 displays information on the determined traveling environment as character information 79 on the notification unit 92 as shown in FIG. 7B.
 さらに、車両制御部7は、記憶部8から読み出した車両1の加速、車両1の減速、及び、車両1の右方への車線変更の3通りの挙動の候補のうち、車両1の右方への車線変更はできないため、車両1の加速、及び、車両1の減速のみを選択する。 Further, the vehicle control unit 7 selects the right side of the vehicle 1 among the three behavior candidates of acceleration of the vehicle 1 read from the storage unit 8, deceleration of the vehicle 1, and lane change to the right side of the vehicle 1. Since the lane cannot be changed, only the acceleration of the vehicle 1 and the deceleration of the vehicle 1 are selected.
 また、車両制御部7は、このままの速度で進むと合流車両と接近しすぎることを予測し、車両1の減速が最も適した挙動である、つまり、第1の挙動であると判定する。 Further, the vehicle control unit 7 predicts that the vehicle 1 is too close to the joining vehicle when proceeding at this speed, and determines that the deceleration of the vehicle 1 is the most suitable behavior, that is, the first behavior.
 ここで、3通りの挙動の候補のうち、最も適した挙動がどれかは、走行状態および周囲の状況の情報に基づいて最も適した挙動を判定する従来技術を用いて判定される。また、最も適した挙動がどれかは、予め定められていてもよいし、前回選択された挙動の情報を記憶部8に記憶しておき、その挙動を最も適した挙動と判定してもよいし、過去に各挙動が選択された回数を記憶部8に記憶しておき、回数が最も多い挙動を最も適した挙動と判定してもよい。 Here, of the three behavior candidates, which is the most suitable behavior is determined using a conventional technique that determines the most suitable behavior based on information on the driving state and the surrounding situation. Further, which behavior is most suitable may be determined in advance, or information on the behavior selected last time may be stored in the storage unit 8, and the behavior may be determined as the most suitable behavior. Then, the number of times each behavior has been selected in the past may be stored in the storage unit 8, and the behavior with the largest number of times may be determined as the most suitable behavior.
 その後、車両制御部7は、「減速」を第1の挙動として表示領域79cに表示させ、「加速」を第2の挙動として表示領域79aに表示させる。また、車両制御部7は、手動運転に切替えることを示す「自動運転終了」という表示を表示領域79gに表示させる。 Thereafter, the vehicle control unit 7 displays “Deceleration” as the first behavior in the display area 79c, and displays “Acceleration” as the second behavior in the display area 79a. Further, the vehicle control unit 7 causes the display area 79g to display “automatic driving end” indicating switching to manual driving.
 このような表示制御により、車両制御部7は、走行環境に応じて、その走行環境に最も適した挙動を第1の挙動として運転者に報知できる。 By such display control, the vehicle control unit 7 can notify the driver of the behavior most suitable for the traveling environment as the first behavior according to the traveling environment.
 第1の挙動の情報を上方に、第2の挙動の情報を下方に配置し、それぞれ操作ボタン51a、51cに選択機能を割り当ててもよい。あるいは、加速挙動の情報を上方に、減速挙動の情報を下方に、右車線変更の挙動の情報を右方に、左車線変更の挙動の情報を左方へ配置し、それぞれ操作ボタン51a、51c、51b、51dに選択機能を割り当ててもよいし、それらを切り替えられるようにし、別途行動優先配置か、操作優先配置かを表示してもよい。さらに、第1の挙動の情報の表示サイズを大きく、第2の挙動の情報の表示サイズを小さくしてもよい。なお、車の前後・左右の挙動と対応して挙動情報の表示を配置することにより、運転者に直感的な認識と操作が可能である。 The information on the first behavior may be arranged on the upper side and the information on the second behavior may be arranged on the lower side, and selection functions may be assigned to the operation buttons 51a and 51c, respectively. Alternatively, the acceleration behavior information is arranged upward, the deceleration behavior information is arranged downward, the right lane change behavior information is arranged on the right side, and the left lane change behavior information is arranged on the left side, respectively. , 51b, 51d may be assigned a selection function, or they may be switched, and a separate action priority arrangement or operation priority arrangement may be displayed. Furthermore, the display size of the first behavior information may be increased and the display size of the second behavior information may be decreased. In addition, by arranging the behavior information display corresponding to the behavior of the front / rear / left / right of the vehicle, the driver can recognize and operate intuitively.
 次に、前方に合流路があるという走行環境以外の走行環境の例について説明する。 Next, an example of a travel environment other than the travel environment in which there is a joint path ahead will be described.
 図8Aは、走行環境の第3の例を、図8Bは、それに対する表示制御を示す図である。図8Aは、車両1の走行環境を示す俯瞰図である。具体的には、図8Aには、先行車両が車両1よりも遅い速度で走行し、かつ、隣の車線への車線変更が可能な走行環境が示されている。 FIG. 8A is a diagram showing a third example of the driving environment, and FIG. 8B is a diagram showing display control for the third example. FIG. 8A is an overhead view showing the traveling environment of the vehicle 1. Specifically, FIG. 8A shows a travel environment in which the preceding vehicle travels at a speed slower than that of the vehicle 1 and the lane can be changed to the adjacent lane.
 車両制御部7は、走行状態および周囲の状況の情報に基づき、走行環境が、図8Aに示すような走行環境であると判定する。この場合、車両制御部7は、判定した走行環境の情報を報知部92に文字情報89として表示させる。 The vehicle control unit 7 determines that the traveling environment is a traveling environment as shown in FIG. 8A based on information on the traveling state and the surrounding situation. In this case, the vehicle control unit 7 causes the notification unit 92 to display the determined traveling environment information as character information 89.
 また、車両制御部7は、判定した走行環境に対応する挙動の候補として、先行車両を追い越す走行、隣の車線へ車線変更を行う走行、車両1を減速させて先行車両を追従する走行の3通りの挙動の候補を記憶部8から読み出す。 In addition, the vehicle control unit 7 can select three behaviors as a candidate for the behavior corresponding to the determined traveling environment: traveling that overtakes the preceding vehicle, traveling that changes the lane to the adjacent lane, and traveling that decelerates the vehicle 1 and follows the preceding vehicle. The candidate for the street behavior is read from the storage unit 8.
 そして、車両制御部7は、例えば、先行車両の減速後の速度が所定値より高く許容できることから、車両1を減速させて先行車両を追従する走行が最も適した挙動、つまり、第1の挙動であると判定する。 For example, the vehicle control unit 7 allows the speed after the deceleration of the preceding vehicle to be higher than a predetermined value, so that the behavior in which the vehicle 1 decelerates and follows the preceding vehicle is most suitable, that is, the first behavior. It is determined that
 ここで、3通りの挙動の候補のうち、最も適した挙動がどれかは、走行状態および周囲の状況の情報に基づいて最も適した挙動を判定する従来技術を用いて判定される。また、最も適した挙動がどれかは、予め定められていてもよいし、前回選択された挙動の情報を記憶部8に記憶しておき、その挙動を最も適した挙動と判定してもよいし、過去に各挙動が選択された回数を記憶部8に記憶しておき、回数が最も多い挙動を最も適した挙動と判定してもよい。 Here, of the three behavior candidates, which is the most suitable behavior is determined using a conventional technique that determines the most suitable behavior based on information on the driving state and the surrounding situation. Further, which behavior is most suitable may be determined in advance, or information on the behavior selected last time may be stored in the storage unit 8, and the behavior may be determined as the most suitable behavior. Then, the number of times each behavior has been selected in the past may be stored in the storage unit 8, and the behavior with the largest number of times may be determined as the most suitable behavior.
 さらに、車両制御部7は、図8Bに示すように、第1の挙動を示す「追従」という文字情報を表示領域89cに強調して表示し、第2の挙動を示す「追い越し」、「車線変更」という文字情報をそれぞれ表示領域89a、89bに表示させる。また、車両制御部7は、手動運転に切替えることを示す「自動運転終了」という表示を表示領域89gに表示させる。 Further, as shown in FIG. 8B, the vehicle control unit 7 highlights and displays the character information “follow” indicating the first behavior in the display area 89 c, and displays “passing” and “lane” indicating the second behavior. Character information “change” is displayed in the display areas 89a and 89b, respectively. Further, the vehicle control unit 7 causes the display area 89g to display “automatic driving end” indicating switching to manual driving.
 第1の挙動の情報を上方に、第2の挙動の情報を下方に配置し、それぞれ操作ボタン51a、51cに選択機能を割り当ててもよい。あるいは、追い越し挙動の情報を上方に、追従挙動の情報を下方に、右車線変更の挙動の情報を右方に、左車線変更の挙動の情報を左方へ配置し、それぞれ操作ボタン51a、51c、51b、51dに選択機能を割り当ててもよいし、それらを切り替えられるようにし、別途行動優先配置か、操作優先配置かを表示してもよい。さらに、第1の挙動の情報の表示サイズを大きく、第2の挙動の情報の表示サイズを小さくしてもよい。 The information on the first behavior may be arranged on the upper side and the information on the second behavior may be arranged on the lower side, and selection functions may be assigned to the operation buttons 51a and 51c, respectively. Alternatively, the passing behavior information is arranged upward, the following behavior information is arranged downward, the right lane change behavior information is arranged on the right side, and the left lane changing behavior information is arranged on the left side, and the operation buttons 51a and 51c are arranged. , 51b, 51d may be assigned a selection function, or they may be switched, and a separate action priority arrangement or operation priority arrangement may be displayed. Furthermore, the display size of the first behavior information may be increased and the display size of the second behavior information may be decreased.
 図9Aは、走行環境の第4の例を、図9Bは、それに対する表示制御を示す図である。図9Aは、車両1の走行環境を示す俯瞰図である。具体的には、図9Aは、走行環境が、車両1と同一車線の前方において、車線が減少する走行環境であることを示している。 FIG. 9A is a diagram showing a fourth example of the driving environment, and FIG. 9B is a diagram showing display control for the fourth example. FIG. 9A is an overhead view showing the traveling environment of the vehicle 1. Specifically, FIG. 9A shows that the traveling environment is a traveling environment in which lanes decrease in front of the same lane as the vehicle 1.
 車両制御部7は、走行状態および周囲の状況の情報に基づき、走行環境が、図9Aに示すような走行環境であると判定する。この場合、車両制御部7は、判定した走行環境の情報を報知部92に文字情報99として表示させる。 The vehicle control unit 7 determines that the traveling environment is a traveling environment as illustrated in FIG. 9A based on information on the traveling state and the surrounding situation. In this case, the vehicle control unit 7 causes the notification unit 92 to display the determined traveling environment information as the character information 99.
 また、車両制御部7は、判定した走行環境に対応する挙動の候補として、隣の車線へ車線変更を行う走行、そのまま現車線を維持する走行の2通りの挙動の候補を記憶部8から読み出す。 In addition, the vehicle control unit 7 reads out from the storage unit 8 two candidate behaviors, that is, a behavior for changing the lane to the adjacent lane and a driving for maintaining the current lane as the behavior candidates corresponding to the determined travel environment. .
 そして、車両制御部7は、例えば、車線減少箇所までのTTCが所定値より短いため、隣の車線へ車線変更を行う走行が最も適した挙動である、つまり、第1の挙動であると判定する。 For example, the vehicle control unit 7 determines that the travel to change the lane to the adjacent lane is the most suitable behavior, that is, the first behavior because the TTC to the lane decrease point is shorter than a predetermined value. To do.
 ここで、2通りの挙動の候補のうち、最も適した挙動がどれかは、走行状態および周囲の状況の情報に基づいて最も適した挙動を判定する従来技術を用いて判定される。また、最も適した挙動がどれかは、予め定められていてもよいし、前回選択された挙動の情報を記憶部8に記憶しておき、その挙動を最も適した挙動と判定してもよいし、過去に各挙動が選択された回数を記憶部8に記憶しておき、回数が最も多い挙動を最も適した挙動と判定してもよい。 Here, which of the two behavior candidates is the most suitable behavior is determined using a conventional technique for determining the most suitable behavior based on information on the driving state and the surrounding situation. Further, which behavior is most suitable may be determined in advance, or information on the behavior selected last time may be stored in the storage unit 8, and the behavior may be determined as the most suitable behavior. Then, the number of times each behavior has been selected in the past may be stored in the storage unit 8, and the behavior with the largest number of times may be determined as the most suitable behavior.
 さらに、車両制御部7は、図9Bに示すように、第1の挙動を示す「車線変更」という文字情報を表示領域99bに強調して表示し、第2の挙動を示す「そのまま」という文字情報を表示領域99cに表示させる。また、車両制御部7は、手動運転に切替えることを示す「自動運転終了」という表示を表示領域99gに表示させる。 Further, as shown in FIG. 9B, the vehicle control unit 7 highlights and displays the character information “lane change” indicating the first behavior in the display area 99b, and the characters “as is” indicating the second behavior. Information is displayed in the display area 99c. Further, the vehicle control unit 7 causes the display area 99g to display “automatic driving end” indicating switching to manual driving.
 第1の挙動の情報を上方に、第2の挙動の情報を下方に配置し、それぞれ操作ボタン51a、51cに選択機能を割り当ててもよいし、何もしない挙動の情報を下方に、右車線変更の挙動の情報を右方に、左車線変更の挙動の情報を左方へ配置し、それぞれ操作ボタン51c、51b、51dに選択機能を割り当ててもよいし、それらを切り替えられるようにし、別途行動優先配置か、操作優先配置かを表示してもよい。さらに、第1の挙動の情報の表示サイズを大きく、第2の挙動の情報の表示サイズを小さくしてもよい。なお、図7B、8B、9Bに示されているように、異なる走行環境によって、表示領域にはそれぞれ異なる機能が割り当てられることで、少ない領域で情報報知或いは操作することができる。 The first behavior information may be arranged above, the second behavior information may be arranged below, and a selection function may be assigned to each of the operation buttons 51a and 51c. The change behavior information is arranged on the right side, the left lane change behavior information is arranged on the left side, and a selection function may be assigned to each of the operation buttons 51c, 51b, 51d. You may display whether it is action priority arrangement | positioning or operation priority arrangement | positioning. Furthermore, the display size of the first behavior information may be increased and the display size of the second behavior information may be decreased. As shown in FIGS. 7B, 8B, and 9B, different functions are assigned to the display areas according to different traveling environments, so that information notification or operation can be performed in a small area.
 上記の説明では、車両制御部7が、走行環境および周囲の状況の情報に応じて、報知部92に挙動を報知させる場合について説明したが、本発明はこれに限定されない。例えば、運転者による所定の操作があったときに、報知部92に挙動を報知させることとしてもよい。 In the above description, the case has been described in which the vehicle control unit 7 causes the notification unit 92 to notify the behavior in accordance with the information on the traveling environment and the surrounding situation, but the present invention is not limited to this. For example, when a predetermined operation is performed by the driver, the notification unit 92 may be notified of the behavior.
 図10Aは、走行環境の第5の例を、図10Bは、それに対する表示制御を示す図である。図10Aは、車両1の走行環境を示す俯瞰図である。具体的には、図10Aには、車両1が左方と右方にそれぞれ車線変更可能な走行環境であることを示す走行環境が示されている。 FIG. 10A is a diagram illustrating a fifth example of the driving environment, and FIG. 10B is a diagram illustrating display control for the fifth example. FIG. 10A is an overhead view showing the traveling environment of the vehicle 1. Specifically, FIG. 10A shows a travel environment indicating that the vehicle 1 is a travel environment in which lanes can be changed to the left and right.
 図10Aに示す走行環境は、図5A~図9Aの場合と異なり、車線の変更或いは車両の加速、減速が不要な通常走行が可能な走行環境である。この場合、車両制御部7は、図10Bの表示109に示すように、走行環境の情報を報知部92に文字情報として表示させなくともよい。 The driving environment shown in FIG. 10A is a driving environment in which normal driving is possible without changing lanes or accelerating or decelerating the vehicle, unlike the case of FIGS. 5A to 9A. In this case, as shown in the display 109 in FIG. 10B, the vehicle control unit 7 does not have to display the travel environment information on the notification unit 92 as character information.
 このように報知部92に文字情報が表示されていない状況において、運転者が操作部51のいずれかの操作ボタンを押下した場合、車両制御部7は、通常走行における挙動の候補を記憶部8から読み出す。 When the driver presses one of the operation buttons of the operation unit 51 in a situation where the character information is not displayed on the notification unit 92 as described above, the vehicle control unit 7 stores the behavior candidate in the normal travel as the storage unit 8. Read from.
 具体的には、記憶部8には、図10Aに示すような通常走行の走行環境に対応付けて、車両1の加速、車両1の減速、車両1の右方への車線変更、車両1の左方への車線変更の4通りの挙動の候補が記憶されている。車両制御部7は、これらを読み出し、報知部92の表示領域109a~109dにそれぞれ表示させる。 Specifically, in the storage unit 8, the acceleration of the vehicle 1, the deceleration of the vehicle 1, the lane change to the right side of the vehicle 1, Four candidate behaviors for lane change to the left are stored. The vehicle control unit 7 reads out these and displays them on the display areas 109a to 109d of the notification unit 92, respectively.
 また、車両制御部7は、手動運転に切り替えることを示す「自動運転終了」という表示を表示領域109gに表示させるとともに、挙動の更新をキャンセルすることを示す「キャンセル」という表示を表示領域109eに強調して表示させる。 In addition, the vehicle control unit 7 displays “automatic driving end” indicating that switching to manual driving is displayed in the display area 109g, and displays “cancel” indicating canceling behavior update in the display area 109e. Highlight and display.
 以上説明した本実施の形態によれば、運転者に次に実施される挙動の候補を効果的に報知し、運転者により好ましい挙動を選択させることができる。 According to the present embodiment described above, it is possible to effectively notify the driver of behavior candidates to be performed next, and to select a preferable behavior by the driver.
 なお、運転者が実施したい挙動を選択する代わりに、直接ステアリングホイールなどの手動操作をしてもよい。これにより、運転者が自分の意思により素早く手動運転操作に切り替えられる。 Note that instead of selecting the behavior that the driver wants to perform, manual operation of the steering wheel or the like may be performed directly. As a result, the driver can quickly switch to the manual driving operation according to his / her own intention.
 以上説明した本実施の形態では、報知部92における表示は、文字情報であるとして説明したが、本発明はこれに限定されない。例えば、挙動を示す記号を用いて運転者に視覚的に表示させてもより。以下では、運転者に視覚的に表示させる記号を用いた表示を図5Bおよび図7Bに対する表示を例にとって説明する。 In the present embodiment described above, the display in the notification unit 92 has been described as character information, but the present invention is not limited to this. For example, it may be displayed visually to the driver using a symbol indicating behavior. In the following, display using symbols that are visually displayed to the driver will be described by taking the display for FIGS. 5B and 7B as an example.
 図11は、図5Aに示した走行環境の第1の例に対する別の表示制御を示す図である。この例では、上述した第1の挙動が車両1の右方への車線変更であり、第2の挙動が車両1の加速、及び、車両1の減速である。 FIG. 11 is a diagram showing another display control for the first example of the traveling environment shown in FIG. 5A. In this example, the first behavior described above is a lane change to the right of the vehicle 1, and the second behavior is acceleration of the vehicle 1 and deceleration of the vehicle 1.
 この場合、第1の挙動である「車線変更」を示す記号111が中央に大きく表示され、第2の挙動である「車両1の加速」を示す記号112、及び、「車両1の減速」を示す記号113が右方に小さく表示される。また、自動運転終了を示す記号114が左方に小さく表示される。 In this case, a symbol 111 indicating “lane change” as the first behavior is displayed large in the center, a symbol 112 indicating “acceleration of the vehicle 1” as a second behavior, and “deceleration of the vehicle 1”. The symbol 113 shown is displayed small to the right. Further, a symbol 114 indicating the end of automatic driving is displayed small on the left.
 そして、このまま運転手により車両1の挙動の変更指示を受けつけなければ、車線変更が行われる。 If the driver does not accept the change instruction of the behavior of the vehicle 1 as it is, the lane change is performed.
 図12A、12Bは、図7Aに示した走行環境の第2の例に対する別の表示制御を示す図である。この例では、上記第1の例と異なり、車両1の右方に別の車両が走行しているため、車線変更ができない。そのため、例えば、「車両1の減速」が第1の挙動に設定され、「車両1の加速」が第2の挙動に設定される。 12A and 12B are diagrams showing another display control for the second example of the traveling environment shown in FIG. 7A. In this example, unlike the first example, since another vehicle is traveling to the right of the vehicle 1, the lane cannot be changed. Therefore, for example, “deceleration of the vehicle 1” is set to the first behavior, and “acceleration of the vehicle 1” is set to the second behavior.
 そして、この場合、図12Aに示すように、第1の挙動である「車両1の減速」を示す記号121が中央に大きく表示され、第2の挙動である「車両1の加速」を示す記号122が右方に小さく表示される。また、自動運転終了を示す記号123が左方に小さく表示される。 In this case, as shown in FIG. 12A, a symbol 121 indicating “deceleration of the vehicle 1” as the first behavior is displayed large in the center, and a symbol indicating “acceleration of the vehicle 1” as the second behavior. 122 is displayed small to the right. In addition, a symbol 123 indicating the end of automatic driving is displayed small on the left.
 ここで、操作部51が運転手から「車両1の加速」を選択する操作を受けつけたものとする。この場合、図12Bに示すように、第1の挙動である「車両1の加速」を示す記号122’が中央に大きく表示され、第2の挙動である「車両1の減速」を示す記号121’が右方に小さく表示されることになる。 Here, it is assumed that the operation unit 51 receives an operation for selecting “acceleration of the vehicle 1” from the driver. In this case, as shown in FIG. 12B, a symbol 122 ′ indicating “acceleration of the vehicle 1” that is the first behavior is displayed large in the center, and a symbol 121 indicating “deceleration of the vehicle 1” that is the second behavior. 'Will be displayed small to the right.
 以上説明した本実施の形態によれば、運転者に次に実施される挙動の候補を効果的に報知し、運転者により好ましい挙動を選択させることができる。一方、運転者は、車両が実施する挙動や他に選択可能な挙動を把握でき、安心感を持って自動運転を継続することできる。または、運転者がスムーズに車へ指示を与えることができる。 According to the present embodiment described above, it is possible to effectively notify the driver of behavior candidates to be performed next, and to select a preferable behavior by the driver. On the other hand, the driver can grasp the behavior performed by the vehicle and other behaviors that can be selected, and can continue the automatic driving with a sense of security. Alternatively, the driver can give instructions to the car smoothly.
 また、本実施の形態によれば、走行環境に応じて、報知部に報知させる選択肢、つまり、第2の挙動を可変にすることができる。 Further, according to the present embodiment, the option to be notified to the notification unit, that is, the second behavior can be made variable according to the traveling environment.
 (実施の形態2)
 実施の形態1では、ステアリングホイール5に設けられた操作部51によって、報知部92の表示に応じた操作を行う構成について説明した。本実施の形態では、ステアリングホイール5に設けられる操作部51の代わりに、タッチパネルが設けられる構成について説明する。
(Embodiment 2)
In the first embodiment, the configuration in which the operation unit 51 provided on the steering wheel 5 performs an operation according to the display of the notification unit 92 has been described. In the present embodiment, a configuration in which a touch panel is provided instead of the operation unit 51 provided on the steering wheel 5 will be described.
 図13は、本発明の実施の形態2に係る情報報知装置を含む車両1の要部構成を示すブロック図である。なお、図13において、図1と共通する構成には図1と同一の符号を付し、その詳しい説明を省略する。図13に示す車両1には、ステアリングホイール5の操作部51の代わりにタッチパネル10が設けられている。 FIG. 13 is a block diagram showing a main configuration of the vehicle 1 including the information notification device according to Embodiment 2 of the present invention. In FIG. 13, the same components as those in FIG. 1 are denoted by the same reference numerals as those in FIG. A vehicle 1 shown in FIG. 13 is provided with a touch panel 10 instead of the operation unit 51 of the steering wheel 5.
 タッチパネル10は、情報の表示と入力の受付が可能な液晶パネル等からなる装置であり、車両制御部7と接続される。タッチパネル10は、車両制御部7による制御に基づいて情報を表示する表示部101と、運転者等からの操作を受けつけ、受けつけた操作を車両制御部7へ出力する入力部102とを有する。 The touch panel 10 is a device composed of a liquid crystal panel or the like capable of displaying information and receiving input, and is connected to the vehicle control unit 7. The touch panel 10 includes a display unit 101 that displays information based on control by the vehicle control unit 7 and an input unit 102 that receives an operation from a driver or the like and outputs the received operation to the vehicle control unit 7.
 次に、タッチパネル10の表示制御について説明する。ここでは、車両1が3車線の中央を走行中であり、右方の車線と左方の車線のいずれかに車線変更が可能である場合の表示制御について説明する。 Next, display control of the touch panel 10 will be described. Here, display control when the vehicle 1 is traveling in the center of three lanes and the lane can be changed to either the right lane or the left lane will be described.
 図14A~14Cは、実施の形態2におけるタッチパネル10の表示を説明する図である。図14Aは、タッチパネル10の表示部101の初期表示である。車両制御部7は、車両1が右方の車線と左方の車線のいずれかに車線変更が可能であると判定した場合、タッチパネル10の表示部101に図14Aのような表示を実行させる。ここで、表示領域121における「Touch」という表示は、タッチパネル10が運転者によるタッチ操作を受けつけ可能なモードであることを示している。 FIGS. 14A to 14C are diagrams illustrating the display on the touch panel 10 according to the second embodiment. FIG. 14A is an initial display of the display unit 101 of the touch panel 10. When the vehicle control unit 7 determines that the vehicle 1 can change the lane to either the right lane or the left lane, the vehicle control unit 7 causes the display unit 101 of the touch panel 10 to perform a display as illustrated in FIG. 14A. Here, the display “Touch” in the display area 121 indicates that the touch panel 10 is in a mode in which a touch operation by the driver can be received.
 運転者は、図14Aに示す表示において、表示領域121をタッチするタッチ操作を行う場合、入力部102は、この操作を受けつけて、この操作が行われたことを示す情報を車両制御部7へ出力する。車両制御部7は、この情報を受けつけると、図14Bに示す表示を表示部101に表示させ、また、図14Cに示す表示を報知部92に表示させる。 When the driver performs a touch operation of touching the display area 121 in the display shown in FIG. 14A, the input unit 102 receives this operation and sends information indicating that this operation has been performed to the vehicle control unit 7. Output. Upon receipt of this information, the vehicle control unit 7 displays the display shown in FIG. 14B on the display unit 101 and displays the display shown in FIG. 14C on the notification unit 92.
 図14Bには、車両1へ移動を指示する操作であることを示す「Move」と表示された表示領域121aが示されている。また、図14Bには、車両1が3車線のそれぞれを走行可能であることを示す表示領域121b~121dが示されている。なお、表示領域121b~121dは、それぞれ、図14Cに矢印X、Y、Zで示される車線での走行と対応する。 FIG. 14B shows a display area 121a on which “Move” indicating an operation for instructing the vehicle 1 to move is displayed. FIG. 14B also shows display areas 121b to 121d indicating that the vehicle 1 can travel in each of the three lanes. The display areas 121b to 121d correspond to traveling in the lane indicated by arrows X, Y, and Z in FIG. 14C, respectively.
 また、図14Bの各表示領域と、図14Cの各矢印とは、それぞれ、態様(例えば、色或いは配置など)を一致させる。これにより、運転者により理解しやすい表示となる。 Further, each display area in FIG. 14B and each arrow in FIG. 14C have the same mode (for example, color or arrangement). As a result, the display is easier to understand for the driver.
 さらに、矢印X、Y、Zで示される車線の太さなどを変えて、車両制御が判定した車両が実施する挙動と他に運転者が選択可能な挙動が区別できるように表示してもよい。 Further, by changing the thickness of the lane indicated by the arrows X, Y, and Z, the behavior performed by the vehicle determined by the vehicle control may be displayed so that the behavior selectable by the driver can be distinguished. .
 運転者は、表示領域121b~121dのうち、走行したい車線に対応する表示領域に触れることによって、車両1の挙動の選択を行う。この場合、入力部102は、運転者の挙動の選択操作を受けつけて、選択された挙動の情報を車両制御部7へ出力する。そして、車両制御部7は、選択された挙動を実行するよう車両1を制御する。これにより、運転者が走行したい車線を車両1が走行することになる。 The driver selects the behavior of the vehicle 1 by touching the display area corresponding to the lane to be traveled among the display areas 121b to 121d. In this case, the input unit 102 accepts a driver's behavior selection operation and outputs information on the selected behavior to the vehicle control unit 7. Then, the vehicle control unit 7 controls the vehicle 1 to execute the selected behavior. As a result, the vehicle 1 travels in the lane that the driver wants to travel.
 なお、運転者は、タッチパネル10に対して、タッチ操作の代わりに、スワイプ操作を行ってもよい。例えば、図14Cに示す例において、運転者が図14Cの矢印Xで示される車線への変更を行いたい場合、運転者は、タッチパネル10において右方へのスワイプ操作を行う。 Note that the driver may perform a swipe operation on the touch panel 10 instead of the touch operation. For example, in the example shown in FIG. 14C, when the driver wants to change to the lane indicated by the arrow X in FIG. 14C, the driver performs a swipe operation to the right on the touch panel 10.
 この場合、入力部102は、スワイプ操作を受けつけ、スワイプ操作の内容を示す情報を車両制御部7へ出力する。そして、車両制御部7は、選択された挙動である矢印Xで示される車線への車線変更を実行するよう車両1を制御する。 In this case, the input unit 102 receives the swipe operation and outputs information indicating the content of the swipe operation to the vehicle control unit 7. And the vehicle control part 7 controls the vehicle 1 to perform the lane change to the lane shown by the arrow X which is the selected behavior.
 さらに、車両1へ移動を指示する操作であることを示す「Move」と表示された表示領域121aが示されるときに、音声で「挙動選択」などと発話してもよい。これにより、手元のタッチパネルを見ることなく、HUDの表示のみで操作が可能となる。 Furthermore, when the display area 121a displayed as “Move” indicating an operation for instructing the vehicle 1 to move is displayed, the user may speak “behavior selection” or the like by voice. Thereby, it becomes possible to operate only by displaying the HUD without looking at the touch panel at hand.
 また、タッチ操作或いはスワイプ操作の際に、選択したタッチパネルの表示領域に対応する車線の表示態様を変更し、どの車線を選択しようとしているのか選択前に確認できるようにしてもよい。例えば、表示領域bをタッチした瞬間に、車線Xの太さが拡大し、すぐに手を離せば車線Xは選択されず車線Xの太さが元の大きさに戻り、表示領域121cにタッチを移動した瞬間に、車線Yの太さが拡大し、しばらくその状態を保持すると、車線Yが選択され、車線Yが点滅することで決定されたことを伝えても良い。これにより、手元を目視せずに選択或いは決定の操作ができる。 Also, during the touch operation or swipe operation, the display mode of the lane corresponding to the display area of the selected touch panel may be changed so that it can be confirmed before selecting which lane is being selected. For example, at the moment when the display area b is touched, the thickness of the lane X increases, and if the hand is released immediately, the lane X is not selected and the thickness of the lane X returns to the original size, and the display area 121c is touched. If the thickness of the lane Y increases and the state is maintained for a while, the lane Y may be selected and the fact that the lane Y has blinked may be notified. Thereby, selection or determination operation can be performed without visually checking the hand.
 なお、実施の形態1と同様に、加速、減速、追越し、そのままなどの車両制御機能を、走行環境に応じて、表示領域に割り当てても良い。 Note that, similarly to the first embodiment, vehicle control functions such as acceleration, deceleration, overtaking, and the like may be assigned to the display area according to the driving environment.
 以上説明した本実施の形態によれば、操作部の代わりにタッチパネルを設けることにより、運転者に直感的な操作を行わせることができる。また、タッチパネルは、操作を受けつける表示領域の数、形状、色などを自由に変更させることができるため、ユーザインタフェースの自由度が向上する。 According to the present embodiment described above, by providing a touch panel instead of the operation unit, the driver can perform an intuitive operation. In addition, since the touch panel can freely change the number, shape, color, and the like of display areas for receiving operations, the degree of freedom of the user interface is improved.
 (実施の形態3)
 実施の形態1では、第1の挙動と第2の挙動が同時に表示される場合について説明した。本実施の形態では、まず、報知部92に第1の挙動が表示され、運転者の操作を受けつけた場合に、第2の挙動が表示される構成について説明する。
(Embodiment 3)
In the first embodiment, the case where the first behavior and the second behavior are displayed simultaneously has been described. In the present embodiment, first, a configuration in which the first behavior is displayed on the notification unit 92 and the second behavior is displayed when a driver's operation is accepted will be described.
 本実施の形態に係る構成は、実施の形態1で説明した図1の構成において、操作部51に運転者がステアリングホイール5を握ったか否かを検出するグリップセンサがさらに含まれた構成となる。 The configuration according to the present embodiment is a configuration in which, in the configuration of FIG. 1 described in the first embodiment, the operation unit 51 further includes a grip sensor that detects whether or not the driver has gripped the steering wheel 5. .
 図15A~15Dは、本発明の実施の形態3における報知部92の表示を説明する図である。図15A~15Dには、図8Aに示した場合と同様に、車両1と同一車線の前方を走行する車両が車両1よりも遅い速度で走行し、かつ、隣の車線への車線変更が可能な走行環境における表示の例が示されている。 15A to 15D are diagrams for explaining the display of the notification unit 92 according to Embodiment 3 of the present invention. 15A to 15D, similarly to the case shown in FIG. 8A, a vehicle traveling in front of the same lane as the vehicle 1 travels at a slower speed than the vehicle 1, and the lane can be changed to the adjacent lane. An example of display in a different driving environment is shown.
 車両制御部7は、走行環境が、図8Aに示した走行環境であると判定すると、まず、報知部92に図15Aに示す表示を実行させる。 When the vehicle control unit 7 determines that the traveling environment is the traveling environment illustrated in FIG. 8A, the vehicle control unit 7 first causes the notification unit 92 to execute the display illustrated in FIG. 15A.
 図15Aには、第1の所定時間が経過した後に実施される挙動の候補うち、第1の挙動である「追い越し」を示す記号131が第1の態様(例えば、第1の色)で示されている。 FIG. 15A shows, in a first mode (for example, the first color), a symbol 131 indicating “passing” which is the first behavior among the behavior candidates to be implemented after the first predetermined time has elapsed. Has been.
 車両制御部7は、図15Aに示す表示を報知部92に実行させた後、第2の所定時間が経過した場合、記号131を第1の態様から、第1の態様とは異なる第2の態様(例えば、第1の色とは異なる第2の色)で報知部92に表示させる。ここで、第2の所定時間は、実施の形態1で説明した第2の所定時間と同様のものである。 When the second predetermined time has elapsed after causing the notification unit 92 to execute the display shown in FIG. 15A, the vehicle control unit 7 changes the symbol 131 from the first mode to the second mode different from the first mode. The notification unit 92 is displayed in a mode (for example, a second color different from the first color). Here, the second predetermined time is the same as the second predetermined time described in the first embodiment.
 つまり、記号131が第1の態様で示されている間、運転者は、第2の挙動の選択が可能であるが、記号131が第2の態様に変更された場合、運転者は、第2の挙動の選択が不可能になる。 That is, while the symbol 131 is shown in the first mode, the driver can select the second behavior, but when the symbol 131 is changed to the second mode, the driver Selection of the behavior of 2 becomes impossible.
 また、図15Aには、第2の挙動が選択可能であることを示すステアリングホイール形状の記号132が示されている。記号132が表示されている場合に運転者がステアリングホイール5を握ることによって、第2の挙動が表示される。記号132は、第2の挙動が選択可能であることを示す表示であるが、記号131が第1の態様にて表示されることによって、運転者に第2の挙動が選択可能であることを示すこととしてもよい。この場合、記号132は、表示されなくてもよい。 Further, FIG. 15A shows a steering wheel shape symbol 132 indicating that the second behavior can be selected. When the symbol 132 is displayed, when the driver holds the steering wheel 5, the second behavior is displayed. The symbol 132 is a display indicating that the second behavior can be selected. By displaying the symbol 131 in the first mode, the driver can select the second behavior. It may be shown. In this case, the symbol 132 may not be displayed.
 また、図15Aには、現在、自動運転中であることを示す記号133が示されている。記号133は、自動運転で走行中であることを運転者に示す補助的な表示であるが、記号133は表示されなくてもよい。 FIG. 15A also shows a symbol 133 indicating that automatic operation is currently in progress. The symbol 133 is an auxiliary display that indicates to the driver that the vehicle is traveling in automatic driving, but the symbol 133 may not be displayed.
 図15Aの表示に対して運転者がステアリングホイール5を握った場合、グリップセンサがそれを検出し、その検出結果の情報を車両制御部7へ出力する。この場合、車両制御部7は、図15Bに示す表示を報知部92に実行させる。 When the driver grips the steering wheel 5 with respect to the display of FIG. 15A, the grip sensor detects it and outputs information of the detection result to the vehicle control unit 7. In this case, the vehicle control unit 7 causes the notification unit 92 to execute the display illustrated in FIG. 15B.
 図15Bには、図15Aと同様に、第1の挙動である「追い越し」を示す記号131が第1の態様(例えば、第1の色)で示されている。また、第2の挙動である「車線変更」を示す記号134と、第2の挙動である「減速」を示す記号135が示されている。 In FIG. 15B, similarly to FIG. 15A, a symbol 131 indicating “passing” which is the first behavior is shown in the first mode (for example, the first color). Further, a symbol 134 indicating “lane change” as the second behavior and a symbol 135 indicating “deceleration” as the second behavior are shown.
 運転者は、ステアリングホイール5の操作部51を操作することによって、第1の挙動から第2の挙動への変更を行う。例えば、運転者は、操作部51の操作ボタン51a、または、操作ボタン51c(図2C参照)を押下することによって、「車線変更」(記号134)、または、「減速」(記号135)への挙動の更新を行う。 The driver changes the first behavior to the second behavior by operating the operation unit 51 of the steering wheel 5. For example, the driver depresses the operation button 51a or the operation button 51c (see FIG. 2C) of the operation unit 51, thereby moving to “change lane” (symbol 134) or “deceleration” (symbol 135). Update the behavior.
 また、図15Bには、車両制御部7が、車両1の挙動を学習中であることを示す記号136が示されている。記号136が表示されている場合、車両制御部7は、運転者が選択した挙動を学習する。記号136は表示されなくても構わない。また、学習は常に行っていても構わない。 FIG. 15B shows a symbol 136 indicating that the vehicle control unit 7 is learning the behavior of the vehicle 1. When the symbol 136 is displayed, the vehicle control unit 7 learns the behavior selected by the driver. The symbol 136 may not be displayed. In addition, learning may always be performed.
 つまり、車両制御部7は、運転者が選択した挙動を記憶部8に記憶し、次に同様の走行環境になった場合、記憶した挙動を第1の挙動として、報知部92に表示させる。または、車両制御部7は、過去に各挙動が選択された回数を記憶部8に記憶しておき、回数が最も多い挙動を第1の挙動として、報知部92に表示させてもよい。 That is, the vehicle control unit 7 stores the behavior selected by the driver in the storage unit 8, and when the same driving environment is subsequently set, the stored behavior is displayed on the notification unit 92 as the first behavior. Or the vehicle control part 7 may memorize | store the frequency | count that each behavior was selected in the memory | storage part 8 in the past, and may display the behavior with the largest frequency on the alerting | reporting part 92 as a 1st behavior.
 また、図15Bには、自動運転中ではないことを示す記号137が示されている。記号137が表示されている場合、車両制御部7は、第1の所定時間経過後に行う挙動が運転者によって選択されるまで待機する。 FIG. 15B also shows a symbol 137 indicating that automatic operation is not being performed. When the symbol 137 is displayed, the vehicle control unit 7 waits until a behavior selected after the first predetermined time has elapsed is selected by the driver.
 図15Bに示す表示に対して、運転者が操作部51の操作ボタン51aを押下して「車線変更」を選択した場合、車両制御部7は、この選択操作の情報を受けつけ、図15Cに示す表示を報知部92に実行させる。 When the driver depresses the operation button 51a of the operation unit 51 and selects “change lane” with respect to the display shown in FIG. 15B, the vehicle control unit 7 receives information on this selection operation, and is shown in FIG. 15C. The notification unit 92 executes the display.
 図15Cには、「車線変更」を示す記号134’が、第1の態様で示されている。車両制御部7は、「車線変更」を選択する選択操作の情報を受けつけた場合、この選択された挙動が次に行う挙動であると判定し、「車線変更」を示す記号134’を第1の態様で報知部92に表示させる。 FIG. 15C shows a symbol 134 ′ indicating “lane change” in the first mode. When the vehicle control unit 7 receives information on the selection operation for selecting “lane change”, the vehicle control unit 7 determines that the selected behavior is the next behavior to be performed, and sets the symbol 134 ′ indicating “lane change” to the first. Is displayed on the notification unit 92.
 また、図15Cの記号131’は、図15Bにおいて第1の挙動として表示されていた記号131が記号134と入れ替わって表示されたものである。 Further, the symbol 131 ′ in FIG. 15C is displayed by replacing the symbol 131 displayed as the first behavior in FIG. 15B with the symbol 134.
 また、図15Cに示す表示に対して、運転者が操作ボタンのいずれかを2度連続して押下した場合、運転者が前に行った選択操作をキャンセルできるようにしてもよい。この場合、車両制御部7は、操作ボタンのいずれかを2度連続して押下する操作の情報を受けつけ、図15Cに示す表示から図15Bに示す表示への変更を報知部92に実行させる。 Further, when the driver presses one of the operation buttons twice in succession with respect to the display shown in FIG. 15C, the selection operation previously performed by the driver may be canceled. In this case, the vehicle control unit 7 receives information on an operation of continuously pressing any one of the operation buttons twice, and causes the notification unit 92 to change the display shown in FIG. 15C to the display shown in FIG. 15B.
 車両制御部7は、図15Aに示す表示を報知部92に実行させてから、第2の所定時間が経過するまでの間に、運転者の操作に基づいて、図15B、図15Cへと報知部92の表示を変化させる。その後、車両制御部7は、図15Aに示す表示を報知部92に実行させてから第2の所定時間が経過した後に、図15Dに示す表示を報知部92に表示させる。 The vehicle control unit 7 informs FIG. 15B and FIG. 15C based on the driver's operation after the notification unit 92 executes the display shown in FIG. 15A until the second predetermined time elapses. The display of the unit 92 is changed. Thereafter, the vehicle control unit 7 causes the notification unit 92 to display the display illustrated in FIG. 15D after the second predetermined time has elapsed since the notification unit 92 has executed the display illustrated in FIG. 15A.
 なお、車両制御部7は、運転者がステアリングホイール5から手を離したこと示す情報をグリップセンサから取得した場合に、第2の所定時間が経過する前に図15Dに示す表示を報知部92に表示させてもよい。 In addition, when the information which shows that the driver has released his hand from the steering wheel 5 is acquired from the grip sensor, the vehicle control unit 7 displays the display shown in FIG. 15D before the second predetermined time elapses. May be displayed.
 ここで、図15Dには、次の挙動として、運転者が選択した「車線変更」を示す記号134’が第2の態様で表示され、また、自動運転で走行中であることを示す記号133が、再び、表示された状態が示されている。 Here, in FIG. 15D, as the next behavior, a symbol 134 ′ indicating “lane change” selected by the driver is displayed in the second mode, and a symbol 133 indicating that the vehicle is traveling in automatic driving. Again, the displayed state is shown.
 以上説明した本実施の形態によれば、車両制御部7は、運転者が次にとる挙動の更新をしたい場合にのみ、他の挙動の候補を確認できるように、報知部92での表示を変更する。この構成により、運転者が視認する表示を減らすことができ、運転者の煩わしさを低減できる。 According to the present embodiment described above, the vehicle control unit 7 displays the information on the notification unit 92 so that the candidate for another behavior can be confirmed only when the driver wants to update the next behavior. change. With this configuration, the display visually recognized by the driver can be reduced, and the driver's troublesomeness can be reduced.
 (実施の形態4)
 上述した実施の形態において、車両1が実行しうる複数の挙動の候補のうち最も適した挙動がどれかを判定する方法についていくつか説明した。本実施の形態では、最も適した挙動を判定する方法として、予め学習により構築されたドライバモデルを用いる場合について説明する。
(Embodiment 4)
In the embodiment described above, several methods have been described for determining which of the plurality of behavior candidates that the vehicle 1 can execute is the most suitable behavior. In the present embodiment, as a method for determining the most suitable behavior, a case will be described in which a driver model constructed by learning in advance is used.
 ここで、ドライバモデルの構築方法について説明する。ドライバモデルは、走行環境毎の運転者による操作の傾向を各操作の頻度の情報などに基づいてモデル化したものである。ドライバモデルは、複数の運転者の走行履歴を集約し、集約した走行履歴から構築される。 Here, the construction method of the driver model is explained. The driver model is obtained by modeling the tendency of the operation by the driver for each driving environment based on information on the frequency of each operation. The driver model aggregates the traveling histories of a plurality of drivers and is constructed from the aggregated traveling histories.
 運転者の走行履歴は、例えば、各走行環境に対応する挙動の候補のうち、運転者が実際に選択した挙動の頻度が、挙動の候補毎に集約された履歴である。 The driving history of the driver is, for example, a history in which the behavior frequency actually selected by the driver among the behavior candidates corresponding to each driving environment is aggregated for each behavior candidate.
 図16は、走行履歴の一例を示す図である。図16には、運転者xが「合流路が近づく」という走行環境において、「減速」、「加速」、「車線変更」という挙動の候補を、それぞれ、3回、1回、5回選択したことが示されている。また、図16には、運転者Xが「前方に低速車あり」という走行環境において、「追従」、「追い越し」、「車線変更」という挙動の候補を、それぞれ、2回、2回、1回選択したことが示されている。運転者yについても同様である。 FIG. 16 is a diagram showing an example of a travel history. In FIG. 16, in the driving environment where the driver x “closes the joint path”, the behavior candidates “decelerate”, “accelerate”, and “lane change” are selected three times, once, and five times, respectively. It has been shown. Further, in FIG. 16, in a driving environment where the driver X “has a low speed vehicle ahead”, candidate behaviors “follow”, “passing”, and “lane change” are shown twice, twice, 1 It is shown that you have selected once. The same applies to the driver y.
 運転者の走行履歴は、自動運転中に選択した挙動を集約してもよいし、運転者が手動運転中に実際に行った挙動を集約してもよい。これにより、自動運転或いは手動運転といった運転状態に応じた走行履歴の収集ができる。 The driving history of the driver may aggregate behaviors selected during automatic driving or may aggregate behaviors actually performed by the driver during manual driving. This makes it possible to collect travel histories according to operating conditions such as automatic driving or manual driving.
 ドライバモデルには、複数の運転者の走行履歴をクラスタリングして構築するクラスタリング型と、特定の運転者(例えば、運転者x)の走行履歴と類似する複数の走行履歴から運転者xのドライバモデルを構築する個別適応型とがある。 The driver model includes a clustering type constructed by clustering the driving histories of a plurality of drivers, and a driver model of the driver x from a plurality of driving histories similar to a driving history of a specific driver (for example, driver x). There is an individual adaptive type that builds.
 まず、クラスタリング型について説明する。クラスタリング型のドライバモデルの構築方法は、図16に示したような複数の運転者の走行履歴を予め集約する。そして、互いの走行履歴の類似度が高い複数の運転者、つまり、類似した運転操作傾向を有する複数の運転者をグループ化してドライバモデルを構築する。 First, the clustering type will be described. The clustering type driver model construction method aggregates the driving histories of a plurality of drivers as shown in FIG. Then, a driver model is constructed by grouping a plurality of drivers having a high degree of similarity between the traveling histories, that is, a plurality of drivers having similar driving operation tendencies.
 図17は、クラスタリング型のドライバモデルの構築方法を示す図である。図17には、運転者a~fの走行履歴が表形式で示されている。そして、運転者a~fの走行履歴から、モデルAが運転者a~cの走行履歴から構築され、モデルBが運転者d~fの走行履歴から構築されることが示されている。 FIG. 17 is a diagram illustrating a clustering type driver model construction method. FIG. 17 shows the travel histories of the drivers a to f in a table format. From the driving histories of the drivers a to f, it is shown that the model A is constructed from the traveling histories of the drivers a to c, and the model B is constructed from the traveling histories of the drivers d to f.
 走行履歴の類似度は、例えば、運転者aと運転者bの走行履歴における各頻度(各数値)を頻度分布として扱い、互いの頻度分布の相関値を算出し、算出した相関値を類似度としてもよい。この場合、例えば、運転者aと運転者bの走行履歴から算出した相関値が所定値よりも高い場合に、運転者aと運転者bの走行履歴を1つのグループとする。 The similarity of the travel histories is, for example, treating each frequency (each numerical value) in the travel histories of the driver a and the driver b as a frequency distribution, calculating a correlation value between the frequency distributions, and using the calculated correlation value as the similarity It is good. In this case, for example, when the correlation value calculated from the driving history of the driver a and the driver b is higher than a predetermined value, the driving history of the driver a and the driver b is set as one group.
 なお、類似度の算出については、これに限定されない。例えば、運転者aと運転者bの各走行履歴において、最も頻度の高い挙動が一致する数に基づいて、類似度を算出してもよい。 Note that the calculation of similarity is not limited to this. For example, the degree of similarity may be calculated based on the number of the most frequently matched behaviors in the driving histories of the driver a and the driver b.
 そして、クラスタリング型のドライバモデルは、例えば、各グループ内の運転者の走行履歴において、それぞれの頻度の平均を算出することによって構築される。 The clustering type driver model is constructed by, for example, calculating the average of each frequency in the driving history of drivers in each group.
 図18は、構築されたクラスタリング型のドライバモデルの一例を示す図である。図17で示した各グループ内の運転者の走行履歴において、それぞれの頻度の平均を算出することによって、各グループの走行履歴の平均頻度を導出する。このように、クラスタリング型のドライバモデルは、走行環境毎に定められた挙動に対する平均頻度で構築される。 FIG. 18 is a diagram illustrating an example of a built clustering driver model. In the travel history of the drivers in each group shown in FIG. 17, the average frequency of each group is derived by calculating the average of the respective frequencies. Thus, the clustering type driver model is constructed with an average frequency for the behavior determined for each driving environment.
 なお、ドライバモデルは、算出した平均頻度から最も頻度の高いもののみで構築してもよい。図19は、構築されたクラスタリング型のドライバモデルの別の一例を示す図である。図19に示すように、走行環境毎に最頻の挙動が選択され、選択された挙動からドライバモデルが構築される。 It should be noted that the driver model may be constructed with only the highest frequency from the calculated average frequency. FIG. 19 is a diagram illustrating another example of the constructed clustering type driver model. As shown in FIG. 19, the most frequent behavior is selected for each traveling environment, and a driver model is constructed from the selected behavior.
 ここで、構築したクラスタリング型のドライバモデルの使用方法について、例を挙げて説明する。 Here, the method of using the built clustering driver model will be described with an example.
 図18に示したようなドライバモデルは、予め車両1の記憶部8に記憶される。また、車両制御部7は、運転者yが過去に運転した際の走行履歴を記憶部8に記憶しておく。なお、運転者yの検知は、車内に設置されるカメラ等(図示しない)で実行される。 The driver model as shown in FIG. 18 is stored in advance in the storage unit 8 of the vehicle 1. Further, the vehicle control unit 7 stores a travel history when the driver y has driven in the past in the storage unit 8. The driver y is detected by a camera or the like (not shown) installed in the vehicle.
 そして、車両制御部7は、運転者yの走行履歴とドライバモデルの各モデルの走行履歴との類似度を算出し、どのモデルが運転者yに最も適しているかを判定する。例えば、図16に示した運転者yの走行履歴と図18に示したドライバモデルの場合、車両制御部7は、モデルBが運転者yに最も適していると判定する。 Then, the vehicle control unit 7 calculates the similarity between the driving history of the driver y and the driving history of each model of the driver model, and determines which model is most suitable for the driver y. For example, in the case of the driving history of the driver y shown in FIG. 16 and the driver model shown in FIG. 18, the vehicle control unit 7 determines that the model B is most suitable for the driver y.
 車両制御部7は、実際の自動走行の際に、モデルBの各走行環境において、最も頻度が高い挙動が運転者yに最も適した挙動、つまり、第1の挙動であると判定する。 The vehicle control unit 7 determines that the behavior with the highest frequency is the behavior most suitable for the driver y, that is, the first behavior in each traveling environment of the model B during actual automatic traveling.
 このように、予め複数の運転者の走行履歴からドライバモデルを構築することにより、運転者により適した挙動を報知できる。 Thus, by constructing a driver model from a plurality of driver's driving histories in advance, it is possible to notify a behavior more suitable for the driver.
 例えば、図16に示すように、運転者yの走行履歴に「前方に低速車あり」という走行環境に対する挙動の頻度が0、つまり、運転者が「前方に低速車あり」という走行環境において「追従」、「追い越し」、「車線変更」という挙動を選択したことが無い場合においても、車両制御部7は、図18に示すモデルBに基づき、「前方に低速車あり」という走行環境において、「追従」を第1の挙動として判定できる。 For example, as illustrated in FIG. 16, in the driving environment where the driving history of the driver y is “there is a low-speed vehicle ahead” in the driving history of the driver y, that is, the driving environment is “a low-speed vehicle ahead”. Even when the behavior of “follow”, “passing”, and “lane change” has never been selected, the vehicle control unit 7 is based on the model B shown in FIG. “Follow-up” can be determined as the first behavior.
 次に、個別適応型について説明する。個別適応型のドライバモデルの構築方法は、クラスタリング型の場合と同様に、図16に示したような複数の運転者の走行履歴を予め集約する。ここで、クラスタリング型の場合と異なる点は、運転者毎にドライバモデルを構築する点である。以下では、運転者yに対してドライバモデルを構築する例について説明する。 Next, the individual adaptive type will be described. As in the case of the clustering type, the individual adaptive type driver model construction method aggregates the driving histories of a plurality of drivers as shown in FIG. Here, the difference from the clustering type is that a driver model is constructed for each driver. Below, the example which builds a driver model with respect to the driver | operator y is demonstrated.
 まず、集約した複数の運転者の走行履歴の中から、運転者yの走行履歴と類似度が高い複数の運転者の走行履歴を抽出する。そして、抽出した複数の運転者の走行履歴から運転者yのドライバモデルを構築する。 First, the driving histories of a plurality of drivers having high similarity to the driving history of the driver y are extracted from the driving histories of the plurality of drivers collected. Then, a driver model of the driver y is constructed from the extracted driving histories of the plurality of drivers.
 図20は、個別適応型のドライバモデルの構築方法を示す図である。図20には、図17と同様に、運転者a~fの走行履歴が表形式で示されている。また、図20には、図16に示した運転者yの走行履歴と類似度が高い運転者c~eの走行履歴とから運転者yのドライバモデルが構築されることが示されている。 FIG. 20 is a diagram showing a method for constructing an individual adaptive driver model. In FIG. 20, the driving histories of the drivers a to f are shown in a table format, as in FIG. FIG. 20 shows that the driver model of the driver y is constructed from the driving history of the driver y shown in FIG. 16 and the driving histories of the drivers c to e having high similarity.
 個別適応型のドライバモデルは、抽出した各運転者の走行履歴において、それぞれの頻度の平均を算出することによって構築される。 The individual adaptive driver model is constructed by calculating the average of each frequency in the extracted driving history of each driver.
 図21は、構築された個別適応型のドライバモデルの一例を示す図である。図16に示した運転者yの走行履歴、及び、図20に示した運転者c~eの走行履歴において、走行環境毎に、各挙動の平均頻度を導出する。このように、運転者yに対する個別適応型のドライバモデルは、各走行環境に対応する挙動の平均頻度で構築される。 FIG. 21 is a diagram illustrating an example of a constructed individual adaptive driver model. In the driving history of the driver y shown in FIG. 16 and the driving history of the drivers c to e shown in FIG. 20, the average frequency of each behavior is derived for each driving environment. As described above, the individually adaptive driver model for the driver y is constructed with an average frequency of behavior corresponding to each traveling environment.
 ここで、構築した個別適応型のドライバモデルの使用方法について、例を挙げて説明する。 Here, how to use the built individually adaptable driver model is explained with an example.
 図21に示したような運転者yのドライバモデルは、予め車両1の記憶部8に記憶される。また、車両制御部7は、運転者yが過去に運転した際の走行履歴を記憶部8に記憶しておく。なお、運転者yの検知は、車内に設置されるカメラ等(図示しない)で実行される。 The driver model of the driver y as shown in FIG. 21 is stored in advance in the storage unit 8 of the vehicle 1. Further, the vehicle control unit 7 stores a travel history when the driver y has driven in the past in the storage unit 8. The driver y is detected by a camera or the like (not shown) installed in the vehicle.
 そして、車両制御部7は、実際の自動走行の際に、運転者yのドライバモデルの各走行環境において、最も頻度が高い挙動が運転者yに最も適した挙動、つまり、第1の挙動であると判定する。 Then, the vehicle control unit 7 determines that the behavior with the highest frequency is the most suitable behavior for the driver y, that is, the first behavior in each driving environment of the driver model of the driver y in actual automatic driving. Judge that there is.
 このように、予め複数の運転者の走行履歴から運転者個人のドライバモデルを構築することにより、運転者により適した挙動を報知できる。 In this way, by constructing a driver model of an individual driver from the driving histories of a plurality of drivers in advance, a behavior more suitable for the driver can be notified.
 例えば、図16に示すように、運転者yの走行履歴に「前方に低速車あり」という走行環境に対する挙動の頻度が0、つまり、運転者が「前方に低速車あり」という走行環境において「追従」、「追い越し」、「車線変更」という挙動を選択したことが無い場合においても、車両制御部7は、図21に示すドライバモデルに基づき、「前方に低速車あり」という走行環境において、「車線変更」を第1の挙動として判定できる。 For example, as illustrated in FIG. 16, in the driving environment where the driving history of the driver y is “there is a low-speed vehicle ahead” in the driving history of the driver y, that is, the driving environment is “a low-speed vehicle ahead”. Even when the behaviors of “follow”, “passing”, and “lane change” have never been selected, the vehicle control unit 7 is based on the driver model shown in FIG. “Changing lane” can be determined as the first behavior.
 次に、運転者の運転特性(運転の癖)を取得し、運転者の嗜好に応じた自動運転を行う場合について説明する。一般に、1つの挙動(例えば、車線変更)に対する実際の動作(例えば、加速、減速の大きさ、あるいは、ステアリングホイールの操作量)は、運転者毎に異なる。そのため、運転者の嗜好に応じた自動運転を行うことにより、運転者にとってより快適な走行が可能となる。 Next, the case where the driving characteristics (driving habits) of the driver are acquired and the automatic driving according to the driver's preference is performed will be described. In general, the actual operation (for example, the magnitude of acceleration, deceleration, or the amount of operation of the steering wheel) for one behavior (for example, lane change) differs for each driver. Therefore, it is possible to travel more comfortably for the driver by performing the automatic driving according to the preference of the driver.
 なお、以下の説明では、手動運転中に運転者の運転特性を取得し、取得した運転特性を自動運転の際に反映させる場合について説明するが、本発明はこれに限定されない。 In the following description, a case will be described in which the driving characteristics of the driver are acquired during manual driving and the acquired driving characteristics are reflected in automatic driving, but the present invention is not limited to this.
 車両制御部7は、運転者の車両1の各部の操作内容から、運転者の運転特性を示す特徴量を抽出し、記憶部8に記憶する。ここで、特徴量とは、例えば、速度に関する特徴量、ステアリングに関する特徴量、操作タイミングに関する特徴量、車外センシングに関する特徴量、車内センシングに関する特徴量等がある。 The vehicle control unit 7 extracts a feature amount indicating the driving characteristics of the driver from the operation content of each part of the vehicle 1 of the driver, and stores it in the storage unit 8. Here, the feature amount includes, for example, a feature amount related to speed, a feature amount related to steering, a feature amount related to operation timing, a feature amount related to outside-vehicle sensing, a feature amount related to in-vehicle sensing, and the like.
 速度に関する特徴量は、例えば、車両の速度、加速度、減速度などがあり、これらの特徴量は、車両が有する速度センサ等から取得される。 The feature quantity related to speed includes, for example, the speed, acceleration, and deceleration of the vehicle, and these feature quantities are acquired from a speed sensor or the like that the vehicle has.
 ステアリングに関する特徴量は、例えば、ステアリングの舵角、角速度、各加速度などがあり、これらの特徴量は、ステアリングホイール5から取得される。 The feature amount related to steering includes, for example, the steering angle, angular velocity, and acceleration of the steering, and these feature amounts are acquired from the steering wheel 5.
 操作タイミングに関する特徴量は、例えば、ブレーキ、アクセル、ウィンカレバー、ステアリングホイールの操作タイミングなどがあり、これらの特徴量は、それぞれ、ブレーキペダル2、アクセルペダル3、ウィンカレバー4、ステアリングホイール5から取得される。 The feature quantities related to the operation timing include, for example, the operation timing of the brake, accelerator, blinker lever, and steering wheel. These feature quantities are obtained from the brake pedal 2, the accelerator pedal 3, the blinker lever 4, and the steering wheel 5, respectively. Is done.
 車外センシングに関する特徴量は、例えば、前方、側方、後方に存在する車両との車間距離などがあり、これらの特徴量は、センサ62から取得される。 The feature amount related to outside-vehicle sensing includes, for example, a distance between vehicles in front, side, and rear, and these feature amounts are acquired from the sensor 62.
 車内センシングに関する特徴量は、例えば、運転者が誰であるか、及び、同乗者が誰であるかを示す個人認識情報であり、これらの特徴量は、車内に設置されるカメラ等から取得される。 The feature amount related to in-vehicle sensing is, for example, personal recognition information indicating who the driver is and who is the passenger, and these feature amounts are acquired from a camera or the like installed in the vehicle. The
 例えば、運転者が手動で車線変更を行う場合、車両制御部7は、運転者が手動で車線変更を行ったことを検知する。検知方法は、予め車線変更の操作時系列パターンをルール化しておくことにより、CAN(Controller Area Network)情報などから取得した操作時系列データを解析することで検知する。その際、車両制御部7は、上述した特徴量を取得する。車両制御部7は、運転者毎に、特徴量を記憶部8に記憶し、運転特性モデルを構築する。 For example, when the driver manually changes the lane, the vehicle control unit 7 detects that the driver has manually changed the lane. The detection method is performed by analyzing operation time-series data acquired from CAN (Controller Area Network) information or the like by rule-setting an operation time-series pattern for changing lanes in advance. In that case, the vehicle control part 7 acquires the feature-value mentioned above. The vehicle control unit 7 stores the feature amount in the storage unit 8 for each driver, and constructs a driving characteristic model.
 なお、車両制御部7は、運転者毎の特徴量に基づき、上述したドライバモデルを構築してもよい。つまり、車両制御部7は、速度に関する特徴量、ステアリングに関する特徴量、操作タイミングに関する特徴量、車外センシングに関する特徴量、車内センシングに関する特徴量を抽出し、記憶部8に記憶する。そして、記憶部8に記憶した特徴量に基づいて、走行環境毎の運転者による操作の傾向と各操作の頻度の情報を対応づけたドライバモデルを構築してもよい。 Note that the vehicle control unit 7 may construct the above-described driver model based on the feature amount for each driver. That is, the vehicle control unit 7 extracts a feature value related to speed, a feature value related to steering, a feature value related to operation timing, a feature value related to outside-vehicle sensing, and a feature value related to in-vehicle sensing, and stores them in the storage unit 8. And based on the feature-value memorize | stored in the memory | storage part 8, you may build the driver model which matched the tendency of operation by the driver for every driving | running | working environment, and the information of the frequency of each operation.
 図22は、運転特性モデルの一例を示す図である。図22は、運転者毎に、特徴量が表形式で示されている。また、図22には、運転者毎に、各挙動を過去に選択した回数が示されている。特徴量についても一部のみが記載されているが、上記に挙げたいずれか、またはその全てを記載してもよい。 FIG. 22 is a diagram showing an example of the driving characteristic model. FIG. 22 shows the feature values in a tabular format for each driver. FIG. 22 also shows the number of times each behavior has been selected in the past for each driver. Although only a part of the feature amount is described, any or all of the above may be described.
 図22に記載の特徴量について詳細を説明する。速度の数値は、実際の速度を段階的に示している数値である。ステアリングホイール、ブレーキ、アクセルの数値は、操作量を段階的に示している数値である。これらの数値は、例えば、過去の所定の期間内の速度、ステアリングホイール、ブレーキ、アクセルの操作量の平均値を算出し、その平均値を段階的に表すことによって得られる。 Details of the feature amount described in FIG. 22 will be described. The numerical value of speed is a numerical value indicating the actual speed in stages. The numerical values for the steering wheel, brake, and accelerator are numerical values that indicate the operation amount in stages. These numerical values are obtained, for example, by calculating an average value of speed, steering wheel, brake, and accelerator operation amounts within a predetermined period in the past and expressing the average value stepwise.
 例えば、図22において、運転者xが同乗者がいない状態で車線変更を行う場合、速度のレベルは8であり、ステアリングホイール、ブレーキ、アクセルの操作量のレベルはそれぞれ4、6、8である。 For example, in FIG. 22, when the driver x changes lanes when there is no passenger, the speed level is 8, and the steering wheel, brake, and accelerator operation amount levels are 4, 6, and 8, respectively. .
 自動運転の際は、車両制御部7は、運転者が誰か、どのような挙動が実行されるか、及び、同乗者が誰かに応じて、運転者、挙動、及び、同乗者に対応する運転特性モデルが図22に示す運転特性モデルの中から選択する。 In the case of automatic driving, the vehicle control unit 7 performs driving corresponding to the driver, the behavior, and the passenger according to who the driver is, what kind of behavior is executed, and who the passenger is. The characteristic model is selected from the driving characteristic models shown in FIG.
 そして、車両制御部7は、選択した運転特性モデルに対応する速度で車両1を走行させ、また、ステアリングホイール、ブレーキ、アクセルの操作量およびそのタイミングの組み合わせで車両1を制御する。これにより、運転者の嗜好に応じた自動運転を行うことができる。なお、図22に示すような運転特性モデルの情報は、報知部92に報知させることができる。 Then, the vehicle control unit 7 causes the vehicle 1 to travel at a speed corresponding to the selected driving characteristic model, and controls the vehicle 1 by a combination of the steering wheel, brake, accelerator operation amounts and timing. Thereby, the automatic driving | operation according to a driver | operator's preference can be performed. Note that the information on the driving characteristic model as shown in FIG.
 図23A~23Dは、本発明の実施の形態4における報知部92の表示を説明する図である。図23A~23Dは、図5Aに示した走行環境の第1の例に対する表示である。 FIGS. 23A to 23D are diagrams for explaining display of the notification unit 92 according to Embodiment 4 of the present invention. 23A to 23D are displays for the first example of the traveling environment shown in FIG. 5A.
 図23Aは、車線の変更或いは車両の加速、減速が不要な通常走行を行っている状態の報知部92の表示である。図23Aには、運転者の運転特性が「減速が多い」運転特性であることを示す記号231と、現在、自動運転中であることを示す記号232が示されている。 FIG. 23A is a display of the notification unit 92 in a state where the vehicle is performing normal travel that does not require lane change or vehicle acceleration / deceleration. FIG. 23A shows a symbol 231 indicating that the driving characteristic of the driver is the “high deceleration” driving characteristic, and a symbol 232 indicating that the driver is currently in automatic driving.
 車両制御部7は、例えば、図22に示した運転特性モデルに含まれる各挙動を過去に選択した回数に基づいて、運転者の運転特性を判定する。この場合、車両制御部7は、例えば、運転特性から「減速」が多い(いわゆる「減速」という挙動を選択した回数が多い)運転者に対して、図23A~23Dのような記号231を含む表示を報知部92に表示させる。 The vehicle control unit 7 determines the driving characteristics of the driver based on the number of times each behavior included in the driving characteristics model shown in FIG. 22 has been selected in the past, for example. In this case, the vehicle control unit 7 includes, for example, a symbol 231 as shown in FIGS. 23A to 23D for a driver who has a lot of “deceleration” based on driving characteristics (a large number of times the behavior of so-called “deceleration” is selected). The display is displayed on the notification unit 92.
 そして、車両制御部7が、走行環境が図5Aに示した第1の例の走行環境であると判定した場合、車両制御部7は、運転者の運転特性が「減速が多い」運転特性であることに基づいて、第1の挙動を「減速」と判定し、図23Bの表示を報知部92に実行させる。 When the vehicle control unit 7 determines that the driving environment is the driving environment of the first example illustrated in FIG. 5A, the vehicle control unit 7 determines that the driving characteristic of the driver is “high deceleration” driving characteristic. Based on the fact, the first behavior is determined to be “deceleration”, and the notification unit 92 is caused to execute the display of FIG. 23B.
 図23Bには、第1の挙動である「減速」を示す記号233が第1の態様(例えば、第1の色)で示されている。また、第2の挙動である「加速」を示す記号234と、第2の挙動である「車線変更」を示す記号235が示されている。 In FIG. 23B, a symbol 233 indicating “deceleration” which is the first behavior is shown in a first mode (for example, a first color). Further, a symbol 234 indicating “acceleration” as the second behavior and a symbol 235 indicating “lane change” as the second behavior are shown.
 運転者は、実施の形態1で説明したような操作により、「加速」への挙動の変更を行った場合、車両制御部7は、図23Cの表示を報知部92に実行させる。 When the driver changes the behavior to “acceleration” by the operation described in the first embodiment, the vehicle control unit 7 causes the notification unit 92 to execute the display of FIG. 23C.
 図23Cには、選択された挙動である「加速」を示す記号234’が、第1の態様で示されている。また、記号233’は、図23Bにおいて第1の挙動として表示されていた記号233が記号234と入れ替わって表示されたものである。 FIG. 23C shows a symbol 234 ′ indicating “acceleration” as the selected behavior in the first mode. Further, the symbol 233 ′ is displayed by replacing the symbol 233 that was displayed as the first behavior in FIG. 23B with the symbol 234.
 その後、車両制御部7は、図23Aに示す表示を報知部92に実行させてから第2の所定時間が経過した後に、図23Dに示す表示を報知部92に表示させる。ここで、図15Dには、次の挙動として、運転者が選択した「加速」を示す記号234’が第2の態様で表示される。 Thereafter, the vehicle control unit 7 causes the notification unit 92 to display the display illustrated in FIG. 23D after the second predetermined time has elapsed since the notification unit 92 has executed the display illustrated in FIG. 23A. Here, in FIG. 15D, as the next behavior, a symbol 234 ′ indicating “acceleration” selected by the driver is displayed in the second mode.
 車両制御部7は、次にとる挙動が「加速」と決定した場合、運転特性モデルに含まれる「加速」の挙動に対応する特徴量を読み出し、それらの特徴量を反映させた「加速」を行うように、車両1を制御する。 When it is determined that the next behavior to be taken is “acceleration”, the vehicle control unit 7 reads out feature amounts corresponding to the “acceleration” behavior included in the driving characteristic model, and performs “acceleration” reflecting those feature amounts. The vehicle 1 is controlled to do so.
 図24A~24Dは、本発明の実施の形態4における報知部92の表示を説明する図である。図24A~24Dは、図7Aに示した走行環境の第2の例に対する表示である。なお、図24A~24Dにおいて、図23A~23Dと共通する構成には図23A~23Dと同一の符号を付し、その詳しい説明を省略する。図24A~24Dは、図23A~23Dから、「車線変更」を示す記号235が削除された図である。 FIGS. 24A to 24D are diagrams illustrating the display of the notification unit 92 according to Embodiment 4 of the present invention. 24A to 24D are displays for the second example of the traveling environment shown in FIG. 7A. 24A to 24D, components common to those in FIGS. 23A to 23D are denoted by the same reference numerals as in FIGS. 23A to 23D, and detailed description thereof is omitted. 24A to 24D are diagrams in which the symbol 235 indicating “lane change” is deleted from FIGS. 23A to 23D.
 前述の通り、第2の例(図7A)では、第1の例(図5A)と異なり、車両1の右方に別の車両が走行しているため、車線変更ができない。そのため、図24B、24Cでは、「車線変更」が表示されていない。また、図24Cの例では、図23Cの場合と同様に、「加速」が選択されたため、車両制御部7は、図23A~23Dと同様に、運転特性モデルに含まれる「加速」の挙動に対応する特徴量を読み出し、それらの特徴量を反映させた「加速」を行うように、車両1を制御する。 As described above, in the second example (FIG. 7A), unlike the first example (FIG. 5A), another vehicle is traveling to the right of the vehicle 1, and therefore the lane cannot be changed. Therefore, “lane change” is not displayed in FIGS. 24B and 24C. In the example of FIG. 24C, “acceleration” is selected as in the case of FIG. 23C. Therefore, the vehicle control unit 7 changes the behavior of “acceleration” included in the driving characteristic model as in FIGS. 23A to 23D. The corresponding feature values are read out, and the vehicle 1 is controlled to perform “acceleration” reflecting those feature values.
 図25A~25Dは、本発明の実施の形態4における報知部92の表示を説明する図である。図25A~25Dは、図8Aに示した走行環境の第3の例に対する表示である。 25A to 25D are diagrams for explaining the display of the notification unit 92 according to Embodiment 4 of the present invention. 25A to 25D are displays for the third example of the traveling environment shown in FIG. 8A.
 図25Aは、図23Aと同様である。車両制御部7が図8Aに示した第3の例の走行環境であることを判定した場合、車両制御部7は、運転者の運転特性が「減速が多い」運転特性であることに基づいて、第1の挙動を「減速」と判定し、図25Bの表示を報知部92に実行させる。 FIG. 25A is the same as FIG. 23A. When the vehicle control unit 7 determines that the driving environment of the third example illustrated in FIG. 8A is satisfied, the vehicle control unit 7 determines that the driving characteristic of the driver is a driving characteristic with “a lot of deceleration”. The first behavior is determined as “deceleration”, and the notification unit 92 is caused to execute the display of FIG. 25B.
 図25Bには、第1の挙動である「減速」を示す記号251が第1の態様(例えば、第1の色)で示されている。また、第2の挙動である「追い越し」を示す記号252と、第2の挙動である「車線変更」を示す記号253が示されている。 In FIG. 25B, a symbol 251 indicating “deceleration” which is the first behavior is shown in a first mode (for example, a first color). Further, a symbol 252 indicating “passing” that is the second behavior and a symbol 253 indicating “lane change” that is the second behavior are shown.
 運転者は、実施の形態1で説明したような操作により、「追い越し」への挙動の変更を行った場合、車両制御部7は、図25Cの表示を報知部92に実行させる。 When the driver changes the behavior to “passing” by the operation described in the first embodiment, the vehicle control unit 7 causes the notification unit 92 to execute the display in FIG. 25C.
 図25Cには、選択された挙動である「追い越し」を示す記号252’が、第1の態様で示されている。また、記号251’は、図25Bにおいて第1の挙動として表示されていた記号251が記号252と入れ替わって表示されたものである。 FIG. 25C shows a symbol 252 ′ indicating “overtaking” as the selected behavior in the first mode. Further, the symbol 251 ′ is displayed by replacing the symbol 251 displayed as the first behavior in FIG. 25B with the symbol 252.
 その後、車両制御部7は、図25Aに示す表示を報知部92に実行させてから第2の所定時間が経過した後に、図25Dに示す表示を報知部92に表示させる。ここで、図15Dには、次の挙動として、運転者が選択した「追い越し」を示す記号252’が第2の態様で表示される。 Thereafter, the vehicle control unit 7 causes the notification unit 92 to display the display illustrated in FIG. 25D after the second predetermined time has elapsed since the notification unit 92 has executed the display illustrated in FIG. 25A. Here, in FIG. 15D, as the next behavior, a symbol 252 ′ indicating “overtaking” selected by the driver is displayed in the second mode.
 車両制御部7は、次にとる挙動が「追い越し」と決定した場合、運転特性モデルに含まれる「追い越し」の挙動に対応する特徴量を読み出し、それらの特徴量を反映させた「加速」を行うように、車両1を制御する。 When the behavior to be taken next is determined to be “passing”, the vehicle control unit 7 reads out feature amounts corresponding to the “passing” behavior included in the driving characteristic model, and performs “acceleration” reflecting those feature amounts. The vehicle 1 is controlled to do so.
 次に、運転者の運転特性が「減速が多い」運転特性ではない場合の表示の例を説明する。 Next, an example of the display when the driving characteristics of the driver are not “high deceleration” driving characteristics will be described.
 図26A~26Dは、本発明の実施の形態4における報知部92の表示を説明する図である。図26A~26Dは、図5Aに示した走行環境の第1の例に対する表示である。なお、図26Aは、運転者の運転特性が「加速が多い」運転特性である場合の例を示し、図26Bは、運転者の運転特性が「車線変更が多い」運転特性である場合の例を示している。 FIGS. 26A to 26D are diagrams illustrating the display of the notification unit 92 according to the fourth embodiment of the present invention. 26A to 26D are displays for the first example of the traveling environment shown in FIG. 5A. FIG. 26A shows an example in which the driving characteristics of the driver are “high acceleration” driving characteristics, and FIG. 26B shows an example in which the driving characteristics of the driver are “many lane changes”. Is shown.
 図26Aには、運転者の運転特性が「加速が多い」運転特性であることを示す記号261が示されている。また、第1の挙動である「加速」を示す記号262が第1の態様(例えば、第1の色)で示されている。また、第2の挙動である「車線変更」を示す記号263と、第2の挙動である「減速」を示す記号264が示されている。 FIG. 26A shows a symbol 261 indicating that the driving characteristic of the driver is the “high acceleration” driving characteristic. Further, a symbol 262 indicating “acceleration” which is the first behavior is shown in the first mode (for example, the first color). Further, a symbol 263 indicating “lane change” as the second behavior and a symbol 264 indicating “deceleration” as the second behavior are shown.
 車両制御部7は、例えば、運転特性から過去に「加速」が多い(いわゆる過去に「加速」という挙動を選択した回数が多い)運転者に対して、図26Aのような記号261を含む表示を報知部92に実行させる。また、車両制御部7は、運転者の運転特性が「加速が多い」運転特性であることに基づいて、第1の挙動を「加速」と判定し、図26Aの表示を報知部92に実行させる。 For example, the vehicle control unit 7 displays a symbol 261 as shown in FIG. 26A for a driver who has a lot of “acceleration” in the past based on driving characteristics (a large number of times the behavior of “acceleration” has been selected in the past). Is executed by the notification unit 92. Further, the vehicle control unit 7 determines that the first behavior is “acceleration” based on the driving characteristic of the driver being “high acceleration” and executes the display of FIG. Let
 図26Bには、運転者の運転特性が「車線変更が多い」運転特性であることを示す記号265が示されている。また、第1の挙動である「車線変更」を示す記号266が第1の態様(例えば、第1の色)で示されている。また、第2の挙動である「車線変更」を示す記号267と、第2の挙動である「減速」を示す記号268が示されている。 FIG. 26B shows a symbol 265 indicating that the driving characteristic of the driver is a driving characteristic with “many lane changes”. Further, a symbol 266 indicating “lane change” as the first behavior is shown in the first mode (for example, the first color). Further, a symbol 267 indicating “lane change” as the second behavior and a symbol 268 indicating “deceleration” as the second behavior are shown.
 車両制御部7は、例えば、運転特性から過去に「車線変更」が多い(いわゆる過去に「車線変更」という挙動を選択した回数が多い)運転者に対して、図26Bのような記号265を含む表示を報知部92に実行させる。車両制御部7は、運転者の運転特性が「車線変更が多い」運転特性であることに基づいて、第1の挙動を「車線変更」と判定し、図26Bの表示を報知部92に実行させる。 For example, the vehicle control unit 7 gives a symbol 265 as shown in FIG. 26B to a driver who has a lot of “lane change” in the past based on driving characteristics (a large number of times the behavior of “lane change” has been selected in the past). The notification unit 92 is caused to execute the display including it. The vehicle control unit 7 determines that the first behavior is “lane change” based on the driving characteristics of the driver being “many lane changes” and executes the display of FIG. Let
 上記は、運転特性モデルのみを使用して説明したが、ドライバモデルを加味してもよく、図23A~23D、図24A~25D、図25A~26D、図26A-26Bにおいて、記号231や261や265などは運転者の操作履歴から選択されたドライバモデルの種類を示してもよい。例えば、図5Aに示した走行環境の第1の例について、「減速」をよく選ぶ運転者に適用するドライバモデルには図23A~23Dのような記号231を含む表示を報知部92に実行させ、第1の挙動を「減速」と判定する。「加速」をよく選ぶ運転者に適用するドライバモデルには図26Aのような記号261を含む表示を報知部92に実行させ、第1の挙動を「加速」と判定する。「車線変更」をよく選ぶ運転者に適用するドライバモデルには図26Aのような記号261を含む表示を報知部92に実行させ、第1の挙動を「車線変更」と判定する。 The above has been described using only the driving characteristic model, but a driver model may be considered. In FIGS. 23A to 23D, FIGS. 24A to 25D, FIGS. 25A to 26D, and FIGS. 26A to 26B, symbols 231 and 261 265 and the like may indicate the type of driver model selected from the operation history of the driver. For example, in the first example of the driving environment shown in FIG. 5A, a notification including the symbol 231 as shown in FIGS. 23A to 23D is displayed on the notification unit 92 in the driver model applied to the driver who often selects “deceleration”. The first behavior is determined as “deceleration”. In the driver model applied to the driver who often selects “acceleration”, the display including the symbol 261 as shown in FIG. 26A is executed by the notification unit 92, and the first behavior is determined as “acceleration”. For the driver model applied to the driver who often selects “lane change”, the display including the symbol 261 as shown in FIG. 26A is executed by the notification unit 92, and the first behavior is determined to be “lane change”.
 以上説明した本実施の形態によれば、車の将来の挙動を決定する際に、運転者の過去の走行履歴を学習し、その結果を将来の挙動の決定に反映させることができる。また、車両制御部が車を制御する際に、運転者の運転特性(運転嗜好)を学習し、車の制御に反映させることができる。 According to the present embodiment described above, when determining the future behavior of the vehicle, the driver's past driving history can be learned, and the result can be reflected in the determination of the future behavior. Further, when the vehicle control unit controls the vehicle, the driving characteristics (driving preference) of the driver can be learned and reflected in the control of the vehicle.
 これにより、車両が運転者若しくは乗員が嗜好するタイミング或いは操作量で自動運転を制御でき、実際運転者が手動運転する場合の感覚と乖離することなく、自動運転中に運転者による不要な操作介入を抑制することができる。 As a result, automatic driving can be controlled at the timing or amount of operation that the driver or occupant prefers the vehicle, and unnecessary operation intervention by the driver during automatic driving without departing from the sense of actual driving by the actual driver. Can be suppressed.
 なお、本発明では、車両制御部7が実行する機能と同様の機能をクラウドサーバなどのサーバ装置に実行させてもよい。また、記憶部8は、車両1ではなく、クラウドサーバなどのサーバ装置にあってもよい。あるいは、記憶部8は、既に構築されたドライバモデルを記憶し、車両制御部7は、記憶部8に記憶されたドライバモデルを参照して、挙動を判定することとしてもよい。 In the present invention, a server device such as a cloud server may execute a function similar to the function executed by the vehicle control unit 7. Moreover, the memory | storage part 8 may exist not in the vehicle 1 but in server apparatuses, such as a cloud server. Alternatively, the storage unit 8 may store an already constructed driver model, and the vehicle control unit 7 may determine the behavior with reference to the driver model stored in the storage unit 8.
 このように、実施の形態4では、車両制御部7が、運転者の運転特性を示す特徴量の情報を取得し、記憶部8がその特徴量の情報を記憶し、車両制御部7が記憶部8に記憶された特徴量の情報に基づいて、運転者が選択した車両の挙動の傾向を、選択された各挙動の頻度で示すドライバモデルを車両の走行環境毎に構築することとした。 As described above, in the fourth embodiment, the vehicle control unit 7 acquires feature amount information indicating the driving characteristics of the driver, the storage unit 8 stores the feature amount information, and the vehicle control unit 7 stores the feature amount information. Based on the feature amount information stored in the unit 8, a driver model indicating the tendency of the behavior of the vehicle selected by the driver with the frequency of each selected behavior is constructed for each traveling environment of the vehicle.
 また、車両制御部7は、複数の運転者のうち、類似した挙動の選択を行う運転者のグループを決定し、グループ毎、車両の走行環境毎にドライバモデルを構築することとした。 In addition, the vehicle control unit 7 determines a group of drivers who select a similar behavior among a plurality of drivers, and constructs a driver model for each group and for each driving environment of the vehicle.
 また、車両制御部7は、類似した操作を行う運転者のグループ毎に各運転者が選択した挙動の頻度の平均値を算出し、運転者が選択した車両の挙動の傾向を、算出した平均値で示すドライバモデルを車両の走行環境毎に構築することとした。 Further, the vehicle control unit 7 calculates the average value of the behavior frequency selected by each driver for each group of drivers who perform similar operations, and calculates the behavior tendency of the vehicle selected by the driver. The driver model indicated by the value was constructed for each driving environment of the vehicle.
 また、車両制御部7は、特定の運転者が選択した車両の挙動の傾向と類似する傾向がある他の運転者が選択した車両の挙動に基づいて、上記特定の運転者が選択した車両の挙動の傾向を、選択された各挙動の頻度で示すドライバモデルを車両の走行環境毎に構築することとした。 In addition, the vehicle control unit 7 determines the vehicle selected by the specific driver based on the behavior of the vehicle selected by another driver that tends to be similar to the behavior tendency of the vehicle selected by the specific driver. A driver model indicating the tendency of behavior with the frequency of each selected behavior is constructed for each traveling environment of the vehicle.
 以上により、車両制御部7は、運転者の運転傾向により適したドライバモデルを構築でき、構築したドライバモデルに基づいて、運転者に対してより適切な自動運転を行うことができる。 As described above, the vehicle control unit 7 can construct a driver model more suitable for the driving tendency of the driver, and can perform more appropriate automatic driving for the driver based on the constructed driver model.
 (ドライバモデルの変形例)
 なお、上記で説明したドライバモデルは、走行環境毎の運転者による操作(挙動)の傾向を各操作の頻度の情報などに基づいてモデル化したものであったが、本発明はこれに限定されない。
(Modified example of driver model)
In addition, although the driver model demonstrated above modeled the tendency of operation (behavior) by the driver for every driving environment based on the information of the frequency of each operation, etc., the present invention is not limited to this. .
 例えば、ドライバモデルは、過去に走行した走行環境(つまり、シチュエーション)を示す環境パラメータと、その走行環境において運転者が実際に選択した操作(挙動)とを対応させた走行履歴に基づいて構築されてもよい。環境パラメータをドライバモデルに組み込むことにより、走行環境の検出・分類を別途行い、その分類結果をドライバモデルに入力(記憶)するという手続きを踏むことなく、選択肢を決めることが出来る。具体的には、図23A~23D、24A~24Dのような走行環境の違いを、環境パラメータとして取得し、ドライバモデルに直接入力(記憶)することにより、図23A~23Dでは「加速」、「減速」、「車線変更」が選択肢となり、図24A~24Dでは、「加速」、「減速」が選択肢となる。以下では、このようなドライバモデルを構築する場合について説明する。なお、以下に説明するドライバモデルは、シチュエーションデータベースと言い換えても良い。 For example, the driver model is constructed based on a travel history in which environmental parameters indicating travel environments (that is, situations) that have traveled in the past and operations (behaviors) actually selected by the driver in the travel environment are associated with each other. May be. By incorporating environmental parameters into the driver model, options can be determined without going through the procedure of separately detecting and classifying the driving environment and inputting (storing) the classification result into the driver model. Specifically, the differences in the driving environment as shown in FIGS. 23A to 23D and 24A to 24D are acquired as environment parameters and directly input (stored) in the driver model, whereby “acceleration” and “ “Deceleration” and “lane change” are options, and in FIGS. 24A to 24D, “acceleration” and “deceleration” are options. Hereinafter, a case where such a driver model is constructed will be described. The driver model described below may be referred to as a situation database.
 ここで、本変形例におけるドライバモデルを構築するための走行履歴について説明する。図27は、走行履歴の一例を示す図である。図27には、運転者xが運転する車両が、過去に走行した走行環境を示す環境パラメータと、その走行環境において運転者が実際に選択した操作(挙動)とを対応させた走行履歴が示されている。 Here, the travel history for constructing the driver model in this modification will be described. FIG. 27 is a diagram illustrating an example of a travel history. FIG. 27 shows a travel history in which an environment parameter indicating a travel environment in which the vehicle driven by the driver x has traveled in the past and an operation (behavior) actually selected by the driver in the travel environment are associated with each other. Has been.
 図27に示す走行履歴の(a)~(c)の環境パラメータは、それぞれ、例えば、図8B、図5B、図7Bに示したようにして運転者に車両の挙動を提示した際の走行環境を示すものである。この走行履歴の環境パラメータは、センシング情報およびインフラ情報から得られる。 The environmental parameters (a) to (c) of the travel history shown in FIG. 27 are the travel environments when the behavior of the vehicle is presented to the driver as shown in FIGS. 8B, 5B, and 7B, for example. Is shown. The environmental parameter of the travel history is obtained from sensing information and infrastructure information.
 センシング情報は、車両が有するセンサ或いはレーダ等が検知した情報である。インフラ情報は、GPSの情報、地図情報、路車間通信で取得される情報などである。 Sensing information is information detected by a sensor or radar of the vehicle. The infrastructure information includes GPS information, map information, information acquired through road-to-vehicle communication, and the like.
 例えば、図27に示す走行履歴の環境パラメータは、「自車両の情報」、自車両が走行する車線の前方を走行する車両の情報を示す「先行車両の情報」、自車両が走行する車線の側方車線の情報を示す「側方車線の情報」、自車両が走行する位置に合流車線がある場合に、その合流車線の情報を示す「合流車線の情報」、自車両の位置とその周囲の情報を示す「位置情報」などを含む。また、後方車両の情報を含めてもよい。その場合、後方車両と自車両との相対速度、車頭間距離、車頭間距離の変化率などを用いても良い。また、車両の存在の情報を含めてもよい。 For example, the environmental parameters of the travel history shown in FIG. 27 include “information on own vehicle”, “information on preceding vehicle” indicating information on a vehicle traveling in front of the lane on which the host vehicle is traveling, and information on the lane on which the host vehicle is traveling. “Side lane information” indicating side lane information, “Merge lane information” indicating the merge lane information when the host vehicle is traveling, and the location of the vehicle and its surroundings "Location information" indicating the information of Moreover, you may include the information of a back vehicle. In that case, you may use the relative speed of a back vehicle and the own vehicle, the distance between heads, the change rate of the distance between heads, etc. Moreover, you may include the information of presence of a vehicle.
 例えば、「自車両の情報」は、自車両の速度Vaの情報を含む。「先行車両の情報」は、自車両に対する先行車両の相対速度Vba、先行車両と自車両との車間距離DRba、先行車両のサイズの変化率RSbの情報を含む。 For example, “information on own vehicle” includes information on the speed Va of the own vehicle. The “preceding vehicle information” includes information on the relative speed Vba of the preceding vehicle with respect to the own vehicle, the inter-vehicle distance DRba between the preceding vehicle and the own vehicle, and the rate of change RSb of the size of the preceding vehicle.
 ここで、自車両の速度Vaは、自車両が有する速度センサによって検知される。相対速度Vba、車間距離DRbaは、センサ或いはレーダ等によって検知される。サイズの変化率RSbは、RSb=-Vba/DRbaという関係式によって算出される。 Here, the speed Va of the host vehicle is detected by a speed sensor of the host vehicle. The relative speed Vba and the inter-vehicle distance DRba are detected by a sensor or a radar. The size change rate RSb is calculated by the relational expression RSb = −Vba / DRba.
 「側方車線の情報」は、側方車線において自車両より後方を走行する側後方車両の情報と、側方車線において自車両より前方を走行する側前方車両の情報と、自車両の残存側方車線長DRdaの情報とを含む。 “Side lane information” includes information on the side rear vehicle traveling behind the host vehicle in the side lane, information on the side front vehicle traveling ahead of the host vehicle in the side lane, and the remaining side of the host vehicle. Information on the direction lane length DRda.
 側後方車両の情報は、自車両に対する側後方車両の相対速度Vca、側後方車両と自車両との車頭間距離Dca、車頭間距離の変化率Rcaの情報を含む。側後方車両と自車両との車頭間距離Dcaとは、自車両(および側後方車両)の進行方向に沿った方向において測定される自車両の先端部(車頭)と側後方車両の先端部(車頭)との間の距離である。なお、車頭間距離は、車間距離及び車長から算出してもよい。また、車頭間距離は、車間距離に代替させてもよい。 The information on the side rear vehicle includes information on the relative speed Vca of the side rear vehicle with respect to the own vehicle, the inter-head distance Dca between the side rear vehicle and the own vehicle, and the change rate Rca of the inter-head distance. The inter-head distance Dca between the side rear vehicle and the host vehicle is determined by measuring the front end portion (vehicle head) of the host vehicle and the front end portion of the side rear vehicle (in the direction along the traveling direction of the host vehicle (and the side rear vehicle)). This is the distance between The inter-vehicle distance may be calculated from the inter-vehicle distance and the vehicle length. The inter-vehicle distance may be substituted for the inter-vehicle distance.
 ここで、相対速度Vca、車頭間距離Dcaは、センサ或いはレーダ等によって検知される。車頭間距離の変化率Rcaは、Rca=Vca/Dcaという関係式によって算出される。 Here, the relative speed Vca and the inter-head distance Dca are detected by a sensor or a radar. The change rate Rca of the inter-vehicle head distance is calculated by the relational expression Rca = Vca / Dca.
 また、側前方車両の情報は、自車両に対する側前方車両の相対速度Vda、側前方車両と自車両との車頭間距離Dda、車頭間距離の変化率Rdaの情報を含む。側前方車両と自車両との車頭間距離Ddaは、自車両(および側前方車両)の進行方向に沿って測定される自車両の先端部(車頭)と側前方車両の先端部(車頭)との間の距離である。 Also, the information on the side front vehicle includes information on the relative speed Vda of the side front vehicle with respect to the host vehicle, the distance Dda between the head of the side front vehicle and the host vehicle, and the change rate Rda of the head distance. The head-to-head distance Dda between the side front vehicle and the host vehicle is measured along the traveling direction of the host vehicle (and the side front vehicle) and the tip end portion (vehicle head) of the host vehicle and the tip portion (vehicle head) of the side front vehicle. Is the distance between.
 相対速度Vda、車頭間距離Ddaは、センサ或いはレーダ等によって検知される。また、車頭間距離の変化率Rdaは、Rda=Vda/Ddaという関係式によって算出される。 The relative speed Vda and the inter-head distance Dda are detected by a sensor or a radar. Further, the change rate Rda of the inter-vehicle head distance is calculated by the relational expression Rda = Vda / Dda.
 自車両の残存側方車線長DRdaは、側方車線への車線変更の可能性の高さを示すパラメータである。具体的には、自車両の残存側方車線長DRdaは、自車両(および側前方車両)進行方向に沿った方向において測定される自車両の先端部(車頭)と側前方車両の後端部との間の距離が、先行車両と自車両との車間距離DRbaより長い場合、自車両の先端部(車頭)と側前方車両の後端部との間の距離となり、自車両の先端部(車頭)と側前方車両の後端部との間の距離が、DRbaより短い場合、DRbaとなる。自車両の残存側方車線長DRdaは、センサ或いはレーダ等によって検知される。 The remaining side lane length DRda of the host vehicle is a parameter indicating a high possibility of lane change to the side lane. Specifically, the remaining side lane length DRda of the host vehicle is measured in the direction along the traveling direction of the host vehicle (and the side front vehicle) and the rear end portion of the side front vehicle. Is longer than the inter-vehicle distance DRba between the preceding vehicle and the host vehicle, the distance between the front end portion (vehicle head) of the host vehicle and the rear end portion of the side forward vehicle, and the front end portion of the host vehicle ( When the distance between the vehicle head) and the rear end portion of the side front vehicle is shorter than DRba, DRba is set. The remaining side lane length DRda of the host vehicle is detected by a sensor or a radar.
 「合流車線の情報」は、自車両に対する合流車両の相対速度Vma、合流車両と自車両との車頭間距離Dma、車頭間距離の変化率Rmaの情報を含む。ここで、合流車両と自車両との車頭間距離Dmaは、自車両(および合流車両)の進行方向に沿った方向において測定される自車両の先端部(車頭)と合流車両の先端部(車頭)との間の距離である。 “The information on the merging lane” includes information on the relative speed Vma of the merging vehicle with respect to the own vehicle, the distance Dma between the merging vehicle and the own vehicle, and the rate of change Rma of the inter-vehicle distance. Here, the inter-head distance Dma between the joining vehicle and the host vehicle is measured in the direction along the traveling direction of the host vehicle (and the joining vehicle) and the leading end portion (head of the host vehicle) and the leading end portion (head of the joining vehicle) ).
 相対速度Vma、車頭間距離Dmaは、センサ或いはレーダ等によって検知される。車頭間距離の変化率Rmaは、Rma=Vma/Dmaという関係式によって算出される。 The relative speed Vma and the inter-head distance Dma are detected by a sensor or a radar. The change rate Rma of the inter-vehicle head distance is calculated by the relational expression Rma = Vma / Dma.
 図27に示す走行履歴の例では、上記で説明した速度、距離、及び変化率の数値が複数のレベルに分類され、分類されたレベルを示す数値が記憶されている。なお、速度、距離、及び変化率の数値は、レベルに分類されることなくそのまま記憶されてもよい。 In the example of the travel history shown in FIG. 27, the numerical values of the speed, distance, and change rate described above are classified into a plurality of levels, and numerical values indicating the classified levels are stored. Note that the numerical values of the speed, the distance, and the change rate may be stored as they are without being classified into levels.
 位置情報は、「自車両の位置情報」、「走行車線数」、「自車両の走行車線」、「合流区間の開始・終了地点までの距離」「分岐区間の開始・終了地点までの距離」「工事区間開始・終了地点までの距離」「車線減少区間開始・終了地点までの距離」「交通事故発生地点までの距離」などの情報を含む。図27には、位置情報の例として「自車両の走行車線」(図27の走行車線)、及び「合流区間の開始・終了地点までの距離」(図27では「合流地点までの距離」と示す)の情報が示されている。 The location information is "location information of own vehicle", "number of lanes", "travel lane of own vehicle", "distance to start / end points of merge section" "distance to start / end points of branch section" It includes information such as “distance to construction section start / end points”, “distance to lane decrease section start / end points”, and “distance to traffic accident occurrence point”. In FIG. 27, as examples of position information, “travel lane of own vehicle” (travel lane of FIG. 27) and “distance to start / end points of merge section” (in FIG. 27, “distance to merge point”) Information) is shown.
 例えば、図示しない「自車両の位置情報」の欄には、GPSより得られた緯度・経度を示す数値情報が記憶される。「走行車線数」の欄には、走行している道の車線の数が記憶される。図示しない「自車両の走行車線」の欄には、走行している車線の位置を示す数値情報が記憶される。「合流区間の開始・終了地点までの距離」の欄には、所定の距離内に合流区間の開始・終了地点が存在する場合に、合流区間の開始・終了地点までの距離が予め決められた複数のレベルに分類され、分類されたレベルの数値が記憶される。なお、所定の距離内に合流区間の開始・終了地点が存在しない場合、「合流区間の開始・終了地点までの距離」の欄には「0」が記憶される。 For example, numerical information indicating latitude / longitude obtained from GPS is stored in the “position information of own vehicle” column (not shown). The number of lanes on the road on which the vehicle is traveling is stored in the “number of lanes” column. Numerical information indicating the position of the traveling lane is stored in the “running lane of the host vehicle” column (not shown). In the “Distance to start / end point of merge section” column, the distance to the start / end point of the merge section is determined in advance when the start / end point of the merge section exists within a predetermined distance. It is classified into a plurality of levels, and the numerical values of the classified levels are stored. If there is no start / end point of the merging section within the predetermined distance, “0” is stored in the “distance to the start / end point of the merging section” column.
 「分岐区間の開始・終了地点までの距離」の欄には、所定の距離内に分岐区間の開始・終了地点が存在する場合に、分岐区間の開始・終了地点までの距離が予め決められた複数のレベルに分類され、分類されたレベルの数値が記憶される。なお、所定の距離内に分岐区間の開始・終了地点が存在しない場合、「分岐区間の開始・終了地点までの距離」の欄には「0」が記憶される。「工事区間開始・終了地点までの距離」の欄には、所定の距離内に工事区間開始・終了地点が存在する場合に、工事区間開始・終了地点までの距離が予め決められた複数のレベルに分類され、分類されたレベルの数値が記憶される。なお、所定の距離内に工事区間開始・終了地点が存在しない場合、「工事区間開始・終了地点までの距離」の欄には「0」が記憶される。 In the “distance to start / end point of branch section” field, when the start / end point of the branch section exists within a predetermined distance, the distance to the start / end point of the branch section is determined in advance. It is classified into a plurality of levels, and the numerical values of the classified levels are stored. If there is no start / end point of the branch section within the predetermined distance, “0” is stored in the “distance to the start / end point of the branch section”. In the "Distance to construction section start / end point" column, if there is a construction section start / end point within a predetermined distance, the distance to the construction section start / end point is determined in multiple levels. And the numerical value of the classified level is stored. When there is no construction section start / end point within a predetermined distance, “0” is stored in the column “Distance to construction section start / end point”.
 「車線減少区間開始・終了地点までの距離」の欄には、所定の距離内に車線減少区間開始・終了地点が存在する場合に、車線減少区間開始・終了地点までの距離が予め決められた複数のレベルに分類され、分類されたレベルの数値が記憶される。なお、所定の距離内に車線減少区間開始・終了地点が存在しない場合、「車線減少区間開始・終了地点までの距離」の欄には「0」が記憶される。 In the “Distance to start / end point of lane reduction section” column, the distance to the start / end point of lane decrease section is determined in advance when there is a start / end point of lane reduction section within the predetermined distance. It is classified into a plurality of levels, and the numerical values of the classified levels are stored. When there is no lane decrease section start / end point within a predetermined distance, “0” is stored in the “distance to lane decrease section start / end point” column.
 「交通事故発生地点までの距離」の欄には、所定の距離内に交通事故発生地点が存在する場合に、交通事故発生地点までの距離が予め決められた複数のレベルに分類され、分類されたレベルの数値が記憶される。なお、所定の距離内に交通事故発生地点が存在しない場合、「交通事故発生地点までの距離」の欄には「0」が記憶される。 In the “distance to traffic accident occurrence point” column, when the traffic accident occurrence point exists within a predetermined distance, the distance to the traffic accident occurrence point is classified into a plurality of predetermined levels. The numerical value of the selected level is stored. If there is no traffic accident occurrence point within a predetermined distance, “0” is stored in the “distance to the traffic accident occurrence point” column.
 さらに、位置情報は、自車両が走行している道の全車線のうちどの車線が合流車線、分岐車線、工事車線、減少車線、事故発生車線かの情報を含んでも良い。 Furthermore, the position information may include information on which lanes of the road on which the vehicle is traveling are merge lanes, branch lanes, construction lanes, reduced lanes, and accident lanes.
 なお、図27に示した走行履歴はあくまで一例であり、本発明はこれに限定されない。例えば、上記側方車線の情報が右側方車線の情報である場合、走行履歴に、その反対側である「左側方車線の情報」がさらに含まれても良い。 The travel history shown in FIG. 27 is merely an example, and the present invention is not limited to this. For example, when the information on the side lane is information on the right side lane, the travel history may further include “information on the left side lane” on the opposite side.
 「左側方車線の情報」は、左側方車線において自車両より後方を走行する左側後方車両の情報と、左側方車線において自車両より前方を走行する左側前方車両の情報と、自車両の残存左側方車線長DRdaの情報とを含む。 “Left lane information” includes information on the left rear vehicle traveling behind the host vehicle in the left lane, information on the left front vehicle traveling ahead of the host vehicle in the left lane, and the remaining left side of the host vehicle. Information on the direction lane length DRda.
 左側後方車両の情報は、自車両に対する左側後方車両の相対速度Vfa、左側後方車両と自車両との車頭間距離Dfa、車頭間距離の変化率Rfaの情報を含む。左側後方車両と自車両との車頭間距離Dfaとは、自車両(および左側後方車両)の進行方向に沿った方向において測定される自車両の先端部(車頭)と左側後方車両の先端部(車頭)との間の距離である。 The information on the left rear vehicle includes information on the relative speed Vfa of the left rear vehicle with respect to the host vehicle, the head distance Dfa between the left rear vehicle and the host vehicle, and the change rate Rfa of the head head distance. The head-to-head distance Dfa between the left rear vehicle and the host vehicle is a front end portion (vehicle head) of the host vehicle measured in a direction along the traveling direction of the host vehicle (and the left rear vehicle) and a front end portion of the left rear vehicle ( This is the distance between
 ここで、相対速度Vfa、車頭間距離Dfaは、センサ或いはレーダ等によって検知される。また、車頭間距離の変化率Rfaは、Rfa=Vfa/Dfaという関係式によって算出される。 Here, the relative speed Vfa and the inter-head distance Dfa are detected by a sensor or a radar. Further, the change rate Rfa of the inter-vehicle head distance is calculated by the relational expression Rfa = Vfa / Dfa.
 また、左側前方車両の情報は、自車両に対する左側前方車両の相対速度Vga、左側前方車両と自車両との車頭間距離Dga、車頭間距離の変化率Rgaの情報を含む。左側前方車両と自車両との車頭間距離Dgaは、自車両(および左側前方車両)の進行方向に沿って測定される自車両の先端部(車頭)と左側前方車両の先端部(車頭)との間の距離である。 Further, the information on the left front vehicle includes information on the relative speed Vga of the left front vehicle with respect to the own vehicle, the distance Dga between the left front vehicle and the own vehicle, and the rate of change Rga of the head distance. The head-to-head distance Dga between the left front vehicle and the host vehicle is measured along the traveling direction of the host vehicle (and the left front vehicle) and the tip portion (vehicle head) of the host vehicle and the tip portion (vehicle head) of the left front vehicle. Is the distance between.
 ここで、相対速度Vga、車頭間距離Dgaは、センサ或いはレーダ等によって検知される。また、車頭間距離の変化率Rgaは、Rga=Vga/Dgaという関係式によって算出される。 Here, the relative speed Vga and the inter-head distance Dga are detected by a sensor or a radar. Further, the change rate Rga of the inter-vehicle head distance is calculated by the relational expression Rga = Vga / Dga.
 なお、ここでは、車両の通行が左側通行である場合について説明したが、左右を逆転させることにより右側通行の場合にも同様の処理が可能である。 In addition, although the case where the traffic of the vehicle is the left-hand traffic has been described here, the same processing can be performed in the case of the right-hand traffic by reversing the left and right.
 また、図27に示す走行履歴は、走行車線において自車両より後方を走行する後方車両の情報を示す「後方車両の情報」を含んでもよい。 Further, the travel history shown in FIG. 27 may include “rear vehicle information” indicating information on the rear vehicle traveling behind the host vehicle in the travel lane.
 後方車両の情報は、自車両に対する後方車両の相対速度Vea、後方車両と自車両との車頭間距離Dea、車頭間距離の変化率Reaの情報を含む。後方車両と自車両との車頭間距離Deaとは、自車両(および後方車両)の進行方向に沿った方向において測定される自車両の先端部(車頭)と後方車両の先端部(車頭)との間の距離である。 The information on the rear vehicle includes information on the relative speed Vea of the rear vehicle with respect to the host vehicle, the distance Dea between the rear vehicle and the host vehicle, and the rate of change Rea of the head distance. The head-to-head distance Dea between the rear vehicle and the host vehicle is determined by the front end portion (vehicle head) of the host vehicle and the front end portion (vehicle head) of the rear vehicle measured in the direction along the traveling direction of the host vehicle (and the rear vehicle). Is the distance between.
 ここで、相対速度Vea、車頭間距離Deaは、センサ或いはレーダ等によって検知される。車頭間距離の変化率Reaは、Rea=Vea/Deaという関係式によって算出される。 Here, the relative speed Vea and the inter-head distance Dea are detected by a sensor or a radar. The change rate Rea of the inter-vehicle distance is calculated by a relational expression Rea = Vea / Dea.
 なお、移動体に隠れて車頭間距離が計測できない場合などは、車頭間距離の代替として、計測できる車間距離或いは車間距離に所定の車長を加えた近似値を使用しても良いし、車間距離に認識した車種ごとの車長を加えて算出してもよい。また、車頭間距離が計測できるかできないかに関わらず、車頭間距離の代替として、計測できる車間距離或いは車間距離に所定の車長を加えた近似値を使用しても良いし、車間距離に認識した車種ごとの車長を加えて算出してもよい。 If the distance between the vehicle heads cannot be measured because it is hidden behind a moving body, the measurable distance between vehicles or an approximate value obtained by adding a predetermined vehicle length to the distance between vehicles may be used instead of the distance between vehicle heads. The distance may be calculated by adding the length of each recognized vehicle type to the distance. Regardless of whether the head-to-head distance can be measured or not, as an alternative to the head-to-head distance, a measurable head-to-head distance or an approximate value obtained by adding a predetermined vehicle length to the head-to-head distance may be used. You may calculate by adding the vehicle length for every recognized vehicle type.
 走行履歴には、車両の走行環境に関する他の様々な情報が含まれていてもよい。例えば、走行履歴には、先行車両、側方車両、合流車両の大きさ或いは種別、または自車両との相対位置の情報が含まれていてもよい。例えば、後方から接近する車両の種別をカメラセンサで認識し、車両が緊急車両である場合に車両が救急車両であることを示す情報を含めても良い。これにより、緊急車両への対応のための情報報知であることを情報報知できる。あるいは、図22で説明したような、ステアリングホイール、ブレーキ、アクセル操作量を段階的に示した数値或いは同乗者の情報などが走行履歴に含まれていてもよい。 The traveling history may include various other information related to the traveling environment of the vehicle. For example, the travel history may include information on the size or type of the preceding vehicle, the side vehicle, the joining vehicle, or the relative position with respect to the host vehicle. For example, the type of a vehicle approaching from behind may be recognized by a camera sensor, and information indicating that the vehicle is an emergency vehicle may be included when the vehicle is an emergency vehicle. Thereby, it can inform that it is information reporting for correspondence to an emergency vehicle. Or the numerical value which showed the steering wheel, the brake, the amount of accelerator operation in steps, or the passenger's information etc. as demonstrated in FIG.
 また、運転者の走行履歴として、自動運転中に選択した挙動が集約されてもよいし、運転者が手動運転中に実際に行った挙動が集約されてもよい。これにより、自動運転或いは手動運転といった運転状態に応じた走行履歴の収集ができる。 Further, as the driving history of the driver, the behaviors selected during the automatic driving may be aggregated, or the behaviors actually performed by the driver during the manual driving may be aggregated. This makes it possible to collect travel histories according to operating conditions such as automatic driving or manual driving.
 また、図27の例では、走行履歴に含まれる環境パラメータが、運転者に車両の挙動を提示した際の走行環境を示すものとしたが、運転者が挙動の選択を行った際の走行環境を示すものであってもよい。あるいは、運転者に車両の挙動を提示した際の走行環境を示す環境パラメータと、運転者が挙動の選択を行った際の走行環境を示す環境パラメータとが両方とも走行履歴に含まれてもよい。 In the example of FIG. 27, the environmental parameter included in the travel history indicates the travel environment when the behavior of the vehicle is presented to the driver. However, the travel environment when the driver selects the behavior. May be shown. Alternatively, both the environmental parameter indicating the driving environment when the behavior of the vehicle is presented to the driver and the environmental parameter indicating the driving environment when the driver selects the behavior may be included in the driving history. .
 さらに、車両制御部7が、図2A、図5A、図6A、図7A、図8A、図9A、図10Aに示す俯瞰図、または図14Cに示す表示を生成するに伴い、第1の挙動、及び、第2の挙動が選択される要因となった、寄与度の高い環境パラメータの情報、および、その環境パラメータに関連する情報(例えば、アイコンなど)の少なくとも一つを報知情報として生成する。生成した報知情報を俯瞰図上に示すなどして報知情報を報知部92に報知させてもよい。 Furthermore, as the vehicle control unit 7 generates the overhead view shown in FIGS. 2A, 5A, 6A, 7A, 8A, 9A, and 10A, or the display shown in FIG. 14C, the first behavior, In addition, at least one of the information on the environmental parameter having a high contribution level and the information related to the environmental parameter (for example, an icon) that causes the second behavior to be selected is generated as the notification information. The notification information may be notified to the notification unit 92 by, for example, showing the generated notification information on an overhead view.
 この場合、例えば、車両制御部7は、先行車両と自車両との車間距離DRba或いは先行車両のサイズの変化率RSbの寄与度が高ければ、俯瞰図における先行車両と自車両との間に輝度を上げたり色を変えたりした領域を表示させ、報知情報を報知部92に報知させてもよい。 In this case, for example, if the contribution of the inter-vehicle distance DRba between the preceding vehicle and the host vehicle or the rate of change RSb of the size of the preceding vehicle is high, the vehicle control unit 7 increases the brightness between the preceding vehicle and the host vehicle in the overhead view. An area where the color is raised or the color is changed may be displayed to notify the notification unit 92 of the notification information.
 また、車両制御部7が、先行車両と自車両との間の領域に車間距離DRba或いは変化率RSbの寄与度が高いことを示すアイコンを報知情報として表示させてもよい。さらに、車両制御部7が、報知部92に、俯瞰図上で先行車両と自車両とを結ぶ線分を報知情報として描画させるか、全ての周辺車両と自車両とを結ぶ線分を報知情報として描画させ、俯瞰図上で先行車両と自車両とを結ぶ線分を強調させてもよい。 Further, the vehicle control unit 7 may display an icon indicating that the contribution degree of the inter-vehicle distance DRba or the change rate RSb is high in the region between the preceding vehicle and the host vehicle as the notification information. Further, the vehicle control unit 7 causes the notification unit 92 to draw a line segment connecting the preceding vehicle and the host vehicle as notification information on the overhead view, or to notify line segments connecting all the surrounding vehicles and the host vehicle. The line segment connecting the preceding vehicle and the host vehicle may be emphasized on the overhead view.
 また、車両制御部7は、俯瞰図ではなく、運転者から見える視点画像の中で、報知部92に先行車両と自車両との間に周囲の領域よりも輝度を上げたり、周囲の領域と異なる色にした領域を報知情報として表示させたりしてAR(Augmented Reality)表示を実現させてもよい。また、車両制御部7が視点画像の中で、先行車両と自車との間の領域に高い寄与度の環境パラメータを示すアイコンを報知情報として報知部92にAR表示させてもよい。 In addition, the vehicle control unit 7 raises the luminance between the preceding vehicle and the host vehicle in the viewpoint image seen from the driver, not the overhead view, and between the preceding vehicle and the host vehicle. AR (Augmented Reality) display may be realized by displaying differently colored areas as notification information. In addition, the vehicle control unit 7 may cause the notification unit 92 to display an AR indicating an environmental parameter having a high contribution degree in the region between the preceding vehicle and the host vehicle as notification information in the viewpoint image.
 さらに、車両制御部7が視点画像の中で、先行車両と自車とを結ぶ線分を報知情報としてAR表示させるか、視点画像の中で、全ての周辺車両と自車両とを結ぶ線分を報知情報としてAR表示させ、先行車両と自車両とを結ぶ線分を強調させてもよい。 Further, the vehicle control unit 7 displays the line segment connecting the preceding vehicle and the host vehicle in the viewpoint image as AR information, or the line segment connecting all the surrounding vehicles and the host vehicle in the viewpoint image. May be displayed as the notification information and the line segment connecting the preceding vehicle and the host vehicle may be emphasized.
 なお、寄与度の高い環境パラメータあるいはその環境パラメータに関連する情報を報知する方法は、上記に限定されない。例えば、車両制御部7は、寄与度の高い環境パラメータの対象となる先行車両を強調表示した画像を報知情報として生成し、報知部92に表示させてもよい。 Note that the method of reporting the environmental parameter having a high contribution or information related to the environmental parameter is not limited to the above. For example, the vehicle control unit 7 may generate, as notification information, an image that highlights a preceding vehicle that is a target of an environmental parameter with a high contribution, and may display the image on the notification unit 92.
 また、車両制御部7が、俯瞰図またはAR表示において、寄与度の高い環境パラメータの対象となる先行車両等の方向を示す情報を報知情報として生成し、その情報を自車両または自車両の周辺に表示させてもよい。 In addition, the vehicle control unit 7 generates information indicating the direction of the preceding vehicle or the like that is the target of the environmental parameter with a high contribution in the overhead view or the AR display as the notification information, and the information is the own vehicle or the vicinity of the own vehicle. May be displayed.
 また、例えば、車両制御部7は、寄与度が高い環境パラメータの情報あるいはその環境パラメータに関連する情報を報知する代わりに、寄与度が低い環境パラメータの対象となる先行車両等の表示輝度を低くするなどして目立たなくし、相対的に目立つようにした寄与度が高い環境パラメータの情報あるいはその環境パラメータに関連する情報を報知情報として生成し、報知部92に表示させてもよい。 In addition, for example, the vehicle control unit 7 reduces the display brightness of a preceding vehicle or the like that is the target of the environmental parameter with a low contribution instead of notifying the information about the environmental parameter with a high contribution or information related to the environmental parameter. For example, information on an environmental parameter having a high degree of contribution that is made inconspicuous by making it inconspicuous or information related to the environmental parameter may be generated as notification information and displayed on the notification unit 92.
 次に、運転者の走行履歴に基づくドライバモデルの構築について説明する。ドライバモデルには、複数の運転者の走行履歴をクラスタリングして構築するクラスタリング型と、特定の運転者(例えば、運転者x)の走行履歴と類似する複数の走行履歴から運転者xのドライバモデルを構築する個別適応型とがある。 Next, the construction of the driver model based on the driving history of the driver will be described. The driver model includes a clustering type constructed by clustering the driving histories of a plurality of drivers, and a driver model of the driver x from a plurality of driving histories similar to a driving history of a specific driver (for example, driver x). There is an individual adaptive type that builds.
 まず、クラスタリング型について説明する。クラスタリング型のドライバモデルの構築方法は、図27に示したような運転者の走行履歴を運転者毎に予め集約する。そして、互いの走行履歴の類似度が高い複数の運転者、つまり、類似した運転操作傾向を有する複数の運転者をグループ化してドライバモデルを構築する。 First, the clustering type will be described. In the clustering type driver model construction method, the driving history of the driver as shown in FIG. 27 is aggregated in advance for each driver. Then, a driver model is constructed by grouping a plurality of drivers having a high degree of similarity between the traveling histories, that is, a plurality of drivers having similar driving operation tendencies.
 走行履歴の類似度は、例えば、運転者aと運転者bの走行履歴における挙動を所定のルールに基づいて数値化した場合に、環境パラメータの数値と挙動の数値とを要素とするベクトルの相関値から決定できる。この場合、例えば、運転者aと運転者bの走行履歴から算出した相関値が所定値よりも高い場合に、運転者aと運転者bの走行履歴を1つのグループとする。なお、類似度の算出については、これに限定されない。 For example, when the behaviors of the driving histories of the driver a and the driver b are quantified based on a predetermined rule, the similarity between the driving histories is a correlation between vectors having environmental parameter values and behavior values as elements. Can be determined from the value. In this case, for example, when the correlation value calculated from the driving history of the driver a and the driver b is higher than a predetermined value, the driving history of the driver a and the driver b is set as one group. The calculation of the similarity is not limited to this.
 次に、個別適応型について説明する。個別適応型のドライバモデルの構築方法は、クラスタリング型の場合と同様に、図27に示したような複数の運転者の走行履歴を予め集約する。ここで、クラスタリング型の場合と異なる点は、運転者毎にドライバモデルを構築する点である。例えば、運転者yに対してドライバモデルを構築する場合、運転者yの走行履歴と他の複数の運転者の走行履歴とを比較し、類似度が高い複数の運転者の走行履歴を抽出する。そして、抽出した複数の運転者の走行履歴から運転者yの個別適応型のドライバモデルを構築する。 Next, the individual adaptive type will be described. As in the case of the clustering type, the individual adaptive type driver model construction method aggregates the driving histories of a plurality of drivers as shown in FIG. Here, the difference from the clustering type is that a driver model is constructed for each driver. For example, when a driver model is constructed for the driver y, the driving history of the driver y is compared with the driving histories of other drivers, and the driving histories of a plurality of drivers with high similarity are extracted. . Then, an individually adaptive driver model for the driver y is constructed from the extracted driving histories of the plurality of drivers.
 なお、図27に示す走行履歴に基づくドライバモデル(シチュエーションデータベース)は、クラスタリング型、または、個別適応型に限定されず、例えば、全ての運転者の走行履歴を含むように構成されてもよい。 It should be noted that the driver model (situation database) based on the travel history shown in FIG. 27 is not limited to the clustering type or the individual adaptation type, and may be configured to include the travel history of all drivers, for example.
 ここで、構築したドライバモデルの使用方法について、例を挙げて説明する。以下の例では、運転者xに対し、4人の運転者a~dの走行履歴を集約したドライバモデルが用いられる場合について説明する。なお、ドライバモデルは、車両制御部7によって構築される。 Here, how to use the built driver model will be explained with an example. In the following example, a case will be described in which a driver model in which the driving histories of four drivers a to d are aggregated is used for the driver x. The driver model is constructed by the vehicle control unit 7.
 [変形例]
 図28A、28Bは、本変形例におけるドライバモデルの使用方法を示す図である。図28Aは、運転者xが運転する車両の現時点における走行環境を示す環境パラメータである。図28Bは、運転者xに対するドライバモデルの一例である。
[Modification]
28A and 28B are diagrams showing a method of using the driver model in this modification. FIG. 28A is an environmental parameter indicating the current traveling environment of the vehicle driven by the driver x. FIG. 28B is an example of a driver model for the driver x.
 図28Aに示すように、現時点における走行環境を示す環境パラメータに対する挙動(操作)はブランクになる。車両制御部7は、環境パラメータを所定の間隔で取得し、環境パラメータのいずれかをトリガとして、図28Bに示すドライバモデルから次の挙動を判定する。 As shown in FIG. 28A, the behavior (operation) for the environmental parameter indicating the current driving environment is blank. The vehicle control unit 7 acquires environmental parameters at predetermined intervals, and determines the next behavior from the driver model shown in FIG. 28B using any one of the environmental parameters as a trigger.
 トリガとしては、例えば、合流区間の開始地点までの距離が所定の距離以下になった場合、あるいは、先行車両との相対速度が所定値以下になった場合など、車両の操作の変更が必要となる場合を示す環境パラメータをトリガとしてもよい。 As a trigger, for example, when the distance to the start point of the merging section is a predetermined distance or less, or when the relative speed with the preceding vehicle is a predetermined value or less, it is necessary to change the operation of the vehicle. An environmental parameter indicating the case may be used as a trigger.
 車両制御部7は、図28Aに示す環境パラメータと、図28Bに示すドライバモデルのそれぞれの走行履歴の環境パラメータとを比較し、最も類似する環境パラメータに対応づけられた挙動を第1の挙動であると判定する。また、それ以外の類似する環境パラメータに対応づけられたいくつかの挙動については、第2の挙動と判定する。 The vehicle control unit 7 compares the environmental parameters shown in FIG. 28A with the environmental parameters of the respective driving histories of the driver model shown in FIG. 28B, and the behavior corresponding to the most similar environmental parameter is the first behavior. Judge that there is. In addition, some behaviors associated with other similar environmental parameters are determined as second behaviors.
 環境パラメータが類似するか否かは、環境パラメータの数値を要素とするベクトルの相関値から決定できる。例えば、図28Aに示す環境パラメータの数値を要素とするベクトルと、図28Bに示す環境パラメータの数値を要素とするベクトルとから算出した相関値が所定値よりも高い場合に、これらの環境パラメータが類似すると判定される。なお、環境パラメータが類似するか否かの判定方法については、これに限定されない。 Whether the environmental parameters are similar can be determined from the correlation value of the vectors whose elements are the numerical values of the environmental parameters. For example, when the correlation value calculated from the vector whose element is the numerical value of the environmental parameter shown in FIG. 28A and the vector whose element is the numerical value of the environmental parameter shown in FIG. 28B is higher than a predetermined value, these environmental parameters are It is determined that they are similar. Note that the method for determining whether the environmental parameters are similar is not limited to this.
 例えば、ここでは環境パラメータの類似度に基づいて挙動を決定することとしたが、まず環境パラメータの類似度の高いグループを作成し、そのグループにおける環境パラメータの統計をとり、その統計データから挙動を決定してもよい。 For example, here we decided to determine the behavior based on the similarity of the environmental parameters, but first create a group with a high degree of similarity of the environmental parameters, take statistics of the environmental parameters in that group, and then determine the behavior from the statistical data. You may decide.
 このように、予め複数の運転者の走行履歴から運転者個人のドライバモデルを構築することにより、運転者により適した挙動を報知できる。なお、より安全な走行履歴をデータベースに登録するため、安全な走行の基準を示す情報を記憶部8が記憶しておき、走行履歴がこの基準を満たすか否かを車両制御部7が判定し、さらに車両制御部7が、この基準を満たす走行履歴をデータベースに登録し、この基準を満たさない走行履歴をデータベースに登録しないこととしてもよい。 In this way, by constructing a driver model of an individual driver from the driving histories of a plurality of drivers in advance, a behavior more suitable for the driver can be notified. In order to register a safer driving history in the database, the storage unit 8 stores information indicating a safe driving criterion, and the vehicle control unit 7 determines whether or not the driving history satisfies this criterion. Furthermore, the vehicle control unit 7 may register a travel history that satisfies this criterion in the database, and may not register a travel history that does not satisfy this criterion in the database.
 さらに、走行環境を示すパラメータと挙動とが対応づけられることにより、車両制御部7が、具体的な走行環境を判定することなく、つまり、走行環境のラベリングを行う事無く、精度よく次の挙動を判定できる。 Further, by associating the parameter indicating the driving environment with the behavior, the vehicle control unit 7 accurately determines the next behavior without determining the specific driving environment, that is, without labeling the driving environment. Can be determined.
 なお、ドライバモデル(シチュエーションデータベース)は、自動運転中に運転者が選択した挙動とその挙動を提示した際の走行環境を示す環境パラメータとを対応づけた走行履歴から構築されてもよい。あるいは、ドライバモデル(シチュエーションデータベース)は、自動運転中に運転者が選択した挙動とその挙動を車両が行った際の走行環境を示す環境パラメータとを対応付けた走行履歴から構築されてもよい。 It should be noted that the driver model (situation database) may be constructed from a travel history in which a behavior selected by the driver during automatic driving and an environment parameter indicating a travel environment when the behavior is presented are associated with each other. Alternatively, the driver model (situation database) may be constructed from a travel history in which a behavior selected by the driver during automatic driving and an environmental parameter indicating a travel environment when the behavior is performed by the vehicle are associated with each other.
 環境パラメータが、運転者により選択された挙動を車両が行った際の走行環境を示すものである場合、現時点における走行環境を示す環境パラメータから将来の走行環境を示す環境パラメータを予測する。そして、運転者により選択された挙動を車両が行った際の走行環境を示す環境パラメータのうち、予測された環境パラメータに最も類似する環境パラメータに対応付けられた挙動を第1の挙動、それ以外の類似する環境パラメータに対応付けられたいくつかの挙動を第2の挙動であると判定してもよい。 When the environmental parameter indicates the driving environment when the vehicle performs the behavior selected by the driver, the environmental parameter indicating the future driving environment is predicted from the environmental parameter indicating the current driving environment. Of the environmental parameters indicating the driving environment when the vehicle performs the behavior selected by the driver, the behavior associated with the environmental parameter most similar to the predicted environmental parameter is the first behavior, and the others Some behaviors associated with similar environmental parameters may be determined to be the second behavior.
 上記予測は、例えば、現時点と現時点よりも前の時点の走行環境を示す環境パラメータから将来の時点の環境パラメータを外挿することにより行う。 The above prediction is performed, for example, by extrapolating environmental parameters at a future time from environmental parameters indicating the driving environment at the current time and a time before the current time.
 あるいは、ドライバモデル(シチュエーションデータベース)は、自動運転中に運転者が選択した挙動とその挙動を提示した際の走行環境を示す環境パラメータとを対応づけた走行履歴、および、自動運転中に運転者が選択した挙動とその挙動を車両が行った際の走行環境を示す環境パラメータとを対応付けた走行履歴の両方から構築されてもよい。 Alternatively, the driver model (situation database) includes a driving history that associates a behavior selected by the driver during automatic driving with an environmental parameter indicating a driving environment when the behavior is presented, and a driver during automatic driving. May be constructed from both the travel history in which the behavior selected by and the environmental parameters indicating the travel environment when the vehicle performs the behavior are associated with each other.
 この場合、例えば、両者の走行履歴が図28Bに示したような形式で記憶され、車両制御部7は、それらから次の挙動を判定する。ここで、車両制御部7は、両者の間で優先度を設け、例えば、自動運転中に運転者が選択した挙動とその挙動を車両が行った際の走行環境を示す環境パラメータとを対応付けた走行履歴から優先的に次の挙動を判定してもよい。 In this case, for example, both travel histories are stored in a format as shown in FIG. 28B, and the vehicle control unit 7 determines the next behavior from them. Here, the vehicle control unit 7 gives priority between the two, for example, associating the behavior selected by the driver during the automatic driving with the environment parameter indicating the traveling environment when the vehicle performs the behavior. The next behavior may be determined preferentially from the travel history.
 なお、本発明では、車両制御部7が実行する機能と同様の機能をクラウドサーバなどのサーバ装置に実行させてもよい。特に、記憶部8は走行履歴の蓄積に伴い膨大なデータ数となるため、車両1ではなくクラウドサーバなどのサーバ装置にあってもよい。あるいは、記憶部8は、既に構築されたドライバモデルを記憶し、車両制御部7は、記憶部8に記憶されたドライバモデルを参照して、挙動を判定することとしてもよい。 In the present invention, a server device such as a cloud server may execute a function similar to the function executed by the vehicle control unit 7. In particular, since the storage unit 8 has an enormous number of data as the driving history is accumulated, it may be in a server device such as a cloud server instead of the vehicle 1. Alternatively, the storage unit 8 may store an already constructed driver model, and the vehicle control unit 7 may determine the behavior with reference to the driver model stored in the storage unit 8.
 なお、記憶部8がクラウドサーバに設けられる構成では、通信速度の低下・通信断などの原因により記憶部8にアクセスできない場合に備えキャッシュが設けられることが望ましい。 In the configuration in which the storage unit 8 is provided in the cloud server, it is desirable to provide a cache in case the storage unit 8 cannot be accessed due to a decrease in communication speed or communication disconnection.
 図29は、キャッシュの配置の一例を示すブロック図である。車両制御部7は、通信部291を通じて記憶部8に走行履歴を保存させ、通信部291を通じてキャッシュ292に記憶部8に記憶されたドライバモデル(シチュエーションデータベース)の一部を保持させる。 FIG. 29 is a block diagram showing an example of cache arrangement. The vehicle control unit 7 stores the travel history in the storage unit 8 through the communication unit 291 and holds a part of the driver model (situation database) stored in the storage unit 8 in the cache 292 through the communication unit 291.
 車両制御部7は、キャッシュ292のドライバモデルにアクセスする。このときのキャッシュの作成方法については、環境パラメータの有無で限定する方法、位置情報を用いる方法、データを加工する方法などが考えられる。以下、それぞれについて説明する。 The vehicle control unit 7 accesses the driver model of the cache 292. As a method for creating a cache at this time, a method of limiting by the presence or absence of environmental parameters, a method of using position information, a method of processing data, and the like are conceivable. Each will be described below.
 まず、環境パラメータの有無で限定する方法について説明する。周囲の状況の比較により似た状況を抽出するには、同じ環境パラメータのみが存在する走行環境(シチュエーション)が十分にあれば可能である。従って、車両制御部7は、記憶部8に記憶された走行環境の中から同じ環境パラメータのみを持つ走行環境を抽出して、これらをソートし、キャッシュ292に保持する。 First, a method for limiting the presence or absence of environmental parameters will be described. In order to extract a similar situation by comparing surrounding situations, it is possible to have a sufficient driving environment (situation) in which only the same environmental parameters exist. Therefore, the vehicle control unit 7 extracts driving environments having only the same environmental parameters from the driving environments stored in the storage unit 8, sorts these, and holds them in the cache 292.
 ここで、車両制御部7は、検出された状況から得られる環境パラメータが変更されたタイミングで、一次キャッシュの更新を行う。こうすることで、車両制御部7は、通信速度の低下が発生しても似た周囲の状況を抽出することが可能になる。なお、変更の有無を判断する環境パラメータは、先に挙げた全ての環境パラメータでもよいし、一部の環境パラメータでもよい。 Here, the vehicle control unit 7 updates the primary cache at the timing when the environmental parameter obtained from the detected situation is changed. By doing so, the vehicle control unit 7 can extract a similar surrounding situation even if the communication speed decreases. Note that the environmental parameters for determining whether or not there is a change may be all of the environmental parameters listed above, or some of the environmental parameters.
 さらに、この環境パラメータは刻一刻と変化するため、キャッシュ292内に一次キャッシュおよび二次キャッシュを用意しても良い。例えば、車両制御部7は、同じ環境パラメータを持つ走行環境を一次キャッシュに保持する。さらに、車両制御部7は、環境パラメータが一時キャッシュに保持された走行環境に一つ追加された状態にある走行環境、および、環境パラメータが一時キャッシュに保持された走行環境から一つ削減された状態にある走行環境の少なくとも一方を二次キャッシュに保持する。 Furthermore, since this environmental parameter changes every moment, a primary cache and a secondary cache may be prepared in the cache 292. For example, the vehicle control unit 7 holds a traveling environment having the same environmental parameter in the primary cache. Further, the vehicle control unit 7 is reduced by one from the driving environment in which one environmental parameter is added to the driving environment held in the temporary cache and from the driving environment in which the environmental parameter is held in the temporary cache. At least one of the driving environments in the state is held in the secondary cache.
 このようにすることで、車両制御部7は、一時的な通信断が発生しても、キャッシュ292のデータのみで、似た状況を抽出することが可能になる。 In this way, the vehicle control unit 7 can extract a similar situation using only the data in the cache 292 even if a temporary communication interruption occurs.
 図30を使って、この場合についてさらに具体的に説明する。センサ62により、自車両301の周囲に側前方車両302のみが存在している周囲状況303が検出されたとき、車両制御部7は、側前方車両302のみが存在している走行環境(同一の環境パラメータのみ存在する走行環境)を、全ての走行環境(シチュエーション)が記憶された記憶部8から抽出し、一次キャッシュ304に格納させる。 This case will be described in more detail with reference to FIG. When the sensor 62 detects an ambient situation 303 in which only the side front vehicle 302 exists around the host vehicle 301, the vehicle control unit 7 determines that the traveling environment in which only the side front vehicle 302 exists (the same The driving environment in which only the environmental parameters exist is extracted from the storage unit 8 in which all the driving environments (situations) are stored, and stored in the primary cache 304.
 さらに、車両制御部7は、側前方車両302以外の車が1台だけ追加された走行環境(同一の環境パラメータに1つの環境パラメータが追加された状態にある走行環境)もしくは、側前方車両302のいない走行環境(同一の環境パラメータから1つの環境パラメータが削減された状態にある走行環境)を記憶部8から抽出し、二次キャッシュ305に格納させる。 Further, the vehicle control unit 7 is configured such that the traveling environment in which only one vehicle other than the side front vehicle 302 is added (the traveling environment in which one environmental parameter is added to the same environmental parameter) or the side front vehicle 302 is used. A driving environment without a vehicle (a driving environment in which one environmental parameter is reduced from the same environmental parameter) is extracted from the storage unit 8 and stored in the secondary cache 305.
 そして、センサ62により検出された周囲状況303が変わったとき、車両制御部7は、変わった周囲状況303に対応する走行環境を二次キャッシュ305から一次キャッシュ304にコピーし、変わった周囲状況303に対応する走行環境に対し、環境パラメータが一つ追加された走行環境、及び、環境パラメータが一つ削減された走行環境を記憶部8から抽出し、二次キャッシュ305に格納することで、二次キャッシュ305を更新する。これにより、車両制御部7は、周囲状況のスムーズに比較により似た周囲状況をスムーズに抽出することが可能になる。 When the ambient condition 303 detected by the sensor 62 changes, the vehicle control unit 7 copies the driving environment corresponding to the changed ambient condition 303 from the secondary cache 305 to the primary cache 304, and the changed ambient condition 303. 2 is extracted from the storage unit 8 and stored in the secondary cache 305 by extracting a driving environment in which one environmental parameter has been added and a driving environment in which one environmental parameter has been reduced. The next cache 305 is updated. As a result, the vehicle control unit 7 can smoothly extract a similar surrounding situation by comparing the surrounding situations smoothly.
 次に、位置情報を用いる方法について説明する。環境パラメータに位置情報が含まれている場合、車両制御部7は、その位置情報により示される位置が自車位置を中心とする一定の範囲内に含まれる走行環境(シチュエーション)を記憶部8から抽出し、キャッシュ292に格納させることが可能となる。 Next, a method using position information will be described. When position information is included in the environmental parameters, the vehicle control unit 7 displays from the storage unit 8 a driving environment (situation) in which the position indicated by the position information is included within a certain range centered on the vehicle position. It can be extracted and stored in the cache 292.
 この場合、車両制御部7は、走行環境に対応する位置情報により示される位置が上記一定の範囲から外れたときに、キャッシュ292の更新を行う。このようにすることで、車両制御部7は、長期間の通信断が発生しても位置が一定の範囲内であれば似た周囲状況を抽出することが可能になる。 In this case, the vehicle control unit 7 updates the cache 292 when the position indicated by the position information corresponding to the traveling environment is out of the certain range. By doing so, the vehicle control unit 7 can extract a similar ambient situation as long as the position is within a certain range even if communication is interrupted for a long time.
 さらに、データを加工する方法について説明する。記憶部8には環境パラメータを含む操作履歴が蓄積されている。車両制御部7は、各々の環境パラメータを一定の範囲毎に分割し、多次元空間上でメッシュを作成する。そして、車両制御部7は、各々のメッシュに含まれる挙動をその種別ごとにカウントしたテーブルを作成する。 Furthermore, a method for processing data will be described. The storage unit 8 stores operation histories including environmental parameters. The vehicle control unit 7 divides each environmental parameter into a certain range and creates a mesh in a multidimensional space. And the vehicle control part 7 creates the table which counted the behavior contained in each mesh for every classification.
 例えば、使用する環境パラメータを二つに限定して説明する。車両制御部7は、操作履歴に含まれる環境パラメータを図31Aのように平面状にマッピングし、これらの各々の軸を一定の範囲で分割することで、平面を複数のブロックに分ける。これをメッシュと呼ぶ。 For example, the explanation will be made by limiting the environment parameters to be used to two. The vehicle control unit 7 maps the environmental parameters included in the operation history in a planar shape as shown in FIG. 31A, and divides each axis in a certain range, thereby dividing the plane into a plurality of blocks. This is called a mesh.
 車両制御部7は、各々のメッシュの中に含まれる挙動の個数をその種別(例えば、加速、減速、車線変更、追い越しなどの種別)ごとにカウントする。図31Bに各メッシュの中に含まれる挙動の個数をその種別ごとにカウントしたテーブルを示す。 The vehicle control unit 7 counts the number of behaviors included in each mesh for each type (for example, types such as acceleration, deceleration, lane change, and overtaking). FIG. 31B shows a table in which the number of behaviors included in each mesh is counted for each type.
 車両制御部7は、この内容をキャッシュ292に保持する。そして、車両制御部7は、周囲状況の比較により似た周囲状況の抽出を行う際に、検出された環境パラメータがどのメッシュに位置するかを判別し、判別したメッシュの中に含まれる挙動のうち個数が最大である挙動を選択し、選択された挙動を報知する挙動に決定する。 The vehicle control unit 7 holds this content in the cache 292. Then, the vehicle control unit 7 determines which mesh the detected environmental parameter is located in when extracting a similar surrounding situation by comparing the surrounding situations, and the behavior included in the determined mesh The behavior having the largest number is selected, and the behavior for notifying the selected behavior is determined.
 例えば、車両制御部7は、検出された環境パラメータがメッシュの3番に位置すると判別したとき、3番のメッシュの中に含まれる挙動のうち最大個数を示す挙動(ここでは「加速」)の操作を報知する挙動に決定する。この方法であれば、キャッシュ292の更新タイミングはいつでもよく、キャッシュ292の容量も一定とすることができる。 For example, when the vehicle control unit 7 determines that the detected environmental parameter is located at the third mesh position, the vehicle control section 7 indicates a behavior (here “acceleration”) indicating the maximum number of behaviors included in the third mesh. The behavior for notifying the operation is determined. With this method, the update timing of the cache 292 may be anytime, and the capacity of the cache 292 can be made constant.
 これらの方法を一つもしくは複数を組み合わせることでキャッシュを作成する。ただし、上に挙げた方法は一例であり、キャッシュの作成方法はこの限りではない。 ∙ Create a cache by combining one or more of these methods. However, the above method is only an example, and the cache creation method is not limited to this.
 上記は、実施の形態4のドライバモデル拡張の例である。この例では、車両制御部7が、過去の走行環境の情報を含む運転者の運転特性を示す特徴量の情報を取得し、記憶部8がその特徴量の情報を記憶し、車両の挙動を変更する必要があると判定された場合、車両制御部7が記憶部8に記憶された特徴量の情報の中から、新たに取得した走行環境の情報を含む運転者の運転特性を示す特徴量に類似する情報を決定し、決定された情報に対応する挙動を報知することとした。 The above is an example of the driver model extension of the fourth embodiment. In this example, the vehicle control unit 7 acquires feature amount information indicating the driving characteristics of the driver including information on past driving environments, the storage unit 8 stores the feature amount information, and stores the behavior of the vehicle. When it is determined that it is necessary to change, the feature amount indicating the driving characteristics of the driver including information on the driving environment newly acquired from the feature amount information stored in the storage unit 8 by the vehicle control unit 7 The information similar to is determined, and the behavior corresponding to the determined information is notified.
 また、実施の形態4のドライバモデル拡張の例では、次のようにしても良い。過去の走行環境の情報を含む運転者の運転特性を示す特徴量の情報は、運転者に車両の挙動を提示した際の特徴量の情報、および、運転者が挙動の選択を行った際の特徴量の情報の少なくとも1つであることとした。 Further, in the example of the driver model extension of the fourth embodiment, the following may be performed. Information on feature values that indicate the driving characteristics of the driver, including past driving environment information, is information on the feature values when the behavior of the vehicle is presented to the driver, and when the driver selects the behavior. The information is at least one piece of feature amount information.
 また、過去の走行環境の情報を含む運転者の運転特性を示す特徴量の情報が、運転者に車両の挙動を提示した際の特徴量の情報、および、運転者が挙動の選択を行った際の特徴量の情報の両方である場合、それら両方の特徴量の情報の中から、新たに取得した走行環境の情報を含む運転者の運転特性を示す特徴量に類似する情報を決定し、決定された情報に対応する挙動を報知することとした。 In addition, feature information indicating the driver's driving characteristics, including past driving environment information, feature information when the vehicle behavior is presented to the driver, and the driver selects the behavior If it is both of the feature amount information at the time, information similar to the feature amount indicating the driving characteristics of the driver including the information of the newly acquired driving environment is determined from the information of both feature amounts, The behavior corresponding to the determined information is notified.
 また、過去の走行環境の情報を含む運転者の運転特性を示す特徴量の情報が、運転者に車両の挙動を提示した際の特徴量の情報、および、運転者が挙動の選択を行った際の特徴量の情報の両方である場合、運転者が挙動の選択を行った際の特徴量の情報の中から優先的に、新たに取得した走行環境の情報を含む運転者の運転特性を示す特徴量に類似する情報を決定し、決定された情報に対応する挙動を報知することとした。 In addition, feature information indicating the driver's driving characteristics, including past driving environment information, feature information when the vehicle behavior is presented to the driver, and the driver selects the behavior In the case of both the feature amount information at the time of driving, the driver's driving characteristics including the newly acquired driving environment information are preferentially selected from the feature amount information when the driver selects the behavior. Information similar to the feature amount to be shown is determined, and a behavior corresponding to the determined information is notified.
 また、過去の走行環境の情報を含む運転者の運転特性を示す特徴量の情報が、車両の自動運転時、および/または、手動運転時の運転者の運転特性を示す特徴量の情報であることとした。 Further, the feature amount information indicating the driving characteristics of the driver including the past driving environment information is the information of the feature amounts indicating the driving characteristics of the driver during the automatic driving and / or manual driving of the vehicle. It was decided.
 以上により、車両制御部7は、運転者の運転傾向により適したドライバモデルを構築でき、構築したドライバモデルに基づいて、運転者に対してより適切な自動運転を行うことができる。走行環境を示すパラメータと挙動とが対応づけられることにより、具体的な走行環境を判定する処理を要することなく、つまり、走行環境のラベリングを行うことなく、精度よく次の挙動を判定できる。 As described above, the vehicle control unit 7 can construct a driver model more suitable for the driving tendency of the driver, and can perform more appropriate automatic driving for the driver based on the constructed driver model. By associating the parameter indicating the driving environment with the behavior, it is possible to accurately determine the next behavior without requiring processing for determining a specific driving environment, that is, without labeling the driving environment.
 (実施の形態5~11に関する共通説明)
 近年、自動車の自動運転に関する開発が進められている。自動運転について、NHTSA(National Highway Traffic Safety Administration)が2013年に定義した自動化レベルは、自動化なし(レベル0)、特定機能の自動化(レベル1)、複合機能の自動化(レベル2)、半自動運転(レベル3)、完全自動運転(レベル4)に分類される。レベル1は加速・減速・操舵の内、1つを自動的に行う運転支援システムであり、レベル2は加速・減速・操舵の内、2つ以上を調和して自動的に行う運転支援システムである。いずれの場合も運転者による運転操作の関与が残る。自動化レベル4は加速・減速・操舵の全てを自動的に行う完全自動走行システムであり、運転者が運転操作に関与しない。自動化レベル3は加速・減速・操舵の全てを自動的に行うが、必要に応じて運転者が運転操作を行う準完全自動走行システムである。
(Common explanation for Embodiments 5 to 11)
In recent years, development related to automatic driving of automobiles has been promoted. As for automatic operation, the automation levels defined in 2013 by NHTSA (National Highway Traffic Safety Administration) are no automation (level 0), automation of specific functions (level 1), automation of complex functions (level 2), semi-automatic operation ( Level 3) and fully automatic operation (level 4). Level 1 is a driving support system that automatically performs one of acceleration, deceleration, and steering. Level 2 is a driving support system that automatically performs two or more of acceleration, deceleration, and steering in harmony. is there. In either case, the driver remains involved in the driving operation. The automation level 4 is a fully automatic traveling system that automatically performs all of acceleration, deceleration, and steering, and the driver is not involved in the driving operation. The automation level 3 is a quasi-fully automatic traveling system in which acceleration, deceleration, and steering are all automatically performed, and a driver performs a driving operation as necessary.
 以下の実施の形態では主に、自動運転のレベル3または4において、車両の自動運転に関する情報を車両の乗員(例えば運転者)との間でやり取りするためのHMI(Human Machine Interface)を制御する装置(以下「運転支援装置」とも呼ぶ。)を提案する。実施の形態5~11に記載の技術は、車両の自動運転中に、運転者に対して有用な情報を提示することにより、安心で快適な自動運転の実現を支援することを1つの目的とする。また、実施の形態5~11に記載の技術は、運転者に対して違和感の少ない情報を提示することで、運転者が自動運転中の車両側の決められた行動に対する変更をより簡単且つ簡便にできる。 In the following embodiments, mainly at the level 3 or 4 of the automatic driving, an HMI (Human Machine Interface) for exchanging information on the automatic driving of the vehicle with the vehicle occupant (for example, the driver) is controlled. A device (hereinafter also referred to as “driving support device”) is proposed. The technologies described in the fifth to eleventh embodiments have one object to support realization of a safe and comfortable automatic driving by presenting useful information to the driver during the automatic driving of the vehicle. To do. In addition, the technologies described in the fifth to eleventh embodiments make it easier and easier for the driver to change the determined behavior on the vehicle side during automatic driving by presenting information with less discomfort to the driver. Can be.
 なお、以下の説明における車両の「行動」は、実施の形態1~4の説明における車両の「挙動」に対応し、自動運転または手動運転において、車両の走行中または停止時の操舵や制動などの作動状態、もしくは自動運転制御に係る制御内容を含む。例えば、定速走行、加速、減速、一時停止、停止、車線変更、進路変更、右左折、駐車など。なお、車両の行動は、現在実行している行動(「現在行動」とも呼ぶ。)、現在実行中の行動の後に実行する行動(「予定行動」とも呼ぶ。)などに分けられる。更に予定行動では、直後に実行する行動、現在実行中の行動の終了後にタイミングを問わず実行予定の行動を含む。なお、予定行動は、現在実行中の行動が終了後に実行する行動を含んでもよい。 Note that the “behavior” of the vehicle in the following description corresponds to the “behavior” of the vehicle in the description of the first to fourth embodiments. In automatic operation or manual operation, steering or braking when the vehicle is running or stopped. The control content related to the operation state of the vehicle or the automatic operation control is included. For example, constant speed running, acceleration, deceleration, temporary stop, stop, lane change, course change, right / left turn, parking, etc. Note that the behavior of the vehicle is classified into an action that is currently being executed (also referred to as “current action”), an action that is executed after the action that is currently being executed (also referred to as “scheduled action”), and the like. Further, the scheduled action includes an action to be executed immediately and an action scheduled to be executed regardless of the timing after the end of the currently executing action. The scheduled action may include an action that is executed after the currently executed action ends.
 図32は、車両1000の構成を示すブロック図であり、自動運転に関する構成を示している。車両1000は、自動運転モードで走行可能であり、報知装置1002、入力装置1004、無線装置1008、運転操作部1010、検出部1020、自動運転制御装置1030、運転支援装置1040を有する。図32に示す各装置の間は、専用線或いはCAN(Controller Area Network)等の有線通信で接続されてもよい。また、USB(Universal Serial Bus)、Ethernet(登録商標)、Wi-Fi(登録商標)、Bluetooth(登録商標)等の有線通信または無線通信で接続されてもよい。 FIG. 32 is a block diagram showing a configuration of the vehicle 1000, and shows a configuration related to automatic driving. The vehicle 1000 can travel in an automatic driving mode, and includes a notification device 1002, an input device 1004, a wireless device 1008, a driving operation unit 1010, a detection unit 1020, an automatic driving control device 1030, and a driving support device 1040. Each device shown in FIG. 32 may be connected by wired communication such as a dedicated line or CAN (Controller Area Network). Further, it may be connected by wired communication or wireless communication such as USB (Universal Serial Bus), Ethernet (registered trademark), Wi-Fi (registered trademark), or Bluetooth (registered trademark).
 報知装置1002は、車両1000の走行に関する情報を運転者に報知する。報知装置1002は、例えば、車内に設置されているカーナビゲーションシステム、ヘッドアップディスプレイ、センターディスプレイ、ステアリングホイール、ピラー、ダッシュボード、メータパネル周りなどに設置されているLEDなどの発光体などのような情報を表示する表示部であってもよい。あるいは、情報を音声に変換して運転者に報知するスピーカであってもよいし、あるいは、運転者が感知できる位置(例えば、運転者の座席、ステアリングホイールなど)に設けられる振動体であってもよい。また、報知装置1002は、これらの組み合わせであってもよい。 The notification device 1002 notifies the driver of information related to the traveling of the vehicle 1000. The notification device 1002 is, for example, a car navigation system installed in a vehicle, a head-up display, a center display, a steering wheel, a pillar, a dashboard, a light emitter such as an LED installed around a meter panel, and the like. It may be a display unit that displays information. Alternatively, it may be a speaker that converts information into sound and notifies the driver, or is a vibrator provided at a position that can be sensed by the driver (for example, the driver's seat, steering wheel, etc.) Also good. Further, the notification device 1002 may be a combination of these.
 車両1000は実施の形態1~4の車両1に対応する。報知装置1002は図1、図13の情報報知装置9に対応し、入力装置1004は図1の操作部51、図13の入力部102に対応し、検出部1020は図1、図13の検出部6に対応する。また自動運転制御装置1030と運転支援装置1040は、図1、図13の車両制御部7に対応する。以下、実施の形態1~4で説明済の構成の説明は適宜省略する。 Vehicle 1000 corresponds to vehicle 1 in the first to fourth embodiments. The notification device 1002 corresponds to the information notification device 9 in FIGS. 1 and 13, the input device 1004 corresponds to the operation unit 51 in FIG. 1 and the input unit 102 in FIG. 13, and the detection unit 1020 detects in FIG. This corresponds to part 6. The automatic driving control device 1030 and the driving support device 1040 correspond to the vehicle control unit 7 in FIGS. 1 and 13. Hereinafter, the description of the configuration described in the first to fourth embodiments will be omitted as appropriate.
 報知装置1002は、車両の自動運転に関する情報を乗員へ提示するユーザインタフェース装置である。報知装置1002は、カーナビゲーションシステム、ディスプレイオーディオ等のヘッドユニットであってもよいし、スマートフォン、タブレット等の携帯端末機器であってもよいし、専用のコンソール端末装置であってもよい。また報知装置1002は、液晶ディスプレイ或いは有機ELディスプレイ或いはヘッドアップディスプレイ(HUD)であってもよい。入力装置1004は、乗員による操作入力を受け付けるユーザインタフェース装置である。例えば入力装置1004は、運転者が入力した自車の自動運転に関する情報を受け付ける。入力装置1004は、受け付けた情報を操作信号として運転支援装置1040に出力する。 The notification device 1002 is a user interface device that presents information on automatic driving of the vehicle to the occupant. The notification device 1002 may be a head unit such as a car navigation system or display audio, may be a mobile terminal device such as a smartphone or a tablet, or may be a dedicated console terminal device. The notification device 1002 may be a liquid crystal display, an organic EL display, or a head-up display (HUD). The input device 1004 is a user interface device that receives an operation input by an occupant. For example, the input device 1004 receives information related to automatic driving of the host vehicle input by the driver. The input device 1004 outputs the received information as an operation signal to the driving support device 1040.
 図33は、図32の車両1000の室内を模式的に示す。報知装置1002は、ヘッドアップディスプレイ(HUD)1002aであってもよく、センターディスプレイ1002bであってもよい。入力装置1004は、ステアリング1011に設けられた第1操作部1004aであってもよく、運転席と助手席との間に設けられた第2操作部1004bであってもよい。なお、報知装置1002と入力装置1004は一体化されてもよく、例えばタッチパネルディスプレイとして実装されてもよい。 FIG. 33 schematically shows the interior of the vehicle 1000 in FIG. The notification device 1002 may be a head-up display (HUD) 1002a or a center display 1002b. The input device 1004 may be the first operation unit 1004a provided on the steering 1011 or the second operation unit 1004b provided between the driver seat and the passenger seat. Note that the notification device 1002 and the input device 1004 may be integrated, and may be implemented as a touch panel display, for example.
 以下では言及しないが、図33に示すように、車両1000には自動運転に関する情報を音声にて乗員へ提示するスピーカ1006が設けられてもよい。この場合、運転支援装置1040は、自動運転に関する情報を示す画像を報知装置1002に表示させ、それとともに、またはそれに代えて、自動運転に関する情報を示す音声をスピーカ1006から出力させてもよい。 Although not mentioned below, as shown in FIG. 33, the vehicle 1000 may be provided with a speaker 1006 that presents information related to automatic driving to the occupant by voice. In this case, the driving support device 1040 may display an image indicating information related to automatic driving on the notification device 1002 and output a sound indicating information related to automatic driving from the speaker 1006 together with or instead of the information.
 図32に戻り、無線装置1008は、携帯電話通信システム、WMAN(Wireless Metropolitan Area Network)等に対応しており、車両1000外部の装置(図示せず)との無線通信を実行する。運転操作部1010は、ステアリング1011、ブレーキペダル1012、アクセルペダル1013、ウィンカスイッチ1014を有する。ステアリング1011は図1、図13のステアリングホイール5、ブレーキペダル1012は図1、図13のブレーキペダル2、アクセルペダル1013は図1、図13のアクセルペダル3、ウィンカスイッチ1014は図1、図13のウィンカレバー4に対応する。 32, the wireless device 1008 corresponds to a mobile phone communication system, WMAN (Wireless Metropolitan Area Network), and the like, and performs wireless communication with a device (not shown) outside the vehicle 1000. The driving operation unit 1010 includes a steering 1011, a brake pedal 1012, an accelerator pedal 1013, and a blinker switch 1014. The steering wheel 1011 is shown in FIGS. 1 and 13, the brake pedal 1012 is shown in FIG. 1, the brake pedal 2 shown in FIG. 13, the accelerator pedal 1013 is shown in FIG. 1, the accelerator pedal 3 shown in FIG. 13, and the winker switch 1014 is shown in FIGS. This corresponds to the winker lever 4.
 ステアリング1011、ブレーキペダル1012、アクセルペダル1013、ウィンカスイッチ1014は、ステアリングECU、ブレーキECU、エンジンECUとモータECUとの少なくとも一方、ウィンカコントローラにより電子制御が可能である。自動運転モードにおいて、ステアリングECU、ブレーキECU、エンジンECU、モータECUは、自動運転制御装置1030から供給される制御信号に応じて、アクチュエータを駆動する。またウィンカコントローラは、自動運転制御装置1030から供給される制御信号に応じてウィンカランプを点灯あるいは消灯する。 Steering 1011, brake pedal 1012, accelerator pedal 1013, and winker switch 1014 can be electronically controlled by a winker controller, at least one of a steering ECU, a brake ECU, an engine ECU, and a motor ECU. In the automatic operation mode, the steering ECU, the brake ECU, the engine ECU, and the motor ECU drive the actuator in accordance with a control signal supplied from the automatic operation control device 1030. The blinker controller turns on or off the blinker lamp according to a control signal supplied from the automatic operation control device 1030.
 検出部1020は、車両1000の周囲状況および走行状態を検出する。実施の形態1~4で一部既述したが、例えば検出部1020は、車両1000の速度、車両1000に対する先行車両の相対速度、車両1000と先行車両との距離、車両1000に対する側方車線の車両の相対速度、車両1000と側方車線の車両との距離、車両1000の位置情報を検出する。検出部1020は、検出した各種情報(以下、「検出情報」という)を自動運転制御装置1030、運転支援装置1040に出力する。なお、検出部1020の詳細は後述する。 Detecting unit 1020 detects the surrounding state and running state of vehicle 1000. As described in part in the first to fourth embodiments, for example, the detection unit 1020 detects the speed of the vehicle 1000, the relative speed of the preceding vehicle with respect to the vehicle 1000, the distance between the vehicle 1000 and the preceding vehicle, and the side lane with respect to the vehicle 1000. The relative speed of the vehicle, the distance between the vehicle 1000 and the vehicle in the side lane, and the position information of the vehicle 1000 are detected. The detection unit 1020 outputs various types of detected information (hereinafter referred to as “detection information”) to the automatic driving control device 1030 and the driving support device 1040. Details of the detection unit 1020 will be described later.
 自動運転制御装置1030は、自動運転制御機能を実装した自動運転コントローラであり、自動運転における車両1000の行動を決定する。自動運転制御装置1030は、制御部1031、記憶部1032、I/O部(入出力部)1033を有する。制御部1031の構成はハードウェア資源とソフトウェア資源の協働、又はハードウェア資源のみにより実現できる。ハードウェア資源としてプロセッサ、ROM、RAM、その他のLSIを利用でき、ソフトウェア資源としてオペレーティングシステム、アプリケーション、ファームウェア等のプログラムを利用できる。記憶部1032は、フラッシュメモリ等の不揮発性記録媒体を有する。I/O部(入出力部)1033は、各種の通信フォーマットに応じた通信制御を実行する。例えば、I/O部(入出力部)1033は、自動運転に関する情報を運転支援装置1040に出力するとともに、制御コマンドを運転支援装置1040から入力する。また、I/O部(入出力部)1033は、検出情報を検出部1020から入力する。 The automatic driving control device 1030 is an automatic driving controller that implements an automatic driving control function, and determines the behavior of the vehicle 1000 in automatic driving. The automatic operation control apparatus 1030 includes a control unit 1031, a storage unit 1032, and an I / O unit (input / output unit) 1033. The configuration of the control unit 1031 can be realized by cooperation of hardware resources and software resources, or only by hardware resources. Processors, ROM, RAM, and other LSIs can be used as hardware resources, and programs such as an operating system, application, and firmware can be used as software resources. The storage unit 1032 has a nonvolatile recording medium such as a flash memory. The I / O unit (input / output unit) 1033 executes communication control according to various communication formats. For example, the I / O unit (input / output unit) 1033 outputs information related to automatic driving to the driving support device 1040 and inputs a control command from the driving support device 1040. The I / O unit (input / output unit) 1033 inputs detection information from the detection unit 1020.
 制御部1031は、運転支援装置1040から入力した制御コマンド、検出部1020或いは各種ECUから収集した各種情報を自動運転アルゴリズムに適用して、車両1000の進行方向等の自動制御対象を制御するための制御値を算出する。制御部1031は算出した制御値を、各制御対象のECU又はコントローラに伝達する。本実施の形態ではステアリングECU、ブレーキECU、エンジンECU、ウィンカコントローラに伝達する。なお電気自動車或いはハイブリッドカーの場合、エンジンECUに代えて又は加えてモータECUに制御値を伝達する。 The control unit 1031 applies a control command input from the driving support device 1040, various information collected from the detection unit 1020 or various ECUs to the automatic driving algorithm, and controls an automatic control target such as a traveling direction of the vehicle 1000. Calculate the control value. The control unit 1031 transmits the calculated control value to each control target ECU or controller. In this embodiment, it is transmitted to the steering ECU, the brake ECU, the engine ECU, and the winker controller. In the case of an electric vehicle or a hybrid car, the control value is transmitted to the motor ECU instead of or in addition to the engine ECU.
 運転支援装置1040は、車両1000と運転者との間のインタフェース機能を実行するHMIコントローラであり、制御部1041、記憶部1042、I/O部(入出力部)1043を有する。制御部1041は、HMI制御等の各種データ処理を実行する。制御部1041は、ハードウェア資源とソフトウェア資源の協働、またはハードウェア資源のみにより実現できる。ハードウェア資源としてプロセッサ、ROM、RAM、その他のLSIを利用でき、ソフトウェア資源としてオペレーティングシステム、アプリケーション、ファームウェア等のプログラムを利用できる。 The driving support device 1040 is an HMI controller that executes an interface function between the vehicle 1000 and the driver, and includes a control unit 1041, a storage unit 1042, and an I / O unit (input / output unit) 1043. The control unit 1041 executes various data processing such as HMI control. The control unit 1041 can be realized by cooperation of hardware resources and software resources, or only by hardware resources. Processors, ROM, RAM, and other LSIs can be used as hardware resources, and programs such as an operating system, application, and firmware can be used as software resources.
 記憶部1042は、制御部1041により参照され、または更新されるデータを記憶する記憶領域である。例えばフラッシュメモリ等の不揮発の記録媒体により実現される。I/O部(入出力部)1043は、各種の通信フォーマットに応じた各種の通信制御を実行する。I/O部(入出力部)1043は、操作入力部1050、画像出力部1051、検出情報入力部1052、コマンドIF(インタフェース)1053、通信IF1056を有する。 The storage unit 1042 is a storage area that stores data that is referred to or updated by the control unit 1041. For example, it is realized by a non-volatile recording medium such as a flash memory. The I / O unit (input / output unit) 1043 executes various communication controls corresponding to various communication formats. The I / O unit (input / output unit) 1043 includes an operation input unit 1050, an image output unit 1051, a detection information input unit 1052, a command IF (interface) 1053, and a communication IF 1056.
 操作入力部1050は、入力装置1004に対してなされた運転者或いは乗員もしくは車外にいるユーザの操作による操作信号を入力装置1004から受信し、制御部1041へ出力する。画像出力部1051は、制御部1041が生成した画像データを報知装置1002へ出力して表示させる。検出情報入力部1052は、検出部1020による検出処理の結果であり、車両1000の現在の周囲状況および走行状態を示す情報(以下「検出情報」と呼ぶ。)を検出部1020から受信し、制御部1041へ出力する。 The operation input unit 1050 receives an operation signal from the input device 1004 by the operation of the driver, the passenger, or the user outside the vehicle made to the input device 1004 and outputs the operation signal to the control unit 1041. The image output unit 1051 outputs the image data generated by the control unit 1041 to the notification device 1002 for display. The detection information input unit 1052 is a result of the detection process by the detection unit 1020, receives information (hereinafter referred to as “detection information”) indicating the current surrounding state and running state of the vehicle 1000 from the detection unit 1020, and performs control. Output to the unit 1041.
 コマンドIF1053は、自動運転制御装置1030とのインタフェース処理を実行し、行動情報入力部1054とコマンド出力部1055を含む。行動情報入力部1054は、自動運転制御装置1030から送信された車両1000の自動運転に関する情報を受信し、制御部1041へ出力する。コマンド出力部1055は、自動運転制御装置1030に対して自動運転の態様を指示する制御コマンドを、制御部1041から受け付けて自動運転制御装置1030へ送信する。 The command IF 1053 executes an interface process with the automatic driving control apparatus 1030, and includes a behavior information input unit 1054 and a command output unit 1055. The behavior information input unit 1054 receives information regarding the automatic driving of the vehicle 1000 transmitted from the automatic driving control device 1030 and outputs the information to the control unit 1041. The command output unit 1055 receives from the control unit 1041 a control command that instructs the automatic driving control device 1030 to specify the mode of automatic driving, and transmits the control command to the automatic driving control device 1030.
 通信IF1056は、無線装置1008とのインタフェース処理を実行する。通信IF1056は、制御部1041から出力されたデータを無線装置1008へ送信し、無線装置1008から車外の装置へ送信させる。また、通信IF1056は、無線装置1008により転送された、車外の装置からのデータを受信し、制御部1041へ出力する。 The communication IF 1056 executes interface processing with the wireless device 1008. The communication IF 1056 transmits the data output from the control unit 1041 to the wireless device 1008, and transmits the data from the wireless device 1008 to a device outside the vehicle. The communication IF 1056 receives data from a device outside the vehicle transferred by the wireless device 1008 and outputs the data to the control unit 1041.
 なお、ここでは、自動運転制御装置1030と運転支援装置1040は別個の装置として構成される。変形例として、図32の破線で示すように、自動運転制御装置1030と運転支援装置1040を1つのコントローラに統合してもよい。言い換えれば、1つの自動運転制御装置が、図32の自動運転制御装置1030と運転支援装置1040の両方の機能を有する構成としてもよい。この場合に、統合したコントローラ内に複数のECUを設け、一方のECUが自動運転制御装置1030の機能を実現し、他方のECUが運転支援装置1040の機能を実現してもよい。また、統合したコントローラ内の1つのECUが複数のOS(Operating System)を実行し、一方のOSが自動運転制御装置1030の機能を実現し、他方のOSが運転支援装置1040の機能を実現してもよい。 Here, the automatic driving control device 1030 and the driving support device 1040 are configured as separate devices. As a modified example, as shown by a broken line in FIG. 32, the automatic driving control device 1030 and the driving support device 1040 may be integrated into one controller. In other words, one automatic driving control device may have the functions of both the automatic driving control device 1030 and the driving support device 1040 in FIG. In this case, a plurality of ECUs may be provided in the integrated controller, and one ECU may realize the function of the automatic driving control device 1030, and the other ECU may realize the function of the driving support device 1040. Further, one ECU in the integrated controller executes a plurality of OSs (Operating System), one OS realizes the function of the automatic driving control device 1030, and the other OS realizes the function of the driving support device 1040. May be.
 図34は、図32の検出部1020の詳細な構成を示すブロック図である。検出部1020は、位置情報取得部1021、センサ1022、速度情報取得部1023、地図情報取得部1024を有する。位置情報取得部1021は、GPS受信機から車両1000の現在位置を取得する。 FIG. 34 is a block diagram showing a detailed configuration of the detection unit 1020 in FIG. The detection unit 1020 includes a position information acquisition unit 1021, a sensor 1022, a speed information acquisition unit 1023, and a map information acquisition unit 1024. The position information acquisition unit 1021 acquires the current position of the vehicle 1000 from the GPS receiver.
 センサ1022は、車外の状況および車両1000の状態を検出するための各種センサの総称である。車外の状況を検出するためのセンサとして例えばカメラ、ミリ波レーダ、LIDAR(Light Detection and Ranging、Laser Imaging Detection and Ranging)、気温センサ、気圧センサ、湿度センサ、照度センサ等が搭載される。車外の状況は、車線情報を含む自車の走行する道路状況、天候を含む環境、自車周辺状況、近傍位置にある他車(隣接車線を走行する他車等)を含む。なお、センサが検出できる車外の情報であれば何でもよい。また車両1000の状態を検出するためのセンサとして例えば、加速度センサ、ジャイロセンサ、地磁気センサ、傾斜センサ等が搭載される。 The sensor 1022 is a general term for various sensors for detecting the situation outside the vehicle and the state of the vehicle 1000. For example, a camera, a millimeter wave radar, a LIDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), a temperature sensor, a pressure sensor, a humidity sensor, an illuminance sensor, and the like are mounted as sensors for detecting the situation outside the vehicle. The situation outside the vehicle includes a road condition in which the host vehicle travels including lane information, an environment including weather, a situation around the host vehicle, and other vehicles in the vicinity (such as other vehicles traveling in the adjacent lane). Any information outside the vehicle that can be detected by the sensor may be used. For example, an acceleration sensor, a gyro sensor, a geomagnetic sensor, a tilt sensor, and the like are mounted as sensors for detecting the state of the vehicle 1000.
 速度情報取得部1023は、車速センサから車両1000の現在速度を取得する。地図情報取得部1024は、地図データベースから車両1000の現在位置周辺の地図情報を取得する。地図データベースは、車両1000内の記録媒体に記録されていてもよいし、使用時にネットワークを介して地図サーバからダウンロードしてもよい。以下、各実施の形態を説明する。 Speed information acquisition unit 1023 acquires the current speed of vehicle 1000 from the vehicle speed sensor. The map information acquisition unit 1024 acquires map information around the current position of the vehicle 1000 from the map database. The map database may be recorded on a recording medium in the vehicle 1000, or may be downloaded from a map server via a network when used. Hereinafter, each embodiment will be described.
 (実施の形態5)
 まず概要を説明する。車両の自動運転中に、車両の現在の行動を報知するだけで将来の行動について何も報知しない場合、車両の乗員に不安感を抱かせてしまうことがあった。そこで実施の形態5では、自動運転において車両1000に現在実行させる行動(以下「現在行動」と呼ぶ。)と、その行動の次に実行させる予定の行動(以下「予定行動」と呼ぶ。)の両方を運転者へ提示する。具体的には、自動運転制御装置1030が現在行動と予定行動を決定する。そして運転支援装置1040が、現在行動と予定行動の両方を車両1000内の報知装置1002に表示させる。
(Embodiment 5)
First, an outline will be described. During the automatic driving of the vehicle, if only the current behavior of the vehicle is notified and nothing about the future behavior is notified, the vehicle occupant may feel uneasy. Therefore, in the fifth embodiment, an action that the vehicle 1000 currently performs in automatic driving (hereinafter referred to as “current action”) and an action that is scheduled to be executed next to the action (hereinafter referred to as “scheduled action”). Present both to the driver. Specifically, the automatic driving control device 1030 determines the current action and the scheduled action. Then, the driving support device 1040 displays both the current action and the scheduled action on the notification device 1002 in the vehicle 1000.
 以下では、これまでの実施の形態で説明済の内容は適宜省略する。本実施の形態で説明する構成或いは動作は、趣旨を逸脱しない範囲で、他の実施の形態或いは変形例で説明する構成或いは動作と組み合わせることができ、また置き換えることができる。 In the following, the contents already described in the above embodiments are omitted as appropriate. The structure or operation described in this embodiment can be combined with or replaced with the structure or operation described in another embodiment or modification without departing from the spirit of the present invention.
 実施の形態5の自動運転制御装置1030は、検出部1020が出力した検出情報に基づいて、現在行動と予定行動の両方を決定する。運転支援装置1040の行動情報入力部1054は、自動運転制御装置1030が車両1000に実行させる現在行動を示す現在行動情報と、その現在行動の次に自動運転制御装置1030が車両1000に実行させる予定行動を示す予定行動情報を自動運転制御装置1030から取得する。実施の形態5では、現在行動情報と予定行動情報の両方を含むデータを行動情報として取得する。現在行動とは、車両が現在実行している行動と言える。また予定行動とは、現在実行している行動が終了した後に、実行する行動と言え、例えば現在行動の次に実行する行動であってもよく、現在行動の次の次に実行する行動であってもよい。 The automatic driving control apparatus 1030 according to the fifth embodiment determines both the current action and the scheduled action based on the detection information output by the detection unit 1020. The action information input unit 1054 of the driving support device 1040 is configured to cause the vehicle 1000 to execute the current action information indicating the current action that the automatic driving control apparatus 1030 executes on the vehicle 1000 and the current action after the current action. Scheduled behavior information indicating behavior is acquired from the automatic driving control device 1030. In the fifth embodiment, data including both current behavior information and scheduled behavior information is acquired as behavior information. The current action is an action that the vehicle is currently executing. Further, the scheduled action is an action to be executed after the currently executed action is completed, for example, an action to be executed next to the current action, or an action to be executed next to the current action. May be.
 図35は、自動運転制御装置1030から入力される行動情報を示す。この行動情報には、現在行動の識別情報である現在行動情報と、予定行動の識別情報である予定行動情報が含まれる。現在行動および予定行動の識別情報は、行動の種類をユニークに識別可能なコード等であってもよい。また行動情報には、現在時刻から予定行動が実行されるまでの時間であり、言い換えれば、現在行動から予定行動に切り替わるまでの現在時刻からの時間を示す残り時間情報が含まれる。例えば、図35の行動情報は、現在行動が右への車線変更であり、予定行動が加速であり、現在行動を終了して予定行動を開始するまでの残り時間が10秒であることを示している。「現在時刻」は、自動運転制御装置1030が認識する現在時点の時刻であり、例えば自動運転制御装置1030内部のシステム時刻でもよい。また、車両1000内部で現在時刻を計時する不図示の時計装置から取得した時刻でもよい。さらにまた、現在行動の内容を決定した時点の時刻でもよく、将来実行する予定行動の内容を決定した時点の時刻でもよく、行動情報を運転支援装置1040へ通知する時点の時刻でもよい。 FIG. 35 shows action information input from the automatic driving control apparatus 1030. The action information includes current action information that is identification information of the current action and scheduled action information that is identification information of the scheduled action. The identification information of the current action and the scheduled action may be a code that can uniquely identify the type of action. The behavior information is the time from the current time until the scheduled behavior is executed, in other words, the remaining time information indicating the time from the current time until the current behavior is switched to the scheduled behavior. For example, the action information in FIG. 35 indicates that the current action is a lane change to the right, the scheduled action is acceleration, and the remaining time from the end of the current action to the start of the scheduled action is 10 seconds. ing. The “current time” is the current time recognized by the automatic driving control device 1030, and may be, for example, the system time inside the automatic driving control device 1030. Moreover, the time acquired from the clock apparatus not shown which time-measures the present time inside the vehicle 1000 may be sufficient. Furthermore, it may be the time when the content of the current action is determined, the time when the content of the scheduled action to be executed in the future is determined, or the time when the behavior information is notified to the driving support device 1040.
 図36は、運転支援装置1040の制御部1041の詳細な構成を示すブロック図である。制御部1041は画像生成部1060を含む。画像生成部1060は、自動運転制御装置1030から入力された現在行動情報に基づいて現在行動を表す現在行動画像を生成し、予定行動情報に基づいて予定行動を表す予定行動画像を生成する。実施の形態5では、予定行動画像のサイズよりも大きいサイズの現在行動画像を生成する。 FIG. 36 is a block diagram showing a detailed configuration of the control unit 1041 of the driving support device 1040. The control unit 1041 includes an image generation unit 1060. The image generation unit 1060 generates a current action image representing the current action based on the current action information input from the automatic driving control apparatus 1030, and generates a scheduled action image representing the scheduled action based on the scheduled action information. In the fifth embodiment, a current action image having a size larger than the size of the scheduled action image is generated.
 運転支援装置1040の画像出力部1051は、車両1000の運転者の一定視野内に現在行動画像と予定行動画像を表示させるように、現在行動画像と予定行動画像を報知装置1002へ出力する。運転者の一定視野内に表示させるとは、運転者の同一視野内に表示させるとも言える。例えば、現在行動画像と予定行動画像の両方を運転者が同時に視認可能なように、それらの画像を近傍位置或いは予め定められた距離内に同時に表示させてもよい。また、それらの画像を同一画面内の予め定められた位置に時間的な重なりを設けて表示させてもよい。また、現在行動画像と予定行動画像の両方を視線の移動が不要な近傍範囲に表示させてもよい。さらにまた、このような近傍範囲に現在行動画像と予定行動画像を並べた画面の画像データを画像生成部1060が生成してもよい。 The image output unit 1051 of the driving support apparatus 1040 outputs the current action image and the scheduled action image to the notification device 1002 so that the current action image and the scheduled action image are displayed within a certain field of view of the driver of the vehicle 1000. Displaying within the driver's fixed visual field can be said to display within the same visual field of the driver. For example, the current action image and the scheduled action image may be displayed simultaneously in the vicinity position or within a predetermined distance so that the driver can view both simultaneously. Further, these images may be displayed with a temporal overlap at predetermined positions in the same screen. Further, both the current action image and the scheduled action image may be displayed in the vicinity range where the movement of the line of sight is unnecessary. Furthermore, the image generation unit 1060 may generate image data of a screen in which the current action image and the scheduled action image are arranged in such a vicinity range.
 また画像生成部1060は、自動運転制御装置1030から入力された残り時間情報によって更新される予定行動が実行されるまでの時間を表す残り時間画像をさらに生成する。画像出力部1051は、残り時間画像を報知装置1002へさらに出力し、車両1000の運転者の一定視野内に、現在行動画像と、残り時間画像を加えた予定行動画像とを表示させる。 Further, the image generation unit 1060 further generates a remaining time image representing the time until the scheduled action updated by the remaining time information input from the automatic driving control device 1030 is executed. The image output unit 1051 further outputs the remaining time image to the notification device 1002, and displays the current action image and the scheduled action image to which the remaining time image is added within a fixed visual field of the driver of the vehicle 1000.
 報知装置1002に表示される自動運転に関する各種情報を含む画面を「自動運転情報画面」とも呼ぶ。図37A~37Bは、自動運転情報画面の一例を示す。図37Aは第1時点での自動運転情報画面1103の例を示し、図37Bは第1時点より後の第2時点での自動運転情報画面1103の例を示している。 The screen including various information related to automatic driving displayed on the notification device 1002 is also referred to as “automatic driving information screen”. 37A and 37B show examples of the automatic driving information screen. FIG. 37A shows an example of the automatic driving information screen 1103 at the first time point, and FIG. 37B shows an example of the automatic driving information screen 1103 at the second time point after the first time point.
 自動運転情報画面1103における現在行動画像1104の表示態様は、予定行動画像1106の表示態様とは異なるよう設定される。これにより、車両の現在行動と、その現在行動に続く予定行動とを運転者が混同してしまうことを防止する。例えば、現在行動画像1104は、予定行動画像1106より大きいサイズで表示される。また、現在行動画像1104は、車両1000の運転者の一定視野内の中心位置で表示され、例えば自動運転情報画面1103の中心の近傍位置に表示される。その一方、予定行動画像1106は、運転者の一定視野内の周辺位置で表示され、例えば自動運転情報画面1103の端付近に表示される。図37A、37Bの例では、現在行動画像1104は、自動運転情報画面1103上の画面上のメイン領域1100に表示される一方、予定行動画像1106はメイン領域1100よりも小さいサブ領域1102に表示される。 The display mode of the current action image 1104 on the automatic driving information screen 1103 is set to be different from the display mode of the scheduled action image 1106. This prevents the driver from confusing the current behavior of the vehicle with the scheduled behavior following the current behavior. For example, the current action image 1104 is displayed with a size larger than the scheduled action image 1106. In addition, the current action image 1104 is displayed at the center position within a fixed visual field of the driver of the vehicle 1000, for example, at a position near the center of the automatic driving information screen 1103. On the other hand, the scheduled action image 1106 is displayed at a peripheral position within a fixed visual field of the driver, for example, near the end of the automatic driving information screen 1103. In the example of FIGS. 37A and 37B, the current action image 1104 is displayed in the main area 1100 on the automatic driving information screen 1103, while the scheduled action image 1106 is displayed in the sub area 1102 smaller than the main area 1100. The
 また自動運転情報画面1103では、残り時間画像1108が予定行動画像1106に対応付けて表示される、具体的には、予定行動画像1106と残り時間画像1108は、同じサブ領域1102内の近傍位置に並べて表示される。実施の形態5では、残り時間画像1108は複数の時間インジケータ1109を含む。各時間インジケータ1109は、点灯状態または消灯状態で表示され、予定行動が実行されるまでの残り時間が長いほど多くの時間インジケータ1109が点灯状態で表示される。 Further, on the automatic driving information screen 1103, the remaining time image 1108 is displayed in association with the scheduled action image 1106. Specifically, the scheduled action image 1106 and the remaining time image 1108 are located in the vicinity of the same sub-region 1102. They are displayed side by side. In the fifth embodiment, the remaining time image 1108 includes a plurality of time indicators 1109. Each time indicator 1109 is displayed in a lit state or an unlit state, and the longer the remaining time until the scheduled action is executed, the more time indicators 1109 are displayed in the lit state.
 また残り時間画像1108では、時間経過に伴って、点灯状態の時間インジケータ1109が徐々に消灯状態へ変化していくことにより、予定行動実行までの残り時間の状況を運転者に報知する。例えば、1つの時間インジケータ1109あたり5秒が割当てられ、5つの時間インジケータ1109で最大25秒の残り時間を示す構成であってもよい。1つの時間インジケータ1109が消灯するまでの時間は、開発者の知見或いは実験等により適切な値が定められてよく、運転者等のユーザが任意の時間を設定可能であってもよい。 Also, in the remaining time image 1108, the time indicator 1109 in the lit state gradually changes to the unlit state as time elapses, thereby notifying the driver of the remaining time until the scheduled action is executed. For example, 5 seconds may be allocated to one time indicator 1109, and the remaining time of the maximum 25 seconds may be indicated by the five time indicators 1109. An appropriate value for the time until one time indicator 1109 is turned off may be determined by a developer's knowledge or experiment, and a user such as a driver may be able to set an arbitrary time.
 典型的には、図37Aの自動運転情報画面1103が表示された第1時点から、1つの時間インジケータ1109分の時間が経過すると、車両1000の現在行動が図37Aの予定行動画像1106で示された「加速」に切り替わる。それとともに、報知装置1002でも図37Bの自動運転情報画面1103が表示される。図37Bの自動運転情報画面1103においては、図37Aの予定行動画像1106で示された「加速」が現在行動画像1104で示される。 Typically, when one time indicator 1109 minutes elapses from the first time point when the automatic driving information screen 1103 of FIG. 37A is displayed, the current action of the vehicle 1000 is shown in the scheduled action image 1106 of FIG. 37A. Switch to "acceleration". At the same time, the automatic operation information screen 1103 of FIG. In the automatic driving information screen 1103 in FIG. 37B, “acceleration” shown in the scheduled action image 1106 in FIG. 37A is shown in the current action image 1104.
 自動運転制御装置1030は、予定行動として、複数の行動(単一行動)を連続して組み合わせた情報である行動計画を決定してもよい。行動計画は「追い越し」を含む。例えば、行動計画「追い越し」は、3つの単一行動の組み合わせで構成され、具体的には(1)右への車線変更、(2)加速、(3)左への車線変更、の組み合わせで構成される。自動運転制御装置1030が予定行動として行動計画を決定した場合、運転支援装置1040の行動情報入力部1054は、その行動計画を示す予定行動情報を取得する。具体的には、図35で示した行動情報の予定行動欄(予定行動情報)に、複数の単一行動と、それぞれの単一行動の実行順序を示す情報が設定される。 The automatic driving control apparatus 1030 may determine an action plan that is information obtained by continuously combining a plurality of actions (single actions) as the scheduled action. The action plan includes “overtaking”. For example, the action plan “passing” is composed of a combination of three single actions, specifically, a combination of (1) lane change to the right, (2) acceleration, and (3) lane change to the left. Composed. When the automatic driving control apparatus 1030 determines an action plan as a scheduled action, the action information input unit 1054 of the driving support apparatus 1040 acquires scheduled action information indicating the action plan. Specifically, a plurality of single actions and information indicating the execution order of each single action are set in the scheduled action column (scheduled action information) of the action information shown in FIG.
 予定行動が行動計画である場合、画像生成部1060は、行動計画に含まれる複数の単一行動を行動単位で表現した複数の画像を、複数の単一行動が実行される順序にしたがって配列した予定行動画像1106を生成する。図38は、自動運転情報画面の一例を示す。同図の自動運転情報画面1103には、追い越しを示す行動計画を表す予定行動画像1106が表示されている。また同図の予定行動画像1106は、3つの単一行動を示す3つの画像を含み、実行順序が早い単一行動の画像ほど下位置に配置されている。 When the scheduled action is an action plan, the image generation unit 1060 arranges a plurality of images expressing a plurality of single actions included in the action plan in units of actions according to the order in which the plurality of single actions are executed. A scheduled action image 1106 is generated. FIG. 38 shows an example of the automatic driving information screen. In the automatic driving information screen 1103 in the figure, a scheduled action image 1106 representing an action plan indicating overtaking is displayed. Further, the scheduled action image 1106 in the figure includes three images showing three single actions, and the single action image having the earlier execution order is arranged at a lower position.
 また自動運転制御装置1030は、予定行動として、自動運転において現在行動の次に実行することが可能な複数の行動候補を予定行動として決定してもよい。この場合の予定行動は、例えば第1候補「加速」と第2候補「減速」の2つを含んでもよく、さらに第3候補「左への車線変更」を加えた3つを含んでもよい。自動運転制御装置1030は、複数の行動候補のそれぞれを実行する場合の残り時間を決定し、さらに複数の行動候補間の優先順位を決定する。 Further, the automatic driving control apparatus 1030 may determine, as the scheduled action, a plurality of action candidates that can be executed next to the current action in the automatic driving as the scheduled action. The scheduled behavior in this case may include, for example, two of the first candidate “acceleration” and the second candidate “deceleration”, and may further include three of the third candidate “lane change to the left”. The automatic driving control apparatus 1030 determines the remaining time when each of the plurality of action candidates is executed, and further determines the priority order among the plurality of action candidates.
 運転支援装置1040の行動情報入力部1054は、複数の行動候補を含む予定行動情報(ここでは「候補情報」と呼ぶ。)を取得する。具体的には、図35で示した行動情報の予定行動欄に、複数の行動候補と、各行動候補についての残り時間および優先順位を示す候補情報が設定される。 The behavior information input unit 1054 of the driving support device 1040 acquires scheduled behavior information including a plurality of behavior candidates (referred to herein as “candidate information”). Specifically, candidate information indicating a plurality of action candidates and the remaining time and priority for each action candidate is set in the scheduled action column of action information shown in FIG.
 画像生成部1060は、行動情報が示す現在行動を表す現在行動画像1104と、行動情報が示す複数の行動候補に対応する複数の候補画像を生成する。実施の形態5では、複数の候補画像として、複数の行動候補を表す複数の予定行動画像1106と、各行動候補の残り時間を表す複数の残り時間画像1108を生成する。画像出力部1051は、報知装置1002に対して現在行動画像1104を出力するとともに、複数の候補画像を所定の順位付けをして出力する。これにより、運転者の一定視野内に現在行動画像1104とともに、所定の順位付けにより表示配置された複数の候補画像を表示させる。 The image generation unit 1060 generates a current action image 1104 representing the current action indicated by the action information and a plurality of candidate images corresponding to the plurality of action candidates indicated by the action information. In the fifth embodiment, a plurality of scheduled action images 1106 representing a plurality of action candidates and a plurality of remaining time images 1108 representing the remaining time of each action candidate are generated as a plurality of candidate images. The image output unit 1051 outputs the current action image 1104 to the notification device 1002 and outputs a plurality of candidate images with a predetermined ranking. As a result, a plurality of candidate images displayed and arranged in a predetermined ranking are displayed together with the current action image 1104 within a fixed visual field of the driver.
 候補行動間の順位付けは、行動情報に示された各行動候補の優先度にしたがって実行し、具体的には、優先度が高い行動候補ほど順位を高くする。変形例として、実行までの残り時間が短い行動候補ほど順位を高くしてもよい。また、後述の実施の形態で説明するように、現在の車両1000の周囲状況および走行状態の下、運転者の嗜好或いは操作パターンとの合致の度合いが高い行動候補ほど順位を高くしてもよい。 Ranking among the candidate actions is performed according to the priority of each action candidate indicated in the action information, and specifically, the action candidate having a higher priority is ranked higher. As a modified example, an action candidate with a shorter remaining time until execution may have a higher rank. Further, as will be described later in the embodiment, the action candidate having a higher degree of matching with the driver's preference or operation pattern under the current surrounding conditions and running state of the vehicle 1000 may be ranked higher. .
 順位付けによる表示配置は、複数の行動候補に対応する複数の候補画像を、各行動候補の順位に応じた自動運転情報画面1103内の所定位置に、上下左右などに順位の並びに可視化したノンパラメトリック表示の形態で、配置することでもよい。この順位(他の実施の形態でも同様)は、画面上の表示或いは運転者への提案の優先順位、優先度合と言え、推奨順位、推奨度合とも言える。例えば、高順位の行動候補の候補画像ほど自動運転情報画面1103のサブ領域1102における右側(左側でもよく、事前に取決められてよい)に配置してもよい。 The display arrangement by ranking is a non-parametric method in which a plurality of candidate images corresponding to a plurality of behavior candidates are visualized in a predetermined position in the automatic driving information screen 1103 corresponding to the ranking of each behavior candidate in order of top, bottom, left and right. It may be arranged in the form of display. This order (same in other embodiments) can be said to be a priority order and a priority degree of display on the screen or a proposal to the driver, and can also be said to be a recommended order and a recommended degree. For example, the candidate images of higher-ranking action candidates may be arranged on the right side (the left side may be determined in advance) in the sub area 1102 of the automatic driving information screen 1103.
 また、順位付けによる表示配置は、ヒストグラムなどで順位を可視化したパラメトリック表示であってもよい。例えば、各候補画像を、画像生成部1060により生成された順位を示すオブジェクトを付加した態様で表示させてもよい。このオブジェクトは、順位に応じた形状のヒストグラム画像或いは順位そのものを示す数字画像等でもよく、画像生成部1060により生成されてもよい。パラメトリック表示の例は後述の図63A、63Bにも示す。さらにまた、高順位の行動候補の候補画像ほど目立つ外観(意匠)で表示してもよく、例えば表示サイズを大きくしてもよく、より視認性が高いと想定される色彩で表示してもよい。 Also, the display arrangement by ranking may be a parametric display in which the ranking is visualized by a histogram or the like. For example, each candidate image may be displayed in a form to which an object indicating the order generated by the image generation unit 1060 is added. This object may be a histogram image having a shape corresponding to the ranking, a numeric image indicating the ranking itself, or the like, or may be generated by the image generation unit 1060. Examples of parametric display are also shown in FIGS. 63A and 63B described later. Furthermore, the higher-ranking action candidate candidate images may be displayed with a more conspicuous appearance (design), for example, the display size may be increased, or the colors may be displayed with a higher visibility. .
 図39は、自動運転情報画面の一例を示す。同図の自動運転情報画面1103では、第1行動候補「加速」を表す第1予定行動画像1106aと、第2行動候補「減速」を表す第2予定行動画像1106bが表示されている。また、第1行動候補が実行されるまでの残り時間を示す第1残り時間画像1108aと、第2行動候補が実行されるまでの残り時間を示す第2残り時間画像1108bが表示されている。ここでは、「加速」の方の優先度が高いこととし、「加速」を表す第1予定行動画像1106aが、より順位が高いことを示す右側に表示されている。図39では、ノンパラメトリック表示の形態で、第1予定行動画像1106aと残り時間画像1108の順位を提示している。既述したように、第1予定行動画像1106aと第2予定行動画像1106bのそれぞれに対して順位に応じた形状のヒストグラム等を付加するパラメトリック形式で順位を提示してもよい。 FIG. 39 shows an example of the automatic driving information screen. In the automatic driving information screen 1103 in the figure, a first scheduled action image 1106a representing the first action candidate “acceleration” and a second scheduled action image 1106b representing the second action candidate “deceleration” are displayed. Also, a first remaining time image 1108a indicating the remaining time until the first action candidate is executed, and a second remaining time image 1108b indicating the remaining time until the second action candidate is executed are displayed. Here, it is assumed that the priority of “acceleration” is higher, and the first scheduled action image 1106a representing “acceleration” is displayed on the right side indicating that the ranking is higher. In FIG. 39, the order of the first scheduled action image 1106a and the remaining time image 1108 is presented in a non-parametric display form. As described above, the ranking may be presented in a parametric format in which a histogram having a shape corresponding to the ranking is added to each of the first scheduled behavior image 1106a and the second scheduled behavior image 1106b.
 この態様によると、車両の自動運転で将来実行されうる複数の行動候補を運転者に事前に把握させ、運転者に一層の安心感を提供できる。また、後の実施の形態で説明するように、運転者が予定行動を選択可能である場合、運転者は、自動運転における車両の将来行動を複数の候補の中から選択可能になる。また、候補間の順位を運転者に示すことで、運転者による候補の選択を支援できる。 According to this aspect, it is possible to allow the driver to know in advance a plurality of action candidates that can be executed in the future by automatic driving of the vehicle, and to provide further security to the driver. Further, as will be described in a later embodiment, when the driver can select the scheduled action, the driver can select the future action of the vehicle in automatic driving from a plurality of candidates. Moreover, the selection of candidates by the driver can be supported by showing the rank among candidates to the driver.
 図40は、車両1000のHMI制御に係る処理の例を示すシーケンス図である。検出部1020は、車両1000の周囲状況および走行状態を検出し、その検出結果を示す検出情報を自動運転制御装置1030へ出力する(P1)。自動運転制御装置1030は、検出部1020から取得した検出情報にしたがって、車両の現在行動、予定行動、予定行動実行までの残り時間を決定する。そして、現在行動の実行を指示する行動指示を運転操作部1010へ出力することにより、その現在行動を車両1000に実行させる(P2)。さらに自動運転制御装置1030は、現在行動情報、予定行動情報、残り時間情報を運転支援装置1040へ送信する(P3)。 FIG. 40 is a sequence diagram illustrating an example of processing related to HMI control of the vehicle 1000. Detection unit 1020 detects the surrounding situation and running state of vehicle 1000, and outputs detection information indicating the detection result to automatic driving control device 1030 (P1). The automatic driving control apparatus 1030 determines the remaining time until execution of the current action, the scheduled action, and the scheduled action of the vehicle according to the detection information acquired from the detection unit 1020. Then, by outputting an action instruction for instructing execution of the current action to the driving operation unit 1010, the vehicle 1000 is caused to execute the current action (P2). Furthermore, the automatic driving control apparatus 1030 transmits the current action information, the scheduled action information, and the remaining time information to the driving support apparatus 1040 (P3).
 運転支援装置1040は、自動運転制御装置1030から取得した現在行動情報、予定行動情報、残り時間情報をもとに現在行動画像、予定行動画像、残り時間画像を生成し、報知装置1002に表示させる(P4)。具体的には、運転支援装置1040の画像生成部1060は、現在行動画像、予定行動画像、残り時間画像の画面上での表示位置をさらに決定する。運転支援装置1040の画像出力部1051は、現在行動画像、予定行動画像、残り時間画像とともに、各画像の表示位置を示す表示位置情報を報知装置1002へ出力し、各画像を図37A~37B等で示した位置に配置した自動運転情報画面を表示させる。 The driving support apparatus 1040 generates a current action image, a scheduled action image, and a remaining time image based on the current action information, the scheduled action information, and the remaining time information acquired from the automatic driving control apparatus 1030, and displays them on the notification device 1002. (P4). Specifically, the image generation unit 1060 of the driving support device 1040 further determines the display positions on the screen of the current action image, the scheduled action image, and the remaining time image. The image output unit 1051 of the driving support device 1040 outputs display position information indicating the display position of each image together with the current action image, the scheduled action image, and the remaining time image to the notification device 1002, and the images are shown in FIGS. 37A to 37B, etc. The automatic driving information screen arranged at the position indicated by is displayed.
 変形例として、運転支援装置1040の画像生成部1060は、現在行動画像を中心位置に配置し、予定行動画像および残り時間画像を周辺位置に配置した自動運転情報画面全体の画像データを生成してもよい。運転支援装置1040の画像出力部1051は、生成された自動運転情報画面の画像データを報知装置1002へ出力して表示させてもよい。 As a modification, the image generation unit 1060 of the driving support device 1040 generates image data of the entire automatic driving information screen in which the current action image is arranged at the center position and the scheduled action image and the remaining time image are arranged at the peripheral positions. Also good. The image output unit 1051 of the driving support device 1040 may output and display the generated image data of the automatic driving information screen to the notification device 1002.
 運転支援装置1040は、P4からの時間を計測し(P5)、時間経過を反映した更新後の残り時間画像を報知装置1002へ出力することにより、自動運転情報画面における残り時間画像の態様を更新する(P6)。更新後の残り時間画像は、例えばそれまで点灯状態であった時間インジケータを消灯状態に変更した画像である。運転支援装置1040は、新たな現在行動情報、予定行動情報、残り時間情報を自動運転制御装置1030から取得するまで、残り時間画像の更新処理を繰り返す(S7~S8)。 The driving support device 1040 measures the time from P4 (P5), and outputs the updated remaining time image reflecting the passage of time to the notification device 1002, thereby updating the mode of the remaining time image on the automatic driving information screen. (P6). The remaining time image after the update is, for example, an image obtained by changing a time indicator that has been in a lit state until then to an unlit state. The driving support device 1040 repeats the remaining time image update processing until new current behavior information, scheduled behavior information, and remaining time information are acquired from the automatic driving control device 1030 (S7 to S8).
 検出部1020は定期的に、車両1000の周囲状況および走行状態を検出し、その検出結果を示す検出情報を自動運転制御装置1030へ出力する(P9)。自動運転制御装置1030は、検出情報にしたがって現在行動、予定行動、予定行動までの残り時間を新たに決定する。そして、新たに決定した現在行動の実行を指示する行動指示を運転操作部1010へ出力することにより、新たに決定した現在行動を車両1000に実行させる(P10)。自動運転制御装置1030は、新たに決定した現在行動、予定行動、残り時間を示す新たな現在行動情報、予定行動情報、残り時間情報を運転支援装置1040へ送信する(P11)。運転支援装置1040は、自動運転制御装置1030から取得した新たな現在行動情報、新たな予定行動情報、新たな残り時間情報に基づいて、新たな現在行動画像、新たな予定行動画像、新たな残り時間画像を生成し、報知装置1002に表示させる(P12)。 The detection unit 1020 periodically detects the surrounding state and the running state of the vehicle 1000, and outputs detection information indicating the detection result to the automatic driving control device 1030 (P9). The automatic driving control device 1030 newly determines the current action, the scheduled action, and the remaining time until the scheduled action according to the detection information. Then, by outputting an action instruction for instructing execution of the newly determined current action to the driving operation unit 1010, the vehicle 1000 is caused to execute the newly determined current action (P10). The automatic driving control apparatus 1030 transmits the newly determined current action, scheduled action, new current action information indicating the remaining time, scheduled action information, and remaining time information to the driving support apparatus 1040 (P11). Based on the new current behavior information, new planned behavior information, and new remaining time information acquired from the automatic driving control device 1030, the driving support device 1040 creates a new current behavior image, new planned behavior image, and new remaining information. A time image is generated and displayed on the notification device 1002 (P12).
 なお図40の破線で示すように、自動運転制御装置1030と運転支援装置1040は1つの自動運転制御装置に統合されてよい。その場合、1つの自動運転制御装置が自動運転制御装置1030と運転支援装置1040の両方の処理を実行してもよい。以降の実施形態についても同様である。 Note that, as indicated by a broken line in FIG. 40, the automatic driving control device 1030 and the driving support device 1040 may be integrated into one automatic driving control device. In that case, one automatic driving control device may execute both the automatic driving control device 1030 and the driving support device 1040. The same applies to the following embodiments.
 図41は、運転支援装置1040の処理の例を示すフローチャートである。自動運転制御装置1030から出力された行動情報を行動情報入力部1054が取得すると(S100のY)、画像生成部1060は、行動情報が示す現在行動と、予め記憶部1042に記憶させた現在行動とが一致するか否かを判定する。さらに、行動情報が示す予定行動と、予め記憶部1042に記憶させた予定行動とが一致するか否かを判定する。 FIG. 41 is a flowchart showing an example of processing of the driving support apparatus 1040. When the behavior information input unit 1054 acquires the behavior information output from the automatic driving control device 1030 (Y in S100), the image generation unit 1060 displays the current behavior indicated by the behavior information and the current behavior stored in the storage unit 1042 in advance. Is matched. Further, it is determined whether or not the scheduled action indicated by the action information matches the scheduled action stored in the storage unit 1042 in advance.
 行動情報が示す現在行動と記憶部1042に予め記憶された現在行動が不一致の場合、すなわち現在行動が更新された場合(S102のY)、画像生成部1060は、行動情報が示す現在行動を表す現在行動画像を生成する(S103)。画像出力部1051は、現在行動画像を報知装置1002へ出力して表示させ(S104)、画像生成部1060は、行動情報が示す現在行動を記憶部1042に記憶させる(S105)。行動情報が示す現在行動と記憶部1042に予め記憶された現在行動が一致する場合、すなわち現在行動の更新がなければ(S102のN)、S103~S105をスキップする。 When the current action indicated by the action information does not match the current action stored in advance in the storage unit 1042, that is, when the current action is updated (Y in S102), the image generation unit 1060 represents the current action indicated by the action information. A current action image is generated (S103). The image output unit 1051 outputs and displays the current action image to the notification device 1002 (S104), and the image generation unit 1060 stores the current action indicated by the action information in the storage unit 1042 (S105). If the current action indicated by the action information matches the current action stored in advance in the storage unit 1042, that is, if there is no update of the current action (N in S102), S103 to S105 are skipped.
 行動情報が示す予定行動と記憶部1042に予め記憶された予定行動が不一致の場合、すなわち予定行動が更新された場合(S106のY)、画像生成部1060は、行動情報が示す予定行動を表す予定行動画像を生成する。さらに画像生成部1060は、行動情報が示す残り時間を表す残り時間画像を生成する(S107)。画像出力部1051は、予定行動画像と残り時間画像を報知装置1002へ出力して表示させ(S108)、画像生成部1060は、行動情報が示す予定行動を記憶部1042に記憶させる(S109)。画像生成部1060は、予定行動画像と残り時間画像の表示開始からの経過時間の計測を開始する(S110)。 When the scheduled action indicated by the action information does not match the scheduled action stored in advance in the storage unit 1042, that is, when the scheduled action is updated (Y in S106), the image generation unit 1060 represents the scheduled action indicated by the action information. A scheduled action image is generated. Further, the image generation unit 1060 generates a remaining time image representing the remaining time indicated by the behavior information (S107). The image output unit 1051 outputs and displays the scheduled action image and the remaining time image to the notification device 1002 (S108), and the image generation unit 1060 stores the scheduled action indicated by the action information in the storage unit 1042 (S109). The image generation unit 1060 starts measuring the elapsed time from the start of displaying the scheduled action image and the remaining time image (S110).
 所定の終了条件が満たされた場合(S111のY)、本図のフローを終了し、終了条件が満たされなければ(S111のN)、S100に戻る。終了条件は、以降の実施の形態で共通であり、例えば運転者が自動運転モードを終了させた場合或いは、車両のイグニッションキーまたは電源がオフに切り替わった場合に満たされる。 If the predetermined end condition is satisfied (Y in S111), the flow of this figure is terminated. If the end condition is not satisfied (N in S111), the process returns to S100. The termination condition is common to the following embodiments, and is satisfied, for example, when the driver terminates the automatic driving mode or when the vehicle ignition key or the power source is switched off.
 自動運転制御装置1030から行動情報が未入力の場合(S100のN)、画像生成部1060は、経過時間の計測開始から所定時間が経過したか否かを判定する。同様に、行動情報が示す予定行動と記憶部1042に予め記憶された予定行動が一致する場合、すなわち予定行動の更新がない場合(S106のN)、画像生成部1060は、経過時間の計測開始から所定時間が経過したか否かを判定する。経過時間の計測開始から所定時間が経過したことを検出すると(S112のY)、画像生成部1060は、残り時間画像を更新する(S113)。例えば、1つの時間インジケータ1109に割当てられた時間が経過したことを検出すると、1つの時間インジケータ1109を点灯状態から消灯状態に変更した残り時間画像を生成する。画像出力部1051は、更新された残り時間画像を報知装置1002へ出力して表示させる(S114)。変形例として、残り時間画像を更新する場合も、自動運転情報画面の画像全体を更新してもよい。経過時間の計測開始から所定時間が未経過であれば(S112のN)、S113とS114をスキップする。 When the behavior information is not input from the automatic driving control device 1030 (N in S100), the image generation unit 1060 determines whether or not a predetermined time has elapsed from the start of the elapsed time measurement. Similarly, when the scheduled action indicated by the action information matches the scheduled action stored in advance in the storage unit 1042, that is, when the scheduled action is not updated (N in S106), the image generation unit 1060 starts measuring elapsed time. It is determined whether or not a predetermined time has passed. When it is detected that a predetermined time has elapsed since the start of the elapsed time measurement (Y in S112), the image generation unit 1060 updates the remaining time image (S113). For example, when it is detected that the time allocated to one time indicator 1109 has elapsed, a remaining time image in which one time indicator 1109 is changed from a lighting state to a non-lighting state is generated. The image output unit 1051 outputs and displays the updated remaining time image to the notification device 1002 (S114). As a modification, when updating the remaining time image, the entire image of the automatic driving information screen may be updated. If the predetermined time has not elapsed since the start of the elapsed time measurement (N in S112), S113 and S114 are skipped.
 図41のS106~S109に関連し、自動運転制御装置1030は、ある現在行動を実行中に、予定行動を第1予定行動(例えば右への車線変更)から第2予定行動(例えば加速)に変更することがある。この場合、運転支援装置1040の行動情報入力部1054は、現在行動情報が未更新の一方、予定行動情報および残り時間情報が更新された行動情報を自動運転制御装置1030から取得する。具体的には、予定行動情報が第2予定行動を示すように更新され、残り時間情報が第2予定行動を実行するまでの時間を示すように更新された行動情報を取得する。 In relation to S106 to S109 in FIG. 41, the automatic driving control apparatus 1030 changes the scheduled action from the first scheduled action (for example, lane change to the right) to the second scheduled action (for example, acceleration) while executing a certain current action. May change. In this case, the behavior information input unit 1054 of the driving support device 1040 acquires from the automatic driving control device 1030 behavior information in which the scheduled behavior information and the remaining time information are updated while the current behavior information is not updated. Specifically, the action information updated so that the scheduled action information indicates the second scheduled action and the remaining time information indicates the time until the second scheduled action is executed is acquired.
 画像生成部1060は、第2予定行動を表す新たな予定行動画像と、第2予定行動を実行するまでの時間を表す新たな残り時間画像を生成する。画像出力部1051は、新たな予定行動画像と新たな残り時間画像を報知装置1002へ出力する。これにより、運転者の一定視野内に、第1予定行動を表すそれまでの予定行動画像と残り時間画像に代えて、第2予定行動を表す新たな予定行動画像と新たな残り時間画像を、未更新の現在行動画像とともに表示させる。 The image generation unit 1060 generates a new scheduled action image representing the second scheduled action and a new remaining time image representing the time until the second scheduled action is executed. The image output unit 1051 outputs a new scheduled action image and a new remaining time image to the notification device 1002. Thereby, instead of the previous scheduled action image representing the first scheduled action and the remaining time image within the fixed visual field of the driver, a new scheduled action image representing the second scheduled action and a new remaining time image are displayed. It is displayed with an unupdated current action image.
 なお、図41のフローチャートでは各処理を順次実行することとしたが、各処理を適宜並行実行してもよい。例えば、S102~S105の処理と、S106~S110の処理を並行実行してもよい。また、自動運転制御装置1030から行動情報が入力された場合にS102およびS106の判定をスキップしてもよい。すなわち、現在行動と予定行動の更新有無にかかわらず、常に現在行動画像、予定行動画像、残り時間画像の新たな生成と出力を実行してもよい。 In the flowchart of FIG. 41, each process is executed sequentially, but each process may be executed in parallel as appropriate. For example, the processing of S102 to S105 and the processing of S106 to S110 may be executed in parallel. Further, when behavior information is input from the automatic driving control apparatus 1030, the determinations in S102 and S106 may be skipped. That is, new generation and output of the current action image, the scheduled action image, and the remaining time image may be executed regardless of whether the current action and the scheduled action are updated.
 以上説明したように、実施の形態5の運転支援装置1040は、車両の乗員(運転者等)に対して、自動運転における現在行動を報知するとともに、将来時点の行動であり、具体的には現在行動に続く予定行動も事前に報知する。また、現在行動が予定行動に切り替わるまでの残り時間もさらに報知する。これにより、車両の運転者が自動運転における予定行動を予見し、また、その予定行動がいつ実行されるかを予見できるよう支援できる。言い換えれば、車両の自動運転において乗員の予期せぬ行動を実行することを抑制し、乗員に不安感を抱かせてしまうことを抑制できる。また、自動運転を中止して運転権限を取り戻すかどうかの判断を運転者が適切に下せるように支援できる。 As described above, the driving support apparatus 1040 according to the fifth embodiment notifies the vehicle occupants (drivers, etc.) of the current action in the automatic driving and is the action at the future time point. The scheduled action following the current action is also notified in advance. Further, the remaining time until the current action is switched to the scheduled action is further notified. Thus, it is possible to assist the driver of the vehicle to foresee the scheduled action in the automatic driving and foresee when the scheduled action is executed. In other words, it is possible to suppress the occupant from performing unexpected behavior in the automatic driving of the vehicle, and to suppress the occupant from feeling uneasy. In addition, it is possible to assist the driver in appropriately determining whether to cancel the automatic driving and regain the driving authority.
 変形例を説明する。自動運転制御装置1030は、車両の自動運転中に、現在行動である第1行動(例えば右への車線変更)の次に実行させる予定の第2行動(例えば加速)と、その第2行動を実行するまでの残り時間を決定する。この場合に、自動運転制御装置1030は、第2行動の次に実行させる予定の第3行動(例えば減速)と、その第3行動を実行するまでの残り時間をさらに決定してもよい。 A modification will be described. The automatic driving control device 1030 performs the second action (for example, acceleration) scheduled to be executed next to the first action (for example, lane change to the right) that is the current action and the second action during the automatic driving of the vehicle. Determine the remaining time until execution. In this case, the automatic driving control apparatus 1030 may further determine the third action (for example, deceleration) that is scheduled to be executed next to the second action and the remaining time until the third action is executed.
 運転支援装置1040の行動情報入力部1054は、第1行動を示す現在行動情報と、第2行動と第3行動を示す予定行動情報と、第2行動までの残り時間と第3行動までの残り時間を示す残り時間情報を自動運転制御装置1030から取得してもよい。画像生成部1060は、第1行動を表す現在行動画像、第2行動と第2行動までの残り時間を示す予定行動画像と残り時間画像、第3行動と第3行動までの残り時間を示す予定行動画像と残り時間画像を生成してもよい。 The action information input unit 1054 of the driving support device 1040 includes current action information indicating the first action, scheduled action information indicating the second action and the third action, the remaining time until the second action, and the remaining until the third action. You may acquire the remaining time information which shows time from the automatic driving | operation control apparatus 1030. FIG. The image generation unit 1060 displays the current action image representing the first action, the scheduled action image indicating the remaining time until the second action and the second action, the remaining time image, and the schedule indicating the remaining time until the third action and the third action. A behavior image and a remaining time image may be generated.
 画像出力部1051は、これらの画像データを報知装置1002へ出力し、これらの画像を図39に示した態様で運転者の一定視野内に並べて表示させてもよい。例えば、第2行動と第2行動までの残り時間を示す予定行動画像と残り時間画像を、図39の第1予定行動画像1106a、第2予定行動画像1106bのように配置してもよい。また、第3行動と第3行動までの残り時間を示す予定行動画像と残り時間画像を、図39の第2予定行動画像1106b、第2残り時間画像1108bのように配置してもよい。 The image output unit 1051 may output these image data to the notification device 1002 and display these images side by side within a fixed visual field of the driver in the manner shown in FIG. For example, the scheduled action image indicating the second action and the remaining time until the second action and the remaining time image may be arranged as a first scheduled action image 1106a and a second scheduled action image 1106b in FIG. Further, the scheduled action image indicating the remaining time until the third action and the third action and the remaining time image may be arranged as a second scheduled action image 1106b and a second remaining time image 1108b in FIG.
 なお、車両の現在行動が第1行動の間は、第3行動と第3行動までの残り時間を示す予定行動画像と残り時間画像を非表示としてもよい。例えば、これらの画像の生成、出力または表示に係る処理をスキップしてもよい。第3行動を非表示とする場合、図37Aの態様で表示させてもよい。 Note that while the current behavior of the vehicle is the first behavior, the scheduled behavior image indicating the remaining time until the third behavior and the third behavior and the remaining time image may be hidden. For example, processing related to generation, output, or display of these images may be skipped. When the third action is not displayed, it may be displayed in the form of FIG. 37A.
 ここで、車両の現在行動が第1行動から第2行動に切り替わった場合、画像出力部1051は、第1行動を表す画像を非表示とするとともに、運転者の同一視野内に第2行動を表す画像と第3行動を表す画像を表示させてもよい。これは例えば、車両の現在行動を第1行動から第2行動に切り替えた旨の通知(新たな行動情報等)を自動運転制御装置1030から受け付けた場合である。画像生成部1060は、第2行動を現在行動として表す現在行動画像と、第3行動を予定行動として示す予定行動画像を生成し、画像出力部1051は、それらの画像を報知装置1002に出力して自動運転情報画面1103の内容を切り替えてもよい。 Here, when the current behavior of the vehicle is switched from the first behavior to the second behavior, the image output unit 1051 hides the image representing the first behavior and displays the second behavior within the same field of view of the driver. An image representing the image and an image representing the third action may be displayed. This is, for example, a case where a notification (new behavior information or the like) indicating that the current behavior of the vehicle has been switched from the first behavior to the second behavior is received from the automatic driving control device 1030. The image generation unit 1060 generates a current action image indicating the second action as the current action and a scheduled action image indicating the third action as the scheduled action, and the image output unit 1051 outputs these images to the notification device 1002. The contents of the automatic driving information screen 1103 may be switched.
 この変形例の自動運転情報画面1103は、第1行動の実行中に第3行動を非表示とする場合、例えば図37Aから図37Bへの画面遷移になる。その一方、第1行動の実行中に第3行動を表示させる場合、例えば図39から図37Bへの画面遷移になる。第1行動の実行中に第3行動を表示するか否かに関わらず、車両の現在行動が第1行動から第2行動に切り替わると、第2行動を表す画像の位置は、運転者の一定視野内の周辺位置から中心位置に変わる。また、第2行動を表す相対的に小さい予定行動画像の表示から、第2行動を表す相対的に大きい現在行動画像の表示へ切り替わる。その一方、第3行動を表す画像である予定行動画像は、第2行動を表す現在行動画像よりも小さいサイズで運転者の一定視野内の周辺位置に表示される。 The automatic driving information screen 1103 of this modification is, for example, a screen transition from FIG. 37A to FIG. 37B when the third behavior is not displayed during the execution of the first behavior. On the other hand, when the third action is displayed during the execution of the first action, the screen transition from FIG. 39 to FIG. 37B, for example. Regardless of whether or not the third action is displayed during the execution of the first action, when the current action of the vehicle is switched from the first action to the second action, the position of the image representing the second action is fixed by the driver. It changes from the peripheral position in the field of view to the center position. Further, the display is switched from the display of the relatively small scheduled action image representing the second action to the display of the relatively large current action image representing the second action. On the other hand, the scheduled action image that is an image representing the third action is displayed at a peripheral position within a fixed visual field of the driver with a smaller size than the current action image representing the second action.
 (実施の形態6)
 まず概要を説明する。車両の自動運転中に、車両に即時実行させる行動として指示可能な行動が何かを運転者が把握できないことがあり、またそのために、運転者に不安感を抱かせてしまうことがあった。
(Embodiment 6)
First, an outline will be described. During automatic driving of the vehicle, the driver may not be able to grasp what action can be instructed as an action to be immediately executed by the vehicle, and this may cause the driver to feel uneasy.
 そこで実施の形態6では、自動運転における車両の現在行動に代えて車両に即時実行させる行動の候補(以下「現在行動候補」と呼ぶ。)を運転者へ提示する。現在行動候補は、現在行動に代わる代替行動の候補とも言え、また、現在行動に代わって車両1000に実行させることが可能な行動の候補とも言える。具体的には、自動運転制御装置1030が現在行動を決定し、運転支援装置1040が現在行動候補を決定する。そして運転支援装置1040が、現在行動と現在行動候補の両方を車両1000内の報知装置1002に表示させる。なお、運転者が即時実行すべき行動を指示した場合も、実際には各装置内の処理或いは装置間での通信等が発生するため、現在時点からある程度の遅延が許容されることはもちろんである。 Therefore, in the sixth embodiment, a candidate for action (hereinafter referred to as “current action candidate”) to be immediately executed by the vehicle instead of the current action of the vehicle in the automatic driving is presented to the driver. The current action candidate can be said to be a candidate for an alternative action that replaces the current action, and can also be a candidate for an action that can be executed by the vehicle 1000 in place of the current action. Specifically, the automatic driving control device 1030 determines the current action, and the driving support device 1040 determines the current action candidate. Then, the driving support device 1040 causes the notification device 1002 in the vehicle 1000 to display both the current action and the current action candidate. In addition, even if the driver instructs an action to be executed immediately, in actuality, processing within each device or communication between devices occurs, so of course some delay from the current time is allowed. is there.
 以下では、これまでの実施の形態で説明済の内容は適宜省略する。本実施の形態で説明する構成或いは動作は、趣旨を逸脱しない範囲で、他の実施の形態或いは変形例で説明する構成或いは動作と組み合わせることができ、また置き換えることができる。 In the following, the contents already described in the above embodiments are omitted as appropriate. The structure or operation described in this embodiment can be combined with or replaced with the structure or operation described in another embodiment or modification without departing from the spirit of the present invention.
 図42は、運転支援装置1040の記憶部1042の詳細な構成を示すブロック図である。記憶部1042は、統計情報蓄積部1070と判定基準保持部1071を含む。 FIG. 42 is a block diagram illustrating a detailed configuration of the storage unit 1042 of the driving support apparatus 1040. The storage unit 1042 includes a statistical information storage unit 1070 and a determination criterion holding unit 1071.
 統計情報蓄積部1070は、車両の周囲状況および走行状態と車両の行動との関連性を示す統計情報を蓄積する。図43は、統計情報蓄積部1070に蓄積される統計情報を模式的に示す。実施の形態6の統計情報は、図27の走行履歴および図28A、28Bのドライバモデルに対応する。統計情報は、車両の周囲状況および走行状態を示す複数種類の環境パラメータの値と、車両に即時実行させる行動(もしくは車両に即時実行させた行動実績)を対応付けたレコードを複数含む情報である。言い換えれば、様々な環境状態において実行された現在行動を、その環境状態を示すパラメータ値と対応付けて蓄積した情報である。既知の統計処理によりモデル化・パターン化された情報であってもよい。 Statistic information accumulating unit 1070 accumulates statistical information indicating the relevance between the surrounding situation and running state of the vehicle and the behavior of the vehicle. FIG. 43 schematically shows statistical information stored in the statistical information storage unit 1070. The statistical information of the sixth embodiment corresponds to the travel history of FIG. 27 and the driver model of FIGS. 28A and 28B. The statistical information is information including a plurality of records in which values of a plurality of types of environmental parameters indicating the surrounding situation and the running state of the vehicle are associated with actions that are immediately executed by the vehicle (or action results that are immediately executed by the vehicle). . In other words, it is information obtained by associating current behaviors executed in various environmental states with parameter values indicating the environmental states. It may be information modeled and patterned by known statistical processing.
 実施の形態6の統計情報で規定される行動は、車両の現在行動であり、言い換えれば、車両に即時実行させる行動である。統計情報で規定される行動には、履歴(d)(e)で示すように単一行動が含まれ、履歴(f)で示すように複数の単一行動を組み合わせた行動計画も含まれる。また、統計情報で規定される各環境パラメータの意味は、実施の形態4で説明済のためここでの説明は省略する。なお、環境パラメータには、車両1000の速度、車両1000に対する先行車両の相対速度、車両1000と先行車両との距離、車両1000に対する側方車線の他車両の相対速度、車両1000と側方車線の他車両との距離、車両1000の位置情報が含まれる。統計情報で規定される環境パラメータの項目は、検出部1020から入力される検出情報に含まれ、もしくは検出情報に基づく計算により項目値を特定可能である。 The action defined by the statistical information of the sixth embodiment is the current action of the vehicle, in other words, the action that is immediately executed by the vehicle. The behavior defined by the statistical information includes a single behavior as shown by histories (d) and (e), and also includes an action plan combining a plurality of single behaviors as shown by history (f). Further, since the meaning of each environmental parameter defined by the statistical information has been described in the fourth embodiment, the description thereof is omitted here. The environmental parameters include the speed of the vehicle 1000, the relative speed of the preceding vehicle with respect to the vehicle 1000, the distance between the vehicle 1000 and the preceding vehicle, the relative speed of other vehicles in the side lane with respect to the vehicle 1000, the vehicle 1000 and the side lane. The distance from other vehicles and the position information of the vehicle 1000 are included. The item of the environmental parameter defined by the statistical information is included in the detection information input from the detection unit 1020, or the item value can be specified by calculation based on the detection information.
 図42に戻り、判定基準保持部1071は、後述の判定部1062による判定処理の基準となるデータ(以下「判定基準」と呼ぶ。)を保持する。判定基準は、検出部1020から入力される検出情報の複数のパターンについて、パターン毎に車両1000に現在(即時)実行させることが可能な行動を定めたデータである。例えば、検出情報のあるパターンが前方と右車線に他の車両が存在することを示す場合、そのパターンには可能な行動候補として減速と左への車線変更が定められてもよい。言い換えれば、可能な行動候補から加速と右への車線変更が除外されてもよい。 42, the determination criterion holding unit 1071 holds data (hereinafter referred to as “determination criterion”) that serves as a reference for determination processing by the determination unit 1062 described later. The determination criterion is data that defines actions that can be executed (immediately) by the vehicle 1000 for each pattern for a plurality of patterns of detection information input from the detection unit 1020. For example, when a pattern with detection information indicates that there are other vehicles in the forward and right lanes, deceleration and lane change to the left may be defined as possible action candidates for the pattern. In other words, acceleration and lane change to the right may be excluded from possible action candidates.
 運転支援装置1040の行動情報入力部1054は、自動運転制御装置1030が車両1000に実行させる現在行動を示す行動情報を自動運転制御装置1030から取得する。運転支援装置1040の検出情報入力部1052は、車両1000の周囲状況および走行状態の検出結果を示す検出情報を検出部1020から取得する。 The behavior information input unit 1054 of the driving support device 1040 acquires from the automatic driving control device 1030 behavior information indicating the current behavior that the automatic driving control device 1030 causes the vehicle 1000 to execute. The detection information input unit 1052 of the driving support device 1040 acquires detection information indicating the detection result of the surrounding state and the running state of the vehicle 1000 from the detection unit 1020.
 図44は、運転支援装置1040の制御部1041の詳細な構成を示すブロック図である。制御部1041は、画像生成部1060、候補決定部1061、判定部1062、指示部1063を含む。候補決定部1061および判定部1062は、自動運転制御装置1030から取得した行動情報が示す現在行動とは別に実行可能な行動候補を、検出部1020から取得した検出情報に基づいて決定する。 FIG. 44 is a block diagram illustrating a detailed configuration of the control unit 1041 of the driving support apparatus 1040. The control unit 1041 includes an image generation unit 1060, a candidate determination unit 1061, a determination unit 1062, and an instruction unit 1063. Candidate determination section 1061 and determination section 1062 determine action candidates that can be executed separately from the current action indicated by the action information acquired from automatic driving control apparatus 1030 based on the detection information acquired from detection section 1020.
 具体的には、候補決定部1061は、統計情報蓄積部1070に蓄積された統計情報における環境パラメータの個数(n)に対応するn次元のベクトル空間を構築し、統計情報に定められた行動をそのベクトル空間に配置する。次に、ベクトル空間内の特定の位置であり、検出情報が示す環境パラメータ値に対応する位置(以下「現在環境位置」とも呼ぶ。)を特定する。そして、ベクトル空間において現在環境位置から所定範囲内(言い換えれば所定の距離内)に存在する1つ以上の行動を仮の現在行動候補として決定する。すなわち、候補決定部1061は、統計情報に規定された複数種類の行動のうち、検出情報に近似する環境パラメータ値に対応付けられた行動を仮の現在行動候補として決定する。所定範囲の閾値は、開発者の知見或いは実験により決定されてよい。 Specifically, the candidate determining unit 1061 constructs an n-dimensional vector space corresponding to the number (n) of environmental parameters in the statistical information accumulated in the statistical information accumulating unit 1070, and performs an action determined in the statistical information. Place in that vector space. Next, a specific position in the vector space and a position corresponding to the environmental parameter value indicated by the detection information (hereinafter also referred to as “current environmental position”) are specified. Then, one or more actions existing within a predetermined range (in other words, within a predetermined distance) from the current environment position in the vector space are determined as temporary current action candidates. That is, the candidate determination unit 1061 determines, as a temporary current action candidate, an action associated with an environmental parameter value that approximates the detection information among a plurality of types of actions defined in the statistical information. The predetermined range of thresholds may be determined by developer knowledge or experimentation.
 判定部1062は、検出部1020から出力された検出情報と、判定基準保持部1071に保持された判定基準を参照して、候補決定部1061により決定された仮の現在行動候補のそれぞれが、車両に現在(即時)実行させることが可能か否かを判定する。判定部1062は、候補決定部1061により決定された1つ以上の仮の現在行動候補のうち、車両に現在実行させることが可能な候補を、運転者に提示する最終的な現在行動候補として決定する。例えば、仮の現在行動候補が右への車線変更である場合に、右車線に他の車両が存在しなければ、その仮の現在行動候補を最終的な現在行動候補として決定してもよい。逆に、仮の現在行動候補が右への車線変更である場合に、右車線に他の車両が存在すれば、その仮の現在行動候補を最終的な現在行動候補から除外してもよい。なお、判定部1062による、特定の行動を車両に現在実行させることが可能か否かの判定処理は既知の方法により実現されてもよい。 The determination unit 1062 refers to the detection information output from the detection unit 1020 and the determination criterion stored in the determination criterion storage unit 1071, and each of the provisional current action candidates determined by the candidate determination unit 1061 is a vehicle. It is determined whether or not the current (immediate) execution is possible. The determination unit 1062 determines, as one of the one or more provisional current action candidates determined by the candidate determination unit 1061, a candidate that can be currently executed by the vehicle as a final current action candidate to be presented to the driver. To do. For example, when the temporary current action candidate is a lane change to the right, if there is no other vehicle on the right lane, the temporary current action candidate may be determined as the final current action candidate. Conversely, when the temporary current action candidate is a lane change to the right, if there is another vehicle in the right lane, the temporary current action candidate may be excluded from the final current action candidate. Note that the determination process by the determination unit 1062 as to whether or not the vehicle can currently execute a specific action may be realized by a known method.
 このように、実施の形態6では、候補決定部1061と判定部1062の連携により運転者へ提示する現在行動候補を決定するが、いずれか一方の処理のみ実行して現在行動候補を決定してもよい。なお、候補決定部1061または判定部1062は、1つ以上の現在行動候補を一旦決定すると、行動情報が示す現在行動と各候補とが一致するか否かを判定する。そして、現在行動と一致する現在行動候補は候補から除外し、現在行動と不一致の現在行動候補のみ運転者へ提示対象として画像生成部1060に渡す。これにより、現在行動と一致する行動を、現在行動に代わる行動として提示することを抑制する。 As described above, in the sixth embodiment, the current action candidate to be presented to the driver is determined by the cooperation of the candidate determination unit 1061 and the determination unit 1062, but only one of the processes is executed to determine the current action candidate. Also good. In addition, once the candidate determination unit 1061 or the determination unit 1062 determines one or more current behavior candidates, the candidate determination unit 1061 or the determination unit 1062 determines whether the current behavior indicated by the behavior information matches each candidate. Then, the current action candidate that matches the current action is excluded from the candidates, and only the current action candidate that does not match the current action is passed to the image generation unit 1060 as a presentation target to the driver. This suppresses presenting an action that matches the current action as an action that replaces the current action.
 画像生成部1060は、行動情報が示す現在行動を表す現在行動画像を生成し、候補決定部1061および判定部1062により決定された最終的な現在行動候補を表す現在行動候補画像を生成する。画像出力部1051は、車両の運転者の一定視野内に現在行動画像と現在行動候補画像を表示させるように、現在行動画像と現在行動候補画像を報知装置1002に出力する。実施の形態6では言及しないが、実施の形態5と同様に、運転支援装置1040は、現在行動情報に加えて予定行動情報および残り時間情報を自動運転制御装置1030から取得し、予定行動画像および残り時間画像を報知装置1002にさらに表示させてもよい。 The image generation unit 1060 generates a current action image that represents the current action indicated by the action information, and generates a current action candidate image that represents the final current action candidate determined by the candidate determination unit 1061 and the determination unit 1062. The image output unit 1051 outputs the current action image and the current action candidate image to the notification device 1002 so that the current action image and the current action candidate image are displayed within a fixed visual field of the driver of the vehicle. Although not mentioned in the sixth embodiment, as in the fifth embodiment, the driving support device 1040 acquires the scheduled behavior information and the remaining time information from the automatic driving control device 1030 in addition to the current behavior information. The remaining time image may be further displayed on the notification device 1002.
 報知装置1002は、運転支援装置1040から出力された現在行動画像と現在行動候補画像を含む自動運転情報画面を表示する。図45は、自動運転情報画面の一例を示す。同図では、メイン領域1100に現在行動画像1104を配置し、サブ領域1102に2つの現在行動候補画像1110を配置した自動運転情報画面1103を示している。同図において上の現在行動候補画像1110は第1の現在行動候補である「減速」を示し、下の現在行動候補画像1110は第2の現在行動候補である「前進(速度維持)」を示している。 The notification device 1002 displays an automatic driving information screen including the current behavior image and the current behavior candidate image output from the driving support device 1040. FIG. 45 shows an example of the automatic driving information screen. In the figure, an automatic driving information screen 1103 in which the current action image 1104 is arranged in the main area 1100 and two current action candidate images 1110 are arranged in the sub area 1102 is shown. In the figure, the upper current action candidate image 1110 shows “deceleration” that is the first current action candidate, and the lower current action candidate image 1110 shows “advance (speed maintenance)” that is the second current action candidate. ing.
 また自動運転情報画面1103における現在行動画像1104の表示態様は、現在行動候補画像1110の表示態様とは異なるよう設定される。これにより、自動運転コントローラで決定済の現在行動と、その現在行動に代わる行動として提案された現在行動候補とを運転者が混同することを防止する。例えば、現在行動画像1104は、現在行動候補画像1110より大きいサイズで表示される。また、現在行動画像1104は、車両1000の運転者の一定視野内の中心位置で表示され、例えば自動運転情報画面1103の中心の近傍位置に表示される。その一方、現在行動候補画像1110は、運転者の一定視野内の周辺位置で表示され、例えば自動運転情報画面1103の端付近に表示される。 Also, the display mode of the current action image 1104 on the automatic driving information screen 1103 is set to be different from the display mode of the current action candidate image 1110. This prevents the driver from confusing the current action already determined by the automatic driving controller with the current action candidate proposed as an action that replaces the current action. For example, the current action image 1104 is displayed with a size larger than the current action candidate image 1110. In addition, the current action image 1104 is displayed at the center position within a fixed visual field of the driver of the vehicle 1000, for example, at a position near the center of the automatic driving information screen 1103. On the other hand, the current action candidate image 1110 is displayed at a peripheral position within a fixed visual field of the driver, for example, near the end of the automatic driving information screen 1103.
 図44に戻り、指示部1063は、現在行動候補画像1110を出力後(表示後)、所定時間が経過した場合に、現在行動候補を車両に実行させるための制御コマンドをコマンド出力部1055から自動運転制御装置1030へ出力させる。 Returning to FIG. 44, the instruction unit 1063 automatically outputs a control command from the command output unit 1055 for causing the vehicle to execute the current action candidate when a predetermined time has elapsed after the current action candidate image 1110 is output (after display). Output to the operation control device 1030.
 具体的には、操作入力部1050は、自動運転中に実行すべき車両の行動を指定する信号(以下「操作指示」と呼ぶ。)を入力装置1004から受信する。指示部1063は、現在行動画像1104および現在行動候補画像1110が報知装置1002に表示されている所定時間内に操作指示を未受信の場合、現在行動候補画像1110が表す現在行動候補を実行させるための制御コマンドをコマンド出力部1055から自動運転制御装置1030へ出力させる。この状況は、現在行動候補画像1110を出力後の所定時間内に操作指示を受け付けない場合とも言える。操作指示の受信を待機する所定時間は、開発者の知見或いは実験により適切な値が定められてよく、例えば5秒~10秒に設定されてもよい。 Specifically, the operation input unit 1050 receives from the input device 1004 a signal (hereinafter referred to as “operation instruction”) that specifies the behavior of the vehicle to be executed during automatic driving. The instruction unit 1063 executes the current action candidate represented by the current action candidate image 1110 when the current action image 1104 and the current action candidate image 1110 have not received an operation instruction within a predetermined time displayed on the notification device 1002. Are output from the command output unit 1055 to the automatic operation control apparatus 1030. This situation can also be said to be a case where an operation instruction is not received within a predetermined time after the current action candidate image 1110 is output. The predetermined time for waiting for reception of the operation instruction may be set to an appropriate value based on the developer's knowledge or experiment, and may be set to, for example, 5 seconds to 10 seconds.
 また指示部1063は、現在行動画像1104および現在行動候補画像1110が報知装置1002に表示されている所定時間内に操作指示が受信された場合、その操作指示により指定された行動を車両に実行させる制御コマンドをコマンド出力部1055から自動運転制御装置1030へ出力させる。操作指示により指定される行動は、自動運転制御装置1030が決定した現在行動または運転支援装置1040が決定した現在行動候補である。 Further, when an operation instruction is received within a predetermined time when the current action image 1104 and the current action candidate image 1110 are displayed on the notification device 1002, the instruction unit 1063 causes the vehicle to execute the action specified by the operation instruction. The control command is output from the command output unit 1055 to the automatic operation control device 1030. The action specified by the operation instruction is a current action determined by the automatic driving control apparatus 1030 or a current action candidate determined by the driving support apparatus 1040.
 例えば、現在行動候補画像1110を選択する操作指示が入力装置1004から入力された場合、現在行動候補画像1110が表す現在行動候補を車両に実行させる制御コマンドをコマンド出力部1055から自動運転制御装置1030へ出力させる。逆に、現在行動画像1104を選択する操作指示が入力装置1004から入力された場合、現在行動画像1104が表す現在行動を車両に実行させる制御コマンドをコマンド出力部1055から自動運転制御装置1030へ出力させる。 For example, when an operation instruction for selecting the current action candidate image 1110 is input from the input device 1004, a control command for causing the vehicle to execute the current action candidate represented by the current action candidate image 1110 is sent from the command output unit 1055 to the automatic driving control device 1030. To output. Conversely, when an operation instruction for selecting the current action image 1104 is input from the input device 1004, a control command for causing the vehicle to execute the current action represented by the current action image 1104 is output from the command output unit 1055 to the automatic driving control device 1030. Let
 図46は、車両1000のHMI制御に係る処理の例を示すシーケンス図である。検出部1020は、車両1000の周囲状況および走行状態を定期的に検出し、その検出結果を示す検出情報を自動運転制御装置1030へ定期的に出力する(P21)。自動運転制御装置1030は、検出部1020から取得した検出情報にしたがって車両の現在行動を決定する。そして、現在行動の実行を指示する行動指示を運転操作部1010へ出力することにより、その現在行動を車両1000に実行させる(P22)。さらに自動運転制御装置1030は、現在行動を示す行動情報を運転支援装置1040へ送信する(P23)。運転支援装置1040は、自動運転制御装置1030から取得した現在行動情報をもとに現在行動画像を生成し、報知装置1002に表示させる(P24)。 FIG. 46 is a sequence diagram illustrating an example of processing related to HMI control of the vehicle 1000. The detection unit 1020 periodically detects the surrounding conditions and the running state of the vehicle 1000, and periodically outputs detection information indicating the detection result to the automatic driving control device 1030 (P21). The automatic driving control device 1030 determines the current behavior of the vehicle according to the detection information acquired from the detection unit 1020. Then, by outputting an action instruction instructing execution of the current action to the driving operation unit 1010, the vehicle 1000 is caused to execute the current action (P22). Furthermore, the automatic driving control device 1030 transmits behavior information indicating the current behavior to the driving support device 1040 (P23). The driving support device 1040 generates a current behavior image based on the current behavior information acquired from the automatic driving control device 1030, and displays it on the notification device 1002 (P24).
 運転支援装置1040は、検出部1020が定期的に出力する検出情報を取得する(P25)。例えば、P21において検出部1020が出力した検出情報を、自動運転制御装置1030と並行して(独立して)取得してもよい。または、自動運転制御装置1030により転送された検出情報を取得してもよい。運転支援装置1040は、定期的に出力される検出情報を取得する都度、その検出情報に基づいて現在行動候補を決定する(P26)。運転支援装置1040は、現在行動候補を表す現在行動候補画像を生成して報知装置1002に表示させる(P27)。 The driving support device 1040 acquires detection information that the detection unit 1020 periodically outputs (P25). For example, the detection information output by the detection unit 1020 in P21 may be acquired in parallel (independently) with the automatic driving control device 1030. Alternatively, the detection information transferred by the automatic operation control device 1030 may be acquired. The driving support device 1040 determines a current action candidate based on the detection information every time the detection information output periodically is acquired (P26). The driving support device 1040 generates a current behavior candidate image representing the current behavior candidate and causes the notification device 1002 to display the current behavior candidate image (P27).
 報知装置1002の自動運転情報画面を確認した運転者が、現在行動候補の実行を指示するための選択操作を入力装置1004へ入力すると、入力装置1004は、現在行動候補の実行を指示する旨の操作指示を運転支援装置1040へ送信する(P28)。運転支援装置1040は、現在行動候補の実行を指示する内容の制御コマンドを自動運転制御装置1030へ送信する(P29)。自動運転制御装置1030は、制御コマンドで指定された現在行動候補を新たな現在行動として識別し、新たな現在行動の実行を指示する新たな行動指示を運転操作部1010へ出力することにより、新たな現在行動を車両1000に実行させる(P30)。 When a driver who has confirmed the automatic driving information screen of the notification device 1002 inputs a selection operation for instructing execution of the current action candidate to the input device 1004, the input apparatus 1004 instructs to execute the current action candidate. An operation instruction is transmitted to the driving support device 1040 (P28). The driving support device 1040 transmits a control command having a content for instructing execution of the current action candidate to the automatic driving control device 1030 (P29). The automatic driving control apparatus 1030 identifies the current action candidate specified by the control command as a new current action, and outputs a new action instruction to instruct the execution of the new current action to the driving operation unit 1010, thereby newly The current action is executed by the vehicle 1000 (P30).
 自動運転制御装置1030は、新たな現在行動を示す新たな行動情報を運転支援装置1040へ送信する(P31)。運転支援装置1040は、自動運転制御装置1030から取得した新たな現在行動情報をもとに新たな現在行動画像を生成し、報知装置1002に表示させる(P32)。運転支援装置1040は、検出部1020が出力した最新の検出情報を取得し(P33)、新たな現在行動候補を決定する(P34)。運転支援装置1040は、新たな現在候補画像を報知装置1002に表示させる(P35)。 The automatic driving control device 1030 transmits new behavior information indicating a new current behavior to the driving support device 1040 (P31). The driving support device 1040 generates a new current behavior image based on the new current behavior information acquired from the automatic driving control device 1030, and displays the new current behavior image on the notification device 1002 (P32). The driving support device 1040 acquires the latest detection information output by the detection unit 1020 (P33), and determines a new current action candidate (P34). The driving support device 1040 displays the new current candidate image on the notification device 1002 (P35).
 図46のP29~P35に関連し、運転支援装置1040の検出情報入力部1052は、制御コマンドが自動運転制御装置1030へ送信された後、その制御コマンドに応じて更新された現在行動を示す行動情報を自動運転制御装置1030から取得する。更新された現在行動は、制御コマンドで指定した行動であり、すなわち運転者により指定された現在行動候補である。そして運転支援装置1040は、報知装置1002の自動運転情報画面1103の内容を、最新の状態を反映するように更新する。 In relation to P29 to P35 in FIG. 46, the detection information input unit 1052 of the driving support device 1040 indicates the current behavior updated in accordance with the control command after the control command is transmitted to the automatic driving control device 1030. Information is acquired from the automatic operation control apparatus 1030. The updated current action is the action specified by the control command, that is, the current action candidate specified by the driver. Then, the driving support device 1040 updates the content of the automatic driving information screen 1103 of the notification device 1002 so as to reflect the latest state.
 図47は、運転支援装置1040の処理の例を示すフローチャートである。自動運転制御装置1030から出力された行動情報を行動情報入力部1054が取得すると(S120のY)、画像生成部1060は、行動情報が示す現在行動と、予め記憶部1042に記憶させた現在行動とが一致するか否かを判定する。以降のS121~S124の処理は、図41のS102~S105の処理と同じであるため説明を省略する。行動情報を未取得の場合(S120のN)、S121~S124をスキップする。 FIG. 47 is a flowchart showing an example of processing of the driving support apparatus 1040. When the behavior information input unit 1054 acquires the behavior information output from the automatic driving control device 1030 (Y in S120), the image generation unit 1060 displays the current behavior indicated by the behavior information and the current behavior stored in the storage unit 1042 in advance. Is matched. The subsequent processes in S121 to S124 are the same as the processes in S102 to S105 in FIG. If the action information has not been acquired (N in S120), S121 to S124 are skipped.
 検出部1020から出力された検出情報を検出情報入力部1052が取得すると(S125のY)、候補決定部1061は、その検出情報、統計情報蓄積部1070に蓄積された統計情報、判定基準保持部1071に保持された判定基準に基づいて、1つ以上の現在行動候補を決定する(S126)。候補決定部1061は、S126で決定した現在行動候補が、予め記憶部1042に記憶させた現在行動候補と一致する否かを判定する。S126で決定した現在行動候補が、予め記憶部1042に記憶させた現在行動候補と不一致であり、すなわち現在行動候補が更新された場合(S127のY)、候補決定部1061は、S126で決定した現在行動候補と、行動情報が示す現在行動とが一致するか否かを判定する。そして、現在行動と一致する現在行動候補を以降の処理対象から除外する(S128)。 When the detection information input unit 1052 acquires the detection information output from the detection unit 1020 (Y in S125), the candidate determination unit 1061 displays the detection information, the statistical information stored in the statistical information storage unit 1070, and the determination criterion holding unit. One or more current action candidates are determined based on the determination criterion held in 1071 (S126). The candidate determination unit 1061 determines whether or not the current action candidate determined in S126 matches the current action candidate stored in the storage unit 1042 in advance. When the current action candidate determined in S126 is inconsistent with the current action candidate stored in the storage unit 1042 in advance, that is, when the current action candidate is updated (Y in S127), the candidate determination unit 1061 determines in S126. It is determined whether the current action candidate matches the current action indicated by the action information. Then, the current action candidate that matches the current action is excluded from the subsequent processing targets (S128).
 画像生成部1060は、S127とS128のフィルタリングを通過した現在行動候補を表す現在行動候補画像を生成し(S129)、画像出力部1051は、現在候補画像を報知装置1002へ出力して表示させる(S130)。候補決定部1061は、現在行動候補画像が生成された現在行動候補の情報を記憶部1042に記憶させる(S131)。検出情報を未取得であれば(S125のN)、S126~S131をスキップする。現在行動候補の更新がなければ(S127のN)、S128~S131をスキップする。所定の終了条件が満たされた場合(S132のY)、本図のフローを終了し、終了条件が満たされなければ(S132のN)、S120に戻る。 The image generation unit 1060 generates a current action candidate image representing the current action candidate that has passed the filtering of S127 and S128 (S129), and the image output unit 1051 outputs the current candidate image to the notification device 1002 for display ( S130). The candidate determining unit 1061 stores information on the current action candidate for which the current action candidate image has been generated in the storage unit 1042 (S131). If the detection information has not been acquired (N in S125), S126 to S131 are skipped. If there is no update of the current action candidate (N of S127), S128 to S131 are skipped. If the predetermined end condition is satisfied (Y in S132), the flow of this figure is ended. If the end condition is not satisfied (N in S132), the process returns to S120.
 後述の変形例で説明するが、複数の現在行動候補を運転者に提示する場合、候補決定部1061は、現在の車両1000の周囲状況および走行状態の下、運転者の嗜好或いは操作パターンとの合致の度合いが高い行動候補ほど順位を高くしてもよい。そして、実施の形態5において複数の予定行動が入力された場合と同様に、画像生成部1060は、複数の現在行動候補に対応する複数の現在行動候補画像を、各候補の順位に応じた態様で生成してもよい。また、画像出力部1051は、複数の現在行動候補画像を、各候補の順位に応じた態様で報知装置1002に表示させてもよい。 As will be described in a later-described modification, when presenting a plurality of current action candidates to the driver, the candidate determination unit 1061 determines the driver's preference or operation pattern under the current surrounding conditions and driving state of the vehicle 1000. An action candidate having a higher degree of match may be ranked higher. Then, similarly to the case where a plurality of scheduled actions are input in the fifth embodiment, the image generation unit 1060 displays a plurality of current action candidate images corresponding to a plurality of current action candidates according to the ranks of the candidates. May be generated. In addition, the image output unit 1051 may cause the notification device 1002 to display a plurality of current action candidate images in a manner corresponding to the rank of each candidate.
 また、実施の形態5で既述したように、画像生成部1060は、現在行動画像と現在行動候補画像の両方を含む自動運転情報画面のデータを生成し、画像出力部1051は、自動運転情報画面のデータを報知装置1002へ出力して表示させてもよい。すなわち、現在行動画像と現在行動候補画像の両方を一括して報知装置1002へ出力してもよい。 Further, as already described in the fifth embodiment, the image generation unit 1060 generates automatic driving information screen data including both the current action image and the current action candidate image, and the image output unit 1051 displays the automatic driving information. The screen data may be output to the notification device 1002 and displayed. That is, both the current action image and the current action candidate image may be output to the notification device 1002 at once.
 図48も、運転支援装置1040の処理の例を示すフローチャートである。現在行動候補画像が画像出力部1051から報知装置1002へ出力されると(S140のY)、指示部1063は、その出力からの経過時間の測定を開始する(S141)。入力装置1004から操作指示が入力されず(S142のN)、経過時間が所定の閾値に達すると(S143のY)、指示部1063は、現在行動候補の実行を指示する制御コマンドをコマンド出力部1055から自動運転制御装置1030へ出力させる(S144)。この閾値は、開発者の知見或いは実験により予め定められた時間値であり、例えば5秒~10秒であってもよい。なお、複数の現在行動候補を運転者に提示した場合、指示部1063は、予め決定された各候補の順位にしたがって、最も高順位の現在行動候補の実行を指示する制御コマンドを出力させてもよい。 FIG. 48 is also a flowchart showing an example of processing of the driving support device 1040. When the current action candidate image is output from the image output unit 1051 to the notification device 1002 (Y in S140), the instruction unit 1063 starts measuring the elapsed time from the output (S141). When no operation instruction is input from the input device 1004 (N in S142) and the elapsed time reaches a predetermined threshold (Y in S143), the instruction unit 1063 sends a control command to instruct the execution of the current action candidate to the command output unit. Output from 1055 to the automatic operation controller 1030 (S144). This threshold value is a time value determined in advance by a developer's knowledge or experiment, and may be, for example, 5 seconds to 10 seconds. When a plurality of current action candidates are presented to the driver, the instructing unit 1063 may output a control command instructing execution of the highest current action candidate according to a predetermined rank of each candidate. Good.
 経過時間が所定の閾値に未達であれば(S143のN)、S142に戻る。現在行動または現在行動候補を指定する操作指示が入力されると(S142のY)、指示部1063は、操作指示で指定された現在行動または現在行動候補の実行を指示する制御コマンドをコマンド出力部1055から自動運転制御装置1030へ出力させる(S145)。この操作指示は、自動運転情報画面上の現在行動画像と現在行動候補画像の一方を選択する操作が入力されたことを示す信号であってもよい。現在行動候補画像を未出力であれば(S140のN)、S141以降の処理をスキップして本図のフローを終了する。 If the elapsed time does not reach the predetermined threshold (N in S143), the process returns to S142. When an operation instruction for designating the current action or the current action candidate is input (Y in S142), the instruction unit 1063 sends a control command for instructing execution of the current action or the current action candidate designated by the operation instruction to the command output unit. Output from 1055 to the automatic operation controller 1030 (S145). This operation instruction may be a signal indicating that an operation for selecting one of the current action image and the current action candidate image on the automatic driving information screen is input. If the current action candidate image has not been output (N in S140), the processes in S141 and subsequent steps are skipped, and the flow of FIG.
 以上説明したように、実施の形態6の運転支援装置1040は、車両の乗員(運転者等)に対して、自動運転における現在行動を報知するとともに、その現在行動に代えて即時実行可能な現在行動候補を提案する。このように、車両の直近の行動の選択肢を運転者へ提示することで、運転者の意思を一層反映した自動運転であり、運転者の嗜好等に即した自動運転を実現できる。また、運転者が選択可能な現在行動候補は現在の車両の周囲状況或いは走行状態において実行可能なものであるため、運転者は安心して自動運転に対する変更の指示を出すことができる。 As described above, the driving support device 1040 according to the sixth embodiment notifies the vehicle occupants (drivers, etc.) of the current behavior in automatic driving, and can be executed immediately instead of the current behavior. Suggest action candidates. In this way, by presenting the driver with the latest action options of the vehicle, it is an automatic driving that further reflects the driver's intention, and an automatic driving that meets the driver's preference and the like can be realized. In addition, since the current action candidate that can be selected by the driver is one that can be executed in the current situation or running state of the vehicle, the driver can give an instruction to change the automatic driving with a sense of security.
 変形例を説明する。実施の形態6では、現在行動画像1104および現在行動候補画像1110が報知装置1002に表示されている所定時間内に操作指示を未受信の場合、現在行動候補を実行させるための制御コマンドを自動運転制御装置1030へ出力した。すなわち、実施の形態6では、自動運転制御装置1030が決定した行動よりも運転支援装置1040が決定した行動を優先的に車両に実行させた。変形例として、所定時間内に操作指示を未受信であれば、指示部1063は、現在行動画像1104が表す現在行動を車両に実行させるための制御コマンドをコマンド出力部1055から自動運転制御装置1030へ出力させてもよい。すなわち、運転支援装置1040が決定した行動よりも自動運転制御装置1030が決定した行動を優先的に車両に実行させてもよい。 A modification will be described. In the sixth embodiment, when the current action image 1104 and the current action candidate image 1110 are not received within the predetermined time displayed on the notification device 1002, the control command for executing the current action candidate is automatically driven. It output to the control apparatus 1030. That is, in the sixth embodiment, the behavior determined by the driving support device 1040 is preferentially executed by the vehicle over the behavior determined by the automatic driving control device 1030. As a modification, if an operation instruction has not been received within a predetermined time, the instruction unit 1063 sends a control command for causing the vehicle to execute the current action represented by the current action image 1104 from the command output unit 1055 to the automatic driving control device 1030. May be output. That is, the behavior determined by the automatic driving control device 1030 may be preferentially executed by the vehicle over the behavior determined by the driving support device 1040.
 別の変形例を説明する。実施の形態6では、運転支援装置1040が、車両の周囲状況および走行状態と車両の行動との関連性を示す統計情報をローカルの記憶部に蓄積した。変形例として、この統計情報は、車両外部の情報処理装置、例えば通信網を介して接続されるデータベースサーバ等に蓄積されてもよい。すなわち統計情報蓄積部1070は、車両外部の遠隔地(例えばクラウド上)に設けられてもよい。運転支援装置1040の候補決定部1061は、通信IF1056および無線装置1008を介して、車両外部の統計情報蓄積部に蓄積された統計情報へアクセスし、その統計情報と、検出部1020の検出情報にしたがって現在行動候補を決定してもよい。統計情報を使用する以降の実施の形態においても同様である。 Another modification will be described. In the sixth embodiment, the driving support device 1040 accumulates statistical information indicating the relationship between the vehicle surroundings and the running state and the vehicle behavior in the local storage unit. As a modification, the statistical information may be stored in an information processing apparatus outside the vehicle, for example, a database server connected via a communication network. That is, the statistical information storage unit 1070 may be provided in a remote place (for example, on a cloud) outside the vehicle. The candidate determination unit 1061 of the driving support device 1040 accesses the statistical information stored in the statistical information storage unit outside the vehicle via the communication IF 1056 and the wireless device 1008, and uses the statistical information and the detection information of the detection unit 1020. Therefore, the current action candidate may be determined. The same applies to the following embodiments using the statistical information.
 さらに別の変形例を説明する。一部既述したが、統計情報蓄積部1070に蓄積される統計情報は、車両の乗員(典型的には運転者)の嗜好或いは運転パターンを反映した情報(ドライバモデル)であってもよい。本変形例の統計情報は、過去の環境パラメータ値と、その環境下での現在行動の実績の組み合わせを蓄積したものと言える。例えば、運転支援装置1040は、車両の走行における過去の環境パラメータ値と、その環境パラメータ値の場合の運転者による直近の操作(結果としての車両の直近行動)を統計情報蓄積部1070の統計情報へ逐次記録する統計情報記録部をさらに有してもよい。 Still another modification will be described. As described above, the statistical information stored in the statistical information storage unit 1070 may be information (driver model) reflecting the preference or driving pattern of a vehicle occupant (typically a driver). It can be said that the statistical information of this modified example is an accumulation of a combination of past environmental parameter values and results of current actions under the environment. For example, the driving support device 1040 indicates the past environmental parameter value in the travel of the vehicle and the latest operation (the latest behavior of the vehicle as a result) by the driver in the case of the environmental parameter value. You may further have a statistical information recording part recorded sequentially.
 本変形例の候補決定部1061は、統計情報蓄積部1070を参照して、現在検出された環境パラメータ値との差異が所定範囲内の環境パラメータ値に対応付けられた複数の現在行動候補を決定する。また、それとともに複数の現在行動候補それぞれの優先順位(優先度合とも言える)を決定する。例えば候補決定部1061は、複数の現在行動候補を決定した場合に、現在検出された環境パラメータ値に近い環境パラメータ値に対応付けられた現在行動候補ほど順位を高くしてもよい。すなわち、複数の現在行動候補の中で、運転者の嗜好或いは運転パターンに即した現在行動候補ほど高い優先順位を付与してもよい。 The candidate determination unit 1061 of the present modification refers to the statistical information storage unit 1070 to determine a plurality of current action candidates in which the difference from the currently detected environmental parameter value is associated with the environmental parameter value within a predetermined range. To do. At the same time, the priority order (also referred to as priority) of each of the plurality of current action candidates is determined. For example, when determining a plurality of current action candidates, the candidate determination unit 1061 may increase the rank of the current action candidate associated with the environmental parameter value that is close to the currently detected environmental parameter value. That is, among the plurality of current action candidates, a higher priority may be given to a current action candidate that matches the driver's preference or driving pattern.
 画像生成部1060は、候補決定部1061により決定された複数の現在行動候補に対応する複数の現在行動候補画像であり、すなわち各候補の内容を表す画像を生成する。画像出力部1051は、車両1000の運転者の一定視野内に現在行動画像と複数の現在行動候補画像を表示させるように現在行動画像と複数の現在行動候補画像を報知装置1002へ出力する。また画像出力部1051は、複数の現在行動候補画像を各候補の優先順位に応じた態様で報知装置1002に表示させる。 The image generation unit 1060 generates a plurality of current action candidate images corresponding to the plurality of current action candidates determined by the candidate determination unit 1061, that is, generates an image representing the contents of each candidate. The image output unit 1051 outputs the current action image and the plurality of current action candidate images to the notification device 1002 so that the current action image and the plurality of current action candidate images are displayed within a certain field of view of the driver of the vehicle 1000. The image output unit 1051 displays a plurality of current action candidate images on the notification device 1002 in a manner corresponding to the priority order of each candidate.
 例えば、画像生成部1060および画像出力部1051は、実施の形態5で既述したように、ヒストグラムなどで優先度を可視化したパラメトリック表示、または上下左右などに順位の並びに可視化したノンパラメトリック表示の形態にて、複数の現在行動候補画像を表示させてもよい。具体的には、順位そのものを示す画像を現在行動候補画像のそれぞれに付加してもよく、高順位の現在行動候補画像ほど視認性が高い態様で表示させてもよく、自動運転情報画面1103の所定位置から順位の降順に現在行動候補画像を並べてもよい。また、所定の順位以上の現在行動候補のみ画像を生成し、また表示対象としてもよい。 For example, as described in the fifth embodiment, the image generation unit 1060 and the image output unit 1051 have a parametric display in which priority is visualized by a histogram or the like, or a non-parametric display form in which the order is visualized in the vertical and horizontal directions. A plurality of current action candidate images may be displayed. Specifically, an image showing the order itself may be added to each of the current action candidate images, or the higher-order current action candidate images may be displayed in a manner with higher visibility. The current action candidate images may be arranged in descending order from a predetermined position. In addition, only current action candidates having a predetermined rank or higher may be generated and displayed.
 この変形例によると、複数の現在行動候補の中から乗員にとって好適な候補を選択できるよう支援できる。また、ドライバモデルに即した優先順位を決定することで、ドライバモデルの対象者(例えば運転者自身、同乗者自身、標準的または模範的な運転者のモデル等)の運転パターンに即した優先順位を提示できる。他の実施の形態における現在候補画像または予定行動候補画像の生成、表示にも本変形例を適用できる。 According to this modification, it is possible to support selection of a suitable candidate for the occupant from among a plurality of current action candidates. In addition, by determining the priority according to the driver model, the priority according to the driving pattern of the target person of the driver model (for example, the driver himself, the passenger himself, a standard or exemplary driver model, etc.) Can be presented. This modification can also be applied to generation and display of current candidate images or scheduled action candidate images in other embodiments.
 なお、図45では、現在行動画像1104が現在行動「車線変更」を示しながら、上の現在行動候補画像1110は第1の現在行動候補である「減速」を示し、下の現在行動候補画像1110は第2の現在行動候補である「前進(速度維持)」を示している。図示しないが別例として、先行車が減速し車間距離が短くなったことから、自動運転制御装置1030(自動走行制御ECUとも言える)は、現在行動「追い越し車線側へ車線変更」の実行を決定し、運転支援装置1040(HMI制御ECUとも言える)は、その現在行動が実行されていることを提示していることとする。このときに運転支援装置1040は、先行車の右ウィンカーが点灯していることが検知されたことに基づいて、『先行車も追い越し車線側へ車線変更することにより、自車が車線変更するメリットが減少すること』を判定してもよい。そして運転支援装置1040は、現在行動「追い越し車線側へ車線変更」に代わる現在行動候補として「車線維持の後に可能であれば加速」を、今すぐ指示が出せる(言い換えれば選択可能な)選択肢として推薦提示してもよい。 In FIG. 45, while the current action image 1104 indicates the current action “lane change”, the upper current action candidate image 1110 indicates “deceleration” that is the first current action candidate, and the lower current action candidate image 1110. Indicates the second current action candidate “forward (maintain speed)”. As another example, although not shown, since the preceding vehicle has decelerated and the inter-vehicle distance has been shortened, the automatic driving control device 1030 (also referred to as an automatic driving control ECU) decides to execute the current action “change lane to the overtaking lane”. Then, it is assumed that the driving support device 1040 (also referred to as an HMI control ECU) presents that the current action is being executed. At this time, the driving support device 1040 detects that the right turn signal of the preceding vehicle is lit, and “the advantage that the own vehicle changes the lane by changing the lane of the preceding vehicle to the overtaking lane side. May be determined. Then, the driving support device 1040 can select “accelerate if possible after lane maintenance” as a current action candidate to replace the current action “change lane to the overtaking lane” as an option that can immediately give an instruction (in other words, selectable). Recommendations may be presented.
 (実施の形態7)
 まず概要を説明する。車両の自動運転中に、自動運転の制御を変えるために指示可能な車両の将来行動が何かを運転者が把握できないことがあり、またそのために、運転者に不安感を抱かせてしまうことがあった。
(Embodiment 7)
First, an outline will be described. During automatic driving of a vehicle, the driver may not be able to grasp what the future behavior of the vehicle that can be instructed to change the control of the automatic driving is, and this causes the driver to feel uneasy was there.
 そこで実施の形態7では、自動運転において車両に将来実行させる行動の候補(以下「予定行動候補」と呼ぶ。)を運転者へ提示する。具体的には、自動運転制御装置1030が現在行動を決定し、運転支援装置1040が予定行動候補を決定する。そして運転支援装置1040が、現在行動と予定行動候補の両方を車両1000内の報知装置1002に表示させる。予定行動候補は、現在実行している行動の後に実行可能な行動であって、次に選択できる行動計画とも言える。また予定行動候補は、実施の形態5の予定行動に対応するものであり、実施の形態7では運転支援装置1040が決定して、選択可能な候補として運転者へ提示する予定行動とも言える。 Therefore, in the seventh embodiment, candidates for actions to be executed in the future by the vehicle in automatic driving (hereinafter referred to as “scheduled action candidates”) are presented to the driver. Specifically, the automatic driving control device 1030 determines the current action, and the driving support device 1040 determines the scheduled action candidate. Then, the driving support device 1040 causes the notification device 1002 in the vehicle 1000 to display both the current action and the scheduled action candidate. The scheduled action candidate is an action that can be executed after the currently executed action, and can be said to be an action plan that can be selected next. Further, the scheduled action candidate corresponds to the scheduled action in the fifth embodiment, and in the seventh embodiment, it can be said that the driving support apparatus 1040 determines and presents it to the driver as a selectable candidate.
 以下では、これまでの実施の形態で説明済の内容は適宜省略する。本実施の形態で説明する構成や動作は、趣旨を逸脱しない範囲で、他の実施の形態や変形例で説明する構成や動作と組み合わせることができ、また置き換えることができる。 In the following, the contents already described in the above embodiments are omitted as appropriate. The structures and operations described in this embodiment can be combined with or replaced with the structures and operations described in the other embodiments and modifications without departing from the spirit of the present invention.
 運転支援装置1040の機能ブロックは実施の形態6と同様である。すなわち、制御部1041は、図44に示すように、画像生成部1060、候補決定部1061、判定部1062、指示部1063を含む。また、記憶部1042は、図42に示すように、統計情報蓄積部1070、判定基準保持部1071を含む。 The functional block of the driving support device 1040 is the same as that of the sixth embodiment. That is, the control unit 1041 includes an image generation unit 1060, a candidate determination unit 1061, a determination unit 1062, and an instruction unit 1063 as shown in FIG. Further, the storage unit 1042 includes a statistical information storage unit 1070 and a determination criterion holding unit 1071 as shown in FIG.
 判定基準保持部1071に保持される判定基準は実施の形態6と同様である。すなわち判定基準は、検出部1020から入力される検出情報の複数のパターンについて、パターン毎に車両1000に現在(即時)実行させることが可能な行動を定めたデータである。 The determination standard held in the determination reference holding unit 1071 is the same as that in the sixth embodiment. In other words, the determination criterion is data that defines actions that can be executed (immediately) by the vehicle 1000 for each pattern for a plurality of patterns of detection information input from the detection unit 1020.
 実施の形態7の統計情報蓄積部1070に蓄積される統計情報も、図27の走行履歴および図28A、28Bのドライバモデルに対応し、車両の周囲状況および走行状態と車両の行動との関連性を示す統計情報(図43)である。ただし、実施の形態7の統計情報は、車両の周囲状況および走行状態を示す複数種類の環境パラメータの値と、将来時点で車両に実行させる行動(もしくは行動実績)を対応付けたレコードを複数含む情報である。言い換えれば、現在の環境状態に対して将来時点で(所定時間後に)実行された行動を、現在の環境状態を示すパラメータ値と対応付けて蓄積した情報である。将来時点は10秒後~数分後であってもよい。また、統計情報が規定する各行動には、その行動が将来実行されるまでの残り時間情報(例えば10秒~数分)が対応付けられている。既述したように、統計情報は車両1000の外部の装置に蓄積されてもよく、運転支援装置1040は、通信IF1056および無線装置1008を介してリモートの統計情報にアクセスしてもよい。 The statistical information stored in the statistical information storage unit 1070 of the seventh embodiment also corresponds to the travel history of FIG. 27 and the driver model of FIGS. 28A and 28B, and the relationship between the vehicle's surrounding situation and travel state and the behavior of the vehicle. Is statistical information (FIG. 43). However, the statistical information of the seventh embodiment includes a plurality of records in which values of a plurality of types of environmental parameters indicating the surrounding situation and running state of the vehicle are associated with actions (or action results) to be executed by the vehicle at a future time point. Information. In other words, it is information obtained by accumulating actions executed at a future time point (after a predetermined time) with respect to the current environmental state in association with a parameter value indicating the current environmental state. The future time point may be 10 seconds to several minutes later. Each action defined by the statistical information is associated with remaining time information (for example, 10 seconds to several minutes) until the action is executed in the future. As described above, the statistical information may be accumulated in a device outside the vehicle 1000, and the driving support device 1040 may access the remote statistical information via the communication IF 1056 and the wireless device 1008.
 運転支援装置1040の行動情報入力部1054は、自動運転制御装置1030が車両1000に実行させる現在行動を示す行動情報を自動運転制御装置1030から取得する。運転支援装置1040の検出情報入力部1052は、車両1000の周囲状況および走行状態の検出結果を示す検出情報を検出部1020から取得する。 The behavior information input unit 1054 of the driving support device 1040 acquires from the automatic driving control device 1030 behavior information indicating the current behavior that the automatic driving control device 1030 causes the vehicle 1000 to execute. The detection information input unit 1052 of the driving support device 1040 acquires detection information indicating the detection result of the surrounding state and the running state of the vehicle 1000 from the detection unit 1020.
 候補決定部1061は、行動情報が示す現在行動の後に車両1000に実行させることが可能な行動である1つ以上の予定行動候補を、検出部1020から出力された検出情報に基づいて決定する。具体的には、候補決定部1061は、実施の形態6と同様に、統計情報に規定された行動のうち、検出情報に近似する環境パラメータ値に対応付けられた1つ以上の行動を候補として抽出する。ただし、実施の形態7の統計情報から抽出される候補は、実施の形態6とは異なり将来時点で車両に実行させる行動を示す予定行動候補となる。予定行動候補は、車両1000の現在行動に続いて実行が予定される行動であってもよく、車両1000の現在行動終了後、他の行動を挟みつつ、現在から数十秒後或いは数分後に実行が予定される行動であってもよい。両者の行動は、統計情報において異なる残り時間が規定される。 The candidate determination unit 1061 determines one or more scheduled behavior candidates that are behaviors that can be executed by the vehicle 1000 after the current behavior indicated by the behavior information, based on the detection information output from the detection unit 1020. Specifically, as in the sixth embodiment, candidate determination section 1061 selects one or more actions associated with environmental parameter values that approximate the detection information from among actions specified in the statistical information as candidates. Extract. However, unlike the sixth embodiment, the candidates extracted from the statistical information of the seventh embodiment are scheduled behavior candidates indicating the behavior to be executed by the vehicle at a future time. The scheduled action candidate may be an action that is scheduled to be executed following the current action of the vehicle 1000. After the current action of the vehicle 1000 is finished, another action is sandwiched and several tens of seconds or minutes after the present. It may be an action scheduled to be executed. For both actions, different remaining times are defined in the statistical information.
 画像生成部1060は、行動情報が示す現在行動を表す現在行動画像を生成し、1つ以上の予定行動候補を表す1つ以上の予定行動候補画像を生成する。画像出力部1051は、車両の運転者の一定視野内に現在行動画像と予定行動候補画像を表示させるように、現在行動画像と予定行動候補画像を報知装置1002に出力する。報知装置1002は、運転支援装置1040から出力された現在行動画像と予定行動候補画像を含む自動運転情報画面を表示する。 The image generation unit 1060 generates a current action image representing the current action indicated by the action information, and generates one or more scheduled action candidate images representing one or more scheduled action candidates. The image output unit 1051 outputs the current action image and the scheduled action candidate image to the notification device 1002 so that the current action image and the scheduled action candidate image are displayed within a certain field of view of the driver of the vehicle. The notification device 1002 displays an automatic driving information screen including the current action image and the scheduled action candidate image output from the driving support device 1040.
 また候補決定部1061は、予定行動候補を決定すると、統計情報においてその予定行動候補に対応付けられた残り時間情報をさらに画像生成部1060に渡す。画像生成部1060は、予定行動候補を表す予定行動候補画像を生成する際に、残り時間情報が示す残り時間を表す残り時間画像をさらに生成する。画像出力部1051は、現在行動画像と予定行動候補画像に加えて残り時間画像を報知装置1002へ出力することにより、残り時間画像を加えた予定行動候補画像を自動運転情報画面に表示させる。 Further, when the candidate determining unit 1061 determines the scheduled action candidate, the candidate determining unit 1061 further passes the remaining time information associated with the scheduled action candidate in the statistical information to the image generating unit 1060. When generating the scheduled action candidate image representing the scheduled action candidate, the image generation unit 1060 further generates a remaining time image representing the remaining time indicated by the remaining time information. The image output unit 1051 outputs the remaining time image to the notification device 1002 in addition to the current action image and the scheduled action candidate image, thereby displaying the scheduled action candidate image with the remaining time image added on the automatic driving information screen.
 図49は、自動運転情報画面の一例を示す。同図では、メイン領域1100に現在行動画像1104を配置し、サブ領域1102に1つの予定行動候補を表す予定行動候補画像1112と残り時間画像1108を配置した自動運転情報画面1103を示している。同図の予定行動候補画像1112は、1つの予定行動候補として、3つの単一行動を組み合わせた行動計画(この例では追い越し)を示している。 FIG. 49 shows an example of the automatic driving information screen. The figure shows an automatic driving information screen 1103 in which a current action image 1104 is arranged in the main area 1100 and a scheduled action candidate image 1112 representing one scheduled action candidate and a remaining time image 1108 are arranged in the sub area 1102. The scheduled action candidate image 1112 in the figure shows an action plan (overtaking in this example) in which three single actions are combined as one scheduled action candidate.
 また同図の残り時間画像1108は、予定行動候補が実行されるまでの残り時間の長さを、サブ領域1102に占める網掛け領域の割合で示している。例えば、残り時間が減少するほど、網掛け領域の割合を増加させてもよい。具体的には、当初の残り時間が60秒である場合に、予定行動候補画像1112の表示開始から20秒が経過すると、サブ領域1102の下から3分の1の領域を網掛け領域とするように残り時間画像1108の表示態様を更新してもよい。また、予定行動候補画像1112の表示開始から40秒が経過すると、サブ領域1102の下から3分の2の領域を網掛け領域とするように残り時間画像1108の表示態様を更新してもよい。このように模様或いは色彩の変化により残り時間を報知してもよく、残り時間をカウントするタイマオブジェクトを表示させる等、他の方法で残り時間を報知してもよい。 Also, the remaining time image 1108 in the figure shows the length of the remaining time until the scheduled action candidate is executed as a ratio of the shaded area in the sub area 1102. For example, the ratio of the shaded area may be increased as the remaining time decreases. Specifically, when the initial remaining time is 60 seconds and 20 seconds have elapsed from the start of displaying the scheduled action candidate image 1112, the one-third area from the bottom of the sub-area 1102 is set as a shaded area. Thus, the display mode of the remaining time image 1108 may be updated. In addition, when 40 seconds have elapsed from the start of displaying the scheduled action candidate image 1112, the display mode of the remaining time image 1108 may be updated so that the two-thirds area from the bottom of the sub-area 1102 is a shaded area. . As described above, the remaining time may be notified by a change in pattern or color, or the remaining time may be notified by another method such as displaying a timer object for counting the remaining time.
 なお、予定行動候補画像1112の表示開始後、予定行動候補実行までの残り時間がゼロになった場合、画像出力部1051は、予定行動候補画像1112の表示を終了するよう指示するコマンドを報知装置1002へ送信して、予定行動候補画像1112の表示を終了させてもよい。また、画像生成部1060は、候補決定部1061が新たに決定した予定行動候補を示す新たな予定行動候補画像を生成し、画像出力部1051は新たな予定行動候補画像を報知装置1002へ送信して表示させてもよい。 In addition, when the remaining time until the scheduled action candidate execution is zero after the display of the scheduled action candidate image 1112 starts, the image output unit 1051 issues a command for instructing the display of the scheduled action candidate image 1112 to end. It may be transmitted to 1002 and the display of the scheduled action candidate image 1112 may be terminated. Further, the image generation unit 1060 generates a new scheduled behavior candidate image indicating the scheduled behavior candidate newly determined by the candidate determination unit 1061, and the image output unit 1051 transmits the new scheduled behavior candidate image to the notification device 1002. May be displayed.
 図50A~50Fも、自動運転情報画面の一例を示す。図50A~図50Fの自動運転情報画面1103では、二重円の中心位置に予定行動候補画像1112を配置し、外側の円に現在行動画像1104を配置している。また、外側の円には残り時間表示領域1114を設けており、予定行動候補が実行されるまでの残り時間が短くなるほど、残り時間画像1108で示す網掛け領域が拡大していく(図50A~図50E)。例えば、当初の残り時間が60秒である場合に、予定行動候補画像1112の表示開始から20秒が経過すると、残り時間表示領域1114の3分の1の領域を網掛け領域とするように残り時間画像1108の表示態様を更新してもよい。 50A to 50F also show examples of the automatic operation information screen. 50A to 50F, the scheduled action candidate image 1112 is arranged at the center position of the double circle, and the current action image 1104 is arranged in the outer circle. Further, a remaining time display area 1114 is provided in the outer circle, and the shaded area indicated by the remaining time image 1108 is expanded as the remaining time until the scheduled action candidate is executed becomes shorter (FIG. 50A to FIG. 50). FIG. 50E). For example, if the initial remaining time is 60 seconds and 20 seconds have elapsed from the start of displaying the scheduled action candidate image 1112, the remaining time display area 1114 is left as a shaded area. The display mode of the time image 1108 may be updated.
 図50Fは、予定行動候補実行までの残り時間がゼロになった場合、言い換えれば、予定行動候補画像1112の表示開始からの経過時間が、統計情報で予定行動候補に対応付けられた残り時間に達した場合の自動運転情報画面1103を示している。この例では、それまで予定行動候補画像1112に示されていた予定行動候補(加速)が現在行動に切り替わり、それに伴って、右への車線変更を示す現在行動画像1104から、加速を示す現在行動画像1104に切り替わっている。 FIG. 50F shows that when the remaining time until the scheduled action candidate execution becomes zero, in other words, the elapsed time from the display start of the scheduled action candidate image 1112 is the remaining time associated with the scheduled action candidate in the statistical information. An automatic driving information screen 1103 when it has been reached is shown. In this example, the scheduled action candidate (acceleration) that was previously shown in the scheduled action candidate image 1112 is switched to the current action, and accordingly, from the current action image 1104 that shows a lane change to the right, the current action showing acceleration is shown. The image 1104 has been switched.
 図51A~51Fも、自動運転情報画面の一例を示す。図51A~51Fの自動運転情報画面1103は、図50A~50Fに示した自動運転情報画面1103と以下の3点で異なる。(1)実施の形態5と同様に予定行動を自動運転制御装置1030から取得し、その予定行動を表す予定行動画像1106を二重円の中心位置に配置している点。(2)予定行動候補画像1112を内側の円の周辺部に配置している点。なお上向きの三角は加速、下向きの三角は減速、左向きの三角は左への車線変更、右向きの三角は右への車線変更を示している。(3)残り時間画像1108は、予定行動画像1106により表される予定行動が実行されるまでの残り時間を示す点。 51A to 51F also show examples of the automatic operation information screen. 51A to 51F are different from the automatic driving information screen 1103 shown in FIGS. 50A to 50F in the following three points. (1) The point that the scheduled action is acquired from the automatic driving control apparatus 1030 as in the fifth embodiment, and the scheduled action image 1106 representing the scheduled action is arranged at the center position of the double circle. (2) The scheduled action candidate image 1112 is arranged around the inner circle. The upward triangle indicates acceleration, the downward triangle indicates deceleration, the left triangle indicates lane change to the left, and the right triangle indicates lane change to the right. (3) The remaining time image 1108 is a point indicating the remaining time until the scheduled action represented by the scheduled action image 1106 is executed.
 図51A、51Eで示すように、予定行動画像1106が示す予定行動「加速」が実行されるまでの残り時間が減っていく間、運転者に提示される予定行動候補(予定行動候補画像1112)は随時更新される。図51Fは、予定行動実行までの残り時間がゼロになった場合、言い換えれば、予定行動画像1106の表示開始からの経過時間が、自動運転制御装置1030から通知された残り時間に達した場合の自動運転情報画面1103を示している。この例では、それまで予定行動画像1106に示されていた予定行動(加速)が現在行動に切り替わり、それに伴って、右への車線変更を示す現在行動画像1104から、加速を示す現在行動画像1104に切り替わっている。 As shown in FIGS. 51A and 51E, the scheduled action candidate presented to the driver (scheduled action candidate image 1112) while the remaining time until the scheduled action “acceleration” indicated by the scheduled action image 1106 is executed decreases. Will be updated from time to time. FIG. 51F shows the case where the remaining time until the scheduled action execution becomes zero, in other words, the elapsed time from the start of displaying the scheduled action image 1106 reaches the remaining time notified from the automatic driving control device 1030. An automatic driving information screen 1103 is shown. In this example, the scheduled action (acceleration) that has been shown in the scheduled action image 1106 until then is switched to the current action, and accordingly, the current action image 1104 that shows acceleration from the current action image 1104 that shows a lane change to the right. It has switched to.
 運転者は、入力装置1004に設けられた十字ボタンを押下して予定行動候補画像1112が表す予定行動候補の実行を指示してもよい。例えば、上向きの三角を示す予定行動候補画像1112が表示中に、運転者が十字ボタンにおける上ボタンを選択することで、予定行動候補画像1112が示す加速の実行を指示してもよい。 The driver may press the cross button provided on the input device 1004 to instruct execution of the scheduled action candidate represented by the scheduled action candidate image 1112. For example, the driver may instruct execution of acceleration indicated by the scheduled action candidate image 1112 by selecting the up button of the cross button while the scheduled action candidate image 1112 indicating the upward triangle is displayed.
 図52も、自動運転情報画面の一例を示す。自動運転情報画面1103では、2つの予定行動候補(2つの行動計画)を表す2つの予定行動候補画像1112が表示されている。また自動運転情報画面1103では、それぞれの予定行動候補が実行されるまでの残り時間を表す残り時間画像1108を、実施の形態5と同様に複数の時間インジケータ1109で示している。また自動運転情報画面1103では、複数の予定行動候補画像1112が表す複数の予定行動候補の中から特定の予定行動候補を運転者に選択させるための選択枠1116が表示されている。運転者は、選択枠1116により所望の予定行動候補を指定する操作を入力装置1004に入力する。 FIG. 52 also shows an example of the automatic driving information screen. On the automatic driving information screen 1103, two scheduled action candidate images 1112 representing two scheduled action candidates (two action plans) are displayed. In the automatic driving information screen 1103, a remaining time image 1108 representing the remaining time until each scheduled action candidate is executed is indicated by a plurality of time indicators 1109 as in the fifth embodiment. The automatic driving information screen 1103 displays a selection frame 1116 for allowing the driver to select a specific scheduled action candidate from a plurality of scheduled action candidates represented by a plurality of scheduled action candidate images 1112. The driver inputs an operation for designating a desired scheduled action candidate to the input device 1004 using the selection frame 1116.
 運転支援装置1040の画像生成部1060は、操作入力部1050を介して、予定行動候補を指定する操作が入力された場合、指定された予定行動候補の「実行」または「予約」を運転者に問い合わせる内容の問い合わせ画像を生成する。「実行」は、予定行動候補を車両に即時実行させることを意味する。「予約」は、予約を指定する操作指示から所定時間後のタイミングであり、予定行動候補を車両が実行可能になったタイミングで実行させることを意味する。画像出力部1051は、問い合わせ画像を報知装置1002へ出力して表示させる。 When an operation for designating a scheduled action candidate is input via the operation input unit 1050, the image generation unit 1060 of the driving support apparatus 1040 gives the driver “execution” or “reservation” of the designated scheduled action candidate. An inquiry image of the contents to be inquired is generated. “Execution” means that the scheduled action candidate is immediately executed by the vehicle. “Reservation” is a timing after a predetermined time from an operation instruction for specifying a reservation, and means that a scheduled action candidate is executed at a timing when the vehicle becomes executable. The image output unit 1051 outputs the inquiry image to the notification device 1002 to be displayed.
 指示部1063は、操作入力部1050を介して、問い合わせ画像の表示中に「実行」を指定する操作が入力された場合、予定行動候補を車両1000に実行させるための制御コマンドを第1のタイミングにおいて自動運転制御装置1030へ出力する。その一方、問い合わせ画像の表示中に「予約」を指定する操作が入力された場合、予定行動候補を車両1000に実行させるための制御コマンドを、第1のタイミングより後の第2のタイミングにおいて自動運転制御装置1030へ出力する。 The instruction unit 1063 receives a control command for causing the vehicle 1000 to execute the scheduled action candidate at the first timing when an operation for designating “execution” is input via the operation input unit 1050 during display of the inquiry image. Is output to the automatic operation control device 1030. On the other hand, when an operation for designating “reservation” is input during the display of the inquiry image, a control command for causing the vehicle 1000 to execute the scheduled action candidate is automatically executed at a second timing after the first timing. Output to the operation control device 1030.
 実施の形態7における車両1000の処理シーケンスは、実施の形態6で説明した図46の処理シーケンスと同様である。ただし、現在行動候補の決定と現在行動候補画像の表示は、予定行動候補の決定と予定行動候補画像の表示に置き換わる(P26、P27)。また入力装置1004は、予定行動候補の選択操作を運転支援装置1040へ通知する(P28)。運転支援装置1040は、予定行動候補の実行を指示する制御コマンドを自動運転制御装置1030へ出力する(P29)。 The processing sequence of the vehicle 1000 in the seventh embodiment is the same as the processing sequence of FIG. 46 described in the sixth embodiment. However, the determination of the current action candidate and the display of the current action candidate image are replaced with the determination of the scheduled action candidate and the display of the scheduled action candidate image (P26, P27). Further, the input device 1004 notifies the driving support device 1040 of the scheduled action candidate selection operation (P28). The driving support device 1040 outputs a control command instructing execution of the scheduled action candidate to the automatic driving control device 1030 (P29).
 図46のP29~P35に関連し、運転支援装置1040の検出情報入力部1052は、制御コマンドが自動運転制御装置1030へ送信された後、その制御コマンドに応じて更新された現在行動を示す行動情報を自動運転制御装置1030から取得する。更新された現在行動は、制御コマンドで指定した行動であり、すなわち運転者により指定された予定行動候補である。運転支援装置1040は、報知装置1002の自動運転情報画面1103の内容を、車両の最新行動を反映するように更新する。 In relation to P29 to P35 in FIG. 46, the detection information input unit 1052 of the driving support device 1040 indicates the current behavior updated in accordance with the control command after the control command is transmitted to the automatic driving control device 1030. Information is acquired from the automatic operation control apparatus 1030. The updated current action is the action specified by the control command, that is, the scheduled action candidate specified by the driver. The driving support device 1040 updates the content of the automatic driving information screen 1103 of the notification device 1002 to reflect the latest behavior of the vehicle.
 図53は、運転支援装置1040の処理の例を示すフローチャートである。自動運転制御装置1030から出力された行動情報を行動情報入力部1054が取得すると(S150のY)、画像生成部1060は、行動情報が示す現在行動と、予め記憶部1042に記憶させた現在行動とが一致するか否かを判定する。以降のS151~S154の処理は、図41のS102~S105の処理と同じであるため説明を省略する。行動情報を未取得の場合(S150のN)、S151~S154をスキップする。 FIG. 53 is a flowchart showing an example of processing of the driving support apparatus 1040. When the behavior information input unit 1054 acquires the behavior information output from the automatic driving control device 1030 (Y in S150), the image generation unit 1060 displays the current behavior indicated by the behavior information and the current behavior stored in the storage unit 1042 in advance. Is matched. The subsequent processing of S151 to S154 is the same as the processing of S102 to S105 in FIG. If the action information has not been acquired (N in S150), S151 to S154 are skipped.
 検出部1020から出力された検出情報を検出情報入力部1052が取得すると(S155のY)、候補決定部1061は、その検出情報と、統計情報蓄積部1070に蓄積された統計情報に基づいて、1つ以上の予定行動候補を決定する(S156)。画像生成部1060は、S156で決定された予定行動候補が、予め記憶部1042に記憶させた予定行動候補と一致する否かを判定する。S156で決定された予定行動候補が、予め記憶部1042に記憶させた予定行動候補と不一致であり、すなわち予定行動候補が更新された場合(S157のY)、画像生成部1060は、予定行動候補を表す予定行動候補画像を生成する(S158)。 When the detection information input unit 1052 acquires the detection information output from the detection unit 1020 (Y in S155), the candidate determination unit 1061 uses the detection information and the statistical information stored in the statistical information storage unit 1070 based on the detection information. One or more scheduled action candidates are determined (S156). The image generation unit 1060 determines whether or not the scheduled action candidate determined in S156 matches the scheduled action candidate stored in the storage unit 1042 in advance. When the scheduled action candidate determined in S156 does not match the scheduled action candidate stored in the storage unit 1042 in advance, that is, when the scheduled action candidate is updated (Y in S157), the image generation unit 1060 displays the scheduled action candidate. Is generated (S158).
 画像生成部1060は、統計情報において予定行動候補に予め対応付けられた実行までの残り時間をさらに識別し、S158において、その残り時間を表す残り時間画像をさらに生成する。画像出力部1051は、予定行動候補画像と残り時間画像を報知装置1002へ出力して自動運転情報画面を表示させる(S159)。画像生成部1060は、画像を生成した予定行動候補を示す情報を記憶部1042に格納し(S160)、予定行動候補画像を出力(表示開始)してからの経過時間の計測を開始する(S161)。所定の終了条件が満たされた場合(S162のY)、本図のフローを終了し、終了条件が満たされなければ(S162のN)、S150に戻る。 The image generation unit 1060 further identifies the remaining time until the execution associated with the scheduled action candidate in advance in the statistical information, and further generates a remaining time image representing the remaining time in S158. The image output unit 1051 outputs the scheduled action candidate image and the remaining time image to the notification device 1002 to display the automatic driving information screen (S159). The image generation unit 1060 stores information indicating the scheduled action candidate that generated the image in the storage unit 1042 (S160), and starts measuring the elapsed time after outputting (starting to display) the scheduled action candidate image (S161). ). If the predetermined end condition is satisfied (Y in S162), the flow of this figure is terminated. If the end condition is not satisfied (N in S162), the process returns to S150.
 検出情報の入力がなく(S155のN)、または、予定行動候補が更新されない場合(S157のN)、画像生成部1060は、経過時間の計測開始から所定時間が経過したか否かを判定する。この所定時間は、残り時間画像を更新すべき単位時間であり、例えば1つの時間インジケータ1109に割当てられた時間であってもよい。経過時間の計測開始から所定時間が経過したことを検出すると(S163のY)、画像生成部1060は、残り時間画像を更新する(S164)。画像出力部1051は、更新された残り時間画像を報知装置1002へ出力して表示させる(S165)。これにより、例えば図49、図50A~50Fの残り時間表示領域1114における網掛け領域を拡大させ、また、図52の1つの時間インジケータ1109を点灯状態から消灯状態へ切り替える。経過時間の計測開始から所定時間が未経過であれば(S163のN)、S164とS165をスキップする。 When no detection information is input (N in S155) or the scheduled action candidate is not updated (N in S157), the image generation unit 1060 determines whether or not a predetermined time has elapsed since the start of the elapsed time measurement. . The predetermined time is a unit time in which the remaining time image is to be updated, and may be a time assigned to one time indicator 1109, for example. When it is detected that a predetermined time has elapsed since the start of measurement of elapsed time (Y in S163), the image generation unit 1060 updates the remaining time image (S164). The image output unit 1051 outputs and displays the updated remaining time image to the notification device 1002 (S165). Thereby, for example, the shaded area in the remaining time display area 1114 in FIG. 49 and FIGS. 50A to 50F is enlarged, and one time indicator 1109 in FIG. 52 is switched from the on state to the off state. If the predetermined time has not elapsed since the start of the elapsed time measurement (N in S163), S164 and S165 are skipped.
 図54も、運転支援装置1040の処理の例を示すフローチャートである。画像出力部1051が予定行動候補画像を報知装置1002へ出力後、予定行動候補画像が報知装置1002の自動運転情報画面で表示中に(S170のY)、予定行動候補(予定行動候補画像)が未選択であれば(S171のN)、S170に戻る。自動運転情報画面で予定行動候補が選択されると(S171のY)、候補決定部1061は、新たな予定行動候補決定処理を一時的に停止する(S172)。 FIG. 54 is also a flowchart showing an example of processing of the driving support device 1040. After the image output unit 1051 outputs the scheduled behavior candidate image to the notification device 1002, the scheduled behavior candidate image (scheduled behavior candidate image) is displayed while the scheduled behavior candidate image is displayed on the automatic driving information screen of the notification device 1002 (Y in S170). If not selected (N in S171), the process returns to S170. When the scheduled action candidate is selected on the automatic driving information screen (Y in S171), the candidate determining unit 1061 temporarily stops the new scheduled action candidate determining process (S172).
 判定部1062は、選択された予定行動候補(以下「選択行動」とも呼ぶ。)を車両1000が即時実行可能か否かを判定する(S173)。具体的には、判定部1062は、検出部1020から出力された最新の検出情報と、判定基準保持部1071に保持された判定基準を参照して、選択行動を車両に現在実行させることが可能か否かを判定する。ここで、車両に即時実行させることができない選択行動であっても、車両の周囲環境或いは走行状態が変化すれば実行可能になることがある。そこで実施の形態7では、選択行動の予約を可能にする。予約は、ある程度の時間範囲内の可能なタイミングで選択行動を実行するように指示する行為と言える。 The determining unit 1062 determines whether or not the vehicle 1000 can immediately execute the selected scheduled action candidate (hereinafter also referred to as “selected action”) (S173). Specifically, the determination unit 1062 can cause the vehicle to currently execute the selection action with reference to the latest detection information output from the detection unit 1020 and the determination criterion held in the determination criterion holding unit 1071. It is determined whether or not. Here, even a selection action that cannot be immediately executed by the vehicle may be executed if the surrounding environment or the running state of the vehicle changes. Therefore, in the seventh embodiment, it is possible to reserve a selected action. Reservation can be said to be an act of instructing execution of a selection action at a possible timing within a certain time range.
 画像生成部1060は、選択行動の即時実行または予約を運転者に問い合わせるための問い合わせ画像を生成し、画像出力部1051は、問い合わせ画像を報知装置1002へ出力して表示させる(S174)。図55A、55Bは、自動運転情報画面の一例を示す。図55Aの自動運転情報画面1103には、図52の自動運転情報画面1103で選択された予定行動候補画像1112が表示されている。また同図の自動運転情報画面1103には、実行ボタン1118と予約ボタン1120を含む問い合わせ画像が表示されている。画像生成部1060は、判定部1062により選択行動の即時実行が可能と判定された場合、実行ボタン1118と予約ボタン1120の両方を含む問い合わせ画像を生成する。その一方、判定部1062により選択行動の即時実行が不可と判定された場合、予約ボタン1120を含むが実行ボタン1118を含まない問い合わせ画像を生成する。これにより即時実行できない選択行動は予約のみ可能になる。 The image generation unit 1060 generates an inquiry image for inquiring the driver about the immediate execution or reservation of the selected action, and the image output unit 1051 outputs the inquiry image to the notification device 1002 for display (S174). 55A and 55B show an example of the automatic driving information screen. In the automatic driving information screen 1103 of FIG. 55A, the scheduled action candidate image 1112 selected on the automatic driving information screen 1103 of FIG. 52 is displayed. In addition, an inquiry image including an execution button 1118 and a reservation button 1120 is displayed on the automatic driving information screen 1103 in FIG. When the determination unit 1062 determines that the selection action can be immediately executed, the image generation unit 1060 generates an inquiry image including both the execution button 1118 and the reservation button 1120. On the other hand, if the determination unit 1062 determines that the selected action cannot be immediately executed, an inquiry image including the reservation button 1120 but not including the execution button 1118 is generated. As a result, only a reservation can be made for a selection action that cannot be executed immediately.
 図54に戻り、自動運転情報画面において選択行動の「実行」が選択された場合(S175のN)、操作入力部1050は、選択行動の即時実行を指示する操作指示を入力装置1004から受け付ける。この操作指示は、予定行動候補画像が表す予定行動候補を即時に実行させることが問い合わせ画像において選択されたことを示す信号であってもよい。指示部1063は、選択行動の即時実行を指示する制御コマンドをコマンド出力部1055から自動運転制御装置1030へ送信する(S176)。この制御コマンドは、自動運転情報画面の現在行動画像が表す現在行動に代えて、予定行動候補画像が表す予定行動候補を車両に即時に実行することを指示する制御コマンドと言える。 54, when “execution” of the selected action is selected on the automatic driving information screen (N in S175), the operation input unit 1050 receives an operation instruction for instructing immediate execution of the selected action from the input device 1004. This operation instruction may be a signal indicating that the scheduled action candidate represented by the scheduled action candidate image is selected in the inquiry image for immediate execution. The instruction unit 1063 transmits a control command for instructing immediate execution of the selected action from the command output unit 1055 to the automatic driving control apparatus 1030 (S176). This control command can be said to be a control command for instructing the vehicle to immediately execute the scheduled action candidate represented by the scheduled action candidate image instead of the current action represented by the current action image on the automatic driving information screen.
 以降、図53のS150~S153の処理により、自動運転情報画面の現在行動画像が選択行動を示すものに変化する。候補決定部1061は、新たな予定行動候補決定処理を再開する(S177)。予定行動候補画像が表示中でなければ(S170のN)、以降の処理をスキップして本図のフローを終了する。 Thereafter, the current action image on the automatic driving information screen is changed to one showing the selected action by the processing of S150 to S153 in FIG. The candidate determination unit 1061 resumes the new scheduled action candidate determination process (S177). If the scheduled action candidate image is not being displayed (N in S170), the subsequent processing is skipped and the flow of this figure is terminated.
 自動運転情報画面において選択行動の「予約」が選択された場合(S175のY)、操作入力部1050は、選択行動の予約を指示する操作指示を入力装置1004から受け付ける。画像生成部1060は、予約が指示された選択行動(以下、「予約行動」とも呼ぶ。)の実行をキャンセルするまでの時間を運転者に設定させるためのキャンセル時間設定画像を生成する。画像出力部1051は、キャンセル時間設定画像を報知装置1002へ出力して表示させる(S178)。図55Bは、キャンセル時間設定画像1122を含む自動運転情報画面1103を示す。同図のキャンセル時間設定画像1122では、30秒から10分の間でキャンセル時間の設定が可能である。予約中は新たな予定行動候補の決定処理が停止するが、運転者が予約のキャンセル時間を設定することで、予約のキャンセル以降、新たな予定行動候補が運転者へ提示される。 When “Reservation” of the selected action is selected on the automatic driving information screen (Y of S175), the operation input unit 1050 receives an operation instruction for instructing reservation of the selected action from the input device 1004. The image generation unit 1060 generates a cancel time setting image for allowing the driver to set the time until the execution of the selection action (hereinafter also referred to as “reservation action”) instructed to be reserved. The image output unit 1051 outputs and displays the cancel time setting image to the notification device 1002 (S178). FIG. 55B shows an automatic driving information screen 1103 including a cancel time setting image 1122. In the cancel time setting image 1122 in the figure, the cancel time can be set between 30 seconds and 10 minutes. While the reservation is being made, the process of determining a new scheduled action candidate is stopped, but the driver sets a reservation cancellation time, so that after the reservation is canceled, a new scheduled action candidate is presented to the driver.
 図54に戻り、運転者が予約のキャンセル時間を設定する操作を入力装置1004へ入力すると、操作入力部1050は、設定されたキャンセル時間を示す信号を入力装置1004から受け付ける(S179)。指示部1063は、予約開始からの経過時間の測定を開始する(S180)。画像生成部1060は、予約行動を表す画像である予約行動画像と、予約のキャンセル時間までの長さを表す画像である制限時間画像を生成する。画像出力部1051は、予約行動画像と制限時間画像を報知装置1002へ出力して表示させる(S181)。予約行動画像は、予定行動候補画像とは異なる表示態様の画像であり、運転者により選択された特定の行動が予約中であることを示す画像である。例えば、予約中を示す所定のシンボルが付加された画像であってもよい。 54, when the driver inputs an operation for setting a reservation cancellation time to the input device 1004, the operation input unit 1050 receives a signal indicating the set cancellation time from the input device 1004 (S179). The instruction unit 1063 starts measuring the elapsed time from the start of reservation (S180). The image generation unit 1060 generates a reservation behavior image that is an image representing the reservation behavior and a time limit image that is an image representing the length of the reservation cancellation time. The image output unit 1051 outputs and displays the reservation action image and the time limit image to the notification device 1002 (S181). The reserved action image is an image having a display mode different from the scheduled action candidate image, and is an image indicating that the specific action selected by the driver is being reserved. For example, an image to which a predetermined symbol indicating reservation is added may be used.
 図56A、56Bは、自動運転情報画面の一例を示す。図56Aの自動運転情報画面1103では、サブ領域1102に、予約行動画像1124と制限時間画像1126が表示されている。この例では、制限時間画像1126は、複数の時間インジケータ1109により予約キャンセルまでの時間の長さを表している。画像生成部1060は、予定行動候補画像1112の態様とは異なる態様で予約行動画像1124を生成する。または、画像出力部1051は、予定行動候補画像1112の態様とは異なる態様で予約行動画像1124を表示させる。これにより、自動運転情報画面1103において、予約行動候補が提案中であるのか、既に予約済であるのかを運転者が容易に判別できるようにする。 56A and 56B show an example of the automatic driving information screen. In the automatic driving information screen 1103 in FIG. 56A, a reserved action image 1124 and a time limit image 1126 are displayed in the sub-region 1102. In this example, the time limit image 1126 represents the length of time until reservation cancellation by a plurality of time indicators 1109. The image generation unit 1060 generates the reserved action image 1124 in a mode different from the mode of the scheduled action candidate image 1112. Alternatively, the image output unit 1051 displays the reserved action image 1124 in a mode different from the mode of the scheduled action candidate image 1112. This allows the driver to easily determine whether the reservation action candidate is being proposed or has already been reserved on the automatic driving information screen 1103.
 具体的には、図52の自動運転情報画面1103のように、予定行動候補を運転者へ提示する場合は、予定行動候補画像1112の左側に残り時間画像1108を配置する。その一方、図56Aの自動運転情報画面1103のように、予約行動を運転者へ提示する場合は、予約行動画像1124の右側に制限時間画像1126を配置する。変形例として、予定行動候補画像1112と予約行動画像1124の背景色を異ならせてもよいし、表示サイズを異ならせてもよい。また、予約行動画像1124に対して予約中であることを示す所定のシンボルの画像を付加してもよい。 More specifically, as shown in the automatic driving information screen 1103 in FIG. 52, when the scheduled action candidate is presented to the driver, the remaining time image 1108 is arranged on the left side of the scheduled action candidate image 1112. On the other hand, when the reservation action is presented to the driver as in the automatic driving information screen 1103 of FIG. 56A, the time limit image 1126 is arranged on the right side of the reservation action image 1124. As a modification, the background color of the scheduled action candidate image 1112 and the reserved action image 1124 may be different, or the display size may be different. In addition, an image of a predetermined symbol indicating that a reservation is being made may be added to the reservation action image 1124.
 図54に戻り、判定部1062は、検出部1020から出力された最新の検出情報と、判定基準保持部1071に保持された判定基準にしたがって、予約行動を車両に即時実行させることが可能か否かを判定する(S182)。予約行動を即時実行させることが可能であれば(S183のY)、指示部1063は、予約行動の即時実行を指示する制御コマンドを、コマンド出力部1055から自動運転制御装置1030へ送信させる(S176)。以降、図53のS150~S153の処理により、自動運転情報画面の現在行動画像が予約行動を示すものに変化する。 Returning to FIG. 54, whether or not the determination unit 1062 can cause the vehicle to immediately execute the reservation action according to the latest detection information output from the detection unit 1020 and the determination criterion held in the determination criterion holding unit 1071. Is determined (S182). If the reservation action can be immediately executed (Y in S183), the instruction unit 1063 causes the command output unit 1055 to transmit a control command instructing the immediate execution of the reservation action to the automatic driving control device 1030 (S176). ). Thereafter, the current action image on the automatic driving information screen changes to the one indicating the reservation action by the processes of S150 to S153 in FIG.
 予約行動の即時実行が不可と判定され(S183のN)、キャンセル時間が経過すると(S184のY)、予約行動の実行を指示することなく、候補決定部1061は新たな予定行動候補の決定処理を再開し(S177)、本図のフローを終了する。キャンセル時間が未経過であり(S184のN)、制限時間画像を更新すべき所定時間も未経過であれば(S185のN)、S182に戻り、予約行動の即時実行可否を再度判定する。制限時間画像を更新すべき所定時間が経過した場合(S185のY)、画像生成部1060は、制限時間画像を更新する。画像出力部1051は、更新された制限時間画像を報知装置1002へ出力して表示させる(S186)。そしてS182に戻り、予約行動の即時実行可否を再度判定する。 If it is determined that the reservation action cannot be immediately executed (N in S183) and the cancellation time has elapsed (Y in S184), the candidate determination unit 1061 determines the new scheduled action candidate without performing an instruction to execute the reservation action. Is resumed (S177), and the flow of FIG. If the cancellation time has not elapsed (N in S184) and the predetermined time for updating the time limit image has not yet elapsed (N in S185), the process returns to S182 to determine again whether the reservation action can be immediately executed. When a predetermined time for updating the time limit image has elapsed (Y in S185), the image generation unit 1060 updates the time limit image. The image output unit 1051 outputs and displays the updated time limit image to the notification device 1002 (S186). Then, the process returns to S182 to determine again whether or not the reservation action can be immediately executed.
 図56Bは、図56Aより後の時点の自動運転情報画面1103を示している。図56Aと図56Bとの間では、現在行動画像1104が表す車両の現在行動が減速から前進(速度維持)へ切り替わっているが、予約行動がまだ実行できないため、サブ領域1102では制限時間画像1126のみ変化している。具体的には、時間インジケータ1109の一部を点灯状態から消灯状態へ変化させることで、予約キャンセルまでの残り時間が減少したことを運転者へ報知する。予約開始からの経過時間がキャンセル時間(例えば5分)に至ると、予約行動画像1124の表示は終了し、新たに候補決定部1061が決定した予定行動候補を提示する自動運転情報画面1103(図52等)に戻る。 FIG. 56B shows an automatic driving information screen 1103 at a time point after FIG. 56A. Between FIG. 56A and FIG. 56B, the current action of the vehicle represented by the current action image 1104 is switched from deceleration to advance (maintain speed), but the reservation action cannot be executed yet, so the time limit image 1126 in the sub-region 1102. Only has changed. Specifically, by changing a part of the time indicator 1109 from the lit state to the unlit state, the driver is notified that the remaining time until the reservation is canceled. When the elapsed time from the reservation start reaches the cancel time (for example, 5 minutes), the display of the reservation action image 1124 ends, and the automatic driving information screen 1103 (FIG. 10) presenting the scheduled action candidate newly determined by the candidate determination unit 1061. 52).
 以上説明したように、実施の形態7の運転支援装置1040は、車両の乗員(運転者等)に対して、自動運転における現在行動を報知するとともに、自動運転の制御を変えるために指示可能な車両の将来行動を運転者に提示する。このように、自動運転における車両の将来行動の選択肢を運転者へ提示することで、運転者の意思を一層反映した自動運転であり、運転者の嗜好等に即した自動運転を実現できる。また、自動運転に対する運転者の不安感を抑制できる。また、実施の形態7の運転支援装置1040によると、現在は実行できない行動であっても、将来に亘る行動の予約が可能であるため、運転者の嗜好等に即した自動運転を一層実現しやすくなる。 As described above, the driving support device 1040 according to the seventh embodiment can notify a vehicle occupant (driver or the like) of the current behavior in automatic driving and can instruct to change the control of automatic driving. Present the future behavior of the vehicle to the driver. In this way, by presenting the driver with options for the future behavior of the vehicle in automatic driving, it is an automatic driving that further reflects the driver's intention, and an automatic driving that matches the driver's preference and the like can be realized. In addition, the driver's anxiety about automatic driving can be suppressed. In addition, according to the driving support device 1040 of the seventh embodiment, even if it is an action that cannot be executed at present, it is possible to reserve an action in the future, so that automatic driving in accordance with the driver's preference is further realized. It becomes easy.
 変形例を説明する。実施の形態7では、運転者による選択行動の即時実行可否を運転支援装置1040が判定した。変形例として、自動運転制御装置1030が運転者による選択行動の即時実行可否を判定してもよい。この場合、運転支援装置1040はこの判定処理を実行しなくてもよい。運転支援装置1040は、運転者による選択行動が即時実行可能か否かにかかわらず、即時実行と予約の両方を選択可能な問い合わせ画像を生成し、運転者へ提示してもよい。 A modification will be described. In the seventh embodiment, the driving support device 1040 determines whether or not the selection action by the driver can be immediately executed. As a modification, the automatic driving control device 1030 may determine whether or not the selection action by the driver can be immediately executed. In this case, the driving assistance device 1040 may not execute this determination process. The driving assistance device 1040 may generate an inquiry image that can select both immediate execution and reservation regardless of whether or not the selection action by the driver can be immediately executed, and may present it to the driver.
 また運転支援装置1040の指示部1063は、問い合わせ画像に対する運転者の操作として、選択操作の即時実行を示す操作指示を受け付けた場合、選択操作を即時に車両に実行させる第1の制御コマンドをコマンド出力部1055から自動運転制御装置1030へ送信させてもよい。また、問い合わせ画像に対する運転者の操作として、選択行動の予約を示す操作指示を受け付けた場合、選択行動を所定時間後に車両に実行させる第2の制御コマンドをコマンド出力部1055から自動運転制御装置1030へ送信させてもよい。第2の制御コマンドは、例えば、キャンセル時間設定画像1122に対して運転者が設定したキャンセル時間内に選択行動を実行することを指示する制御コマンドであってもよい。言い換えれば、キャンセル時間が経過した場合、選択行動の実行をキャンセルすることを指示する制御コマンドであってもよい。 In addition, the instruction unit 1063 of the driving support device 1040 receives a first control command that causes the vehicle to execute the selection operation immediately when an operation instruction indicating immediate execution of the selection operation is received as the driver's operation on the inquiry image. You may make it transmit to the automatic operation control apparatus 1030 from the output part 1055. FIG. In addition, when an operation instruction indicating reservation of a selected action is received as an operation of the driver for the inquiry image, a second control command for causing the vehicle to execute the selected action after a predetermined time is sent from the command output unit 1055 to the automatic driving control device 1030. May be sent to. The second control command may be, for example, a control command that instructs the cancel time setting image 1122 to execute the selection action within the cancel time set by the driver. In other words, the control command may be an instruction to cancel execution of the selected action when the cancel time has elapsed.
 別の変形例を説明する。実施の形態7では言及していないが、予定行動候補を車両に実行させる構成は、実施の形態6で説明した構成と同様であってもよい。すなわち、運転支援装置1040の指示部1063は、現在行動画像1104と予定行動候補画像1112が自動運転情報画面1103に所定時間表示された後、予定行動候補を車両に実行させる制御コマンドをコマンド出力部1055から自動運転制御装置1030へ送信させてもよい。 Another modification will be described. Although not mentioned in the seventh embodiment, the configuration for causing the vehicle to execute the scheduled action candidate may be the same as the configuration described in the sixth embodiment. That is, the instruction unit 1063 of the driving support device 1040 outputs a control command for causing the vehicle to execute the scheduled action candidate after the current action image 1104 and the scheduled action candidate image 1112 are displayed on the automatic driving information screen 1103 for a predetermined time. You may make it transmit to the automatic driving | operation control apparatus 1030 from 1055.
 また指示部1063は、現在行動画像1104と予定行動候補画像1112が自動運転情報画面1103に表示されている所定時間内に車両の行動を指定する操作指示が未入力である場合、予定行動候補を車両に実行させる制御コマンドをコマンド出力部1055から自動運転制御装置1030へ送信させてもよい。さらにまた指示部1063は、現在行動画像1104と予定行動候補画像1112が自動運転情報画面1103に表示されている所定時間内に予定行動候補(予定行動候補画像1112)を選択する操作指示を受け付けた場合、予定行動候補を車両に実行させる制御コマンドをコマンド出力部1055から自動運転制御装置1030へ送信させてもよい。 In addition, the instruction unit 1063 selects a scheduled action candidate when an operation instruction for designating a vehicle action is not input within a predetermined time in which the current action image 1104 and the scheduled action candidate image 1112 are displayed on the automatic driving information screen 1103. A control command to be executed by the vehicle may be transmitted from the command output unit 1055 to the automatic driving control device 1030. Furthermore, the instruction unit 1063 has received an operation instruction to select a scheduled action candidate (scheduled action candidate image 1112) within a predetermined time in which the current action image 1104 and the scheduled action candidate image 1112 are displayed on the automatic driving information screen 1103. In this case, a control command for causing the vehicle to execute the scheduled action candidate may be transmitted from the command output unit 1055 to the automatic driving control device 1030.
 さらに別の変形例を説明する。統計情報蓄積部1070に蓄積される統計情報は、車両の乗員(典型的には運転者)の嗜好或いは運転パターンを反映した情報(ドライバモデル)であってもよい。本変形例の統計情報は、過去の環境パラメータ値と、その環境下での予定行動の実績の組み合わせを蓄積したものと言える。例えば、運転支援装置1040は、車両の走行における過去の環境パラメータ値と、その環境パラメータ値の場合の運転者による所定の未来時点の操作(結果としての車両の将来行動)を統計情報蓄積部1070の統計情報へ逐次記録する統計情報記録部をさらに有してもよい。 Still another modification will be described. The statistical information stored in the statistical information storage unit 1070 may be information (driver model) reflecting the preference or driving pattern of a vehicle occupant (typically a driver). It can be said that the statistical information of this modification is an accumulation of past environmental parameter values and combinations of scheduled behaviors under the environment. For example, the driving support device 1040 displays the past environmental parameter value in the travel of the vehicle and the operation at the predetermined future time point (the resulting future behavior of the vehicle) by the driver in the case of the environmental parameter value as the statistical information storage unit 1070. You may further have a statistical information recording part recorded sequentially on this statistical information.
 本変形例の候補決定部1061は、統計情報蓄積部1070を参照して、現在検出された環境パラメータ値との差異が所定範囲内の環境パラメータ値に対応付けられた複数の予定行動候補を決定する。また、それとともに複数の予定行動候補それぞれの優先順位(優先度合とも言える)を決定する。例えば候補決定部1061は、複数の予定行動候補を決定した場合に、現在検出された環境パラメータ値に近い環境パラメータ値に対応付けられた予定行動候補ほど順位を高くしてもよい。すなわち、複数の予定行動候補の中で、運転者の嗜好或いは運転パターンに即した予定行動候補ほど高い優先順位を付与してもよい。 The candidate determination unit 1061 of the present modification refers to the statistical information storage unit 1070 and determines a plurality of scheduled action candidates in which the difference from the currently detected environmental parameter value is associated with the environmental parameter value within a predetermined range. To do. At the same time, the priority order (also referred to as priority) of each of the plurality of scheduled action candidates is determined. For example, when the candidate determination unit 1061 determines a plurality of scheduled action candidates, the candidate action unit 1061 may increase the rank of the scheduled action candidate associated with the environmental parameter value that is close to the currently detected environmental parameter value. That is, among the plurality of scheduled action candidates, a higher priority may be given to a scheduled action candidate that matches the driver's preference or driving pattern.
 画像生成部1060は、候補決定部1061により決定された複数の予定在行動候補に対応する複数の予定行動候補画像であり、すなわち各候補の内容を表す画像を生成する。画像出力部1051は、車両1000の運転者の一定視野内に現在行動画像と複数の予定行動候補画像を表示させるように現在行動画像と複数の予定行動候補画像を報知装置1002へ出力する。また画像出力部1051は、複数の予定行動候補画像を各候補の優先順位に応じた態様で報知装置1002に表示させる。 The image generation unit 1060 generates a plurality of scheduled action candidate images corresponding to the plurality of scheduled action candidates determined by the candidate determination unit 1061, that is, generates an image representing the contents of each candidate. The image output unit 1051 outputs the current action image and the plurality of scheduled action candidate images to the notification device 1002 so that the current action image and the plurality of scheduled action candidate images are displayed within a certain field of view of the driver of the vehicle 1000. The image output unit 1051 displays a plurality of scheduled action candidate images on the notification device 1002 in a manner corresponding to the priority order of each candidate.
 例えば、画像生成部1060および画像出力部1051は、実施の形態5で既述したように、ヒストグラムなどで優先度を可視化したパラメトリック表示、または上下左右などに順位の並びに可視化したノンパラメトリック表示の形態にて、複数の予定行動候補画像を表示させてもよい。具体的には、順位そのものを示す画像を予定行動候補画像のそれぞれに付加してもよく、高順位の予定行動候補画像ほど視認性が高い態様で表示させてもよく、自動運転情報画面1103の所定位置から順位の降順に予定行動候補画像を並べてもよい。また、所定の順位以上の予定行動候補のみ画像を生成し、また表示対象としてもよい。この変形例によると、複数の予定行動候補の中から乗員にとって好適な候補を選択できるよう支援できる。また、ドライバモデルに即した優先順位を決定することで、ドライバモデルの対象者(例えば運転者自身、同乗者自身、標準的または模範的な運転者のモデル等)の運転パターンに即した優先順位を提示できる。 For example, as described in the fifth embodiment, the image generation unit 1060 and the image output unit 1051 have a parametric display in which priority is visualized by a histogram or the like, or a non-parametric display form in which the order is visualized in the vertical and horizontal directions. A plurality of scheduled action candidate images may be displayed. Specifically, an image indicating the order itself may be added to each of the scheduled action candidate images, or the higher order scheduled action candidate images may be displayed in a higher visibility. The scheduled action candidate images may be arranged in descending order from a predetermined position. Further, only the scheduled action candidates having a predetermined rank or higher may be generated and used as display targets. According to this modification, it is possible to assist so that a suitable candidate for the occupant can be selected from among a plurality of scheduled action candidates. In addition, by determining the priority according to the driver model, the priority according to the driving pattern of the target person of the driver model (for example, the driver himself, the passenger himself, a standard or exemplary driver model, etc.) Can be presented.
 なお、図示しないが別例として、先行車が減速し車間距離が短くなったが、追い越し車線後方から自車より速度の速い車両が連続して接近していることから、自動運転制御装置1030は、現在行動「減速」の実行を決定し、運転支援装置1040は、その現在行動が実行されていることを提示していることとする。このときに運転支援装置1040は、検出部1020の検出情報に基づいて、『追い越し車線後方から接近する車両がいなければメリットが大きくなること』を判定してもよい。そして運転支援装置1040は、現在行動「減速」のあとに、タイミングを見計らって走行制御指示できる予定行動候補として「追い越し車線側へ車線変更」を、指示が予約できる選択肢として運転者へ推薦提示してもよい。 As another example, although not shown, the preceding vehicle has decelerated and the inter-vehicle distance has been shortened. However, since vehicles with higher speed than the own vehicle are continuously approaching from behind the overtaking lane, the automatic driving control device 1030 is Therefore, it is assumed that the execution of the current action “deceleration” is determined, and the driving support device 1040 presents that the current action is being executed. At this time, the driving assistance device 1040 may determine “the advantage will be increased if there is no vehicle approaching from behind the overtaking lane” based on the detection information of the detection unit 1020. Then, after the current action “deceleration”, the driving support device 1040 presents “change lane to the overtaking lane” as a scheduled action candidate that can be instructed to run at an appropriate timing, and recommends it to the driver as an option to reserve the instruction. May be.
 (実施の形態8)
 まず概要を説明する。車両の自動運転中に、計画されている将来時点での車両の行動(以下「予定行動」と呼ぶ。)が運転者へ提示される場合であっても、その予定行動より前に車両に実行させる行動として指示可能な行動が何かを運転者が把握できないことがあった。またその結果、運転者に不安感を抱かせることがあった。
(Embodiment 8)
First, an outline will be described. Even when the planned vehicle behavior (hereinafter referred to as “scheduled behavior”) is presented to the driver during automatic driving of the vehicle, it is executed on the vehicle before the scheduled behavior. In some cases, the driver cannot grasp what action can be instructed as the action to be performed. As a result, the driver may feel uneasy.
 そこで実施の形態8では、自動運転コントローラで計画されている予定行動に加えて、予定行動より前に車両に実行させる行動の候補を運転者へ提示する。実施の形態8で運転者に提示する候補は、車両に即時実行させる行動の候補(以下「現在行動候補」と呼ぶ。)とする。具体的には、自動運転制御装置1030が予定行動を決定し、運転支援装置1040が現在行動候補を決定する。そして運転支援装置1040が、予定行動と現在行動候補の両方を車両1000内の報知装置1002に表示させる。 Therefore, in the eighth embodiment, in addition to the scheduled action planned by the automatic driving controller, candidates for actions to be executed by the vehicle before the scheduled action are presented to the driver. The candidates presented to the driver in the eighth embodiment are action candidates that are immediately executed by the vehicle (hereinafter referred to as “current action candidates”). Specifically, the automatic driving control device 1030 determines the scheduled behavior, and the driving support device 1040 determines the current behavior candidate. Then, the driving support device 1040 causes the notification device 1002 in the vehicle 1000 to display both the scheduled behavior and the current behavior candidate.
 以下では、これまでの実施の形態で説明済の内容は適宜省略する。本実施の形態で説明する構成或いは動作は、趣旨を逸脱しない範囲で、他の実施の形態或いは変形例で説明する構成或いは動作と組み合わせることができ、また置き換えることができる。 In the following, the contents already described in the above embodiments are omitted as appropriate. The structure or operation described in this embodiment can be combined with or replaced with the structure or operation described in another embodiment or modification without departing from the spirit of the present invention.
 実施の形態8は、実施の形態6における現在行動の表示が予定行動の表示に置き換わったものと言え、実施の形態8の運転支援装置1040の機能ブロックは、実施の形態6と同様である。すなわち、制御部1041は、画像生成部1060、候補決定部1061、判定部1062、指示部1063を含む。また、記憶部1042は、統計情報蓄積部1070、判定基準保持部1071を含む。 In the eighth embodiment, it can be said that the display of the current action in the sixth embodiment is replaced with the display of the scheduled action, and the functional blocks of the driving support device 1040 in the eighth embodiment are the same as those in the sixth embodiment. That is, the control unit 1041 includes an image generation unit 1060, a candidate determination unit 1061, a determination unit 1062, and an instruction unit 1063. The storage unit 1042 includes a statistical information storage unit 1070 and a determination criterion holding unit 1071.
 統計情報蓄積部1070に蓄積される統計情報と、判定基準保持部1071に保持される判定基準も実施の形態6と同様である。すなわち、統計情報蓄積部1070には現在行動候補を決定するための統計情報が蓄積され、判定基準保持部1071には環境パラメータに応じた即時実行可能な行動の判定基準が保持される。既述したように、統計情報は車両1000の外部の装置に蓄積されてもよく、運転支援装置1040は、通信IF1056および無線装置1008を介してリモートの統計情報にアクセスしてもよい。 The statistical information stored in the statistical information storage unit 1070 and the determination standard stored in the determination standard storage unit 1071 are the same as in the sixth embodiment. That is, statistical information for determining a current action candidate is stored in the statistical information storage unit 1070, and a determination criterion for an action that can be executed immediately according to an environmental parameter is stored in the determination criterion storage unit 1071. As described above, the statistical information may be accumulated in a device outside the vehicle 1000, and the driving support device 1040 may access the remote statistical information via the communication IF 1056 and the wireless device 1008.
 行動情報入力部1054は、自動運転制御装置1030が車両1000に将来時点で実行させる予定の行動である予定行動を示す行動情報(実施の形態5の「予定行動情報」)を自動運転制御装置1030から取得する。行動情報入力部1054はさらに、現在の時刻から予定行動を実行させるまでの時間を示す残り時間情報を自動運転制御装置1030から取得する。図35で示したように、残り時間情報は、自動運転制御装置1030から取得される行動情報のデータセットに含まれることとする。 The behavior information input unit 1054 receives behavior information (“scheduled behavior information” in the fifth embodiment) indicating scheduled behavior, which is a behavior scheduled to be executed by the automatic driving control device 1030 at a future time, in the automatic driving control device 1030. Get from. The behavior information input unit 1054 further acquires remaining time information indicating the time from the current time until the scheduled behavior is executed from the automatic driving control device 1030. As shown in FIG. 35, the remaining time information is included in the data set of behavior information acquired from the automatic driving control apparatus 1030.
 検出情報入力部1052は、車両1000の周囲状況および走行状態の検出結果を示す検出情報を検出部1020から取得する。運転支援装置1040の候補決定部1061および判定部1062は、行動情報が示す予定行動の前に車両1000に実行させることが可能な行動を検出情報に基づいて決定する。実施の形態7では、その行動として、現在、車両1000に即時実行させることが可能な行動を現在行動候補として決定する。変形例として、行動情報が示す予定行動の実行時点より前であるが、現在時点からは遅れて実行される行動であり、例えば車両1000が実行中の現在行動が終了後に実行される行動を現在行動候補として決定してもよい。 The detection information input unit 1052 acquires detection information indicating the detection result of the surrounding state and the running state of the vehicle 1000 from the detection unit 1020. The candidate determination unit 1061 and the determination unit 1062 of the driving support device 1040 determine an action that can be executed by the vehicle 1000 before the scheduled action indicated by the action information based on the detection information. In the seventh embodiment, as the action, an action that can be immediately executed by the vehicle 1000 is determined as a current action candidate. As a modification, it is an action that is executed before the scheduled action execution time indicated by the action information but is delayed from the current time, for example, an action executed after the current action being executed by the vehicle 1000 is ended. You may determine as an action candidate.
 具体的には、候補決定部1061は、実施の形態6と同様に、統計情報蓄積部1070の統計情報に規定された複数種類の行動のうち、検出情報に近似する環境パラメータ値に対応付けられた1つ以上の行動を仮の現在行動候補として決定する。判定部1062も、実施の形態6と同様に、検出部1020から出力された検出情報と、判定基準保持部1071の判定基準を参照して、候補決定部1061により決定された仮の現在行動候補のそれぞれが、車両に現在(即時)実行させることが可能か否かを判定する。そして、車両に現在実行させることが可能な候補を、運転者に提示する最終的な現在行動候補として決定する。 Specifically, the candidate determination unit 1061 is associated with environmental parameter values that approximate detection information among a plurality of types of actions defined in the statistical information of the statistical information storage unit 1070, as in the sixth embodiment. One or more actions are determined as temporary current action candidates. Similarly to the sixth embodiment, the determination unit 1062 also refers to the detection information output from the detection unit 1020 and the determination criterion of the determination criterion holding unit 1071, and the provisional current action candidate determined by the candidate determination unit 1061 Each determines whether the vehicle can be executed (immediately) at present. Then, a candidate that can be currently executed by the vehicle is determined as a final current action candidate to be presented to the driver.
 画像生成部1060は、行動情報が示す予定行動を表す予定行動画像を生成し、候補決定部1061および判定部1062により決定された最終的な現在行動候補を表す現在行動候補画像を生成する。画像出力部1051は、車両1000の運転者の一定視野内に予定行動画像と現在行動候補画像を表示させるように、予定行動画像と現在行動候補画像を報知装置1002に出力する。 The image generation unit 1060 generates a scheduled behavior image that represents the scheduled behavior indicated by the behavior information, and generates a current behavior candidate image that represents the final current behavior candidate determined by the candidate determination unit 1061 and the determination unit 1062. The image output unit 1051 outputs the scheduled action image and the current action candidate image to the notification device 1002 so that the scheduled action image and the current action candidate image are displayed within a certain field of view of the driver of the vehicle 1000.
 また画像生成部1060は、自動運転制御装置1030から入力された残り時間情報によって更新される予定行動が実行されるまでの時間を表す残り時間画像をさらに生成する。画像出力部1051は、残り時間画像を報知装置1002へさらに出力して、車両1000の運転者の一定視野内に、残り時間画像を加えた予定行動画像、および、現在行動候補画像を表示させる。報知装置1002は、これらの画像を含む自動運転情報画面を表示する。 Further, the image generation unit 1060 further generates a remaining time image representing the time until the scheduled action updated by the remaining time information input from the automatic driving control device 1030 is executed. The image output unit 1051 further outputs the remaining time image to the notification device 1002, and displays the scheduled action image and the current action candidate image to which the remaining time image is added within the fixed visual field of the driver of the vehicle 1000. The notification device 1002 displays an automatic driving information screen including these images.
 図57A~57Eは、自動運転情報画面の一例を示す。図57A~57Eの自動運転情報画面1103では、内側の円に予定行動画像1106と残り時間画像1108を配置し、外側の円に現在行動候補画像1110を配置している。内側の円はその全体が残り時間表示領域であり、予定行動実行までの残り時間が減るほど、網掛けで示す残り時間画像1108の範囲が拡大していくように残り時間画像1108を更新する。現在行動候補画像1110について、例えば上向きの三角は加速、下向きの三角は減速、左向きの三角は左への車線変更、右向きの三角は右への車線変更を示す。例えば図57Bの自動運転情報画面1103では、2つの現在行動候補「左への車線変更」「減速」を提示している。 FIGS. 57A to 57E show an example of the automatic driving information screen. In the automatic driving information screens 1103 of FIGS. 57A to 57E, the scheduled action image 1106 and the remaining time image 1108 are arranged in the inner circle, and the current action candidate image 1110 is arranged in the outer circle. The inner circle is the remaining time display area as a whole, and the remaining time image 1108 is updated so that the range of the remaining time image 1108 indicated by shading is expanded as the remaining time until execution of the scheduled action decreases. For the current action candidate image 1110, for example, an upward triangle indicates acceleration, a downward triangle indicates deceleration, a left triangle indicates lane change to the left, and a right triangle indicates lane change to the right. For example, in the automatic driving information screen 1103 in FIG. 57B, two current action candidates “lane change to the left” and “deceleration” are presented.
 図57A~57Dに示すように、予定行動実行までの残り時間が減少する間も、車両の周囲状況或いは走行状態の変化に応じて現在行動候補画像1110は随時更新される。運転者は、入力装置1004に設けられた十字ボタンを押下して現在行動候補画像1110が表す現在行動候補の実行を指示してもよい。例えば、下向きの三角を示す現在行動候補画像1110が表示中に、運転者が十字ボタンにおける下ボタンを選択することで、現在行動候補画像1110が示す減速の実行を指示してもよい。 As shown in FIGS. 57A to 57D, the current action candidate image 1110 is updated at any time according to the change in the surrounding situation or the running state of the vehicle while the remaining time until the scheduled action execution is reduced. The driver may instruct execution of the current action candidate represented by the current action candidate image 1110 by pressing a cross button provided on the input device 1004. For example, the driver may instruct execution of deceleration indicated by the current action candidate image 1110 by selecting the lower button of the cross button while the current action candidate image 1110 indicating the downward triangle is displayed.
 図57Eは、図57Dの予定行動画像1106が示す加速が実行される前に、運転者がいずれかの現在行動候補(現在行動候補画像1110)を選択した結果を示している。例えば、図57Dの自動運転情報画面1103において運転者が現在行動候補「減速」を選択した場合、車両の現在行動は減速に切り替わる。それとともに図57Eの自動運転情報画面1103では、自動運転制御装置1030が決定した車両の予定行動(予定行動画像1106)と、運転支援装置1040が決定した車両の現在行動候補(現在行動候補画像1110)が変化している。 FIG. 57E shows the result of the driver selecting one of the current action candidates (current action candidate image 1110) before the acceleration indicated by the scheduled action image 1106 of FIG. 57D is executed. For example, when the driver selects the current action candidate “decelerate” on the automatic driving information screen 1103 in FIG. 57D, the current action of the vehicle is switched to deceleration. At the same time, in the automatic driving information screen 1103 in FIG. 57E, the planned behavior of the vehicle (scheduled behavior image 1106) determined by the automatic driving control device 1030 and the current behavior candidate of the vehicle determined by the driving support device 1040 (current behavior candidate image 1110). ) Has changed.
 実施の形態8における車両1000の処理シーケンスは、実施の形態6で説明した図46の処理シーケンスと同様であるため説明を省略する。ただし、現在行動を示す行動情報の取得(P23、P31)、現在行動画像の生成および出力(P24、P32)はそれぞれ、予定行動および残り時間を示す行動情報の取得と、予定行動画像の生成および出力に置き換わる。 Since the processing sequence of the vehicle 1000 in the eighth embodiment is the same as the processing sequence of FIG. 46 described in the sixth embodiment, the description thereof is omitted. However, acquisition of action information indicating the current action (P23, P31), generation and output of the current action image (P24, P32), acquisition of action information indicating the scheduled action and the remaining time, generation of the scheduled action image, and Replaced by output.
 図58は、運転支援装置1040の処理の例を示すフローチャートである。予定行動画像の生成と表示に係る同図のS190~S199の処理は、実施の形態5で説明した図41のS100、S106~S110、S112~S114の処理と同じであるため、説明を省略する。検出部1020から出力された検出情報を検出情報入力部1052が取得すると(S200のY)、候補決定部1061は、その検出情報、統計情報蓄積部1070に蓄積された統計情報、判定基準保持部1071に保持された判定基準に基づいて、1つ以上の現在行動候補を決定する(S201)。画像生成部1060は、S201で決定された現在行動候補が、予め記憶部1042に記憶させた現在行動候補と一致する否かを判定する。 FIG. 58 is a flowchart showing an example of processing of the driving support apparatus 1040. The processing of S190 to S199 in the figure related to the generation and display of the scheduled action image is the same as the processing of S100, S106 to S110, and S112 to S114 of FIG. 41 described in the fifth embodiment, and thus description thereof is omitted. . When the detection information input unit 1052 acquires the detection information output from the detection unit 1020 (Y in S200), the candidate determination unit 1061 displays the detection information, the statistical information stored in the statistical information storage unit 1070, and the determination criterion holding unit. One or more current action candidates are determined based on the determination criterion held in 1071 (S201). The image generation unit 1060 determines whether or not the current action candidate determined in S201 matches the current action candidate stored in the storage unit 1042 in advance.
 S201で決定された現在行動候補が、予め記憶部1042に記憶させた現在行動候補と不一致であり、すなわち現在行動候補が更新された場合(S202のY)、画像生成部1060は、現在行動候補を表す現在行動候補画像を生成する(S203)。画像出力部1051は、現在行動候補画像を報知装置1002へ出力して表示させる(S204)。画像生成部1060は、現在行動候補画像を生成した現在行動候補の情報を記憶部1042に記憶させる(S205)。検出情報を未取得であれば(S200のN)、S201~S205をスキップする。現在行動候補の更新がなければ(S202のN)、S203~S205をスキップする。所定の終了条件が満たされた場合(S206のY)、本図のフローを終了し、終了条件が満たされなければ(S206のN)、S190に戻る。 When the current action candidate determined in S201 is inconsistent with the current action candidate previously stored in the storage unit 1042, that is, when the current action candidate is updated (Y in S202), the image generation unit 1060 displays the current action candidate. Is generated (S203). The image output unit 1051 outputs and displays the current action candidate image to the notification device 1002 (S204). The image generation unit 1060 causes the storage unit 1042 to store information on the current action candidate that generated the current action candidate image (S205). If the detection information has not been acquired (N in S200), S201 to S205 are skipped. If no action candidate is currently updated (N in S202), S203 to S205 are skipped. If the predetermined end condition is satisfied (Y in S206), the flow of this figure is ended. If the end condition is not satisfied (N in S206), the process returns to S190.
 運転者による現在行動候補の選択に係る運転支援装置1040の処理は、実施の形態6と同様であり、処理フローは図48の処理フローと同様である。現在行動候補の選択を受け付ける時間の長さは、予め定められた固定値でもよく、自動運転制御装置1030で決定された予定行動実行までの残り時間未満の値(例えば残り時間-5秒等)であってもよい。 The processing of the driving support device 1040 related to the selection of the current action candidate by the driver is the same as that of the sixth embodiment, and the processing flow is the same as the processing flow of FIG. The length of time for accepting selection of the current action candidate may be a predetermined fixed value, or a value less than the remaining time until the scheduled action execution determined by the automatic driving control device 1030 (for example, remaining time—5 seconds, etc.) It may be.
 以上説明したように、実施の形態8の運転支援装置1040は、車両の乗員(運転者等)に対して、自動運転において計画されている予定行動を報知するとともに、現在において即時実行可能な現在行動候補を提案する。このように、車両の直近の行動の選択肢を運転者へ提示することで、運転者の意思を一層反映した自動運転であり、運転者の嗜好等に即した自動運転を実現できる。また、運転者が選択可能な現在行動候補は現在の車両の周囲状況或いは走行状態において実行可能なものであるため、運転者は安心して自動運転に対する変更の指示を出すことができる。 As described above, the driving support device 1040 according to the eighth embodiment notifies the vehicle occupants (drivers, etc.) of the planned behavior planned for the automatic driving, and is currently executable immediately. Suggest action candidates. In this way, by presenting the driver with the latest action options of the vehicle, it is an automatic driving that further reflects the driver's intention, and an automatic driving that meets the driver's preference and the like can be realized. In addition, since the current action candidate that can be selected by the driver is one that can be executed in the current situation or running state of the vehicle, the driver can give an instruction to change the automatic driving with a sense of security.
 実施の形態6に既述した変形例は実施の形態8にも適用でき、同様の効果を奏する。例えば、統計情報蓄積部1070に蓄積される統計情報は、車両の乗員(典型的には運転者)の嗜好或いは運転パターンを反映した情報(ドライバモデル)であってもよい。候補決定部1061は、統計情報蓄積部1070を参照して、現在検出された環境パラメータ値との差異が所定範囲内の環境パラメータ値に対応付けられた複数の現在行動候補を決定してもよい。また、それとともに複数の現在行動候補それぞれの優先順位を決定してもよい。画像生成部1060は、候補決定部1061により決定された複数の現在行動候補に対応する複数の現在行動候補画像であり、すなわち各候補の内容を表す画像を生成してもよい。画像出力部1051は、車両1000の運転者の一定視野内に予定行動画像と複数の現在行動候補画像を表示させるように予定行動画像と複数の現在行動候補画像を報知装置1002へ出力してもよい。また画像出力部1051は、複数の現在行動候補画像を各候補の優先順位に応じた態様で報知装置1002に表示させてもよい。 The modification already described in the sixth embodiment can be applied to the eighth embodiment, and has the same effect. For example, the statistical information stored in the statistical information storage unit 1070 may be information (driver model) reflecting the preference or driving pattern of a vehicle occupant (typically a driver). The candidate determination unit 1061 may determine a plurality of current action candidates in which a difference from the currently detected environmental parameter value is associated with an environmental parameter value within a predetermined range with reference to the statistical information storage unit 1070. . Moreover, you may determine the priority of each of several current action candidate with it. The image generation unit 1060 may generate a plurality of current action candidate images corresponding to the plurality of current action candidates determined by the candidate determination unit 1061, that is, an image representing the contents of each candidate. Even if the image output unit 1051 outputs the scheduled action image and the plurality of current action candidate images to the notification device 1002 so as to display the scheduled action image and the plurality of current action candidate images within a certain field of view of the driver of the vehicle 1000. Good. The image output unit 1051 may display a plurality of current action candidate images on the notification device 1002 in a manner corresponding to the priority order of each candidate.
 なお、図示しないが、予定行動(将来についての走行制御計画とも言える)と現在行動候補(現在についての状況適応型推薦とも言える)を提示する別例を説明する。ここでは、合流路に接近する最左側車線を走行中に合流車両がいないことに基づいて、自動運転制御装置1030は、現在の運転行動の次に、予定行動「車線維持」を実行することを決定し、運転支援装置1040は、「車線維持」を実行予定であることを提示していることとする。このときに運転支援装置1040は、『合流路に接近するまでに合流車両が現れると急な操作が必要となりデメリットが大きくなること』を判定してもよい。そして運転支援装置1040は、車両1000がこの後に実行する予定の「車線維持」の前に、走行制御指示できる現在行動候補として「最左側車線から右車線へ車線変更」を、今すぐ指示が出せる選択肢として推薦提示してもよい。 Although not shown in the drawings, another example of presenting a scheduled action (also referred to as a driving control plan for the future) and a current action candidate (also referred to as a situation-adaptive recommendation for the present) will be described. Here, based on the fact that there is no merging vehicle while traveling in the leftmost lane approaching the merging path, the automatic driving control device 1030 executes the scheduled action “keep lane” next to the current driving action. It is determined that the driving support apparatus 1040 presents that “lane keeping” is scheduled to be executed. At this time, the driving assistance device 1040 may determine that “when a merged vehicle appears before approaching the merge path, a sudden operation is required and a disadvantage increases.” Then, the driving support device 1040 can immediately issue an instruction “change lane from leftmost lane to right lane” as a current action candidate capable of instructing driving control before the “lane keeping” scheduled to be executed by the vehicle 1000 later. Recommendations may be presented as options.
 (実施の形態9)
 まず概要を説明する。車両の自動運転中に、計画されている将来時点での車両の行動(以下「予定行動」と呼ぶ。)が運転者へ提示される場合であっても、その予定行動に代わる行動として指示可能な行動が何かを運転者が把握できないことがあった。またその結果、運転者に不安感を抱かせてしまうことがあった。
(Embodiment 9)
First, an outline will be described. Even if the vehicle's behavior at the planned future time point (hereinafter referred to as “scheduled behavior”) is presented to the driver during automatic driving of the vehicle, it can be instructed as an alternative to the scheduled behavior. The driver may not be able to figure out what the wrong behavior is. As a result, the driver may feel uneasy.
 そこで実施の形態9では、自動運転コントローラで計画されている予定行動に加えて、予定行動に代えて将来時点で車両に実行させる行動の候補である予定行動候補を運転者へ提示する。具体的には、自動運転制御装置1030が予定行動を決定し、運転支援装置1040が予定行動候補を決定する。そして運転支援装置1040が予定行動と予定行動候補の両方を車両1000内の報知装置1002に表示させる。予定行動候補は、自動運転コントローラが決定した予定行動とは異なる行動であり、車両に実行させる予定の行動になり得るものと言える。 Therefore, in the ninth embodiment, in addition to the scheduled action planned by the automatic driving controller, a scheduled action candidate that is a candidate for an action to be executed by the vehicle at a future time is presented to the driver instead of the scheduled action. Specifically, the automatic driving control device 1030 determines a scheduled action, and the driving support device 1040 determines a scheduled action candidate. Then, the driving support device 1040 displays both the scheduled behavior and the scheduled behavior candidate on the notification device 1002 in the vehicle 1000. The scheduled action candidate is an action different from the scheduled action determined by the automatic driving controller, and can be said to be an action scheduled to be executed by the vehicle.
 以下では、これまでの実施の形態で説明済の内容は適宜省略する。本実施の形態で説明する構成或いは動作は、趣旨を逸脱しない範囲で、他の実施の形態或いは変形例で説明する構成或いは動作と組み合わせることができ、また置き換えることができる。 In the following, the contents already described in the above embodiments are omitted as appropriate. The structure or operation described in this embodiment can be combined with or replaced with the structure or operation described in another embodiment or modification without departing from the spirit of the present invention.
 実施の形態9は、実施の形態7における現在行動の表示が予定行動の表示に置き換わったものと言え、実施の形態9の運転支援装置1040の機能ブロックは、実施の形態7と同様である。すなわち、制御部1041は、画像生成部1060、候補決定部1061、判定部1062、指示部1063を含む。また、記憶部1042は、統計情報蓄積部1070、判定基準保持部1071を含む。 In Embodiment 9, it can be said that the display of the current action in Embodiment 7 is replaced with the display of scheduled action, and the functional blocks of the driving support device 1040 in Embodiment 9 are the same as those in Embodiment 7. That is, the control unit 1041 includes an image generation unit 1060, a candidate determination unit 1061, a determination unit 1062, and an instruction unit 1063. The storage unit 1042 includes a statistical information storage unit 1070 and a determination criterion holding unit 1071.
 統計情報蓄積部1070に蓄積される統計情報と、判定基準保持部1071に保持される判定基準も実施の形態7と同様である。すなわち、統計情報蓄積部1070には予定行動候補を決定するための統計情報が蓄積され、判定基準保持部1071には環境パラメータに応じた即時実行可能な行動の判定基準が保持される。既述したように、統計情報は車両1000の外部の装置に蓄積されてもよく、運転支援装置1040は、通信IF1056および無線装置1008を介してリモートの統計情報にアクセスしてもよい。 The statistical information stored in the statistical information storage unit 1070 and the determination standard stored in the determination standard storage unit 1071 are the same as in the seventh embodiment. That is, statistical information for determining a scheduled action candidate is stored in the statistical information storage unit 1070, and a determination criterion for an action that can be executed immediately according to the environmental parameter is stored in the determination criterion storage unit 1071. As described above, the statistical information may be accumulated in a device outside the vehicle 1000, and the driving support device 1040 may access the remote statistical information via the communication IF 1056 and the wireless device 1008.
 行動情報入力部1054は、自動運転制御装置1030が車両1000に将来時点で実行させる予定の行動である予定行動を示す行動情報(実施の形態5の「予定行動情報」)を自動運転制御装置1030から取得する。行動情報入力部1054はさらに、現在の時刻から予定行動を実行させるまでの時間を示す残り時間情報を自動運転制御装置1030から取得する。図35で示したように、残り時間情報は、自動運転制御装置1030から取得される行動情報のデータセットに含まれることとする。 The behavior information input unit 1054 receives behavior information (“scheduled behavior information” in the fifth embodiment) indicating scheduled behavior, which is a behavior scheduled to be executed by the automatic driving control device 1030 at a future time, in the automatic driving control device 1030. Get from. The behavior information input unit 1054 further acquires remaining time information indicating the time from the current time until the scheduled behavior is executed from the automatic driving control device 1030. As shown in FIG. 35, the remaining time information is included in the data set of behavior information acquired from the automatic driving control apparatus 1030.
 検出情報入力部1052は、車両1000の周囲状況および走行状態の検出結果を示す検出情報を検出部1020から取得する。候補決定部1061は、行動情報が示す予定行動とは異なる行動であって、車両1000に実行させる予定の行動になり得る1つ以上の予定行動候補を検出部1020から出力された検出情報に基づいて決定する。予定行動候補は、行動情報が示す予定行動に代えて車両1000に実行させることが可能な行動とも言える。具体的には、候補決定部1061は、実施の形態7と同様に、統計情報に規定された行動のうち、検出情報に近似する環境パラメータ値に対応付けられた1つ以上の行動を予定行動候補として抽出する。 The detection information input unit 1052 acquires detection information indicating the detection result of the surrounding state and the running state of the vehicle 1000 from the detection unit 1020. The candidate determination unit 1061 is based on the detection information output from the detection unit 1020, which is one or more scheduled behavior candidates that are different from the scheduled behavior indicated by the behavior information and can be scheduled to be executed by the vehicle 1000. To decide. The scheduled action candidate can be said to be an action that can be executed by the vehicle 1000 in place of the scheduled action indicated by the action information. Specifically, as in the seventh embodiment, candidate determination unit 1061 selects one or more actions associated with an environmental parameter value that approximates detection information from among actions specified in statistical information as scheduled actions. Extract as a candidate.
 候補決定部1061は、統計情報から抽出した1つ以上の予定行動候補の中に、行動情報が示す予定行動と同じ候補があれば、その候補を運転者への提示対象から除外する。言い換えれば、候補決定部1061は、統計情報から抽出した予定行動候補のうち、行動情報が示す予定行動とは異なる候補を運転者への提示対象として決定する。これにより、予定行動と同じ予定行動候補を運転者へ提示することを防止する。 If one or more scheduled action candidates extracted from the statistical information have the same candidate as the scheduled action indicated by the action information, the candidate determining unit 1061 excludes the candidate from the target to be presented to the driver. In other words, the candidate determination unit 1061 determines a candidate that is different from the scheduled behavior indicated by the behavior information, among the scheduled behavior candidates extracted from the statistical information, as a subject to be presented to the driver. Thereby, it is prevented that the same scheduled action candidate as the scheduled action is presented to the driver.
 画像生成部1060は、行動情報が示す予定行動を表す予定行動画像を生成し、候補決定部1061により運転者への提示対象として決定された1つ以上の予定行動候補を表す1つ以上の予定行動候補画像を生成する。画像出力部1051は、車両の運転者の一定視野内に予定行動画像と予定行動候補画像を表示させるように、予定行動画像と予定行動候補画像を報知装置1002に出力する。報知装置1002は、運転支援装置1040から出力された予定行動画像と予定行動候補画像を含む自動運転情報画面を表示する。 The image generation unit 1060 generates a scheduled action image representing the scheduled action indicated by the action information, and one or more schedules representing one or more scheduled action candidates determined as candidates for presentation to the driver by the candidate determination unit 1061. An action candidate image is generated. The image output unit 1051 outputs the scheduled action image and the scheduled action candidate image to the notification device 1002 so that the scheduled action image and the scheduled action candidate image are displayed within a certain field of view of the driver of the vehicle. The notification device 1002 displays an automatic driving information screen including the scheduled action image and the scheduled action candidate image output from the driving support device 1040.
 また画像生成部1060は、自動運転制御装置1030から入力された残り時間情報によって更新される予定行動が実行されるまでの時間を表す第1残り時間画像をさらに生成する。画像出力部1051は、予定行動画像に対応付けて第1残り時間画像を報知装置1002へさらに出力することにより、第1残り時間画像を加えた予定行動画像を自動運転情報画面に表示させる。 Further, the image generation unit 1060 further generates a first remaining time image representing the time until the scheduled action updated by the remaining time information input from the automatic driving control device 1030 is executed. The image output unit 1051 further outputs the first remaining time image in association with the scheduled action image to the notification device 1002, thereby displaying the scheduled action image with the first remaining time image added on the automatic driving information screen.
 また候補決定部1061は、予定行動候補を決定すると、統計情報においてその予定行動候補に対応付けられた残り時間情報をさらに画像生成部1060に渡す。画像生成部1060は、予定行動候補を表す予定行動候補画像を生成する際に、残り時間情報が示す残り時間を表す第2残り時間画像をさらに生成する。画像出力部1051は、予定行動候補画像に対応付けて第2残り時間画像を報知装置1002へさらに出力することにより、第2残り時間画像を加えた予定行動候補画像を自動運転情報画面に表示させる。 Further, when the candidate determining unit 1061 determines the scheduled action candidate, the candidate determining unit 1061 further passes the remaining time information associated with the scheduled action candidate in the statistical information to the image generating unit 1060. When generating the scheduled action candidate image representing the scheduled action candidate, the image generation unit 1060 further generates a second remaining time image representing the remaining time indicated by the remaining time information. The image output unit 1051 further outputs the second remaining time image to the notification device 1002 in association with the scheduled action candidate image, thereby displaying the scheduled action candidate image to which the second remaining time image is added on the automatic driving information screen. .
 図59は、自動運転情報画面の一例を示す。同図の自動運転情報画面1103では、1つの予定行動を表す1つの予定行動画像1106と、1つの予定行動候補を表す1つの予定行動候補画像1112が表示されている。複数の予定行動が決定された場合は複数の予定行動画像1106が表示されてよく、複数の予定行動候補が決定された場合は複数の予定行動候補画像1112が表示されてよい。また、第1残り時間画像1108aは予定行動画像1106の近傍位置に配置され、第2残り時間画像1108bは予定行動候補画像1112の近傍位置に配置され、いずれも複数の時間インジケータ1109の表示態様により残り時間の長さを報知する。なお、図59で示す予定行動と予定行動候補はいずれも、複数の単一行動を組み合わせた行動計画であるが、単一行動であってもよい。 FIG. 59 shows an example of the automatic driving information screen. In the automatic driving information screen 1103 in the figure, one scheduled action image 1106 representing one scheduled action and one scheduled action candidate image 1112 representing one scheduled action candidate are displayed. When a plurality of scheduled actions are determined, a plurality of scheduled action images 1106 may be displayed, and when a plurality of scheduled action candidates are determined, a plurality of scheduled action candidate images 1112 may be displayed. In addition, the first remaining time image 1108a is arranged in the vicinity of the scheduled action image 1106, and the second remaining time image 1108b is arranged in the vicinity of the scheduled action candidate image 1112, all of which depend on the display mode of the plurality of time indicators 1109. Announce the length of remaining time. Note that each of the scheduled action and the scheduled action candidate shown in FIG. 59 is an action plan in which a plurality of single actions are combined, but may be a single action.
 自動運転情報画面1103において、予定行動画像1106の表示態様は、予定行動候補画像1112の表示態様と異なるように設定される。例えば、予定行動画像1106と予定行動候補画像1112の間で、模様或いは色彩、サイズ等を異ならせてもよい。これにより、自動運転コントローラにおいて計画済の予定行動と、その予定行動に代わる行動として提案された予定行動候補とを運転者が混同することを防止する。図59では、予定行動画像1106にはラベル「予定行動」が付加され、予定行動候補画像1112にはラベル「オススメ行動」が付加されている。予定行動候補画像1112にはさらに、網掛け領域が付加されている。 In the automatic driving information screen 1103, the display mode of the scheduled action image 1106 is set to be different from the display mode of the scheduled action candidate image 1112. For example, the pattern, color, size, or the like may be different between the scheduled action image 1106 and the scheduled action candidate image 1112. This prevents the driver from confusion between the scheduled behavior planned in the automatic driving controller and the scheduled behavior candidate proposed as an action instead of the scheduled behavior. In FIG. 59, the label “scheduled action” is added to the scheduled action image 1106, and the label “recommended action” is added to the scheduled action candidate image 1112. A shaded area is further added to the scheduled action candidate image 1112.
 予定行動画像1106と予定行動候補画像1112間で表示態様を異ならせるために、画像生成部1060は、予定行動画像1106とは異なるシンボル或いは装飾を予定行動候補画像1112に設定してもよい。また画像出力部1051は、予定行動画像1106の表示態様と、予定行動候補画像1112の表示態様とが異なるように指定する表示制御データを報知装置1002へさらに出力してもよい。 In order to change the display mode between the scheduled action image 1106 and the scheduled action candidate image 1112, the image generation unit 1060 may set a symbol or decoration different from the scheduled action image 1106 in the scheduled action candidate image 1112. The image output unit 1051 may further output to the notification device 1002 display control data that specifies that the display mode of the scheduled action image 1106 and the display mode of the scheduled action candidate image 1112 are different.
 また自動運転情報画面1103には、1つ以上の予定行動(予定行動画像1106)と、1つ以上の予定行動候補(予定行動候補画像1112)の中から特定の行動を運転者に選択さえるための選択枠1116が表示されている。運転者は、選択枠1116により所望の予定行動または予定行動候補を選択する操作を入力装置1004へ入力する。 The automatic driving information screen 1103 also allows the driver to select a specific action from one or more scheduled actions (scheduled action image 1106) and one or more scheduled action candidates (scheduled action candidate image 1112). A selection frame 1116 is displayed. The driver inputs an operation for selecting a desired scheduled action or a scheduled action candidate to the input device 1004 using the selection frame 1116.
 予定行動または予定行動候補が選択された場合の処理は実施の形態7と同様である。すなわち、判定部1062は、選択行動の即時実行可否を判定する。画像生成部1060は、判定部1062の判定結果に応じて、選択行動の「実行」または「予約」を運転者に指定させるための問い合わせ画像(例えば図55A、55B)を生成する。指示部1063は、問い合わせ画像表示中に「実行」または「予約」が指定された場合に、指定された「実行」または「予約」に応じたタイミングで、運転者による選択行動を実行させるための制御コマンドを自動運転制御装置1030へ送信する。 Processing when a scheduled action or a scheduled action candidate is selected is the same as in the seventh embodiment. That is, the determination unit 1062 determines whether or not the selected action can be immediately executed. The image generation unit 1060 generates an inquiry image (for example, FIGS. 55A and 55B) for causing the driver to specify “execution” or “reservation” of the selection action according to the determination result of the determination unit 1062. When “execution” or “reservation” is designated while the inquiry image is displayed, the instruction unit 1063 is for causing the driver to perform a selection action at a timing corresponding to the designated “execution” or “reservation”. The control command is transmitted to the automatic operation control device 1030.
 実施の形態9における車両1000の処理シーケンスは、実施の形態6で説明した図46の処理シーケンスと同様であるため説明を省略する。ただし、現在行動を示す行動情報の取得(P23、P31)、現在行動画像の生成および出力(P24、P32)はそれぞれ、予定行動および残り時間を示す行動情報の取得と、予定行動画像の生成および出力に置き換わる。また、現在行動候補画像の生成と出力(P26、P27、P34、P35)は、予定行動候補画像の生成と出力に置き換わる。 Since the processing sequence of the vehicle 1000 in the ninth embodiment is the same as the processing sequence of FIG. 46 described in the sixth embodiment, the description thereof is omitted. However, acquisition of action information indicating the current action (P23, P31), generation and output of the current action image (P24, P32), acquisition of action information indicating the scheduled action and the remaining time, generation of the scheduled action image, and Replaced by output. Further, the generation and output of current action candidate images (P26, P27, P34, and P35) are replaced with generation and output of scheduled action candidate images.
 図60は、運転支援装置1040の処理の例を示すフローチャートである。予定行動画像の生成と表示に係る同図のS210~S218の処理は、実施の形態5で説明した図41のS100、S106~S110、S112~S114の処理と同じであるため、説明を省略する。検出部1020から出力された検出情報を検出情報入力部1052が取得すると(S219のY)、候補決定部1061は、その検出情報と、統計情報蓄積部1070に蓄積された統計情報に基づいて、1つ以上の予定行動候補を決定する。その際、候補決定部1061は、行動情報が示す予定行動と同じ行動を統計情報から抽出した場合、その行動については予定行動候補から除外する。すなわち候補決定部1061は、予定行動とは異なる予定行動候補を決定する(S220)。 FIG. 60 is a flowchart showing an example of processing of the driving support apparatus 1040. The processes in S210 to S218 in FIG. 41 relating to the generation and display of the scheduled action image are the same as the processes in S100, S106 to S110, and S112 to S114 in FIG. . When the detection information input unit 1052 acquires the detection information output from the detection unit 1020 (Y in S219), the candidate determination unit 1061 is based on the detection information and the statistical information accumulated in the statistical information accumulation unit 1070. One or more scheduled action candidates are determined. At that time, when the same action as the scheduled action indicated by the action information is extracted from the statistical information, the candidate determining unit 1061 excludes the action from the scheduled action candidate. That is, the candidate determination unit 1061 determines a scheduled action candidate different from the scheduled action (S220).
 画像生成部1060は、S220で決定された予定行動候補が、予め記憶部1042に記憶させた予定行動候補と一致する否かを判定する。S220で決定された現在行動候補が、予め記憶部1042に記憶させた予定行動候補と不一致であり、すなわち予定行動候補が更新された場合(S221のY)、画像生成部1060は、予定行動候補を表す予定行動候補画像を生成する(S222)。画像生成部1060は、統計情報において予定行動候補に予め対応付けられた実行までの残り時間をさらに識別し、S222において、その残り時間を表す残り時間画像をさらに生成する。 The image generation unit 1060 determines whether or not the scheduled action candidate determined in S220 matches the scheduled action candidate previously stored in the storage unit 1042. When the current action candidate determined in S220 is inconsistent with the scheduled action candidate stored in the storage unit 1042 in advance, that is, when the scheduled action candidate is updated (Y in S221), the image generation unit 1060 displays the scheduled action candidate. Is generated (S222). The image generation unit 1060 further identifies the remaining time until the execution associated with the scheduled action candidate in advance in the statistical information, and further generates a remaining time image representing the remaining time in S222.
 画像出力部1051は、予定行動候補画像と残り時間画像を報知装置1002へ出力して自動運転情報画面を表示させる(S223)。画像生成部1060は、画像を生成した予定行動候補を示す情報を記憶部1042に格納し(S224)、予定行動候補画像を出力(表示開始)してからの経過時間の計測を開始する(S225)。所定の終了条件が満たされた場合(S229のY)、本図のフローを終了し、終了条件が満たされなければ(S229のN)、S210に戻る。 The image output unit 1051 outputs the scheduled action candidate image and the remaining time image to the notification device 1002 to display the automatic driving information screen (S223). The image generation unit 1060 stores information indicating the scheduled action candidate that generated the image in the storage unit 1042 (S224), and starts measuring the elapsed time after outputting (starting to display) the scheduled action candidate image (S225). ). If the predetermined end condition is satisfied (Y in S229), the flow of this figure is terminated. If the end condition is not satisfied (N in S229), the process returns to S210.
 検出情報の入力がなく(S219のN)、または、予定行動候補が更新されない場合(S221のN)、画像生成部1060は、経過時間の計測開始から所定時間が経過したか否かを判定する。この所定時間は、残り時間画像を更新すべき単位時間であり、例えば1つの時間インジケータ1109に割当てられた時間であってもよい。経過時間の計測開始から所定時間が経過したことを検出すると(S226のY)、画像生成部1060は、残り時間画像を更新する(S227)。画像出力部1051は、更新された残り時間画像を報知装置1002へ出力して表示させる(S228)。これにより、例えば図59の1つの時間インジケータ1109を点灯状態から消灯状態へ切り替える。経過時間の計測開始から所定時間が未経過であれば(S226のN)、S227とS228をスキップする。 When no detection information is input (N in S219) or the scheduled action candidate is not updated (N in S221), the image generation unit 1060 determines whether a predetermined time has elapsed since the start of the elapsed time measurement. . The predetermined time is a unit time in which the remaining time image is to be updated, and may be a time assigned to one time indicator 1109, for example. When it is detected that a predetermined time has elapsed since the start of the elapsed time measurement (Y in S226), the image generation unit 1060 updates the remaining time image (S227). The image output unit 1051 outputs and displays the updated remaining time image to the notification device 1002 (S228). Thereby, for example, one time indicator 1109 in FIG. 59 is switched from the lighting state to the extinguishing state. If the predetermined time has not elapsed since the start of the elapsed time measurement (N in S226), S227 and S228 are skipped.
 運転者による予定行動または予定行動候補の選択に係る運転支援装置1040(画像生成部1060、判定部1062、指示部1063等)の処理は、実施の形態7と同様である。例えば、図54のフローチャートに示す処理と、図55A、55Bおよび図56A、56Bの自動運転情報画面1103に示すユーザインタフェースは、実施の形態9にそのまま適用される。ただし、運転者による選択対象は、運転支援装置1040が決定した予定行動候補に加えて、自動運転制御装置1030が決定した予定行動を含む。 The processing of the driving support device 1040 (the image generation unit 1060, the determination unit 1062, the instruction unit 1063, etc.) related to the selection of the scheduled action or the scheduled action candidate by the driver is the same as that in the seventh embodiment. For example, the processing shown in the flowchart of FIG. 54 and the user interface shown in the automatic driving information screen 1103 of FIGS. 55A and 55B and FIGS. 56A and 56B are directly applied to the ninth embodiment. However, the selection target by the driver includes the scheduled action determined by the automatic driving control apparatus 1030 in addition to the scheduled action candidate determined by the driving support apparatus 1040.
 以上説明したように、実施の形態9の運転支援装置1040は、車両の乗員(運転者等)に対して、自動運転における予定行動を報知するとともに、その予定行動に代えて実行可能な予定行動候補を提案する。このように、車両の将来時点の行動の選択肢を運転者へ提示することで、運転者の意思を一層反映した自動運転であり、運転者の嗜好等に即した自動運転を実現できる。また、自動運転に対する運転者の不安感を抑制できる。また、実施の形態7と同様に、現在は実行できない行動であっても、将来に亘る行動の予約が可能であるため、運転者の嗜好等に即した自動運転を一層実現しやすくなる。 As described above, the driving support device 1040 according to the ninth embodiment notifies the vehicle occupants (drivers, etc.) of the scheduled behavior in the automatic driving and can be executed instead of the scheduled behavior. Suggest a candidate. In this way, by presenting the driver with behavior options at a future time point to the driver, it is an automatic driving that further reflects the driver's intention, and an automatic driving that meets the driver's preference and the like can be realized. In addition, the driver's anxiety about automatic driving can be suppressed. Further, as in the seventh embodiment, even if it is an action that cannot be executed at the present time, it is possible to reserve an action in the future, so that it becomes easier to realize automatic driving in accordance with the driver's preference and the like.
 実施の形態7に既述した変形例は実施の形態9にも適用でき、同様の効果を奏する。例えば、統計情報蓄積部1070に蓄積される統計情報は、車両の乗員(典型的には運転者)の嗜好或いは運転パターンを反映した情報(ドライバモデル)であってもよい。候補決定部1061は、統計情報蓄積部1070を参照して、現在検出された環境パラメータ値との差異が所定範囲内の環境パラメータ値に対応付けられた複数の予定行動候補を決定してもよい。また、それとともに複数の予定行動候補それぞれの優先順位を決定してもよい。画像生成部1060は、候補決定部1061により決定された複数の予定行動候補に対応する複数の予定行動候補画像であり、すなわち各候補の内容を表す画像を生成してもよい。画像出力部1051は、車両1000の運転者の一定視野内に予定行動画像と複数の予定行動候補画像を表示させるように予定行動画像と複数の予定行動候補画像を報知装置1002へ出力してもよい。また画像出力部1051は、複数の予定行動候補画像を各候補の優先順位に応じた態様で報知装置1002に表示させてもよい。 The modification already described in the seventh embodiment can be applied to the ninth embodiment and has the same effect. For example, the statistical information stored in the statistical information storage unit 1070 may be information (driver model) reflecting the preference or driving pattern of a vehicle occupant (typically a driver). The candidate determination unit 1061 may determine a plurality of scheduled action candidates in which a difference from the currently detected environmental parameter value is associated with an environmental parameter value within a predetermined range with reference to the statistical information storage unit 1070. . Moreover, you may determine the priority of each of several schedule action candidates with it. The image generation unit 1060 may generate a plurality of scheduled behavior candidate images corresponding to the plurality of scheduled behavior candidates determined by the candidate determination unit 1061, that is, an image representing the contents of each candidate. The image output unit 1051 may output the scheduled action image and the plurality of scheduled action candidate images to the notification device 1002 so that the scheduled action image and the plurality of scheduled action candidate images are displayed within a certain field of view of the driver of the vehicle 1000. Good. The image output unit 1051 may display a plurality of scheduled action candidate images on the notification device 1002 in a manner corresponding to the priority order of each candidate.
 なお、図示しないが、予定行動(将来についての走行制御計画とも言える)と予定行動候補(将来についての状況適応型推薦とも言える)を提示する別例を説明する。ここでは、合流路に接近する最左側車線を走行中に、右車線後方から自車より速度の速い車両が連続して接近していることに基づいて、自動運転制御装置1030は、現在の運転行動の次に、予定行動「車線維持」を実行することを決定し、運転支援装置1040は、「車線維持」を実行予定であることを提示していることとする。このときに運転支援装置1040は、『合流路に接近するまでに合流車両が現れると急な操作が必要となりデメリットが大きくなること』を判定してもよい。そして運転支援装置1040は、車両1000がこの後に実行する予定の「車線維持」の前に、タイミングを見計らって走行制御指示できる予定行動候補として「最左側車線から右車線へ車線変更」を、指示が予約できる選択肢として推薦提示してもよい。 Although not shown in the drawings, another example of presenting a scheduled action (also referred to as a driving control plan for the future) and a scheduled action candidate (also referred to as a situation-adaptive recommendation for the future) will be described. Here, based on the fact that a vehicle having a speed higher than that of the host vehicle is approaching continuously from the rear of the right lane while traveling in the leftmost lane approaching the combined channel, the automatic driving control device 1030 performs the current driving. Next to the action, it is determined that the scheduled action “keep lane” is to be executed, and the driving support device 1040 presents that “keep lane” is to be executed. At this time, the driving assistance device 1040 may determine that “when a merged vehicle appears before approaching the merge path, a sudden operation is required and a disadvantage increases.” Then, the driving support device 1040 instructs “change lane from the leftmost lane to the right lane” as a scheduled action candidate that can be instructed to travel control at the timing before the “lane keeping” scheduled to be executed by the vehicle 1000 later. May be recommended as an option that can be reserved.
 (実施の形態10)
 まず概要を説明する。車両の自動運転中、車両に即時実行させる行動(以下「現在行動」と呼ぶ。)を運転者に選択させる場合に、その行動選択を支援する情報が運転者に対して十分に提供されておらず、運転者によるスムーズな行動選択が困難なことがあった。また、運転者に不安感を抱かせることがあった。
(Embodiment 10)
First, an outline will be described. When the driver selects an action to be immediately executed by the vehicle (hereinafter referred to as “current action”) during automatic driving of the vehicle, sufficient information is provided to the driver to support the action selection. In some cases, it was difficult for the driver to select a smooth action. In addition, the driver may feel uneasy.
 そこで実施の形態10では、車両の現在行動の選択を支援する情報として、運転者個人に適応した情報を提供する。具体的には、運転支援装置1040が、車両1000の周囲状況或いは走行状態に基づく観点と、運転者個人に基づく観点の両方から現在行動候補を決定し、車両1000内の報知装置1002に表示させる。これにより、運転者の意に即したスムーズな行動選択を支援し、また、運転者が安心して行動選択或いは行動変更の指示を出せるよう支援する。 Therefore, in the tenth embodiment, information adapted to the individual driver is provided as information for supporting selection of the current behavior of the vehicle. Specifically, the driving support device 1040 determines the current action candidate from both the viewpoint based on the surrounding state or the running state of the vehicle 1000 and the viewpoint based on the individual driver, and displays it on the notification device 1002 in the vehicle 1000. . Thus, smooth action selection in accordance with the driver's will is supported, and the driver is supported so that instructions for action selection or action change can be issued with peace of mind.
 以下では、これまでの実施の形態で説明済の内容は適宜省略する。本実施の形態で説明する構成或いは動作は、趣旨を逸脱しない範囲で、他の実施の形態或いは変形例で説明する構成或いは動作と組み合わせることができ、また置き換えることができる。 In the following, the contents already described in the above embodiments are omitted as appropriate. The structure or operation described in this embodiment can be combined with or replaced with the structure or operation described in another embodiment or modification without departing from the spirit of the present invention.
 図61は、運転支援装置1040の記憶部1042の詳細な構成を示すブロック図である。判定基準保持部1071は、他の実施の形態と同様の判定基準を保持する。統計情報蓄積部1070は、第1統計情報を蓄積する第1統計情報蓄積部1072と、第2統計情報を蓄積する第2統計情報蓄積部1073を含む。 FIG. 61 is a block diagram showing a detailed configuration of the storage unit 1042 of the driving support apparatus 1040. The determination criterion holding unit 1071 holds the same determination criterion as in the other embodiments. The statistical information accumulation unit 1070 includes a first statistical information accumulation unit 1072 that accumulates first statistical information and a second statistical information accumulation unit 1073 that accumulates second statistical information.
 第1統計情報と第2統計情報はいずれも、これまでの実施の形態と同様に、車両の周囲状況および走行状態と車両の行動との関連性を示す統計情報である。具体的には、実施の形態6の統計情報(図43)と同様に、車両の周囲状況および走行状態を示す複数種類の環境パラメータの値と、車両に即時実行させる行動(もしくは車両に即時実行させた行動実績)を対応付けたレコードを複数含む情報である。 Both the first statistical information and the second statistical information are statistical information indicating the relevance of the surrounding situation and the running state of the vehicle and the behavior of the vehicle, as in the previous embodiments. Specifically, similar to the statistical information of the sixth embodiment (FIG. 43), the values of a plurality of types of environmental parameters indicating the surrounding conditions and the running state of the vehicle, and actions to be immediately executed by the vehicle (or immediate execution by the vehicle) Information including a plurality of records associated with each other.
 ただし、第1統計情報と第2統計情報は統計の範囲が異なり、具体的には第1統計情報の方が第2統計情報よりも統計の範囲が広い。第1統計情報は、複数人の集団・多数の車両を対象として、様々な環境状態における操作実績、行動実績を記録したものである。もちろん、様々な環境状態における操作実績、行動実績の履歴を既知の統計手法により操作パターン、行動パターンとしてモデル化したものでもよい。第2統計情報は、運転者個人・車両1000単体を対象として、これまでの環境状態における操作実績或いは行動実績を記録したものである。運転者個人の操作実績、車両1000単体の行動実績の履歴を、既知の統計手法により操作パターン、行動パターンとしてモデル化したものでもよい。 However, the range of statistics differs between the first statistical information and the second statistical information. Specifically, the first statistical information has a wider statistical range than the second statistical information. The first statistical information is a record of operation results and action results in various environmental states for a group of a plurality of people and a large number of vehicles. Of course, the history of operation results and behavior results in various environmental states may be modeled as operation patterns and behavior patterns by a known statistical method. The second statistical information is a record of an operation result or an action result in an environmental state so far for an individual driver and a vehicle 1000 alone. A driver's individual operation results and a history of behavior results of the vehicle 1000 alone may be modeled as operation patterns and behavior patterns by a known statistical method.
 例えば、第1統計情報は、大人数の集団における操作履歴、言い換えれば、複数の車両の行動履歴が、環境パラメータ値とともに逐次記録されたものであってもよい。また、大人数の集団における環境パラメータ値と行動との平均的な組み合わせが記録されたものであってもよい。第1統計情報は、様々な周囲環境或いは走行状態に応じた、典型的な操作パターン、言い換えれば、車両の典型的な行動パターンを示す情報と言える。また第1統計情報は、大人数の集団における操作履歴、多数の車両の行動履歴に基づくため、環境パラメータ値および行動の網羅性が高い。 For example, the first statistical information may be information in which operation histories in a large group, in other words, action histories of a plurality of vehicles are sequentially recorded together with environmental parameter values. Alternatively, an average combination of environmental parameter values and behavior in a large group of people may be recorded. The first statistical information can be said to be information indicating a typical operation pattern according to various surrounding environments or running conditions, in other words, a typical behavior pattern of the vehicle. In addition, since the first statistical information is based on the operation history of a large group and the behavior histories of a large number of vehicles, the environmental parameter value and the behavior are comprehensive.
 その一方、第2統計情報は、運転者個人の操作履歴、言い換えれば、車両1000単独の行動履歴を蓄積したものであってもよい。また、運転者個人の操作に基づく環境パラメータ値と行動との組み合わせが逐次記録されたものであってもよい。第2統計情報は、運転者個人の嗜好或いは操作パターンを第1統計情報より強く反映した統計情報であると言える。また第2統計情報は、運転者個人の操作履歴、車両1000単独の行動履歴を蓄積したものであるため、環境パラメータ値および行動の網羅性が低い。 On the other hand, the second statistical information may be an accumulation of an individual driver's operation history, in other words, an action history of the vehicle 1000 alone. Further, a combination of environmental parameter values and actions based on individual driver operations may be sequentially recorded. It can be said that the second statistical information is statistical information that reflects the individual preference or operation pattern of the driver more strongly than the first statistical information. Further, since the second statistical information is obtained by accumulating the individual driver's operation history and the behavior history of the vehicle 1000 alone, the environmental parameter value and the behavior coverage are low.
 実施の形態10では、第1統計情報と第2統計情報の両方が車両1000のローカルに記憶されるが、第1統計情報と第2統計情報の少なくとも一方が車両外部の装置、例えばクラウド上のデータベース等に蓄積されてもよい。例えば、第2統計情報は、統計の範囲が運転者個人または車両1000単独であるため、車両1000のローカルに逐次蓄積されてもよい。その一方、第1統計情報は、統計の範囲が大人数の集団または多数の車両に亘るため、クラウド上のサーバで集計・統計・蓄積等の処理が実行されてもよい。運転支援装置1040は、通信IF1056および無線装置1008を介してリモートの統計情報にアクセスしてもよい。 In the tenth embodiment, both the first statistical information and the second statistical information are stored locally in the vehicle 1000, but at least one of the first statistical information and the second statistical information is a device outside the vehicle, for example, on a cloud It may be stored in a database or the like. For example, the second statistical information may be sequentially accumulated locally in the vehicle 1000 because the range of the statistics is for the individual driver or the vehicle 1000 alone. On the other hand, since the range of statistics covers a large group of people or a large number of vehicles, the first statistical information may be subjected to processing such as aggregation / statistics / accumulation on a server on the cloud. The driving support device 1040 may access remote statistical information via the communication IF 1056 and the wireless device 1008.
 運転支援装置1040の行動情報入力部1054は、自動運転制御装置1030が車両1000に実行させる現在行動を示す行動情報を自動運転制御装置1030から取得する。運転支援装置1040の検出情報入力部1052は、車両1000の周囲状況および走行状態の検出結果を示す検出情報を検出部1020から取得する。 The behavior information input unit 1054 of the driving support device 1040 acquires from the automatic driving control device 1030 behavior information indicating the current behavior that the automatic driving control device 1030 causes the vehicle 1000 to execute. The detection information input unit 1052 of the driving support device 1040 acquires detection information indicating the detection result of the surrounding state and the running state of the vehicle 1000 from the detection unit 1020.
 図62は、運転支援装置1040の制御部1041の詳細な構成を示すブロック図である。制御部1041は、第1決定部1080、第2決定部1082、画像生成部1060、判定部1062、指示部1063を含む。 FIG. 62 is a block diagram showing a detailed configuration of the control unit 1041 of the driving support apparatus 1040. The control unit 1041 includes a first determination unit 1080, a second determination unit 1082, an image generation unit 1060, a determination unit 1062, and an instruction unit 1063.
 第1決定部1080は、検出部1020から出力された検出情報と、第1統計情報蓄積部1072に蓄積された第1統計情報に基づいて、車両1000に実行させることが可能な1つ以上の第1行動を決定する。第1決定部1080は状況適応型決定部1081を含む。 The first determination unit 1080 includes one or more items that can be executed by the vehicle 1000 based on the detection information output from the detection unit 1020 and the first statistical information stored in the first statistical information storage unit 1072. Determine the first action. The first determination unit 1080 includes a situation adaptive determination unit 1081.
 状況適応型決定部1081は、実施の形態6の候補決定部1061に対応する。状況適応型決定部1081は、行動情報が示す現在行動に代えて車両1000に実行させることが可能な1つ以上の行動の候補(以下「状況適応現在行動候補」と呼ぶ。)を第1行動として決定する。具体的には、実施の形態6と同様に、第1統計情報に規定された行動のうち、検出情報に近似する環境パラメータ値に対応付けられた行動を状況適応現在行動候補(実施の形態6の現在行動候補に対応)として決定する。状況適応現在行動候補は、現在の周囲状況或いは走行状態において即時実行される典型的な操作パターン・行動パターンと言える。 The situation adaptive determination unit 1081 corresponds to the candidate determination unit 1061 of the sixth embodiment. The situation adaptive determination unit 1081 replaces the current action indicated by the action information with one or more action candidates (hereinafter referred to as “situation adaptive current action candidates”) that can be executed by the vehicle 1000. Determine as. Specifically, as in the sixth embodiment, among the actions defined in the first statistical information, the action associated with the environmental parameter value that approximates the detection information is selected as the situation adaptation current action candidate (the sixth embodiment). Corresponding to the current action candidate). The situation adaptation current action candidate can be said to be a typical operation pattern / behavior pattern that is immediately executed in the current ambient situation or running state.
 なお、状況適応型決定部1081は、第1統計情報から抽出した状況適応現在行動候補の中に、行動情報が示す現在行動と同じ候補があれば、その候補を運転者への提示対象から除外する。言い換えれば、状況適応型決定部1081は、第1統計情報から抽出した状況適応現在行動候補のうち、行動情報が示す現在行動とは異なる候補を運転者への提示対象として決定する。これにより、現在行動と同じ候補を運転者へ提示することを防止する。 The situation adaptive determination unit 1081 excludes the candidate from the presentation target to the driver if there is the same candidate as the current action indicated by the action information among the situation adaptive current action candidates extracted from the first statistical information. To do. In other words, the situation adaptive determination unit 1081 determines a candidate that is different from the current action indicated by the behavior information, from among the situation adaptive current action candidates extracted from the first statistical information, as a presentation target to the driver. This prevents presenting the same candidate as the current action to the driver.
 判定部1062は、実施の形態6と同様に、検出部1020から出力された検出情報と、判定基準保持部1071の判定基準を参照して、状況適応型決定部1081により決定された状況適応現在行動候補のそれぞれが、車両に即時実行させることが可能か否かを判定する。状況適応型決定部1081は、第1統計情報から抽出した1つ以上の状況適応現在行動候補のうち、判定部1062により即時実行可能と判定された候補を運転者への提示対象として決定する。 As in the sixth embodiment, the determination unit 1062 refers to the detection information output from the detection unit 1020 and the determination criterion of the determination criterion holding unit 1071, and the situation adaptation current determination unit 1081 determines the situation adaptation current Each of the action candidates determines whether or not the vehicle can be immediately executed. The situation adaptive determination unit 1081 determines a candidate determined to be immediately executable by the determination unit 1062 as one to be presented to the driver among one or more situation adaptive current action candidates extracted from the first statistical information.
 第2決定部1082は、検出部1020から出力された検出情報と、第2統計情報蓄積部1073に蓄積された第2統計情報に基づいて、車両1000に実行させることが可能な1つ以上の第2行動を決定する。実施の形態10では、第2決定部1082は、1つ以上の状況適応現在行動候補それぞれに対する優先度であり、第2統計情報と検出情報との相関性に応じた優先度を示す情報を第2行動として決定する。言い換えれば、第2決定部1082が決定する1つ以上の第2行動は、1つ以上の第1行動それぞれの優先度を示すものである。 Based on the detection information output from the detection unit 1020 and the second statistical information stored in the second statistical information storage unit 1073, the second determination unit 1082 can execute one or more vehicles that can be executed by the vehicle 1000. Determine the second action. In the tenth embodiment, the second determination unit 1082 indicates the priority for each of the one or more situation-adapted current action candidates, and indicates information indicating the priority according to the correlation between the second statistical information and the detection information. Determine as two actions. In other words, the one or more second actions determined by the second determination unit 1082 indicate the priority of each of the one or more first actions.
 第2決定部1082は、個人適応型決定部1083と優先度決定部1084を含む。個人適応型決定部1083は、実施の形態6の候補決定部1061に対応する。個人適応型決定部1083は、行動情報が示す現在行動に代えて車両1000に実行させることが可能な1つ以上の行動(以下「個人適応現在行動候補」と呼ぶ。)を決定する。具体的には、実施の形態6と同様に、第2統計情報に規定された行動のうち、検出情報に近似する環境パラメータ値に対応付けられた行動を個人適応現在行動候補(実施の形態6の現在行動候補に対応)として決定する。個人適応現在行動候補は、現在の周囲状況或いは走行状態における運転者個人の操作パターンと言え、また車両1000単体の行動パターンと言える。 The second determination unit 1082 includes a personal adaptive determination unit 1083 and a priority determination unit 1084. The personal adaptive determination unit 1083 corresponds to the candidate determination unit 1061 of the sixth embodiment. The personal adaptation determination unit 1083 determines one or more actions (hereinafter referred to as “personal adaptation current action candidates”) that can be executed by the vehicle 1000 instead of the current action indicated by the action information. Specifically, as in the sixth embodiment, among the actions defined in the second statistical information, the action associated with the environmental parameter value that approximates the detection information is selected as the personal adaptation current action candidate (the sixth embodiment). Corresponding to the current action candidate). The personal adaptation current action candidate can be said to be a driver's individual operation pattern in the current surrounding state or running state, and can also be said to be an action pattern of the vehicle 1000 alone.
 なお、個人適応型決定部1083は、第2統計情報から抽出した個人適応現在行動候補の中に、行動情報が示す現在行動と同じ候補があれば、その候補を後述の順位付けの対象から除外する。言い換えれば、個人適応型決定部1083は、第2統計情報から抽出した個人適応現在行動候補のうち、行動情報が示す現在行動とは異なる候補を順位付けの対象として決定する。 The personal adaptive determination unit 1083 excludes the candidate for personal adaptation current behavior extracted from the second statistical information from the ranking target described later if the same candidate as the current behavior indicated by the behavior information is present. To do. In other words, the personal adaptive determination unit 1083 determines candidates that are different from the current behavior indicated by the behavior information, as candidates for ranking, from among the individual adaptive current behavior candidates extracted from the second statistical information.
 判定部1062は、状況適応現在行動候補と同様に、検出情報と判定基準を参照して、個人適応型決定部1083により決定された個人適応現在行動候補のそれぞれが即時実行可能か否かを判定してもよい。個人適応型決定部1083は、第2統計情報から抽出した1つ以上の個人適応現在行動候補のうち、即時実行可能と判定された候補を順位付けの対象とするよう絞り込んでもよく、即時実行できない候補を個人適応現在行動候補から除外してもよい。 The determination unit 1062 determines whether each of the individual adaptation current action candidates determined by the individual adaptation type determination unit 1083 can be immediately executed with reference to the detection information and the determination criterion, in the same manner as the situation adaptation current action candidates. May be. The personal adaptive determination unit 1083 may narrow down the candidates determined to be immediately executable out of one or more individual adaptive current action candidates extracted from the second statistical information so that the candidates for ranking may not be immediately executed. The candidate may be excluded from the personal adaptation current action candidate.
 個人適応型決定部1083は、1つ以上の個人適応現在行動候補を順位付けする。個人適応型決定部1083は、最新の検出情報が示す環境パラメータ値に近い環境パラメータ値に対応付けられた個人適応現在行動候補ほど順位を高くしてもよい。例えば、n個の環境パラメータに対応するn次元のベクトル空間において、最新の検出情報が示す環境パラメータ値の位置(現在環境位置)から所定範囲内の環境パラメータ値に対応付けられた行動を個人適応現在行動候補として抽出してもよい。そして、抽出した各候補について、ベクトル空間内の位置が現在環境位置に近い候補ほど順位を高くしてもよい。 The personal adaptation determination unit 1083 ranks one or more personal adaptation current action candidates. The personal adaptive determination unit 1083 may increase the rank of the personal adaptive current action candidate associated with the environmental parameter value that is close to the environmental parameter value indicated by the latest detection information. For example, in an n-dimensional vector space corresponding to n environment parameters, an action associated with an environment parameter value within a predetermined range from the position of the environment parameter value (current environment position) indicated by the latest detection information is personally adapted. You may extract as an action candidate now. Then, for each extracted candidate, the rank may be higher for a candidate whose position in the vector space is closer to the current environment position.
 優先度決定部1084は、個人適応型決定部1083により決定された1つ以上の個人適応現在行動候補の順位に基づいて、状況適応型決定部1081により運転者提示対象として決定された1つ以上の状況適応現在行動候補の優先度を決定する。例えば、個人適応現在行動候補と同じ行動を示す状況適応現在行動候補に対して、その個人適応現在行動候補の順位を仮順位として付与する。そして、1つ以上の状況適応現在行動候補のそれぞれに付与した仮順位にしたがって、仮順位が高いものほど画面表示上の優先度、言い換えれば、運転者への提案・お勧めの優先順位を高く設定する。なお、同じ行動を示す個人適応現在行動候補が存在しない状況適応現在行動候補、すなわち仮順位を付与できない状況適応現在行動候補の優先順位は最下位とする。 The priority determination unit 1084 is based on the ranking of one or more individual adaptation current action candidates determined by the individual adaptation type determination unit 1083, and is one or more determined as a driver presentation target by the situation adaptation type determination unit 1081. The priority of the current behavior candidate for the situation adaptation is determined. For example, the rank of the personal adaptation current action candidate is given as a provisional rank to the situation adaptation current action candidate showing the same action as the individual adaptation current action candidate. Then, according to the provisional rank assigned to each of one or more situation adaptation current action candidates, the higher the provisional rank, the higher the priority on the screen display, in other words, the priority of proposal / recommendation to the driver. Set. It should be noted that the priority order of the situation adaptation current action candidates for which there is no individual adaptation current action candidate showing the same action, that is, the situation adaptation current action candidates for which provisional ranks cannot be assigned is the lowest.
 画像生成部1060は、1つ以上の第1行動を表す第1画像を生成し、1つ以上の第2行動を表す第2画像を生成する。具体的には、画像生成部1060は、1つ以上の状況適応現在行動候補のそれぞれを表わす現在行動候補画像を第1画像として生成する。また画像生成部1060は、1つ以上の状況適応現在行動候補のそれぞれに対して優先度決定部1084が設定した優先順位を表わす優先度画像を第2画像として生成する。画像生成部1060は、1つ以上の状況適応現在行動候補のそれぞれに付与された優先順位をヒストグラムまたは数字で表わす優先度画像を生成してもよい。ヒストグラムは、優先度合に対応する表示サイズが設定された図形オブジェクトの画像であってもよい。 The image generation unit 1060 generates a first image representing one or more first actions, and generates a second image representing one or more second actions. Specifically, the image generation unit 1060 generates a current action candidate image representing each of one or more situation-adapted current action candidates as a first image. In addition, the image generation unit 1060 generates a priority image representing the priority set by the priority determination unit 1084 for each of the one or more situation adaptation current action candidates as the second image. The image generation unit 1060 may generate a priority image that represents a priority order given to each of one or more situation-adapted current action candidates by a histogram or a number. The histogram may be an image of a graphic object in which a display size corresponding to the priority is set.
 画像出力部1051は、車両の運転者の一定視野内に現在行動候補画像と優先度画像を表示させるように、現在行動候補画像と優先度画像を報知装置1002に出力する。報知装置1002は、現在行動候補画像と優先度画像を含む自動運転情報画面を表示する。 The image output unit 1051 outputs the current action candidate image and the priority image to the notification device 1002 so that the current action candidate image and the priority image are displayed within a certain field of view of the driver of the vehicle. The notification device 1002 displays an automatic driving information screen including a current action candidate image and a priority image.
 図63A、Bは、自動運転情報画面の一例を示す。図63Aの自動運転情報画面1103では、各状況適応現在行動候補の優先順位を数字で示している。図63Aでは、外側の円が行動候補表示領域1128になり、4つの状況適応現在行動候補を表わす4つの現在行動候補画像1110が表示されている。また図63Aでは、内側の円が優先度表示領域1130になり、4つの状況適応現在行動候補の優先順位を表わす4つの優先度画像1132が表示されている。図63Aでは、右への車線変更を示す現在行動候補画像1110に対して最も高い優先度「1」を示す優先度画像1132が付与されており、右への車線変更が運転者個人に対する最もお勧めの現在行動であることを示している。 63A and 63B show examples of the automatic driving information screen. In the automatic driving information screen 1103 of FIG. 63A, the priority order of each situation adaptation current action candidate is indicated by a number. In FIG. 63A, the outer circle is the action candidate display area 1128, and four current action candidate images 1110 representing the four situation-adapted current action candidates are displayed. In FIG. 63A, the inner circle is the priority display area 1130, and four priority images 1132 representing the priority order of the four situation-adapted current action candidates are displayed. In FIG. 63A, a priority image 1132 indicating the highest priority “1” is given to the current action candidate image 1110 indicating the lane change to the right, and the lane change to the right is the most important for the individual driver. Indicates that the current action is recommended.
 図63Bの自動運転情報画面1103では、各状況適応現在行動候補の優先度合をヒストグラムで示している。図63Bでは、円の周辺部に4つの状況適応現在行動候補を表わす4つの現在行動候補画像1110が表示されている。また図63Bでは、4つの状況適応現在行動候補の優先順位を表わす4つの優先度画像1132が表示されており、優先度が高い状況適応現在行動候補ほど大きいサイズの優先度画像1132が付加されている。図63Bでは、右への車線変更を示す現在行動候補画像1110に対して最も高い優先度を示す最大サイズの優先度画像1132が付与されており、右への車線変更が運転者個人に対する最もお勧めの現在行動であることを示している。 In the automatic driving information screen 1103 in FIG. 63B, the priority of each situation adaptation current action candidate is indicated by a histogram. In FIG. 63B, four current action candidate images 1110 representing four situation-adapted current action candidates are displayed around the circle. In FIG. 63B, four priority images 1132 representing the priority order of the four situation-adapted current action candidates are displayed, and a priority image 1132 having a larger size is added to a situation-adapted current action candidate having a higher priority. Yes. In FIG. 63B, a maximum size priority image 1132 indicating the highest priority is given to the current action candidate image 1110 indicating the lane change to the right, and the lane change to the right is the most important for the individual driver. Indicates that the current action is recommended.
 図63A、63Bの自動運転情報画面1103が表示中に、運転者は、入力装置1004に設けられた十字ボタンを押下して現在行動候補画像1110が表す現在行動候補の実行を指示してもよい。例えば、下向きの三角を示す現在行動候補画像1110が表示中に、運転者が十字ボタンにおける下ボタンを選択することで、現在行動候補画像1110が示す減速の実行を指示してもよい。運転支援装置1040の指示部1063の処理は、実施の形態6と同様である。 While the automatic driving information screen 1103 of FIGS. 63A and 63B is being displayed, the driver may instruct execution of the current action candidate represented by the current action candidate image 1110 by pressing the cross button provided on the input device 1004. . For example, the driver may instruct execution of deceleration indicated by the current action candidate image 1110 by selecting the lower button of the cross button while the current action candidate image 1110 indicating the downward triangle is displayed. The processing of the instruction unit 1063 of the driving support device 1040 is the same as that in the sixth embodiment.
 図64は、車両1000のHMI制御に係る処理の例を示すシーケンス図である。同図のP41~P44は、実施の形態6で説明した図46のシーケンス図のP21~P23、P25と同じである。図64では不図示だが、図46のP24で示されるように、運転支援装置1040は、自動運転制御装置1030から取得した行動情報に基づいて現在行動画像を報知装置1002にさらに表示させてもよい。 FIG. 64 is a sequence diagram illustrating an example of processing related to HMI control of the vehicle 1000. P41 to P44 in the figure are the same as P21 to P23 and P25 in the sequence diagram of FIG. 46 described in the sixth embodiment. Although not shown in FIG. 64, as indicated by P <b> 24 in FIG. 46, the driving support device 1040 may further display the current behavior image on the notification device 1002 based on the behavior information acquired from the automatic driving control device 1030. .
 運転支援装置1040は、検出情報と第1統計情報の相関性に応じて1つ以上の状況適応現在行動候補を決定し、検出情報と第2統計情報の相関性に応じて各状況適応現在行動候補の優先度を決定する(P45)。運転支援装置1040は、1つ以上の状況適応現在行動候補を表わす現在行動候補画像と、各候補の優先度を表わす優先度画像を生成し、報知装置1002へ出力して表示させる(P46)。以降のP47~P49の処理は、実施の形態6で説明した図46のシーケンス図のP28~P30と同じである。 The driving support device 1040 determines one or more situation adaptation current action candidates according to the correlation between the detection information and the first statistical information, and determines each situation adaptation current action according to the correlation between the detection information and the second statistical information. The priority of the candidate is determined (P45). The driving support apparatus 1040 generates a current action candidate image representing one or more situation-adapted current action candidates and a priority image representing the priority of each candidate, and outputs and displays the information on the notification apparatus 1002 (P46). The subsequent processing of P47 to P49 is the same as P28 to P30 of the sequence diagram of FIG. 46 described in the sixth embodiment.
 図65は、運転支援装置1040の処理の例を示すフローチャートである。自動運転制御装置1030から出力された行動情報を行動情報入力部1054が取得すると(S240のY)、制御部1041は、行動情報が示す現在行動と、予め記憶部1042に記憶させた現在行動とが一致するか否かを判定する。不一致であり、すなわち現在行動が更新された場合(S241のY)、制御部1041は、行動情報が示す現在行動を記憶部1042に記憶させる(S242)。行動情報を未取得であれば(S240のN)、S241とS242をスキップする。現在行動が未更新であれば(S241のN)、S242をスキップする。 FIG. 65 is a flowchart showing an example of processing of the driving support apparatus 1040. When the behavior information input unit 1054 acquires the behavior information output from the automatic driving control device 1030 (Y in S240), the control unit 1041 displays the current behavior indicated by the behavior information and the current behavior stored in the storage unit 1042 in advance. It is determined whether or not. When the current behavior is updated (Y in S241), the control unit 1041 stores the current behavior indicated by the behavior information in the storage unit 1042 (S242). If the action information has not been acquired (N in S240), S241 and S242 are skipped. If the current action is not updated (N in S241), S242 is skipped.
 検出部1020から出力された検出情報を検出情報入力部1052が取得すると(S243のY)、状況適応型決定部1081は、検出情報と第1統計情報に基づいて1つ以上の状況適応現在行動候補を決定する(S244)。状況適応型決定部1081は、一旦決定した1つ以上の状況適応現在行動候補のうち記憶部1042に格納された現在行動と一致するものは候補から除外する(S245)。個人適応型決定部1083は、検出情報と第2統計情報に基づいて1つ以上の個人適応現在行動候補を決定し、それぞれの候補の順位を決定する。なお、状況適応型決定部1081も、一旦決定した1つ以上の個人適応現在行動候補のうち記憶部1042に格納された現在行動と一致するものは候補から除外してもよい。優先度決定部1084は、1つ以上の個人適応現在行動候補の順位に基づいて、1つ以上の状況適応現在行動候補の画面表示上の優先度を決定する(S246)。優先度決定部1084は、状況適応現在行動候補と優先度を示す情報を記憶部1042へ格納する。 When the detection information input unit 1052 acquires the detection information output from the detection unit 1020 (Y in S243), the situation adaptive determination unit 1081 determines one or more situation adaptive current actions based on the detection information and the first statistical information. Candidates are determined (S244). The situation adaptive determination unit 1081 excludes from the candidates one or more situation adaptation current action candidates once determined that match the current action stored in the storage unit 1042 (S245). The personal adaptive determination unit 1083 determines one or more personal adaptive current action candidates based on the detection information and the second statistical information, and determines the rank of each candidate. The situation adaptive determination unit 1081 may also exclude one or more personal adaptation current behavior candidates once determined that match the current behavior stored in the storage unit 1042 from the candidates. The priority determination unit 1084 determines the priority of one or more situation adaptation current action candidates on the screen display based on the ranking of the one or more individual adaptation current action candidates (S246). The priority determination unit 1084 stores information indicating the situation adaptation current action candidate and the priority in the storage unit 1042.
 状況適応現在行動候補が更新された場合(S247のY)、画像生成部1060は、更新された状況適応現在行動候補を表わす現在行動候補画像を生成する(S248)。画像出力部1051は、現在行動候補画像を報知装置1002へ出力して表示させる(S249)。画像生成部1060は、1つ以上の状況適応現在行動候補それぞれの優先度を表わす優先度画像を生成し(S250)、画像出力部1051は、優先度画像を報知装置1002へ出力して表示させる(S251)。状況適応現在行動候補が未更新であるが(S247のN)、優先度が更新された場合(S252のY)、S250へ進み、優先度も未更新であれば(S252のN)、S253へ進む。 When the situation adaptation current action candidate is updated (Y in S247), the image generation unit 1060 generates a current action candidate image representing the updated situation adaptation current action candidate (S248). The image output unit 1051 outputs the current action candidate image to the notification device 1002 for display (S249). The image generation unit 1060 generates a priority image representing the priority of each of one or more situation-adapted current action candidates (S250), and the image output unit 1051 outputs the priority image to the notification device 1002 for display. (S251). Although the situation adaptation current action candidate is not updated (N in S247), if the priority is updated (Y in S252), the process proceeds to S250, and if the priority is not updated (N in S252), the process proceeds to S253. move on.
 所定の終了条件が満たされた場合(S253のY)、本図のフローを終了し、終了条件が満たされなければ(S253のN)、S240に戻る。検出情報が未入力であれば(S243のN)、S244~S251をスキップする。このように、実施の形態10の運転支援装置1040は、検出情報が入力される都度、状況適応現在行動候補を決定し、運転者の嗜好に応じた提案の優先度を決定する。そして、状況適応現在行動候補と優先度の少なくとも一方が更新されると、自動運転情報画面1103の表示内容も更新される。 If the predetermined termination condition is satisfied (Y in S253), the flow of this figure is terminated. If the termination condition is not satisfied (N in S253), the process returns to S240. If the detection information is not input (N in S243), S244 to S251 are skipped. As described above, the driving assistance device 1040 according to the tenth embodiment determines the situation adaptation current action candidate every time the detection information is input, and determines the priority of the proposal according to the driver's preference. When at least one of the situation adaptation current action candidate and the priority is updated, the display content of the automatic driving information screen 1103 is also updated.
 運転者による現在行動候補の選択に係る運転支援装置1040の処理は、実施の形態6と同様であり、処理フローは図48の処理フローと同様である。現在行動候補の選択を受け付ける待機時間の長さは、予め定められた固定値でもよい。現在行動候補画像1110を表示させてから待機時間が経過するまでに行動の選択が未入力の場合、運転支援装置1040の指示部1063は、優先度が最も高い現在行動候補(状況適応現在行動候補)の即時実行を指示する制御コマンドをコマンド出力部1055から自動運転制御装置1030へ送信させてもよい。 The processing of the driving support device 1040 related to the selection of the current action candidate by the driver is the same as that of the sixth embodiment, and the processing flow is the same as the processing flow of FIG. The length of the waiting time for receiving the selection of the current action candidate may be a predetermined fixed value. When the action selection is not input before the waiting time elapses after the current action candidate image 1110 is displayed, the instruction unit 1063 of the driving support device 1040 displays the current action candidate having the highest priority (situation adaptation current action candidate). ) May be transmitted from the command output unit 1055 to the automatic operation control device 1030.
 以上説明したように、実施の形態10の運転支援装置1040は、1つ以上の状況適応現在行動候補に対して、運転者個人または車両1000単体に適応した優先度を付与した自動運転情報画面1103を運転者へ提示する。これにより、車両の現在行動の選択を支援するために有用な情報を運転者へ提示できる。実施の形態10では、自動運転情報画面1103に表示させる現在行動の候補を、様々な環境パラメータ値と様々な行動が網羅的に記録された第1統計情報から抽出することで、刻々と変化する車両の環境パラメータ値により適合した現在行動候補を提示しやすくなる。また、各候補の優先順位を、運転者の嗜好を強く反映する第2統計情報をもとに決定することで、運転手個人の嗜好或いは操作パターンに応じた優先度を提示することができる。 As described above, the driving support apparatus 1040 according to the tenth embodiment has an automatic driving information screen 1103 in which one or more situation-adapted current action candidates are assigned priorities adapted to the individual driver or the vehicle 1000 alone. To the driver. Thereby, useful information can be presented to the driver to assist in selecting the current behavior of the vehicle. In the tenth embodiment, the current action candidates to be displayed on the automatic driving information screen 1103 are extracted from the first statistical information in which various environmental parameter values and various actions are comprehensively recorded, thereby changing every moment. It becomes easy to present the current action candidate that is more suitable for the environmental parameter value of the vehicle. Moreover, the priority according to a driver | operator's individual preference or operation pattern can be shown by determining the priority of each candidate based on the 2nd statistical information which strongly reflects a driver | operator's preference.
 変形例を説明する。実施の形態10では、状況適応型決定部1081が決定した状況適応現在行動候補と、優先度決定部1084が決定した優先度を自動運転情報画面1103に表示させた。変形例として、状況適応型決定部1081が決定した状況適応現在行動候補と、個人適応型決定部1083が決定した個人適応現在行動候補の両方を並列して自動運転情報画面1103に表示させてもよい。 A modification will be described. In the tenth embodiment, the situation adaptive current action candidate determined by the situation adaptive determination unit 1081 and the priority determined by the priority determination unit 1084 are displayed on the automatic driving information screen 1103. As a modification, both the situation adaptive current action candidate determined by the situation adaptive determination unit 1081 and the personal adaptation current action candidate determined by the personal adaptive determination unit 1083 may be displayed on the automatic driving information screen 1103 in parallel. Good.
 本変形例の状況適応型決定部1081は、自動運転制御装置1030から取得された行動情報が示す現在行動とは異なる状況適応現在行動候補を決定することが望ましい。同様に個人適応型決定部1083も、行動情報が示す現在行動とは異なる個人適応現在行動候補を決定することが望ましい。これにより、車両の現在行動と同じ候補を運転者へ提示することを防止できる。 It is desirable that the situation adaptive determination unit 1081 of the present modification determines a situation adaptive current action candidate different from the current action indicated by the action information acquired from the automatic driving control apparatus 1030. Similarly, it is desirable that the personal adaptive determination unit 1083 also determines a personal adaptive current behavior candidate that is different from the current behavior indicated by the behavior information. Thereby, it is possible to prevent the same candidate as the current behavior of the vehicle from being presented to the driver.
 本変形例の自動運転情報画面1103では、状況適応現在行動候補の表示態様と、個人適応現在行動候補の表示態様とが異なることが望ましい。例えば、運転者が状況適応現在行動候補と個人適応現在行動候補を容易に区別できるように、表示位置、模様、色彩、サイズ等を異ならせてもよい。画像生成部1060は、状況適応現在行動候補の画像の態様とは異なる態様で個人適応現在行動候補の画像を生成してもよい。画像出力部1051は、状況適応現在行動候補の画像と個人適応現在行動候補の画像を報知装置1002へ送信する際に、両者の画像を異なる態様で表示させるように指示する表示制御データをさらに送信してもよい。 In the automatic driving information screen 1103 of this modification, it is desirable that the display mode of the situation adaptation current action candidate and the display mode of the individual adaptation current action candidate are different. For example, the display position, the pattern, the color, the size, and the like may be different so that the driver can easily distinguish the situation adaptation current action candidate and the individual adaptation current action candidate. The image generation unit 1060 may generate an image of the personal adaptation current action candidate in a manner different from that of the situation adaptation current action candidate image. When transmitting the situation adaptation current action candidate image and the personal adaptation current action candidate image to the notification device 1002, the image output unit 1051 further transmits display control data instructing to display both images in different modes. May be.
 また、本変形例の状況適応型決定部1081は、1つ以上の状況適応現在行動候補の優先順位を、検出情報と第1統計情報との相関性に基づき決定してもよい。個人適応型決定部1083も、1つ以上の個人適応現在行動候補の優先順位を、実施の形態と同様に検出情報と第2統計情報との相関性に基づき決定してもよい。画像生成部1060および画像出力部1051は、1つ以上の状況適応現在行動候補を表わす画像と各候補の優先順位を表わす画像、および、1つ以上の個人適応現在行動候補を表わす画像と各候補の優先順位を表わす画像を自動運転情報画面1103に表示させてもよい。また、優先順位毎に、1つ以上の状況適応現在行動候補と1つ以上の個人適応現在行動候補を比較容易な態様で並べて表示させてもよい。 In addition, the situation adaptive determination unit 1081 of the present modification may determine the priority order of one or more situation adaptive current action candidates based on the correlation between the detection information and the first statistical information. The personal adaptive determination unit 1083 may also determine the priority order of one or more personal adaptive current action candidates based on the correlation between the detection information and the second statistical information as in the embodiment. The image generation unit 1060 and the image output unit 1051 include an image representing one or more situation-adapted current action candidates and an image representing the priority order of each candidate, and an image representing one or more individual adapted current action candidates and each candidate. May be displayed on the automatic driving information screen 1103. In addition, for each priority, one or more status adaptation current action candidates and one or more individual adaptation current action candidates may be displayed side by side in an easily comparable manner.
 なお、図示しないが、状況適応現在行動候補(現在についての状況適応型推薦とも言える)と個人適応現在行動候補(現在についての個人適応型推薦とも言える)を提示する別例を説明する。ここでは、合流路に接近する最左側車線を走行中に、右車線後方から自車より速度の速い車両が連続して接近していることに基づいて、自動運転制御装置1030は、現在行動「車線維持(任意)」を実行することを決定し、運転支援装置1040は、「車線維持(任意)」を実行していることを提示していることとする(例えば実施の形態5、6等を参照)。 Although not shown, another example of presenting a situation adaptation current action candidate (also referred to as a situation adaptation type recommendation for the present) and a personal adaptation current action candidate (also referred to as a person adaptation type recommendation for the present) will be described. Here, based on the fact that a vehicle having a speed higher than that of the host vehicle is continuously approaching from the rear of the right lane while traveling in the leftmost lane approaching the merge path, the automatic operation control device 1030 performs the current action “ It is determined that “lane keeping (arbitrary)” is to be executed, and the driving support apparatus 1040 presents that “lane keeping (arbitrary)” is being executed (for example, Embodiments 5 and 6). See).
 このときに運転支援装置1040は、『合流路に接近するまでに合流車両が現れると急な操作が必要となりデメリットが大きくなること』を判定してもよい。そして運転支援装置1040は、車両1000が実行している「車線維持」に代わり走行制御指示できる状況適応現在行動候補として「減速」を、今すぐ指示が出せる選択肢として推薦提示してもよい。 At this time, the driving assistance device 1040 may determine that “a sudden operation is required and a disadvantage increases when a merged vehicle appears before approaching the merge path”. Then, the driving support device 1040 may recommend and present “decelerate” as a situation adaptation current action candidate that can give a travel control instruction instead of “lane keeping” executed by the vehicle 1000 as an option that can be instructed immediately.
 さらに運転支援装置1040は、『合流路の手前において、左側車線の後方からある速度で接近する車両がある場合にドライバが採る運転行動の傾向に基づいて』、車両1000が実行している「車線維持」に代わり走行制御指示できる個人適応現在行動候補として「加速」を決定してもよい。そして運転支援装置1040は、個人適応現在行動候補「加速」を、今すぐ指示が出せる選択肢としてさらに推薦提示してもよい。このように、運転支援装置1040は、近似のタイミングで決定した状況適応現在行動候補と個人適応現在行動候補のそれぞれを同時並行して運転者へ提示してもよい。 Further, the driving support device 1040 reads “based on the tendency of the driving action taken by the driver when there is a vehicle approaching at a certain speed from the rear of the left lane in front of the merge path”. Instead of “maintenance”, “acceleration” may be determined as a personal adaptation current action candidate capable of instructing traveling control. Then, the driving support apparatus 1040 may further recommend and present the personal adaptation current action candidate “acceleration” as an option that can be instructed immediately. As described above, the driving support device 1040 may simultaneously present the situation adaptation current action candidate and the individual adaptation current action candidate determined at the approximate timing to the driver in parallel.
 (実施の形態11)
 まず概要を説明する。車両の自動運転中、車両に将来時点で実行させる行動、言い換えれば、所定時間後に車両に実行させる予定の行動(以下「予定行動」と呼ぶ。)を運転者に選択させる場合に、その行動選択を支援する情報が運転者に対して十分に提供されておらず、運転者によるスムーズな行動選択が困難なことがあった。また、運転者に不安感を抱かせることがあった。
(Embodiment 11)
First, an outline will be described. When the driver selects an action to be executed by the vehicle at a future time during automatic driving of the vehicle, in other words, an action to be executed by the vehicle after a predetermined time (hereinafter referred to as “scheduled action”), the action is selected. Information for assisting the driver is not sufficiently provided to the driver, and it is sometimes difficult for the driver to select a smooth action. In addition, the driver may feel uneasy.
 そこで実施の形態11では、車両の予定行動の選択を支援する情報として、運転者個人に適応した情報を提供する。具体的には、運転支援装置1040が、車両1000の周囲状況或いは走行状態に基づく観点と、運転者個人に基づく観点の両方から予定行動候補を決定し、車両1000内の報知装置1002に表示させる。これにより、運転者の意に即したスムーズな行動選択を支援し、また、運転者が安心して行動選択或いは行動変更の指示を出せるよう支援する。 上記の実施の形態10では、車両の現在行動の候補を運転者へ提示したが、本実施の形態11では、車両の予定行動の候補を運転者へ提示する点で異なる。以下では、これまでの実施の形態で説明済の内容は適宜省略する。本実施の形態で説明する構成或いは動作は、趣旨を逸脱しない範囲で、他の実施の形態或いは変形例で説明する構成或いは動作と組み合わせることができ、また置き換えることができる。 Therefore, in the eleventh embodiment, information adapted to the individual driver is provided as information for supporting the selection of the scheduled behavior of the vehicle. Specifically, the driving support device 1040 determines a scheduled action candidate from both the viewpoint based on the surrounding state or the running state of the vehicle 1000 and the viewpoint based on the individual driver, and displays the scheduled action candidate on the notification device 1002 in the vehicle 1000. . Thus, smooth action selection in accordance with the driver's will is supported, and the driver is supported so that instructions for action selection or action change can be issued with peace of mind. In the tenth embodiment described above, the candidate for the current behavior of the vehicle is presented to the driver, but in the present eleventh embodiment, the candidate for the scheduled behavior of the vehicle is presented to the driver. In the following, the contents already described in the above embodiments are omitted as appropriate. The structure or operation described in this embodiment can be combined with or replaced with the structure or operation described in another embodiment or modification without departing from the spirit of the present invention.
 運転支援装置1040の記憶部1042は、実施の形態10で説明した図61に示す構成であり、運転支援装置1040の制御部1041は、実施の形態10で説明した図62に示す構成である。 The storage unit 1042 of the driving support device 1040 has the configuration shown in FIG. 61 described in the tenth embodiment, and the control unit 1041 of the driving support device 1040 has the configuration shown in FIG. 62 described in the tenth embodiment.
 第1統計情報蓄積部1072に蓄積される第1統計情報と、第2統計情報蓄積部1073に蓄積される第2統計情報はいずれも、車両の周囲状況および走行状態と車両の行動との関連性を示す統計情報(図43)である。具体的には、実施の形態7と同様に、車両の周囲状況および走行状態を示す複数種類の環境パラメータの値と、将来時点で車両に実行させる行動(もしくは行動実績)を対応付けたレコードを複数含む情報である。言い換えれば、現在の環境状態に対して将来時点で(所定時間後に)実行された行動を、現在の環境状態を示すパラメータ値と対応付けて蓄積した情報である。このように、第1統計情報と第2統計情報で規定される行動が予定行動である点で実施の形態10と異なる。 The first statistical information stored in the first statistical information storage unit 1072 and the second statistical information stored in the second statistical information storage unit 1073 are both related to the vehicle surroundings and driving conditions and vehicle behavior. It is statistical information (FIG. 43) which shows sex. Specifically, as in the seventh embodiment, a record in which values of a plurality of types of environmental parameters indicating the surrounding situation and the running state of the vehicle are associated with actions (or action results) to be executed by the vehicle at a future time point is recorded. It is information that includes a plurality. In other words, it is information obtained by accumulating actions executed at a future time point (after a predetermined time) with respect to the current environmental state in association with a parameter value indicating the current environmental state. Thus, it differs from Embodiment 10 in the action prescribed | regulated by 1st statistical information and 2nd statistical information is a schedule action.
 その一方、実施の形態10と同様に、第1統計情報と第2統計情報は統計の範囲が異なり、具体的には第1統計情報の方が第2統計情報よりも統計の範囲が広い。第1統計情報は、複数人の集団・多数の車両を対象として、様々な環境状態における操作実績、行動実績を記録したものである。もちろん、様々な環境状態における操作実績、行動実績の履歴を既知の統計手法により操作パターン、行動パターンとしてモデル化したものでもよい。第2統計情報は、運転者個人・車両1000単体を対象として、これまでの環境状態における操作実績或いは行動実績を記録したものである。運転者個人の操作実績、車両1000単体の行動実績の履歴を、既知の統計手法により操作パターン、行動パターンとしてモデル化したものでもよい。 On the other hand, as in the tenth embodiment, the first statistical information and the second statistical information have different statistical ranges, and specifically, the first statistical information has a wider statistical range than the second statistical information. The first statistical information is a record of operation results and action results in various environmental states for a group of a plurality of people and a large number of vehicles. Of course, the history of operation results and behavior results in various environmental states may be modeled as operation patterns and behavior patterns by a known statistical method. The second statistical information is a record of an operation result or an action result in an environmental state so far for an individual driver and a vehicle 1000 alone. A driver's individual operation results and a history of behavior results of the vehicle 1000 alone may be modeled as operation patterns and behavior patterns by a known statistical method.
 実施の形態11では、第1統計情報と第2統計情報の両方が車両1000のローカルに記憶されるが、第1統計情報と第2統計情報の少なくとも一方が車両外部の装置、例えばクラウド上のデータベース等に蓄積されてもよい。例えば、第2統計情報は、統計の範囲が運転者個人または車両1000単独であるため、車両1000のローカルに逐次蓄積されてもよい。その一方、第1統計情報は、統計の範囲が大人数の集団または多数の車両に亘るため、クラウド上のサーバで集計・統計・蓄積等の処理が実行されてもよい。運転支援装置1040は、通信IF1056および無線装置1008を介してリモートの統計情報にアクセスしてもよい。 In the eleventh embodiment, both the first statistical information and the second statistical information are stored locally in the vehicle 1000, but at least one of the first statistical information and the second statistical information is a device outside the vehicle, for example, on a cloud It may be stored in a database or the like. For example, the second statistical information may be sequentially accumulated locally in the vehicle 1000 because the range of the statistics is for the individual driver or the vehicle 1000 alone. On the other hand, since the range of statistics covers a large group of people or a large number of vehicles, the first statistical information may be subjected to processing such as aggregation / statistics / accumulation on a server on the cloud. The driving support device 1040 may access remote statistical information via the communication IF 1056 and the wireless device 1008.
 運転支援装置1040の検出情報入力部1052は、車両1000の周囲状況および走行状態の検出結果を示す検出情報を検出部1020から取得する。第1決定部1080は、検出部1020から出力された検出情報と、第1統計情報蓄積部1072に蓄積された第1統計情報に基づいて、車両1000に実行させる予定の行動になり得る1つ以上の第1行動を決定する。 The detection information input unit 1052 of the driving support device 1040 acquires detection information indicating the detection result of the surrounding state and the running state of the vehicle 1000 from the detection unit 1020. Based on the detection information output from the detection unit 1020 and the first statistical information stored in the first statistical information storage unit 1072, the first determination unit 1080 may be an action that is scheduled to be executed by the vehicle 1000. The above first action is determined.
 第1決定部1080の状況適応型決定部1081は、車両1000に実行させる予定の行動になり得る1つ以上の行動の候補(以下「状況適応予定行動候補」と呼ぶ。)を第1行動として決定する。具体的には、実施の形態7と同様に、第1統計情報に規定された行動のうち、検出情報に近似する環境パラメータ値に対応付けられた行動を状況適応予定行動候補(実施の形態7の予定行動候補に対応)として決定する。状況適応予定行動候補は、現在の周囲状況或いは走行状態に対して将来時点で実行される典型的な操作パターン・行動パターンと言える。 The situation adaptation type determination unit 1081 of the first determination unit 1080 uses one or more action candidates (hereinafter referred to as “situation adaptation scheduled action candidates”) that can be actions scheduled to be executed by the vehicle 1000 as the first action. decide. Specifically, in the same manner as in the seventh embodiment, among actions defined in the first statistical information, actions associated with environment parameter values that approximate the detection information are designated as situation adaptation scheduled action candidates (the seventh embodiment). Corresponding to the scheduled action candidate). The situation adaptation scheduled action candidate can be said to be a typical operation pattern / behavior pattern executed at a future time with respect to the current surrounding situation or running state.
 第2決定部1082は、検出部1020から出力された検出情報と、第2統計情報蓄積部1073に蓄積された第2統計情報に基づいて、車両1000に実行させる予定の行動になり得る1つ以上の第2行動を決定する。実施の形態11では、第2決定部1082は、1つ以上の状況適応予定行動候補それぞれに対する優先度であり、第2統計情報と検出情報との相関性に応じた優先度を示す情報を第2行動として決定する。言い換えれば、第2決定部1082が決定する1つ以上の第2行動は、1つ以上の第1行動それぞれの優先度を示すものである。 Based on the detection information output from the detection unit 1020 and the second statistical information stored in the second statistical information storage unit 1073, the second determination unit 1082 can be an action that is scheduled to be executed by the vehicle 1000. The above second action is determined. In the eleventh embodiment, the second determination unit 1082 indicates the priority for each of the one or more situation-adapted scheduled action candidates, and indicates information indicating the priority according to the correlation between the second statistical information and the detection information. Determine as two actions. In other words, the one or more second actions determined by the second determination unit 1082 indicate the priority of each of the one or more first actions.
 第2決定部1082の個人適応型決定部1083は、車両1000に実行させる予定の行動になり得る1つ以上の行動の候補(以下「個人適応予定行動候補」と呼ぶ。)を決定する。具体的には、実施の形態7と同様に、第2統計情報に規定された行動のうち、検出情報に近似する環境パラメータ値に対応付けられた行動を個人適応予定行動候補(実施の形態7の予定行動候補に対応)として決定する。個人適応予定行動候補は、現在の周囲状況或いは走行状態に対する運転者個人の将来の操作パターンと言え、また車両1000単体の将来の行動パターンと言える。 The personal adaptation determination unit 1083 of the second determination unit 1082 determines one or more action candidates (hereinafter referred to as “individual adaptation scheduled action candidates”) that can be the actions scheduled to be executed by the vehicle 1000. Specifically, as in the seventh embodiment, among the actions defined in the second statistical information, the action associated with the environmental parameter value that approximates the detection information is selected as the personal adaptation scheduled action candidate (the seventh embodiment). Corresponding to the scheduled action candidate). The personal adaptation scheduled action candidate can be said to be a driver's future operation pattern with respect to the current surrounding state or running state, and can also be said to be a future action pattern of the vehicle 1000 alone.
 個人適応型決定部1083は、実施の形態10と同様に、第2統計情報から抽出した1つ以上の個人適応予定行動候補を順位付けする。優先度決定部1084は、実施の形態10と同様に、個人適応型決定部1083により決定された1つ以上の個人適応予定行動候補の順位に基づいて、状況適応型決定部1081により決定された1つ以上の状況適応予定行動候補の優先順位を決定する。 The personal adaptation determination unit 1083 ranks one or more personal adaptation scheduled action candidates extracted from the second statistical information, as in the tenth embodiment. Similar to the tenth embodiment, the priority determination unit 1084 is determined by the situation adaptive determination unit 1081 based on the ranks of one or more individual adaptive scheduled behavior candidates determined by the individual adaptive determination unit 1083. Determine the priority of one or more situation adaptation scheduled action candidates.
 画像生成部1060は、1つ以上の第1行動を表す第1画像を生成し、1つ以上の第2行動を表す第2画像を生成する。具体的には、画像生成部1060は、1つ以上の状況適応予定行動候補のそれぞれを表わす予定行動候補画像を第1画像として生成する。また画像生成部1060は、1つ以上の状況適応予定行動候補のそれぞれに対して優先度決定部1084が設定した優先順位を表わす優先度画像を第2画像として生成する。画像生成部1060は、1つ以上の状況適応予定行動候補のそれぞれに付与された優先度合をヒストグラムまたは数字で表わす優先度画像を生成してもよい。ヒストグラムは、優先度合に対応する表示サイズが設定された図形オブジェクトの画像であってもよい。 The image generation unit 1060 generates a first image representing one or more first actions, and generates a second image representing one or more second actions. Specifically, the image generation unit 1060 generates a scheduled action candidate image representing each of one or more situation adaptation scheduled action candidates as a first image. In addition, the image generation unit 1060 generates a priority image representing the priority set by the priority determination unit 1084 for each of the one or more situation adaptation scheduled action candidates as the second image. The image generation unit 1060 may generate a priority image that represents a priority level given to each of one or more situation adaptation scheduled action candidates by a histogram or a number. The histogram may be an image of a graphic object in which a display size corresponding to the priority is set.
 画像出力部1051は、車両の運転者の一定視野内に予定行動候補画像と優先度画像を表示させるように、予定行動候補画像と優先度画像を報知装置1002に出力する。報知装置1002は、予定行動候補画像と優先度画像を含む自動運転情報画面を表示する。 The image output unit 1051 outputs the scheduled action candidate image and the priority image to the notification device 1002 so that the scheduled action candidate image and the priority image are displayed within a certain field of view of the driver of the vehicle. The notification device 1002 displays an automatic driving information screen including a scheduled action candidate image and a priority image.
 図66は、自動運転情報画面の一例を示す。同図の自動運転情報画面1103では、予定行動候補画像として第1予定行動候補画像1112aと第2予定行動候補画像1112bが表示されている。同図の予定行動候補(状況適応予定行動候補)はいずれも、複数の単一行動を組み合わせた行動計画であるが、状況適応現在行動候補として単一行動が決定された場合、その単一行動を表わす予定行動候補画像が表示される。 FIG. 66 shows an example of the automatic driving information screen. In the automatic driving information screen 1103 in the figure, a first scheduled action candidate image 1112a and a second scheduled action candidate image 1112b are displayed as the scheduled action candidate images. The scheduled action candidate (situation adaptation scheduled action candidate) in the figure is an action plan that combines a plurality of single actions, but if a single action is determined as a situation adaptation current action candidate, the single action A scheduled action candidate image representing is displayed.
 また同図では、各予定行動候補の優先度をヒストグラムで示している。具体的には、第1予定行動候補画像1112aが示す予定行動候補の優先度を第1優先度画像1132a(網掛け)で示し、第2予定行動候補画像1112bが示す予定行動候補の優先度を第2優先度画像1132b(網掛け)で示している。同図では、第1予定行動候補画像1112aに対して相対的に大きい優先度画像1132が付加されている。これにより、第1予定行動候補画像1112aが示す行動の方が、運転者個人に対するお勧め度が高いことを示している。なお、図63Aで示したように、各予定行動候補の優先度を数字を表わす優先度画像を表示してもよい。 In the same figure, the priority of each scheduled action candidate is shown as a histogram. Specifically, the priority of the scheduled action candidate indicated by the first scheduled action candidate image 1112a is indicated by a first priority image 1132a (shaded), and the priority of the scheduled action candidate indicated by the second scheduled action candidate image 1112b is indicated. A second priority image 1132b (shaded) is shown. In the figure, a relatively high priority image 1132 is added to the first scheduled action candidate image 1112a. Accordingly, the behavior indicated by the first scheduled behavior candidate image 1112a indicates that the degree of recommendation for the individual driver is higher. Note that, as shown in FIG. 63A, a priority image representing the priority of each scheduled action candidate may be displayed.
 実施の形態11における車両1000の処理シーケンスは、実施の形態10で説明した図64の処理シーケンスと同様であるため説明を省略する。ただし、図64のP43で示した自動運転制御装置1030から運転支援装置1040への行動情報の入力は、実施の形態11ではないものとし、後述の変形例において説明する。また、図64に記載の現在行動候補と現在行動候補画像はそれぞれ、予定行動候補(具体的には状況適応予定行動候補、個人適応予定行動候補)と予定行動候補画像に置き換わる。 Since the processing sequence of the vehicle 1000 in the eleventh embodiment is the same as the processing sequence of FIG. 64 described in the tenth embodiment, the description thereof is omitted. However, it is assumed that the action information input from the automatic driving control device 1030 to the driving support device 1040 indicated by P43 in FIG. 64 is not the eleventh embodiment, and will be described in a later-described modification. Also, the current action candidate and the current action candidate image described in FIG. 64 are replaced with a scheduled action candidate (specifically, a situation adaptation scheduled action candidate and a personal adaptation scheduled action candidate) and a scheduled action candidate image, respectively.
 図67は、運転支援装置1040の処理の例を示すフローチャートである。検出部1020から出力された検出情報を検出情報入力部1052が取得すると(S260のY)、状況適応型決定部1081は、検出情報と第1統計情報に基づいて1つ以上の状況適応予定行動候補を決定する(S261)。個人適応型決定部1083は、検出情報と第2統計情報に基づいて1つ以上の個人適応予定行動候補を決定し、各候補の順位を決定する。優先度決定部1084は、1つ以上の個人適応予定行動候補の順位に基づいて、1つ以上の状況適応予定行動候補の優先度を決定する(S262)。優先度決定部1084は、状況適応予定行動候補と優先度を示す情報を記憶部1042へ格納する。 FIG. 67 is a flowchart showing an example of processing of the driving support apparatus 1040. When the detection information input unit 1052 acquires the detection information output from the detection unit 1020 (Y in S260), the situation adaptive determination unit 1081 determines one or more situation adaptation scheduled actions based on the detection information and the first statistical information. Candidates are determined (S261). The personal adaptation type determination unit 1083 determines one or more personal adaptation scheduled action candidates based on the detection information and the second statistical information, and determines the rank of each candidate. The priority determination unit 1084 determines the priority of one or more situation adaptation scheduled action candidates based on the ranking of one or more individual adaptation scheduled action candidates (S262). The priority determination unit 1084 stores information indicating the situation adaptation scheduled action candidate and the priority in the storage unit 1042.
 状況適応予定行動候補が更新された場合(S263のY)、画像生成部1060は、更新された状況適応予定行動候補を表わす予定行動候補画像を生成する(S264)。画像出力部1051は、予定行動候補画像を報知装置1002へ出力して表示させる(S265)。画像生成部1060は、1つ以上の状況適応予定行動候補それぞれの優先度を表わす優先度画像を生成し(S266)、画像出力部1051は、優先度画像を報知装置1002へ出力して表示させる(S267)。状況適応予定行動候補が未更新であるが(S263のN)、優先度が更新された場合(S268のY)、S266へ進み、優先度も未更新であれば(S268のN)、S269へ進む。 When the situation adaptation scheduled action candidate is updated (Y of S263), the image generation unit 1060 generates a scheduled action candidate image representing the updated situation adaptation scheduled action candidate (S264). The image output unit 1051 outputs the scheduled action candidate image to the notification device 1002 for display (S265). The image generation unit 1060 generates a priority image representing the priority of each of one or more situation adaptation scheduled action candidates (S266), and the image output unit 1051 outputs the priority image to the notification device 1002 for display. (S267). If the situation adaptation scheduled action candidate is not updated (N in S263), but the priority is updated (Y in S268), the process proceeds to S266, and if the priority is not updated (N in S268), the process proceeds to S269. move on.
 所定の終了条件が満たされた場合(S269のY)、本図のフローを終了し、終了条件が満たされなければ(S269のN)、S260に戻る。検出情報が未入力であれば(S260のN)、S261~S268をスキップする。このように、実施の形態10の運転支援装置1040は、検出情報が入力される都度、状況適応予定行動候補を決定し、運転者の嗜好に応じた提案の優先度を決定する。そして、状況適応予定行動候補と優先度の少なくとも一方が更新されると、自動運転情報画面1103の表示内容も更新される。 If the predetermined termination condition is satisfied (Y in S269), the flow of this figure is terminated. If the termination condition is not satisfied (N in S269), the process returns to S260. If the detection information has not been input (N in S260), S261 to S268 are skipped. As described above, the driving support device 1040 according to the tenth embodiment determines the situation adaptation scheduled action candidate every time the detection information is input, and determines the priority of the proposal according to the driver's preference. When at least one of the situation adaptation scheduled action candidate and the priority is updated, the display content of the automatic driving information screen 1103 is also updated.
 運転者による予定行動候補の選択に係る運転支援装置1040(画像生成部1060、判定部1062、指示部1063等)の処理は、実施の形態7と同様である。例えば、図54のフローチャートに示す処理と、図55A、55Bおよび図56A、56Bの自動運転情報画面1103に示すユーザインタフェースは、実施の形態11にそのまま適用される。 The processing of the driving support device 1040 (image generation unit 1060, determination unit 1062, instruction unit 1063, etc.) related to the selection of the scheduled action candidate by the driver is the same as that in the seventh embodiment. For example, the process shown in the flowchart of FIG. 54 and the user interface shown in the automatic driving information screen 1103 of FIGS. 55A and 55B and FIGS. 56A and 56B are applied to the eleventh embodiment as they are.
 以上説明したように、実施の形態11の運転支援装置1040は、1つ以上の状況適応予定行動候補に対して、運転者個人または車両1000単体に適応した優先度を付与した自動運転情報画面1103を運転者へ提示する。これにより、車両の予定行動の選択を支援するために有用な情報を運転者へ提示できる。実施の形態11では、自動運転情報画面1103に表示させる予定行動の候補を、様々な環境パラメータ値と様々な行動が網羅的に記録された第1統計情報から抽出することで、刻々と変化する車両の環境パラメータ値により適合した予定行動候補を提示しやすくなる。また、各候補の優先順位を、運転者の嗜好を強く反映する第2統計情報をもとに決定することで、運転手個人の嗜好或いは操作パターンに応じた優先度を提示することができる。 As described above, the driving support apparatus 1040 according to the eleventh embodiment gives an automatic driving information screen 1103 to which one or more situation-adapted scheduled action candidates are given priorities adapted to the individual driver or the vehicle 1000 alone. To the driver. Thereby, useful information can be presented to the driver in order to assist the selection of the scheduled behavior of the vehicle. In the eleventh embodiment, candidates for scheduled behavior to be displayed on the automatic driving information screen 1103 are extracted from the first statistical information in which various environmental parameter values and various behaviors are comprehensively recorded, thereby changing every moment. It becomes easy to present a scheduled action candidate that is more suitable for the environmental parameter value of the vehicle. Moreover, the priority according to a driver | operator's individual preference or operation pattern can be shown by determining the priority of each candidate based on the 2nd statistical information which strongly reflects a driver | operator's preference.
 なお、図示しないが、状況適応予定行動候補(将来についての状況適応型推薦とも言える)と個人適応予定行動候補(将来についての個人適応型推薦とも言える)を提示する別例を説明する。ここでは、片側二車線道路の交差点を右折する際に、対向車線を接近する車両がいることに基づいて、自動運転制御装置1030は、現在行動「一時停止」を実行することを決定し、運転支援装置1040は、「一時停止」を実行していることを提示していることとする(例えば実施の形態5、6等を参照)。 Although not shown in the drawing, another example of presenting a situation adaptation scheduled action candidate (also referred to as a situation adaptation type recommendation for the future) and a personal adaptation scheduled action candidate (also referred to as a personal adaptation type recommendation for the future) will be described. Here, when turning right at the intersection of the one-sided two-lane road, based on the fact that there is a vehicle approaching the opposite lane, the automatic driving control device 1030 determines to execute the current action “pause” It is assumed that the support apparatus 1040 presents that “pause” is being executed (see, for example, the fifth and sixth embodiments).
 このとき運転支援装置1040は、『自車が右折するために要する時間が充分確保できる対向車線前方から接近する車両との距離』を判定してもよい。そして運転支援装置1040は、現在行動「一時停止」のあとに、タイミングを見計らって走行制御指示できる状況適応予定行動候補として「右折」を、指示が予約できる選択肢として推薦提示してもよい。 At this time, the driving support device 1040 may determine “a distance from a vehicle approaching from the front of the oncoming lane that can sufficiently secure the time required for the host vehicle to turn right”. Then, after the current action “pause”, the driving support apparatus 1040 may recommend and present “turn right” as a situation adaptation scheduled action candidate that can give a travel control instruction at the expected timing as an option for which an instruction can be reserved.
 さらに運転支援装置1040は、『対抗車線の前方から接近する車両がある場合において運転者が採る運転行動の傾向』に基づいて、現在行動「一時停止」のあとに、タイミングを見計らって走行制御指示できる個人適応予定行動候補として「タイミングを見て一車線分右折→一時停止→タイミングを見て一車線分右折」(ここでは複数の単一行動を組み合わせた行動計画)を決定してもよい。そして運転支援装置1040は、その個人適応現在行動候補を、指示が予約できる選択肢としてさらに推薦提示してもよい。なお運転者が採る運転行動の傾向は、次のようにしてもよい。例えば、対向車線手前側後方が信号で詰まっており前方から接近する車両が減速しており、かつ対向車線奥側前方から接近する車両が左折ウィンカーを点灯する場合には、対向車線手前側前方から接近する車両の前に右折で割込み、一時停止したあと再度対向車線奥側前方から接近する車両の減速或いは左折ウィンカーの継続など左折意思が確認できる場合に右折を再開する傾向があること、などであってもよい。このように、運転支援装置1040は、近似するタイミングで決定した状況適応予定行動候補と個人適応予定行動候補のそれぞれを同時並行して運転者へ提示してもよい。 Further, the driving support device 1040, based on “the tendency of the driving action taken by the driver when there is a vehicle approaching from the front of the opposite lane”, after the current action “pause”, the driving control instruction at the timing is estimated. As a possible individual adaptation scheduled action candidate, “turn right for one lane at timing → pause → turn right for one lane at timing” (here, an action plan combining a plurality of single actions) may be determined. Then, the driving support device 1040 may further recommend the personal adaptation current action candidate as an option for which an instruction can be reserved. The tendency of the driving behavior taken by the driver may be as follows. For example, if the vehicle approaching from the front is slowing down and the vehicle approaching from the front side of the opposite lane lights up the left turn blinker, the front side of the opposite lane is blocked from the front side of the opposite lane. There is a tendency to resume a right turn when an intention to make a left turn can be confirmed, such as deceleration of a vehicle approaching from the front side of the opposite lane or continuation of a left turn winker after interrupting a right turn in front of an approaching vehicle There may be. As described above, the driving support device 1040 may simultaneously present each of the situation-adapted scheduled action candidate and the individual-adapted scheduled action candidate determined at the approximate timing to the driver.
 変形例を説明する。行動情報入力部1054は、自動運転制御装置1030により決定された予定行動を示す行動情報を自動運転制御装置1030から取得してもよい。画像生成部1060は、予定行動を示す予定行動画像をさらに生成してもよい。画像出力部1051は、予定行動画像を報知装置1002へさらに出力して、予定行動画像、予定行動候補画像、優先度画像を運転者の一定視野内に同時に表示させてもよい。この場合、運転支援装置1040が決定した予定行動候補に加えて、自動運転制御装置1030が決定した予定行動が、運転者による選択対象となってよく、運転支援装置1040から自動運転制御装置1030へ指示する行動の候補となってよい。 A modification will be described. The behavior information input unit 1054 may acquire behavior information indicating the scheduled behavior determined by the automatic driving control device 1030 from the automatic driving control device 1030. The image generation unit 1060 may further generate a scheduled action image indicating the scheduled action. The image output unit 1051 may further output the scheduled action image to the notification device 1002, and simultaneously display the scheduled action image, the scheduled action candidate image, and the priority image within the driver's fixed visual field. In this case, in addition to the scheduled behavior candidate determined by the driving assistance device 1040, the scheduled behavior determined by the automatic driving control device 1030 may be a selection target by the driver, and the driving assistance device 1040 changes to the automatic driving control device 1030. It may be a candidate for action to instruct.
 また、この変形例において、状況適応型決定部1081は、第1統計情報から抽出した状況適応予定行動候補の中に、行動情報が示す予定行動と同じ候補があれば、その候補を状況適応予定行動候補(例えば運転者への提示対象)から除外してもよい。また、個人適応型決定部1083は、第2統計情報から抽出した個人適応予定行動候補の中に、行動情報が示す予定行動と同じ候補があれば、その候補を個人適応予定行動候補(例えば順位付けの対象)から除外してもよい。これにより、自動運転制御装置1030で計画済の予定行動と同じ候補を運転者へ提示することを防止できる。 Further, in this modification, the situation adaptation type determination unit 1081 determines that the candidate for situation adaptation is the situation adaptation scheduled if the candidate for the situation adaptation scheduled action extracted from the first statistical information is the same as the scheduled action indicated by the action information. You may exclude from an action candidate (for example, presentation object to a driver). In addition, if the individual adaptive scheduled action candidate extracted from the second statistical information includes the same candidate as the scheduled action indicated by the action information, the individual adaptive determination unit 1083 selects the candidate as the individual adaptive scheduled action candidate (for example, rank). May be excluded from the target). Thereby, it is possible to prevent the driver from presenting the same candidate as the scheduled action planned by the automatic driving control device 1030.
 別の変形例として、実施の形態11では言及していないが、実施の形態7或いは実施の形態9と同様に、統計情報(例えば第1統計情報)で規定された予定行動が実行されるまでの残り時間を自動運転情報画面1103にさらに表示させてもよい。例えば、予定行動候補画像に対して、その画像が示す予定行動候補(例えば状況適応現在行動候補)が実行されるまでの残り時間を示す残り時間画像を付加してもよい。 As another modification, although not mentioned in the eleventh embodiment, the scheduled action defined by the statistical information (for example, the first statistical information) is executed as in the seventh or ninth embodiment. May be further displayed on the automatic driving information screen 1103. For example, a remaining time image indicating the remaining time until the scheduled action candidate (for example, the situation adaptation current action candidate) indicated by the image is executed may be added to the scheduled action candidate image.
 次に各実施の形態に共通の変形例を説明する。 Next, a modification common to the embodiments will be described.
 車両1000内のシートが、手動運転中の前向き状態と、自動運転中に前列と後列が向い合せる対面状態とに変形できる場合、運転支援装置1040は、シート状態を検知する不図示のシートセンサの検出情報に基づいて、シートが対面状態の場合、報知方法を全て音声にしてもよい。また、報知方法を表示から音声に切り替えてもよい。また、運転支援装置1040は、現在行動を音声で報知し、予定行動を表示で報知してもよい。言い換えれば、現在行動、現在行動候補、予定行動、予定行動候補について、その一部種類の情報を報知する手段(媒体)を、他の種類の情報を報知する手段(媒体)と異ならせてもよい。さらにまた、自動運転制御装置1030により決定された行動情報を音声で報知し、運転支援装置1040により決定された行動候補を画面表示で報知してもよく、この逆の組み合わせでもよい。運転者、乗員などを含む車両1000内の状態に合わせて報知方法を適切に変更することで、受け取り側が煩わしく感じにくくなり、且つ情報伝達の確実性を高めることができる。 When the seat in the vehicle 1000 can be deformed into a forward state during manual operation and a facing state in which the front row and the rear row face each other during automatic operation, the driving support device 1040 includes a seat sensor (not shown) that detects the seat state. If the sheet is in a face-to-face state based on the detection information, all the notification methods may be voiced. Further, the notification method may be switched from display to sound. In addition, the driving support device 1040 may notify the current action by voice and notify the scheduled action by display. In other words, the means (medium) for notifying part of the information about the current action, the current action candidate, the scheduled action, and the scheduled action candidate may be different from the means (medium) for notifying other kinds of information. Good. Furthermore, the action information determined by the automatic driving control device 1030 may be notified by voice, and the action candidate determined by the driving support device 1040 may be notified by screen display, or the reverse combination. By appropriately changing the notification method according to the state in the vehicle 1000 including the driver, the occupant, etc., the receiving side is less likely to be bothersome and the reliability of information transmission can be improved.
 また、運転支援装置1040は、報知装置1002(ここではヘッドアップディスプレイ等の表示装置)を見るべきことを振動により運転者へ知らせた後(自動運転では運転者は報知装置1002を常に見ている必要がないため)、運転者が報知装置1002の表示を見るタイミング付近で音声による説明音声をスピーカから出力させてもよい。人間は、触覚(振動)と、聴覚(音)≒触覚(鼓膜への振動)への反応は早いが、視覚(表示)への反応は遅い。一方で、触覚や単音の聴覚は、その意味するところがわかりにくい。音声による聴覚は、意味は伝わるがその意味が伝わりきるまでに時間がかかる。視覚は、意味を表現する情報を伝達可能である。本変形例の構成は、このような各感覚の特徴を好適に組み合わせて利用するものである。 The driving support device 1040 informs the driver by vibration that the notification device 1002 (here, a display device such as a head-up display) should be viewed (in automatic driving, the driver always looks at the notification device 1002). Because it is not necessary), a voice explanation voice may be output from the speaker in the vicinity of the timing when the driver views the display of the notification device 1002. Humans respond quickly to tactile sensations (vibrations) and hearing (sounds) ≈ tactile sensations (vibrations to the eardrum), but react slowly to vision (display). On the other hand, tactile sense and single-tone hearing are difficult to understand. The sense of hearing by voice is transmitted, but it takes time to transmit the meaning. Vision can transmit information that expresses meaning. The configuration of the present modified example uses a combination of such features of each sense.
 また、報知すべき状況は単発の連続でなく、重なることがあってもよい。例えば、踏切の矢印表示のように、「←」が表示されていて、しばらく待つうちに「→」が追加され、「←」は消えるが「→」は残り、両方が消えることで、出発可能になることを、触覚、聴覚、視覚の組み合わせにより、運転者や同乗者に伝達してもよい。例えば、報知装置1002(ここではヘッドアップディスプレイ等の表示装置)を見るべきことを振動により知らせた後、運転者が報知装置1002の表示を見るタイミング付近で音声による説明音声を出力してもよい。さらに、その説明が終わらないうちに、状況が変化したことを所定の報知音で知らせて、報知装置1002の表示内容を変化させ、或いは追加してもよい。 Also, the situation to be notified may not overlap in a single shot but may overlap. For example, as indicated by the arrow at the railroad crossing, “←” is displayed, and “→” is added while waiting for a while, “←” disappears but “→” remains, and both can disappear, allowing departure May be transmitted to the driver and passengers by a combination of touch, hearing, and vision. For example, after notifying by vibration that the notification device 1002 (here, a display device such as a head-up display) should be viewed, an explanatory voice description may be output in the vicinity of the timing when the driver views the display of the notification device 1002. . Furthermore, before the explanation ends, the display content of the notification device 1002 may be changed or added by notifying the fact that the situation has changed with a predetermined notification sound.
 以上、本発明に係る実施形態について図面を参照して詳述してきたが、上述した装置或いは各処理部の機能は、コンピュータプログラムにより実現され得る。 As described above, the embodiments according to the present invention have been described in detail with reference to the drawings. However, the functions of the above-described apparatus or each processing unit can be realized by a computer program.
 上述した機能をプログラムにより実現するコンピュータは、キーボード或いはマウス、タッチパッドなどの入力装置、ディスプレイ或いはスピーカなどの出力装置、CPU(Central Processing Unit)、ROM(Read Only Memory)、RAM(Random Access Memory)、ハードディスク装置或いはSSD(Solid State Drive)などの記憶装置を有する。さらに、DVD-ROM(Digital Versatile Disk Read Only Memory)或いはUSB(Universal Serial Bus)メモリなどの記録媒体から情報を読み取る読取装置、ネットワークを介して通信を行うネットワークカードなどを有し、各部はバスにより接続される。 A computer that realizes the above-described functions by a program includes an input device such as a keyboard or mouse, a touch pad, an output device such as a display or a speaker, a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory). And a storage device such as a hard disk device or SSD (Solid State Drive). Furthermore, it has a reader that reads information from a recording medium such as a DVD-ROM (Digital Versatile Disk Read Only Memory) or USB (Universal Serial Bus) memory, a network card that communicates via a network, etc. Connected.
 そして、読取装置は、上記プログラムを記録した記録媒体からそのプログラムを読み取り、記憶装置に記憶させる。あるいは、ネットワークカードが、ネットワークに接続されたサーバ装置と通信を行い、サーバ装置からダウンロードした上記各装置の機能を実現するためのプログラムを記憶装置に記憶させる。 The reading device reads the program from the recording medium on which the program is recorded, and stores the program in the storage device. Or a network card communicates with the server apparatus connected to the network, and memorize | stores the program for implement | achieving the function of said each apparatus downloaded from the server apparatus in a memory | storage device.
 そして、CPUが、記憶装置に記憶されたプログラムをRAMにコピーし、そのプログラムに含まれる命令をRAMから順次読み出して実行することにより、上記各装置の機能が実現される。 Then, the CPU copies the program stored in the storage device to the RAM, and sequentially reads out and executes the instructions included in the program from the RAM, thereby realizing the functions of the respective devices.
 なお、実施の形態は、以下の項目によって特定されてもよい。 Note that the embodiment may be specified by the following items.
 [項目1]
 車両の自動運転における車両の行動を決定する自動運転制御部から、車両に実行させる第1行動を示す行動情報を取得する行動情報入力部と、
 車両の周囲状況および走行状態を検出する検出部から、検出結果を示す検出情報を取得する検出情報入力部と、
 行動情報が示す第1行動の後に車両に実行させることが可能な第2行動を検出情報に基づいて決定する候補決定部と、
 行動情報が示す第1行動を表す第1画像を生成し、第2行動を表す第2画像を生成する画像生成部と、
 車両の運転者の一定視野内に第1画像と第2画像を表示させるように第1画像と第2画像を車両内の表示部に出力する画像出力部と、
 を有する運転支援装置。
[Item 1]
A behavior information input unit that acquires behavior information indicating a first behavior to be executed by the vehicle from an automatic driving control unit that determines the behavior of the vehicle in automatic driving of the vehicle;
A detection information input unit that acquires detection information indicating a detection result from a detection unit that detects a surrounding situation and a running state of the vehicle;
A candidate determination unit that determines, based on the detection information, a second action that can be executed by the vehicle after the first action indicated by the action information;
An image generation unit that generates a first image representing the first behavior indicated by the behavior information and generates a second image representing the second behavior;
An image output unit for outputting the first image and the second image to a display unit in the vehicle so as to display the first image and the second image within a fixed visual field of the driver of the vehicle;
A driving support device having
 これにより、自動運転制御部が決定した車両の現在行動の後に実行させる予定行動の候補を運転者へ提示でき、車両の将来行動を指示するか否かの運転者の判断を支援できる。また、自動運転に対して運転者が抱く不安感を低減できる。 This makes it possible to present to the driver a candidate for a scheduled action to be executed after the current action of the vehicle determined by the automatic driving control unit, and to assist the driver in determining whether to instruct the future action of the vehicle. In addition, it is possible to reduce anxiety that the driver has with respect to automatic driving.
 [項目2]
 車両の周囲状況および走行状態と車両の行動との関連性を示す統計情報を蓄積する蓄積部をさらに有し、
 候補決定部は、統計情報と検出情報に基づいて第2行動を決定する項目1に記載の運転支援装置。
[Item 2]
A storage unit for storing statistical information indicating a relationship between the vehicle surroundings and the driving state and the vehicle behavior;
A candidate determination part is a driving support device given in item 1 which determines the 2nd action based on statistical information and detection information.
 これにより、予定行動の適切な候補を決定しやすくなる。 This makes it easier to determine appropriate candidates for scheduled actions.
 [項目3]
 通信網を介して、車両の周囲状況および走行状態と車両の行動との関連性を示す統計情報を蓄積する本車両外部の蓄積部と通信可能な通信インタフェースをさらに有し、
 候補決定部は、通信インタフェースを介して蓄積部の統計情報にアクセスし、統計情報と検出情報に基づいて第2行動を決定する項目1に記載の運転支援装置。
[Item 3]
A communication interface capable of communicating with an accumulation unit outside the vehicle for accumulating statistical information indicating the relationship between the vehicle surroundings and the running state and the behavior of the vehicle via the communication network;
The candidate determination unit is a driving support device according to item 1, wherein the candidate determination unit accesses the statistical information of the storage unit via the communication interface, and determines the second action based on the statistical information and the detection information.
 これにより、予定行動の適切な候補を決定しやすくなる。 This makes it easier to determine appropriate candidates for scheduled actions.
 [項目4]
 画像生成部は、第2行動が実行されるまでの時間を表す付加画像を生成し、
 画像出力部は、車両内の表示部に付加画像を出力し、付加画像を加えた第2画像を表示させる項目1に記載の運転支援装置。
[Item 4]
The image generation unit generates an additional image representing a time until the second action is executed,
The driving support device according to item 1, wherein the image output unit outputs the additional image to a display unit in the vehicle and displays a second image including the additional image.
 これにより、予定行動が実行されるタイミングを運転者へ報知することができる。 This makes it possible to notify the driver when the scheduled action is executed.
 [項目5]
 第1画像および第2画像が車両の表示部に所定時間表示された後、第2行動を車両に実行させるためのコマンドを自動運転制御部に出力するコマンド出力部をさらに有する項目1から4のいずれか一項に記載の運転支援装置。
[Item 5]
After the first image and the second image are displayed on the display unit of the vehicle for a predetermined time, the command output unit further includes a command output unit that outputs a command for causing the vehicle to execute the second action to the automatic driving control unit. The driving support device according to any one of the above.
 これにより、運転支援装置が決定した予定行動を車両に実行させることができる。 This allows the vehicle to execute the scheduled action determined by the driving support device.
 [項目6]
 車両の行動を指定する操作指示を受け付ける操作信号入力部と、
 第1画像および第2画像が車両の表示部に表示されている所定時間内に、操作指示を受け付けなかった場合、第2行動を車両に実行させるためのコマンドを自動運転制御部に出力するコマンド出力部と、
 をさらに有する項目1から4のいずれか一項に記載の運転支援装置。
[Item 6]
An operation signal input unit for receiving an operation instruction for specifying the behavior of the vehicle;
A command for outputting a command for causing the vehicle to execute the second action to the automatic driving control unit when the operation instruction is not received within a predetermined time when the first image and the second image are displayed on the display unit of the vehicle. An output section;
The driving support device according to any one of items 1 to 4, further comprising:
 これにより、運転支援装置が決定した予定行動を車両に実行させることができる。 This allows the vehicle to execute the scheduled action determined by the driving support device.
 [項目7]
 車両の行動を指定する操作指示を受け付ける操作信号入力部と、
 第1画像および第2画像が車両の表示部に表示されている所定時間内に、第2画像を選択する操作指示を受け付けた場合、第2行動を車両に実行させるためのコマンドを自動運転制御部に出力するコマンド出力部と、
 をさらに有する項目1から4のいずれか一項に記載の運転支援装置。
[Item 7]
An operation signal input unit for receiving an operation instruction for specifying the behavior of the vehicle;
Automatic operation control of a command for causing the vehicle to execute the second action when an operation instruction for selecting the second image is received within a predetermined time during which the first image and the second image are displayed on the display unit of the vehicle Command output part to output to
The driving support device according to any one of items 1 to 4, further comprising:
 これにより、運転支援装置が決定した予定行動を、運転者の意思を反映して車両に実行させることができる。 This allows the vehicle to execute the scheduled action determined by the driving support device, reflecting the driver's intention.
 [項目8]
 コマンド出力部をさらに有し、
 第2行動を指定する操作が入力された場合、画像出力部は、第2行動の実行または予約を指定させるための問い合わせ画像を表示部に出力し、
 コマンド出力部は、問い合わせ画像の表示中に実行を指定する操作が入力された場合、第2行動を車両に実行させるためのコマンドを第1のタイミングにおいて自動運転制御部に出力し、問い合わせ画像の表示中に予約を指定する操作が入力された場合、第1のタイミングより後の第2のタイミングにおいてコマンドを自動運転制御部に出力する項目1から4のいずれか一項に記載の運転支援装置。
[Item 8]
A command output unit;
When an operation for specifying the second action is input, the image output unit outputs an inquiry image for specifying execution or reservation of the second action to the display unit,
The command output unit outputs a command for causing the vehicle to execute the second action to the automatic driving control unit at the first timing when an operation designating execution is input while the inquiry image is displayed. The driving support device according to any one of items 1 to 4, wherein when an operation for designating a reservation is input during display, the command is output to the automatic driving control unit at a second timing after the first timing. .
 これにより、運転支援装置が決定した予定行動を、運転者の意思を反映したタイミングで車両に実行させることができる。 This allows the vehicle to execute the scheduled action determined by the driving support device at a timing reflecting the driver's intention.
 [項目9]
 画像出力部は、問い合わせ画像の表示中に予約を指定する操作が入力された場合、第2行動が予約中であることを表す第3画像を表示部に出力する項目8に記載の運転支援装置。
[Item 9]
The driving support device according to item 8, wherein the image output unit outputs a third image indicating that the second action is being reserved to the display unit when an operation for specifying a reservation is input while the inquiry image is displayed. .
 これにより、予約中の行動を運転者に把握させやすくなる。 This makes it easier for the driver to understand the behavior during the reservation.
 [項目10]
 コマンドが自動運転制御部に出力された後、行動情報入力部は、コマンドに応じて更新された第1行動を示す第1行動情報を取得する項目5から9のいずれか一項に記載の運転支援装置。
[Item 10]
The driving according to any one of Items 5 to 9, wherein after the command is output to the automatic driving control unit, the behavior information input unit acquires first behavior information indicating the first behavior updated according to the command. Support device.
 これにより、車両の最新の行動を運転者へ提示することができる。 This makes it possible to present the latest behavior of the vehicle to the driver.
 [項目11]
 車両の自動運転中に実行する車両の行動を指定する操作指示を受け付ける操作入力部をさらに有し、
 画像生成部は、第1画像および第2画像が車両の表示部に表示されている所定時間内に、第2画像を選択する操作指示を受け付けた場合、選択された第2画像が表す第2行動を即時に実行するか、または、所定時間後に実行するかを車両の運転者に問い合わせる問い合わせ画像を生成し、
 画像出力部は、問い合わせ画像を表示部に出力する項目1に記載の運転支援装置。
[Item 11]
An operation input unit that receives an operation instruction that specifies an action of the vehicle to be executed during automatic driving of the vehicle;
The image generation unit receives the operation instruction for selecting the second image within a predetermined time when the first image and the second image are displayed on the display unit of the vehicle, and the second image represented by the selected second image is displayed. Generate a query image that asks the vehicle driver whether to take action immediately or after a certain amount of time,
The driving support device according to item 1, wherein the image output unit outputs the inquiry image to the display unit.
 これにより、運転者により選択された行動を、運転者の意思を反映したタイミングに実行することができる。 This makes it possible to execute the action selected by the driver at a timing that reflects the driver's intention.
 [項目12]
 コマンド出力部をさらに有し、
 操作入力部が、第2画像が表す第2行動を即時に実行することが問い合わせ画像において選択されたことを示す操作指示を受け付けた場合、コマンド出力部は、第1画像が表す第1行動に替えて第2画像が表す第2行動を即時に車両に実行させる制御コマンドを自動運転制御部に送信する項目11に記載の運転支援装置。
[Item 12]
A command output unit;
When the operation input unit receives an operation instruction indicating that the second action represented by the second image is to be executed immediately in the inquiry image, the command output unit performs the first action represented by the first image. The driving support device according to item 11, wherein a control command for causing the vehicle to immediately execute the second action represented by the second image instead is transmitted to the automatic driving control unit.
 これにより、運転者により選択された行動を車両に即時実行させることができる。 This allows the vehicle to immediately execute the action selected by the driver.
 [項目13]
 コマンド出力部をさらに有し、
 操作入力部が、第2画像が表す第2行動を所定時間後に実行することが問い合わせ画像において選択されたことを示す操作指示を受け付けた場合、コマンド出力部は、第2画像が表す第2行動を所定時間後に車両に実行させる制御コマンドを自動運転制御部に送信する項目11または12に記載の運転支援装置。
[Item 13]
A command output unit;
When the operation input unit receives an operation instruction indicating that the execution of the second action represented by the second image after a predetermined time is selected in the inquiry image, the command output unit displays the second action represented by the second image. 13. The driving support apparatus according to item 11 or 12, wherein a control command for causing the vehicle to execute the command after a predetermined time is transmitted to the automatic driving control unit.
 これにより、運転者により選択された行動を所定時間後に車両に実行させることができる。 This allows the vehicle to execute the action selected by the driver after a predetermined time.
 [項目14]
 車両の自動運転における車両の行動を決定する自動運転制御部と、
 車両の周囲状況および走行状態を検出する検出部から、検出結果を示す検出情報を取得する検出情報入力部と、
 自動運転制御部が車両に実行させる第1行動の後に車両に実行させることが可能な第2行動を検出情報に基づいて決定する候補決定部と、
 第1行動を表す第1画像を生成し、第2行動を表す第2画像を生成する画像生成部と、
 車両の運転者の一定視野内に第1画像と第2画像を表示させるように第1画像と第2画像を車両内の表示部に出力する画像出力部と、
 を有する運転制御装置。
[Item 14]
An automatic driving control unit for determining the behavior of the vehicle in the automatic driving of the vehicle;
A detection information input unit that acquires detection information indicating a detection result from a detection unit that detects a surrounding situation and a running state of the vehicle;
A candidate determining unit that determines, based on the detection information, a second action that can be executed by the vehicle after the first action that the automatic driving control unit causes the vehicle to execute;
An image generation unit for generating a first image representing the first action and generating a second image representing the second action;
An image output unit for outputting the first image and the second image to a display unit in the vehicle so as to display the first image and the second image within a fixed visual field of the driver of the vehicle;
An operation control device.
 これにより、自動運転制御部が決定した車両の現在行動の後に実行させる予定行動の候補を運転者へ提示でき、車両の将来行動を指示するか否かの運転者の判断を支援できる。また、自動運転に対して運転者が抱く不安感を低減できる。 This makes it possible to present to the driver a candidate for a scheduled action to be executed after the current action of the vehicle determined by the automatic driving control unit, and to assist the driver in determining whether to instruct the future action of the vehicle. In addition, it is possible to reduce anxiety that the driver has with respect to automatic driving.
 [項目15]
 本車両の自動運転における本車両の行動を決定する自動運転制御部と、
 本車両の周囲状況および走行状態を検出する検出部から、検出結果を示す検出情報を取得する検出情報入力部と、
 自動運転制御部が本車両に実行させる第1行動の後に本車両に実行させることが可能な第2行動を検出情報に基づいて決定する候補決定部と、
 第1行動を表す第1画像を生成し、第2行動を表す第2画像を生成する画像生成部と、
 本車両の運転者の一定視野内に第1画像と第2画像を表示させるように第1画像と第2画像を本車両内の表示部に出力する画像出力部と、
 を有する車両。
[Item 15]
An automatic driving control unit for determining the behavior of the vehicle in the automatic driving of the vehicle;
A detection information input unit for acquiring detection information indicating a detection result from a detection unit for detecting the surrounding state and the running state of the vehicle;
A candidate determining unit that determines, based on the detection information, a second action that can be executed by the vehicle after the first action that the automatic driving control unit causes the vehicle to execute;
An image generation unit for generating a first image representing the first action and generating a second image representing the second action;
An image output unit for outputting the first image and the second image to a display unit in the vehicle so as to display the first image and the second image within a fixed visual field of the driver of the vehicle;
Vehicle with.
 これにより、自動運転制御部が決定した車両の現在行動の後に実行させる予定行動の候補を運転者へ提示でき、車両の将来行動を指示するか否かの運転者の判断を支援できる。また、自動運転に対して運転者が抱く不安感を低減できる。 This makes it possible to present to the driver a candidate for a scheduled action to be executed after the current action of the vehicle determined by the automatic driving control unit, and to assist the driver in determining whether to instruct the future action of the vehicle. In addition, it is possible to reduce anxiety that the driver has with respect to automatic driving.
 [項目16]
 車両の自動運転における車両の行動を決定する自動運転制御部から、車両に実行させる第1行動を示す行動情報を取得するステップと、
 車両の周囲状況および走行状態を検出する検出部から、検出結果を示す検出情報を取得するステップと、
 行動情報が示す第1行動の後に車両に実行させることが可能な第2行動を検出情報に基づいて決定するステップと、
 行動情報が示す第1行動を表す第1画像を生成し、第2行動を表す第2画像を生成するステップと、
 車両の運転者の一定視野内に第1画像と第2画像を表示させるように第1画像と第2画像を車両内の表示部に出力するステップと、
 をコンピュータが実行することを特徴とする運転支援方法。
[Item 16]
A step of acquiring behavior information indicating a first behavior to be executed by the vehicle from an automatic driving control unit that determines the behavior of the vehicle in the automatic driving of the vehicle;
A step of acquiring detection information indicating a detection result from a detection unit that detects a surrounding situation and a running state of the vehicle;
Determining a second action that can be executed by the vehicle after the first action indicated by the action information based on the detection information;
Generating a first image representing the first behavior indicated by the behavior information and generating a second image representing the second behavior;
Outputting the first image and the second image to a display unit in the vehicle so as to display the first image and the second image within a fixed visual field of the driver of the vehicle;
A driving support method characterized in that the computer executes the above.
 これにより、自動運転制御部が決定した車両の現在行動の後に実行させる予定行動の候補を運転者へ提示でき、車両の将来行動を指示するか否かの運転者の判断を支援できる。また、自動運転に対して運転者が抱く不安感を低減できる。 This makes it possible to present to the driver a candidate for a scheduled action to be executed after the current action of the vehicle determined by the automatic driving control unit, and to assist the driver in determining whether to instruct the future action of the vehicle. In addition, it is possible to reduce anxiety that the driver has with respect to automatic driving.
 [項目17]
 車両の自動運転における車両の行動を決定する自動運転制御部から、車両に実行させる第1行動を示す行動情報を取得する機能と、
 車両の周囲状況および走行状態を検出する検出部から、検出結果を示す検出情報を取得する機能と、
 行動情報が示す第1行動の後に車両に実行させることが可能な第2行動を検出情報に基づいて決定する機能と、
 行動情報が示す第1行動を表す第1画像を生成し、第2行動を表す第2画像を生成する機能と、
 車両の運転者の一定視野内に第1画像と第2画像を表示させるように第1画像と第2画像を車両内の表示部に出力する機能と、
 をコンピュータに実現させるための運転支援プログラム。
[Item 17]
A function of acquiring behavior information indicating a first behavior to be executed by the vehicle from an automatic driving control unit that determines the behavior of the vehicle in automatic driving of the vehicle;
A function of acquiring detection information indicating a detection result from a detection unit that detects a surrounding situation and a running state of the vehicle;
A function for determining a second action that can be executed by the vehicle after the first action indicated by the action information based on the detection information;
A function of generating a first image representing the first action indicated by the action information and generating a second image representing the second action;
A function of outputting the first image and the second image to a display unit in the vehicle so as to display the first image and the second image within a fixed visual field of the driver of the vehicle;
A driving support program for realizing a computer.
 これにより、自動運転制御部が決定した車両の現在行動の後に実行させる予定行動の候補を運転者へ提示でき、車両の将来行動を指示するか否かの運転者の判断を支援できる。また、自動運転に対して運転者が抱く不安感を低減できる。 This makes it possible to present to the driver a candidate for a scheduled action to be executed after the current action of the vehicle determined by the automatic driving control unit, and to assist the driver in determining whether to instruct the future action of the vehicle. In addition, it is possible to reduce anxiety that the driver has with respect to automatic driving.
 本発明によれば、車両の自動運転中に、運転者に対して有用な情報を提示することや運転者に対して違和感の少ない情報を提示することができる。 According to the present invention, it is possible to present useful information to the driver or information with less discomfort to the driver during automatic driving of the vehicle.
 1,1000 車両
 2 ブレーキペダル
 3 アクセルペダル
 4 ウィンカレバー
 5 ステアリングホイール
 6 検出部
 7 車両制御部
 8 記憶部
 9 情報報知装置
 10 タッチパネル
 29a,29b,39a,69a,79a,79c,89a,89c,99b,99c,109a,109e,121,121a,121b,121c 表示領域
 51 操作部
 51a,51c,51e 操作ボタン
 59 文字情報
 59a,59b 表示領域
 61 位置情報取得部
 62 センサ
 63 速度情報取得部
 64 地図情報取得部
 69,79,89,99 文字情報
 91 情報取得部
 92 報知部
 101 表示部
 102 入力部
 109 表示
 291 通信部
 292 キャッシュ
 301 自車両
 302 側前方車両
 303 周囲状況
 304 一次キャッシュ
 305 二次キャッシュ
 1002 報知装置
 1002b センターディスプレイ
 1004 入力装置
 1004a 操作部
 1004b 操作部
 1006 スピーカ
 1008 無線装置
 1010 運転操作部
 1011 ステアリング
 1012 ブレーキペダル
 1013 アクセルペダル
 1014 ウィンカスイッチ
 1020 検出部
 1021 位置情報取得部
 1022 センサ
 1023 速度情報取得部
 1024 地図情報取得部
 1030 自動運転制御装置
 1031,1041 制御部
 1032,1042 記憶部
 1033,1043 I/O部(入出力部)
 1040 運転支援装置
 1050 操作入力部
 1051 画像出力部
 1052 検出情報入力部
 1054 行動情報入力部
 1055 コマンド出力部
 1060 画像生成部
 1061 候補決定部
 1062 判定部
 1063 指示部
 1070 統計情報蓄積部
 1071 判定基準保持部
 1072 統計情報蓄積部
 1073 統計情報蓄積部
 1080 第1決定部
 1081 状況適応型決定部
 1082 第2決定部
 1083 個人適応型決定部
 1084 優先度決定部
 1100 メイン領域
 1102 サブ領域
 1103 自動運転情報画面
 1104 現在行動画像
 1106,1106a,1106b 予定行動画像
 1108,1108a,1108b 時間画像
 1109 時間インジケータ
 1110 現在行動候補画像
 1112,1112a,1112b 予定行動候補画像
 1114 時間表示領域
 1116 選択枠
 1118 実行ボタン
 1120 予約ボタン
 1122 キャンセル時間設定画像
 1124 予約行動画像
 1126 制限時間画像
 1128 行動候補表示領域
 1130 優先度表示領域
 1132,1132a,1132b 優先度画像
DESCRIPTION OF SYMBOLS 1,1000 Vehicle 2 Brake pedal 3 Accelerator pedal 4 Winker lever 5 Steering wheel 6 Detection unit 7 Vehicle control unit 8 Storage unit 9 Information notification device 10 Touch panel 29a, 29b, 39a, 69a, 79a, 79c, 89a, 89c, 99b, 99c, 109a, 109e, 121, 121a, 121b, 121c Display area 51 Operation part 51a, 51c, 51e Operation button 59 Character information 59a, 59b Display area 61 Position information acquisition part 62 Sensor 63 Speed information acquisition part 64 Map information acquisition part 69, 79, 89, 99 Character information 91 Information acquisition unit 92 Notification unit 101 Display unit 102 Input unit 109 Display 291 Communication unit 292 Cache 301 Own vehicle 302 Front vehicle 303 Surrounding condition 304 Primary cache 305 Secondary cache 1002 Notification Device 1002b Center Display 1004 Input Device 1004a Operation Unit 1004b Operation Unit 1006 Speaker 1008 Wireless Device 1010 Driving Operation Unit 1011 Steering 1012 Brake Pedal 1013 Accelerator Pedal 1014 Winker Switch 1020 Detection Unit 1021 Position Information Acquisition Unit 1022 Speed Information Acquisition Unit 1022 Speed Information Acquisition Unit 1022 1024 Map information acquisition unit 1030 Automatic operation control device 1031, 1041 Control unit 1032, 1042 Storage unit 1033, 1043 I / O unit (input / output unit)
DESCRIPTION OF SYMBOLS 1040 Driving assistance apparatus 1050 Operation input part 1051 Image output part 1052 Detection information input part 1054 Action information input part 1055 Command output part 1060 Image generation part 1061 Candidate determination part 1062 Determination part 1063 Instruction part 1070 Statistical information storage part 1071 Determination reference holding part 1072 Statistical information storage unit 1073 Statistical information storage unit 1080 First determination unit 1081 Situation adaptive determination unit 1082 Second determination unit 1083 Personal adaptive determination unit 1084 Priority determination unit 1100 Main area 1102 Sub area 1103 Automatic driving information screen 1104 Current Action image 1106, 1106a, 1106b Scheduled action image 1108, 1108a, 1108b Time image 1109 Time indicator 1110 Current action candidate image 1112, 1112a, 1112b Scheduled action sign Complementary image 1114 Time display area 1116 Selection frame 1118 Execution button 1120 Reservation button 1122 Cancel time setting image 1124 Reservation action image 1126 Time limit image 1128 Action candidate display area 1130 Priority display area 1132, 1132a, 1132b Priority image

Claims (23)

  1.  車両の自動運転における前記車両の行動を決定する自動運転制御部から、前記車両に実行させる第1行動を示す行動情報を取得する行動情報入力部と、
     前記車両の周囲状況および走行状態を検出する検出部から、検出結果を示す検出情報を取得する検出情報入力部と、
     前記行動情報が示す前記第1行動の後に前記車両に実行させることが可能な少なくとも1つの第2行動を前記検出情報に基づいて決定する候補決定部と、
     前記行動情報が示す前記第1行動を表す第1画像を生成し、前記少なくとも1つの第2行動を表す第2画像を生成する画像生成部と、
     前記第1画像と前記第2画像を前記車両内の表示部に出力する画像出力部と、
     を備える運転支援装置。
    A behavior information input unit that acquires behavior information indicating a first behavior to be executed by the vehicle from an automatic driving control unit that determines the behavior of the vehicle in automatic driving of the vehicle;
    A detection information input unit for acquiring detection information indicating a detection result from a detection unit for detecting the surrounding situation and the running state of the vehicle;
    A candidate determination unit that determines, based on the detection information, at least one second action that can be executed by the vehicle after the first action indicated by the action information;
    An image generating unit that generates a first image representing the first behavior indicated by the behavior information and generates a second image representing the at least one second behavior;
    An image output unit for outputting the first image and the second image to a display unit in the vehicle;
    A driving support apparatus comprising:
  2.  前記車両の周囲状況および走行状態と前記車両の行動との関連性を示す統計情報を蓄積する蓄積部をさらに備え、
     前記候補決定部は、前記統計情報と前記検出情報に基づいて前記少なくとも1つの第2行動を決定する請求項1に記載の運転支援装置。
    An accumulator that accumulates statistical information indicating the relevance between the vehicle's surroundings and running state and the vehicle's behavior;
    The driving support device according to claim 1, wherein the candidate determination unit determines the at least one second action based on the statistical information and the detection information.
  3.  通信網を介して、前記車両の周囲状況および走行状態と前記車両の行動との関連性を示す統計情報を蓄積する前記車両の外部の蓄積部と通信可能な通信インタフェースをさらに備え、
     前記候補決定部は、前記通信インタフェースを介して前記蓄積部の統計情報にアクセスし、前記統計情報と前記検出情報に基づいて前記少なくとも1つの第2行動を決定する請求項1に記載の運転支援装置。
    Via a communication network, further comprising a communication interface capable of communicating with an external storage unit of the vehicle for storing statistical information indicating the relevance between the surrounding situation and the running state of the vehicle and the behavior of the vehicle;
    The driving support according to claim 1, wherein the candidate determination unit accesses the statistical information of the storage unit via the communication interface, and determines the at least one second action based on the statistical information and the detection information. apparatus.
  4.  前記画像生成部は、前記少なくとも1つの第2行動が実行されるまでの時間を表す付加画像を生成し、
     前記画像出力部は、前記車両内の前記表示部に前記付加画像を出力し、前記付加画像を加えた前記第2画像を表示させる請求項1に記載の運転支援装置。
    The image generation unit generates an additional image representing a time until the at least one second action is executed;
    The driving support device according to claim 1, wherein the image output unit outputs the additional image to the display unit in the vehicle and displays the second image to which the additional image is added.
  5.  前記第1画像および前記第2画像が前記車両の前記表示部に所定時間表示された後、前記少なくとも1つの第2行動を前記車両に実行させるためのコマンドを前記自動運転制御部に出力するコマンド出力部をさらに備える請求項1から4のいずれか一項に記載の運転支援装置。 A command for outputting a command for causing the vehicle to execute the at least one second action to the automatic driving control unit after the first image and the second image are displayed on the display unit of the vehicle for a predetermined time. The driving support device according to any one of claims 1 to 4, further comprising an output unit.
  6.  前記車両の行動を指定する操作指示を受け付ける操作信号入力部と、
     前記第1画像および前記第2画像が前記車両の前記表示部に表示されている所定時間内に、前記操作指示を受け付けなかった場合、前記少なくとも1つの第2行動を前記車両に実行させるためのコマンドを前記自動運転制御部に出力するコマンド出力部と、
     をさらに備える請求項1から4のいずれか一項に記載の運転支援装置。
    An operation signal input unit for receiving an operation instruction for designating the behavior of the vehicle;
    When the first instruction and the second image are not received within the predetermined time displayed on the display unit of the vehicle and the operation instruction is not received, the vehicle is caused to execute the at least one second action. A command output unit for outputting a command to the automatic operation control unit;
    The driving support device according to any one of claims 1 to 4, further comprising:
  7.  前記車両の行動を指定する操作指示を受け付ける操作信号入力部と、
     前記第1画像および前記第2画像が前記車両の前記表示部に表示されている所定時間内に、前記第2画像を選択する操作指示を受け付けた場合、前記少なくとも1つの第2行動を前記車両に実行させるためのコマンドを前記自動運転制御部に出力するコマンド出力部と、
     をさらに備える請求項1から4のいずれか一項に記載の運転支援装置。
    An operation signal input unit for receiving an operation instruction for designating the behavior of the vehicle;
    When the operation instruction for selecting the second image is received within a predetermined time when the first image and the second image are displayed on the display unit of the vehicle, the at least one second action is performed on the vehicle. A command output unit for outputting a command to be executed to the automatic operation control unit,
    The driving support device according to any one of claims 1 to 4, further comprising:
  8.  コマンド出力部をさらに備え、
     前記少なくとも1つの第2行動を指定する操作が入力された場合、前記画像出力部は、前記少なくとも1つの第2行動の実行または予約を指定させるための問い合わせ画像を前記表示部に出力し、
     前記コマンド出力部は、前記問い合わせ画像の表示中に実行を指定する操作が入力された場合、前記少なくとも1つの第2行動を前記車両に実行させるためのコマンドを第1のタイミングにおいて前記自動運転制御部に出力し、前記問い合わせ画像の表示中に予約を指定する操作が入力された場合、前記第1のタイミングより後の第2のタイミングにおいて前記コマンドを前記自動運転制御部に出力する請求項1から4のいずれか一項に記載の運転支援装置。
    A command output unit;
    When an operation for specifying the at least one second action is input, the image output unit outputs an inquiry image for designating execution or reservation of the at least one second action to the display unit,
    When the operation for designating execution is input during display of the inquiry image, the command output unit performs the automatic driving control at a first timing with a command for causing the vehicle to execute the at least one second action. The command is output to the automatic operation control unit at a second timing after the first timing when an operation for designating a reservation is input during display of the inquiry image. The driving support device according to any one of claims 1 to 4.
  9.  前記画像出力部は、前記問い合わせ画像の表示中に予約を指定する操作が入力された場合、前記少なくとも1つの第2行動が予約中であることを表す第3画像を前記表示部に出力する請求項8に記載の運転支援装置。 The image output unit outputs a third image indicating that the at least one second action is being reserved to the display unit when an operation for designating a reservation is input while the inquiry image is displayed. Item 9. The driving support device according to Item 8.
  10.  前記コマンドが前記自動運転制御部に出力された後、前記行動情報入力部は、前記コマンドに応じて更新された第1行動を示す第1行動情報を取得する請求項5から9のいずれか一項に記載の運転支援装置。 10. The behavior information input unit acquires first behavior information indicating a first behavior updated according to the command after the command is output to the automatic driving control unit. The driving support device according to item.
  11.  前記車両の自動運転中に実行する前記車両の行動を指定する操作指示を受け付ける操作入力部をさらに備え、
     前記画像生成部は、前記第1画像および前記第2画像が前記車両の前記表示部に表示されている所定時間内に、前記第2画像を選択する操作指示を受け付けた場合、前記選択された前記第2画像が表す少なくとも1つの第2行動を即時に実行するか、または、所定時間後に実行するかを前記車両の運転者に問い合わせる問い合わせ画像を生成し、
     前記画像出力部は、前記問い合わせ画像を前記表示部に出力する請求項1に記載の運転支援装置。
    An operation input unit that receives an operation instruction that specifies an action of the vehicle to be executed during automatic driving of the vehicle;
    The image generation unit is selected when receiving an operation instruction to select the second image within a predetermined time in which the first image and the second image are displayed on the display unit of the vehicle. Generating an inquiry image that inquires the driver of the vehicle whether to execute at least one second action represented by the second image immediately or after a predetermined time;
    The driving support device according to claim 1, wherein the image output unit outputs the inquiry image to the display unit.
  12.  コマンド出力部をさらに備え、
     前記操作入力部が、前記第2画像が表す前記少なくとも1つの第2行動を即時に実行することが前記問い合わせ画像において選択されたことを示す操作指示を受け付けた場合、前記コマンド出力部は、前記第1画像が表す第1行動に替えて前記第2画像が表す前記少なくとも1つの第2行動を即時に前記車両に実行させる制御コマンドを前記自動運転制御部に送信する請求項11に記載の運転支援装置。
    A command output unit;
    When the operation input unit receives an operation instruction indicating that the immediate execution of the at least one second action represented by the second image is selected in the inquiry image, the command output unit includes: The driving according to claim 11, wherein a control command for causing the vehicle to immediately execute the at least one second action represented by the second image instead of the first action represented by the first image is transmitted to the automatic driving control unit. Support device.
  13.  コマンド出力部をさらに備え、
     前記操作入力部が、前記第2画像が表す前記少なくとも1つの第2行動を所定時間後に実行することが前記問い合わせ画像において選択されたことを示す操作指示を受け付けた場合、前記コマンド出力部は、前記第2画像が表す前記第2行動を所定時間後に前記車両に実行させる制御コマンドを前記自動運転制御部に送信する請求項11または12に記載の運転支援装置。
    A command output unit;
    When the operation input unit receives an operation instruction indicating that the at least one second action represented by the second image is selected in the inquiry image after a predetermined time, the command output unit includes: The driving support device according to claim 11 or 12, wherein a control command for causing the vehicle to execute the second action represented by the second image after a predetermined time is transmitted to the automatic driving control unit.
  14.  前記少なくとも1つの第2行動は、複数の第2行動であって、前記候補決定部は、前記複数の第2行動を決定し、かつ、前記複数の第2行動の各々の優先順位を決定し、
     前記画像生成部は、前記複数の第2行動に対応する複数の第3画像を生成し、
     前記画像出力部は、前記第1画像を表示させ、かつ、優先度に応じた態様で前記複数の第3画像を表示させる請求項1から13のいずれか一項に記載の運転支援装置。
    The at least one second action is a plurality of second actions, and the candidate determination unit determines the plurality of second actions and determines a priority order of each of the plurality of second actions. ,
    The image generation unit generates a plurality of third images corresponding to the plurality of second actions,
    The driving support device according to any one of claims 1 to 13, wherein the image output unit displays the first image and displays the plurality of third images in a manner according to priority.
  15.  車両の自動運転における前記車両の行動を決定する自動運転制御部と、
     前記車両の周囲状況および走行状態を検出する検出部から、検出結果を示す検出情報を取得する検出情報入力部と、
     前記自動運転制御部が前記車両に実行させる第1行動の後に前記車両に実行させることが可能な第2行動を前記検出情報に基づいて決定する候補決定部と、
     前記第1行動を表す第1画像を生成し、前記第2行動を表す第2画像を生成する画像生成部と、
     前記第1画像と前記第2画像を前記車両内の表示部に出力する画像出力部と、
     を備える運転制御装置。
    An automatic driving control unit for determining the behavior of the vehicle in automatic driving of the vehicle;
    A detection information input unit for acquiring detection information indicating a detection result from a detection unit for detecting the surrounding situation and the running state of the vehicle;
    A candidate determining unit that determines, based on the detection information, a second action that can be executed by the vehicle after the first action that the automatic driving control unit causes the vehicle to execute;
    An image generating unit that generates a first image representing the first action and generates a second image representing the second action;
    An image output unit for outputting the first image and the second image to a display unit in the vehicle;
    An operation control device comprising:
  16.  本車両の自動運転における本車両の行動を決定する自動運転制御部と、
     本車両の周囲状況および走行状態を検出する検出部から、検出結果を示す検出情報を取得する検出情報入力部と、
     前記自動運転制御部が本車両に実行させる第1行動の後に本車両に実行させることが可能な第2行動を前記検出情報に基づいて決定する候補決定部と、
     前記第1行動を表す第1画像を生成し、前記第2行動を表す第2画像を生成する画像生成部と、
     前記第1画像と前記第2画像を本車両内の表示部に出力する画像出力部と、
     を備える車両。
    An automatic driving control unit for determining the behavior of the vehicle in the automatic driving of the vehicle;
    A detection information input unit for acquiring detection information indicating a detection result from a detection unit for detecting the surrounding state and the running state of the vehicle;
    A candidate determination unit that determines, based on the detection information, a second action that can be executed by the vehicle after the first action that the automatic driving control unit causes the vehicle to execute;
    An image generating unit that generates a first image representing the first action and generates a second image representing the second action;
    An image output unit for outputting the first image and the second image to a display unit in the vehicle;
    A vehicle comprising:
  17.  車両の自動運転における前記車両の行動を決定する自動運転制御部から、前記車両に実行させる第1行動を示す行動情報を取得するステップと、
     前記車両の周囲状況および走行状態を検出する検出部から、検出結果を示す検出情報を取得するステップと、
     前記行動情報が示す第1行動の後に前記車両に実行させることが可能な第2行動を前記検出情報に基づいて決定するステップと、
     前記行動情報が示す第1行動を表す第1画像を生成し、前記第2行動を表す第2画像を生成するステップと、
     前記第1画像と前記第2画像を前記車両内の表示部に出力するステップと、
     をコンピュータが実行することを特徴とする運転支援方法。
    Obtaining behavior information indicating a first behavior to be executed by the vehicle from an automatic driving control unit that determines the behavior of the vehicle in automatic driving of the vehicle;
    Obtaining detection information indicating a detection result from a detection unit that detects a surrounding situation and a running state of the vehicle;
    Determining a second action that can be executed by the vehicle after the first action indicated by the action information based on the detection information;
    Generating a first image representing the first behavior indicated by the behavior information and generating a second image representing the second behavior;
    Outputting the first image and the second image to a display unit in the vehicle;
    A driving support method characterized in that the computer executes the above.
  18.  車両の自動運転における前記車両の行動を決定する自動運転制御部から、前記車両に実行させる第1行動を示す行動情報を取得する機能と、
     前記車両の周囲状況および走行状態を検出する検出部から、検出結果を示す検出情報を取得する機能と、
     前記行動情報が示す第1行動の後に前記車両に実行させることが可能な第2行動を前記検出情報に基づいて決定する機能と、
     前記行動情報が示す第1行動を表す第1画像を生成し、前記第2行動を表す第2画像を生成する機能と、
     前記第1画像と前記第2画像を前記車両内の表示部に出力する機能と、
     をコンピュータに実現させるための運転支援プログラム。
    A function of acquiring behavior information indicating a first behavior to be executed by the vehicle from an automatic driving control unit that determines the behavior of the vehicle in automatic driving of the vehicle;
    A function of acquiring detection information indicating a detection result from a detection unit that detects a surrounding state and a running state of the vehicle;
    A function of determining a second action that can be executed by the vehicle after the first action indicated by the action information based on the detection information;
    A function of generating a first image representing the first action indicated by the action information and generating a second image representing the second action;
    A function of outputting the first image and the second image to a display unit in the vehicle;
    A driving support program for realizing a computer.
  19.  前記画像出力部は、前記車両の運転者の一定視野内に前記第1画像と前記第2画像を表示させる請求項1から14のいずれか一項に記載の運転支援装置。 The driving support device according to any one of claims 1 to 14, wherein the image output unit displays the first image and the second image within a fixed visual field of a driver of the vehicle.
  20.  前記画像出力部は、前記車両の運転者の一定視野内に前記第1画像と前記第2画像を表示させる請求項15に記載の運転制御装置。 The operation control device according to claim 15, wherein the image output unit displays the first image and the second image within a fixed visual field of the driver of the vehicle.
  21.  前記画像出力部は、前記車両の運転者の一定視野内に前記第1画像と前記第2画像を表示させる請求項16に記載の車両。 The vehicle according to claim 16, wherein the image output unit displays the first image and the second image within a fixed visual field of the driver of the vehicle.
  22.  前記表示部に出力するステップは、前記車両の運転者の一定視野内に前記第1画像と前記第2画像を表示させる請求項17に記載の運転支援方法。 The driving support method according to claim 17, wherein the step of outputting to the display unit displays the first image and the second image within a fixed visual field of the driver of the vehicle.
  23.  前記表示部に出力する機能は、前記車両の運転者の一定視野内に前記第1画像と前記第2画像を表示させる請求項18に記載の運転支援プログラム。 19. The driving support program according to claim 18, wherein the function to be output to the display unit displays the first image and the second image within a fixed visual field of the driver of the vehicle.
PCT/JP2016/002049 2015-04-21 2016-04-15 Driving assistance method and driving assistance device, driving control device, vehicle, and driving assistance program using such method WO2016170764A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN201680034946.7A CN107683237B (en) 2015-04-21 2016-04-15 Driving support method, and driving support device, driving control device, vehicle, driving support program, and recording medium using the driving support method
EP16782788.0A EP3269610B1 (en) 2015-04-21 2016-04-15 Driving assistance method and driving assistance device, driving control device and driving assistance program using such method
US15/565,887 US10252726B2 (en) 2015-04-21 2016-04-15 Driving assistance method, and driving assistance device, driving control device, vehicle, driving assistance program, and recording medium using said method
US16/255,338 US11072343B2 (en) 2015-04-21 2019-01-23 Driving assistance method, and driving assistance device, driving control device, vehicle, driving assistance program, and recording medium using said method

Applications Claiming Priority (20)

Application Number Priority Date Filing Date Title
JP2015087069 2015-04-21
JP2015-087069 2015-04-21
JP2015-099474 2015-05-14
JP2015099474 2015-05-14
JP2015119139 2015-06-12
JP2015-119139 2015-06-12
JP2015-252672 2015-12-24
JP2015252668A JP6685008B2 (en) 2015-04-21 2015-12-24 Driving support method and driving support device, driving control device, vehicle, driving support program using the same
JP2015-252673 2015-12-24
JP2015-252671 2015-12-24
JP2015252669A JP6598019B2 (en) 2015-04-21 2015-12-24 Driving support method, driving support device, driving control device, vehicle, and driving support program using the same
JP2015252673A JP6558734B2 (en) 2015-04-21 2015-12-24 Driving support method, driving support device, driving control device, vehicle, and driving support program using the same
JP2015-252670 2015-12-24
JP2015-252668 2015-12-24
JP2015252672A JP6558733B2 (en) 2015-04-21 2015-12-24 Driving support method, driving support device, driving control device, vehicle, and driving support program using the same
JP2015252674A JP6558735B2 (en) 2015-04-21 2015-12-24 Driving support method, driving support device, driving control device, vehicle, and driving support program using the same
JP2015252671A JP6558732B2 (en) 2015-04-21 2015-12-24 Driving support method, driving support device, driving control device, vehicle, and driving support program using the same
JP2015-252674 2015-12-24
JP2015-252669 2015-12-24
JP2015252670A JP6558731B2 (en) 2015-04-21 2015-12-24 Driving support method, driving support device, driving control device, vehicle, and driving support program using the same

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US15/565,887 A-371-Of-International US10252726B2 (en) 2015-04-21 2016-04-15 Driving assistance method, and driving assistance device, driving control device, vehicle, driving assistance program, and recording medium using said method
US16/255,338 Continuation US11072343B2 (en) 2015-04-21 2019-01-23 Driving assistance method, and driving assistance device, driving control device, vehicle, driving assistance program, and recording medium using said method

Publications (1)

Publication Number Publication Date
WO2016170764A1 true WO2016170764A1 (en) 2016-10-27

Family

ID=57143482

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/002049 WO2016170764A1 (en) 2015-04-21 2016-04-15 Driving assistance method and driving assistance device, driving control device, vehicle, and driving assistance program using such method

Country Status (1)

Country Link
WO (1) WO2016170764A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018101400A (en) * 2016-12-20 2018-06-28 バイドゥ・ユーエスエイ・リミテッド・ライアビリティ・カンパニーBaidu USA LLC Method and system for recognizing individual driving preference of autonomous vehicle
CN108973682A (en) * 2017-06-02 2018-12-11 本田技研工业株式会社 Vehicle control system, control method for vehicle and storage medium
CN108995645A (en) * 2017-06-06 2018-12-14 丰田自动车株式会社 Change auxiliary device in lane
CN109389766A (en) * 2017-08-10 2019-02-26 通用汽车环球科技运作有限责任公司 User's identifying system and method for autonomous vehicle
CN109747655A (en) * 2017-11-07 2019-05-14 北京京东尚科信息技术有限公司 Steering instructions generation method and device for automatic driving vehicle
CN109808600A (en) * 2019-01-07 2019-05-28 北京百度网讯科技有限公司 The method for visualizing and device of the perception information of automatic driving vehicle
JP2020021350A (en) * 2018-08-02 2020-02-06 トヨタ自動車株式会社 Vehicular display device
US11066082B2 (en) * 2017-08-22 2021-07-20 Toyota Jidosha Kabushiki Kaisha Cancel point management system, cancel point notification system, cancel point guide system, and non-transitory computer-readable storage medium
CN115249413A (en) * 2021-04-27 2022-10-28 丰田自动车株式会社 Information processing apparatus and information processing method
CN115273543A (en) * 2022-07-06 2022-11-01 上海工物高技术产业发展有限公司 Road anti-collision multistage early warning system and use method thereof

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004034917A (en) * 2002-07-08 2004-02-05 Nissan Motor Co Ltd Display device for following control object
JP2005067483A (en) * 2003-08-26 2005-03-17 Fuji Heavy Ind Ltd Vehicular running control device
JP2011096105A (en) * 2009-10-30 2011-05-12 Toyota Motor Corp Driving support device
JP2014127183A (en) * 2012-12-27 2014-07-07 Nissan Motor Co Ltd Parking support apparatus and parking support method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004034917A (en) * 2002-07-08 2004-02-05 Nissan Motor Co Ltd Display device for following control object
JP2005067483A (en) * 2003-08-26 2005-03-17 Fuji Heavy Ind Ltd Vehicular running control device
JP2011096105A (en) * 2009-10-30 2011-05-12 Toyota Motor Corp Driving support device
JP2014127183A (en) * 2012-12-27 2014-07-07 Nissan Motor Co Ltd Parking support apparatus and parking support method

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018101400A (en) * 2016-12-20 2018-06-28 バイドゥ・ユーエスエイ・リミテッド・ライアビリティ・カンパニーBaidu USA LLC Method and system for recognizing individual driving preference of autonomous vehicle
CN108973682B (en) * 2017-06-02 2021-06-01 本田技研工业株式会社 Vehicle control system, vehicle control method, and storage medium
CN108973682A (en) * 2017-06-02 2018-12-11 本田技研工业株式会社 Vehicle control system, control method for vehicle and storage medium
CN108995645A (en) * 2017-06-06 2018-12-14 丰田自动车株式会社 Change auxiliary device in lane
CN109389766A (en) * 2017-08-10 2019-02-26 通用汽车环球科技运作有限责任公司 User's identifying system and method for autonomous vehicle
CN109389766B (en) * 2017-08-10 2021-07-27 通用汽车环球科技运作有限责任公司 User identification system and method for autonomous vehicle
US11066082B2 (en) * 2017-08-22 2021-07-20 Toyota Jidosha Kabushiki Kaisha Cancel point management system, cancel point notification system, cancel point guide system, and non-transitory computer-readable storage medium
CN109747655A (en) * 2017-11-07 2019-05-14 北京京东尚科信息技术有限公司 Steering instructions generation method and device for automatic driving vehicle
CN109747655B (en) * 2017-11-07 2021-10-15 北京京东乾石科技有限公司 Driving instruction generation method and device for automatic driving vehicle
CN110789342A (en) * 2018-08-02 2020-02-14 丰田自动车株式会社 Display device for vehicle
JP2020021350A (en) * 2018-08-02 2020-02-06 トヨタ自動車株式会社 Vehicular display device
JP7095468B2 (en) 2018-08-02 2022-07-05 トヨタ自動車株式会社 Vehicle display control device, vehicle display control method, and vehicle display control program
CN110789342B (en) * 2018-08-02 2022-08-19 丰田自动车株式会社 Display device for vehicle
JP2022128471A (en) * 2018-08-02 2022-09-01 トヨタ自動車株式会社 Display control device for vehicle, display control method for vehicle, and display control program for vehicle
JP7363974B2 (en) 2018-08-02 2023-10-18 トヨタ自動車株式会社 Vehicle display control device, vehicle display control method, and vehicle display control program
CN109808600A (en) * 2019-01-07 2019-05-28 北京百度网讯科技有限公司 The method for visualizing and device of the perception information of automatic driving vehicle
CN115249413A (en) * 2021-04-27 2022-10-28 丰田自动车株式会社 Information processing apparatus and information processing method
CN115249413B (en) * 2021-04-27 2023-11-10 丰田自动车株式会社 Information processing apparatus and information processing method
CN115273543A (en) * 2022-07-06 2022-11-01 上海工物高技术产业发展有限公司 Road anti-collision multistage early warning system and use method thereof

Similar Documents

Publication Publication Date Title
JP6883766B2 (en) Driving support method and driving support device, driving control device, vehicle, driving support program using it
JP6731619B2 (en) Information processing system, information processing method, and program
JP6807559B2 (en) Information processing systems, information processing methods, and programs
WO2016170764A1 (en) Driving assistance method and driving assistance device, driving control device, vehicle, and driving assistance program using such method
WO2016170763A1 (en) Driving assistance method, driving assistance device using same, automatic driving control device, vehicle, and driving assistance program
WO2016170773A1 (en) Driving assistance method, and driving assistance device, automatic driving control device, vehicle, and driving assistance program using said method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16782788

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 15565887

Country of ref document: US

REEP Request for entry into the european phase

Ref document number: 2016782788

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE