WO2016170763A1 - Procédé d'assistance à la conduite, dispositif d'assistance à la conduite l'utilisant, dispositif de commande de conduite automatique, véhicule et programme d'assistance à la conduite - Google Patents

Procédé d'assistance à la conduite, dispositif d'assistance à la conduite l'utilisant, dispositif de commande de conduite automatique, véhicule et programme d'assistance à la conduite Download PDF

Info

Publication number
WO2016170763A1
WO2016170763A1 PCT/JP2016/002048 JP2016002048W WO2016170763A1 WO 2016170763 A1 WO2016170763 A1 WO 2016170763A1 JP 2016002048 W JP2016002048 W JP 2016002048W WO 2016170763 A1 WO2016170763 A1 WO 2016170763A1
Authority
WO
WIPO (PCT)
Prior art keywords
driver
vehicle
unit
travel history
driving
Prior art date
Application number
PCT/JP2016/002048
Other languages
English (en)
Japanese (ja)
Inventor
勝長 辻
森 俊也
江村 恒一
渉 仲井
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2015252667A external-priority patent/JP6761967B2/ja
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Priority to EP16782787.2A priority Critical patent/EP3269609B1/fr
Priority to US15/564,702 priority patent/US10919540B2/en
Priority to CN201680021986.8A priority patent/CN107531252B/zh
Priority to EP20177508.7A priority patent/EP3738854A1/fr
Publication of WO2016170763A1 publication Critical patent/WO2016170763A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K31/00Vehicle fittings, acting on a single sub-unit only, for automatically controlling vehicle speed, i.e. preventing speed from exceeding an arbitrarily established velocity or maintaining speed at a particular velocity, as selected by the vehicle operator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/10Interpretation of driver requests or demands
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a vehicle, a driving support method provided in the vehicle, a driving support device using the same, an automatic driving control device, and a driving support program.
  • Patent Document 1 discloses a travel control device that allows a driver to visually recognize the operating state of automatic steering control or automatic acceleration / deceleration control when the host vehicle performs automatic steering control or automatic acceleration / deceleration control. Yes.
  • the present invention relates to a driving support method capable of solving at least one of the above problems during fully automatic driving or partial automatic driving, and a driving support device, automatic driving control device, vehicle, driving using the same Provide support programs.
  • a driving support device provides a driving in which an environmental parameter indicating a driving environment in which the vehicle has traveled in the past and an action selected by the driver with respect to the environmental parameter are associated with each other.
  • a travel history generation unit that generates a history for each driver is provided.
  • the driving support apparatus further includes an acquisition unit that acquires a travel history similar to the travel history for the current driver among the travel histories generated by the travel history generation unit.
  • the driving support device further indicates a driver model generation unit that generates a driver model, a driver model generated by the driver model generation unit, and a current driving environment of the vehicle based on the travel history acquired by the acquisition unit.
  • a determination unit that determines the next action based on the environmental parameter.
  • Another aspect of the present invention is an automatic operation control device.
  • This device includes a travel history generation unit that generates, for each driver, a travel history in which an environment parameter indicating a travel environment in which the vehicle has traveled in the past and an action selected by the driver with respect to the environment parameter.
  • the automatic operation control device further includes an acquisition unit that acquires a travel history similar to the travel history for the current driver among the travel histories generated by the travel history generation unit.
  • the automatic driving control device further includes a driver model generation unit that generates a driver model based on the travel history acquired by the acquisition unit, a driver model generated by the driver model generation unit, and a current driving environment of the vehicle.
  • a determination unit that determines the next action based on the environmental parameter shown, and an automatic driving control unit that controls the automatic driving of the vehicle based on the next action determined by the determination unit.
  • Still another aspect of the present invention is a vehicle.
  • This vehicle is a vehicle including a driving support device, and the driving support device travels by causing an environment parameter indicating a travel environment in which the vehicle has traveled in the past and an action selected by the driver to the environment parameter.
  • a travel history generation unit that generates a history for each driver is provided.
  • the driving support apparatus further includes an acquisition unit that acquires a travel history similar to the travel history for the current driver among the travel histories generated by the travel history generation unit.
  • the driving support device further indicates a driver model generation unit that generates a driver model, a driver model generated by the driver model generation unit, and a current driving environment of the vehicle based on the travel history acquired by the acquisition unit.
  • a determination unit that determines the next action based on the environmental parameter.
  • Still another aspect of the present invention is a driving support method.
  • This method includes a step of generating, for each driver, a driving history in which an environmental parameter indicating a driving environment in which the vehicle has traveled in the past and an action selected by the driver with respect to the environmental parameter are associated with each other. And a step of acquiring a travel history similar to the travel history for the current driver.
  • the driving support method further includes a step of generating a driver model based on the acquired traveling history, a generated driver model, and an environment parameter indicating a current driving environment of the vehicle, and the next action Determining.
  • information can be appropriately transmitted from the vehicle to the occupant so that comfortable automatic driving can be performed in which the operation of the vehicle and the driver is less likely to conflict with each other in fully automatic driving or partial automatic driving.
  • FIG. 1 It is a block diagram which shows the principal part structure of the vehicle containing the information alerting device which concerns on Embodiment 1 of this invention. It is a figure explaining the 1st example of driving environment, the display of the alerting
  • FIG. 10 is a diagram illustrating display on a touch panel in Embodiment 2.
  • FIG. 10 is a diagram illustrating display on a touch panel in Embodiment 2.
  • FIG. 10 is a block diagram illustrating a configuration of a vehicle according to a fifth embodiment. It is a figure which shows typically the interior of the vehicle of FIG. It is a block diagram which shows the detailed structure of the detection part of FIG. 32, and a detection information input part. It is a block diagram which shows the detailed structure of the control part of FIG.
  • FIG. 36 is a diagram illustrating a data structure of a travel history generated by a travel history generation unit in FIG. 35.
  • FIG. 36 is a diagram showing another data structure of a travel history generated by the travel history generation unit in FIG. 35. It is a figure which shows the process outline
  • FIG. 33 is a sequence diagram illustrating a procedure for generating a driver model by the driving support apparatus of FIG. 32. It is a flowchart which shows the update procedure of the travel history by the travel history production
  • FIG. 1 is a block diagram showing a main configuration of a vehicle 1 including an information notification device according to Embodiment 1 of the present invention.
  • the vehicle 1 is a vehicle that can automatically perform all or part of the driving control without requiring the operation of the driver.
  • the vehicle 1 includes a brake pedal 2, an accelerator pedal 3, a winker lever 4, a steering wheel 5, a detection unit 6, a vehicle control unit 7, a storage unit 8, and an information notification device 9.
  • the brake pedal 2 receives a brake operation by the driver and decelerates the vehicle 1.
  • the brake pedal 2 may receive a control result from the vehicle control unit 7 and change in an amount corresponding to the degree of deceleration of the vehicle 1.
  • the accelerator pedal 3 accepts an accelerator operation by the driver and accelerates the vehicle 1. Further, the accelerator pedal 3 may receive a control result by the vehicle control unit 7 and may change by an amount corresponding to the degree of acceleration of the vehicle 1.
  • the winker lever 4 receives a lever operation by the driver and turns on a direction indicator (not shown) of the vehicle 1.
  • the winker lever 4 may receive a control result from the vehicle control unit 7, change the winker lever 4 to a state corresponding to the direction indicating direction of the vehicle 1, and turn on a direction indicator (not shown) of the vehicle 1.
  • Steering wheel 5 receives the steering operation by the driver and changes the traveling direction of the vehicle 1. Further, the steering wheel 5 may receive a control result by the vehicle control unit 7 and may change by an amount corresponding to a change in the traveling direction of the vehicle 1.
  • the steering wheel 5 has an operation unit 51.
  • the operation unit 51 is provided on the front surface (surface facing the driver) of the steering wheel 5 and receives an input operation from the driver.
  • the operation unit 51 is a device such as a button, a touch panel, or a grip sensor, for example.
  • the operation unit 51 outputs information on the input operation received from the driver to the vehicle control unit 7.
  • the detection unit 6 detects the traveling state of the vehicle 1 and the situation around the vehicle 1. Then, the detection unit 6 outputs information on the detected traveling state and surrounding conditions to the vehicle control unit 7.
  • the detection unit 6 includes a position information acquisition unit 61, a sensor 62, a speed information acquisition unit 63, and a map information acquisition unit 64.
  • the position information acquisition unit 61 acquires the position information of the vehicle 1 as travel state information by GPS (Global Positioning System) positioning or the like.
  • GPS Global Positioning System
  • the sensor 62 determines the collision prediction time (TTC: Time) from the position of the other vehicle existing around the vehicle 1 and the lane position information, from the type of the other vehicle and whether it is a preceding vehicle, the speed of the other vehicle, and the speed of the own vehicle. To Collision), the situation around the vehicle 1 such as an obstacle around the vehicle 1 is detected.
  • TTC Time
  • the speed information acquisition unit 63 acquires information such as the speed of the vehicle 1 or the traveling direction from a speed sensor or the like (not shown) as traveling state information.
  • the map information acquisition unit 64 obtains map information around the vehicle 1 such as the road on which the vehicle 1 travels, a merging point with other vehicles on the road, the currently traveling lane, the position of the intersection, and the like. Obtain as information.
  • the sensor 62 is constituted by a millimeter wave radar, a laser radar, a camera, or a combination thereof.
  • the storage unit 8 is a storage device such as a ROM (Read Only Memory), a RAM (Random Access Memory), a hard disk device, or an SSD (Solid State Drive), and the current running environment and next (first predetermined time elapse). Memorize the correspondence between possible behavior candidates (later).
  • the current traveling environment is an environment determined by the position of the vehicle 1, the road on which the vehicle 1 is traveling, the position and speed of other vehicles existing around the vehicle 1, and the like.
  • the other vehicle may be interrupted during acceleration or deceleration depending on the position or speed of the other vehicle, and collision may occur after 1 second. You may judge to sex. Thereby, the behavior of the other vehicle can be predicted, and the traveling environment can be grasped in more detail and accurately.
  • the candidate for behavior is a candidate for behavior that the vehicle 1 can take next (after the first predetermined time) with respect to the current traveling environment.
  • the storage unit 8 has a merge path ahead of the lane in which the vehicle 1 travels, there is a vehicle that merges from the left side of the lane, and the lane can be changed to the right side of the lane in which the vehicle 1 travels.
  • three behavior candidates of acceleration of the vehicle 1, deceleration of the vehicle 1, and lane change to the right of the vehicle 1 are stored in advance.
  • the storage unit 8 allows a vehicle traveling in front of the same lane as the vehicle 1 (hereinafter referred to as “preceding vehicle”) to travel at a slower speed than the vehicle 1 and can change the lane to an adjacent lane.
  • preceding vehicle a vehicle traveling in front of the same lane as the vehicle 1
  • three behavior candidates are stored in advance: driving that overtakes the preceding vehicle, driving that changes the lane to the adjacent lane, and driving that decelerates the vehicle 1 and follows the preceding vehicle.
  • the storage unit 8 may store priorities for the respective behavior candidates. For example, the storage unit 8 may store the number of behaviors actually adopted in the same driving environment in the past, and may store the priority set higher for the behaviors that are adopted more frequently.
  • the vehicle control unit 7 can be realized as a part of an LSI circuit or an electronic control unit (ECU) that controls the vehicle, for example.
  • the vehicle control unit 7 controls the vehicle based on the traveling state information and the surrounding situation information acquired from the detection unit 6, and the brake pedal 2, the accelerator pedal 3, the blinker lever 4, and information notification corresponding to the vehicle control result.
  • the device 9 is controlled.
  • the object which the vehicle control part 7 controls is not limited to these.
  • the vehicle control unit 7 determines the current driving environment based on information on the driving state and surrounding conditions. For this determination, various conventionally proposed methods can be used.
  • the vehicle control unit 7 determines that the current driving environment is based on the information on the driving state and the surrounding situation: “There is a merge path in front of the lane in which the vehicle 1 travels, and a vehicle that merges from the left side of the lane. It is determined that the travel environment is present and can be changed to the right of the lane in which the vehicle 1 travels.
  • the vehicle control unit 7 determines that the time series of the travel environment is “a vehicle traveling in front of the same lane as the vehicle 1 travels at a slower speed than the vehicle 1 based on information on the travel state and the surrounding conditions. In addition, it is determined that the travel environment allows a lane change to the adjacent lane.
  • the vehicle control unit 7 causes the notification unit 92 of the information notification device 9 to notify information related to the traveling environment indicating the traveling state and the surrounding situation. Further, the vehicle control unit 7 reads, from the storage unit 8, behavior candidates that the vehicle 1 can take next (after the first predetermined time has elapsed) with respect to the determined traveling environment.
  • the vehicle control unit 7 determines which behavior is most suitable for the current traveling environment from the read behavior candidates, and sets the behavior most suitable for the current traveling environment as the first behavior.
  • the first behavior may be the same behavior that the vehicle is currently implementing, that is, continuing the currently implemented behavior.
  • the vehicle control part 7 sets the candidate of the behavior which a driver
  • the vehicle control unit 7 may set the most suitable behavior as the first behavior using a conventional technique that determines the most suitable behavior based on information on the running state and the surrounding situation.
  • the vehicle control unit 7 may set a preset behavior among the plurality of behavior candidates as the most suitable behavior, or store information on the behavior selected last time in the storage unit 8.
  • the behavior may be determined as the most suitable behavior, or the number of times each behavior has been selected in the past is stored in the storage unit 8, and the behavior with the largest number of times may be determined as the most suitable behavior. Good.
  • the vehicle control part 7 makes the alerting
  • vehicle control unit 7 may cause the notification unit 92 to simultaneously notify the information on the first behavior and the second behavior, and information on the running state and the surrounding situation.
  • the vehicle control unit 7 acquires information on the operation received by the operation unit 51 from the driver. After notifying the first behavior and the second behavior, the vehicle control unit 7 determines whether or not the operation unit 51 has accepted the operation within the second predetermined time. This operation is, for example, an operation for selecting one behavior from behaviors included in the second behavior.
  • the vehicle control unit 7 controls the vehicle so as to execute the first behavior when the operation unit 51 does not accept the operation within the second predetermined time, and the brake pedal 2 and the accelerator according to the vehicle control result.
  • the pedal 3 and the winker lever 4 are controlled.
  • the vehicle control unit 7 performs control corresponding to the accepted operation when the operation unit 51 accepts the operation within the second predetermined time.
  • the information notification device 9 acquires various information related to the traveling of the vehicle 1 from the vehicle control unit 7 and notifies the acquired information.
  • the information notification device 9 includes an information acquisition unit 91 and a notification unit 92.
  • the information acquisition unit 91 acquires various information related to the traveling of the vehicle 1 from the vehicle control unit 7. For example, the information acquisition unit 91 acquires the first behavior information and the second behavior information from the vehicle control unit 7 when the vehicle control unit 7 determines that there is a possibility of updating the behavior of the vehicle 1. To do.
  • the information acquisition part 91 memorize
  • the notification unit 92 notifies the driver of information related to the traveling of the vehicle 1.
  • the notification unit 92 displays information such as a car navigation system installed in the vehicle, a head-up display, a center display, a steering wheel 5 or a light emitter such as a light emitting diode (LED) installed in the pillar.
  • the display part to display may be sufficient.
  • the notification unit 92 may be a speaker that converts information into sound and notifies the driver, or is provided at a position that can be sensed by the driver (for example, the driver's seat, the steering wheel 5). It may be a vibrating body.
  • the notification unit 92 may be a combination of these.
  • the notification unit 92 is a notification device.
  • the notification unit 92 includes, for example, a head-up display (HUD), an LCD (Liquid Crystal Display), an HMD (Head-Mounted Display or Helmet-Mounted Display), a glasses-type display (Smart Glasses), Other dedicated displays.
  • the HUD may be, for example, a windshield of the vehicle 1, or may be a separately provided glass surface, plastic surface (for example, a combiner), or the like.
  • the windshield may be, for example, a windshield, a side glass or a rear glass of the vehicle 1.
  • the HUD may be a transmissive display provided on the surface or inside of the windshield.
  • the transmissive display is, for example, a transmissive organic EL (Electroluminescence) display or a transparent display using glass that emits light when irradiated with light of a specific wavelength.
  • the driver can view the display on the transmissive display at the same time as viewing the background.
  • the notification unit 92 may be a display medium that transmits light. In either case, an image is displayed on the notification unit 92.
  • the notification unit 92 notifies the driver of information related to travel acquired from the vehicle control unit 7 via the information acquisition unit 91.
  • the notification unit 92 notifies the driver of information on the first behavior and the second behavior acquired from the vehicle control unit 7.
  • FIG. 2 is a diagram for explaining a first example of the traveling environment, the display of the notification unit 92 and the operation of the operation unit 51 corresponding thereto.
  • FIG. 2 is an overhead view showing a traveling environment of the vehicle 1. Specifically, (a) of FIG. 2 is a right-hand side of the lane in which the vehicle 1 travels, and there is a merge channel in front of the lane in which the vehicle 1 travels. This indicates that the driving environment is capable of changing lanes.
  • the vehicle control unit 7 determines that the traveling environment is a traveling environment as shown in FIG. 2A based on information on the traveling state and the surrounding situation. Note that the vehicle control unit 7 generates the overhead view shown in FIG. 2A and causes the notification unit 92 to notify the generated overhead view in addition to the information on the first behavior and the second behavior. May be.
  • FIG. 2 shows an example of the display of the notification unit 92 for the traveling environment shown in (a) in FIG.
  • the display range of the notification unit 92 options on the behavior of the vehicle 1 are displayed on the right side, and information for switching to manual driving is displayed on the left side.
  • the first behavior is “lane change” shown in the highlighted display area 29b among the display areas 29a to 29c, 29g.
  • the second behavior is “acceleration” and “deceleration” shown in the display areas 29a and 29c, respectively.
  • the display area 29g displays “automatic operation end” indicating switching to manual operation.
  • FIG. 2C shows an example of the operation unit 51 provided in the steering wheel 5.
  • the operation unit 51 includes operation buttons 51 a to 51 d provided on the right side of the steering wheel 5 and operation buttons 51 e to 51 h provided on the left side of the steering wheel 5.
  • the number, shape, etc. of the operation part 51 provided in the steering wheel 5 are not limited to these.
  • the display areas 29a to 29c and the operation buttons 51a to 51c shown in FIG. 2B correspond to each other, and the display area 29g and the operation buttons 51g correspond to each other.
  • the driver presses an operation button corresponding to each display area when selecting any of the contents displayed in each display area. For example, when the driver selects the behavior “acceleration” displayed in the display area 29a, the driver presses the operation button 51a.
  • FIG. 2B only character information is displayed in each display area, but a symbol or icon relating to driving of the vehicle may be displayed as described below. As a result, the driver can grasp the display contents at a glance.
  • FIG. 3 is a diagram showing another example of display in the notification unit 92. As shown in FIG. 3, both character information and symbols indicating the information are displayed in the display areas 39a to 39c and 39g. Only symbols may be displayed.
  • FIG. 4 is a flowchart showing a processing procedure of information notification processing in the present embodiment.
  • FIG. 5 is a diagram illustrating a first example of a traveling environment and display control for the first example.
  • the detection unit 6 detects the traveling state of the vehicle (step S11). Next, the detection unit 6 detects the situation around the vehicle (step S12). Information on the detected traveling state of the vehicle and the situation around the vehicle is output by the detection unit 6 to the vehicle control unit 7.
  • the vehicle control unit 7 determines the current traveling environment based on the information on the traveling state and the surrounding situation (step S13).
  • the vehicle control unit 7 indicates that the current travel environment is “there is a merge path in front of the lane in which the vehicle 1 travels, and a vehicle that merges from the left side of the lane, And it determines with it being the driving
  • the vehicle control unit 7 causes the notification unit 92 of the information notification device 9 to notify the determined traveling environment information (step S14).
  • the vehicle control unit 7 outputs information on the determined traveling environment to the information acquisition unit 91.
  • the notification unit 92 acquires travel environment information from the information acquisition unit 91 and displays it as character information 59.
  • the vehicle control unit 7 may notify the driver of the information on the driving environment as sound through a speaker or the like instead of displaying the information on the driving environment on the notification unit 92. Thereby, even when the driver is not looking at or overlooking the display or monitor, information can be reliably transmitted to the driver.
  • the vehicle control unit 7 determines whether or not the determined traveling environment has a possibility of updating the behavior. If it is determined that there is a possibility of updating, the vehicle control unit 7 further includes the first behavior, And determination of a 2nd behavior is performed (step S15). The determination as to whether or not the driving environment is likely to be updated is made based on whether or not the driving environment has changed.
  • the behavior to be implemented after the update is, for example, the vehicle that decelerates when there is a possibility of a collision with another vehicle, etc., the speed changes when the preceding vehicle disappears in ACC (Adaptive Cruise Control), It is conceivable to change lanes when free. Whether to update or not is determined using conventional technology.
  • the vehicle control unit 7 reads, from the storage unit 8, candidate behaviors that the vehicle 1 can take next (after the first predetermined time has elapsed) with respect to the determined traveling environment. Then, the vehicle control unit 7 determines which behavior is most suitable for the current traveling environment from the behavior candidates, and sets the behavior most suitable for the current traveling environment as the first behavior. Then, the vehicle control unit 7 sets behavior candidates excluding the first behavior to the second behavior.
  • the vehicle control unit 7 selects from the storage unit 8 candidates for three behaviors of acceleration of the vehicle 1, deceleration of the vehicle 1, and lane change to the right of the vehicle 1. read out. Then, the vehicle control unit 7 determines that the rightward lane change of the vehicle 1 is the most suitable behavior based on the speed of the vehicle joining from the left side and the situation of the right lane of the vehicle 1. The behavior is set to the first behavior. Then, the vehicle control unit 7 sets behavior candidates excluding the first behavior to the second behavior.
  • the vehicle control unit 7 causes the notification unit 92 of the information notification device 9 to notify the first behavior and the second behavior (step S16).
  • the notification unit 92 highlights and displays the character information “lane change”, which is the first behavior information, in the display area 59b, and is the second behavior information. “Acceleration” and “Deceleration” are displayed in the display areas 59a and 59c, respectively.
  • the vehicle control unit 7 determines whether or not the operation unit 51 has received an operation from the driver within the second predetermined time (step S17).
  • the vehicle control unit 7 sets the time from when it is determined that the current travel environment is the travel environment illustrated in FIG. 5A to the arrival at the junction point as the first predetermined time. And the vehicle control part 7 sets 2nd predetermined time shorter than 1st predetermined time as time when reception of operation with respect to the next behavior performed by a merge point is possible.
  • the vehicle control unit 7 determines whether the received operation is an operation for terminating automatic driving or a behavior selection operation (so-called operation). Update) is determined (step S18).
  • each display area of the notification unit 92 and each operation button of the operation unit 51 correspond to each other.
  • the driver selects the end of the automatic driving in FIG. 5B, the driver presses the operation button 51g shown in FIG. Further, when selecting the behavior, the driver presses one of the operation buttons 51a to 51c shown in FIG.
  • the vehicle control unit 7 terminates the automatic driving when the operation received by the operating unit 51 is an operation for terminating the automatic driving (that is, when it is detected that the operation button 51g is pressed) (step S19).
  • the operation received by the operation unit 51 is a behavior selection operation (that is, when any of the operation buttons 51a to 51c is pressed)
  • the vehicle control unit 7 executes the behavior corresponding to the pressed operation button.
  • the vehicle 1 is controlled (step S20).
  • the vehicle control unit 7 controls the vehicle 1 to execute the first behavior when the operation unit 51 does not accept the operation from the driver within the second predetermined time (NO in step S17). (Step S21).
  • FIG. 6 is a diagram showing a first example of the driving environment and another display control for it. 6 (a) is the same as FIG. 5 (a), but the display control of FIG. 6 (b) is different from the display control of FIG. 5 (b).
  • the vehicle control unit 7 accelerates the vehicle 1 from the storage unit 8 with respect to the traveling environment illustrated in (a) of FIG. Three candidate motions for deceleration and lane change to the right of the vehicle 1 are read out. At that time, it is assumed that the storage unit 8 stores a behavior in which the lane change to the right side of the vehicle 1 has the highest priority.
  • the vehicle control unit 7 causes the notification unit 92 to notify the traveling environment information and the first behavior information.
  • the vehicle control unit 7 generates character information 69 indicating information on the driving environment and information on the first behavior, and causes the notification unit 92 to display the character information 69.
  • the vehicle control unit 7 causes the display areas 69a and 69c to display a display prompting the driver to adopt or reject the first behavior.
  • the vehicle control unit 7 displays a display “automatic driving end” indicating that switching to manual driving is possible in the display area 69g.
  • the vehicle control unit 7 highlights and displays “YES” corresponding to adopting the first behavior. Which of “YES” and “NO” is emphasized and displayed may be determined in advance, the option selected last time may be highlighted and displayed, or the number of times selected in the past May be stored in the storage unit 8, and the notification unit 92 may highlight and display the one with the larger number of times.
  • the vehicle control unit 7 can appropriately notify the driver of information. Moreover, the display made to alert
  • FIG. 7 is a diagram showing a second example of the driving environment and display control for the second example.
  • FIG. 7A is an overhead view showing the traveling environment.
  • the traveling environment shown in FIG. 7A is the same as FIG. 5A and FIG. 6A in that there is a joint path ahead, but the traveling vehicle exists on the right side of the vehicle 1. 5 (a) and FIG. 6 (a) are different. In such a case, the vehicle control unit 7 determines that the lane change cannot be performed.
  • the vehicle control unit 7 determines that the traveling environment of the vehicle 1 is as shown in FIG. 7A, the vehicle control unit 7 informs the notification unit 92 of the determined traveling environment information as shown in FIG. It is displayed as character information 79.
  • the vehicle control unit 7 selects the right side of the vehicle 1 among the three behavior candidates of acceleration of the vehicle 1 read from the storage unit 8, deceleration of the vehicle 1, and lane change to the right side of the vehicle 1. Since the lane cannot be changed, only the acceleration of the vehicle 1 and the deceleration of the vehicle 1 are selected.
  • the vehicle control unit 7 predicts that the vehicle 1 is too close to the joining vehicle when proceeding at this speed, and determines that the deceleration of the vehicle 1 is the most suitable behavior, that is, the first behavior.
  • the most suitable behavior is determined using a conventional technique that determines the most suitable behavior based on information on the driving state and the surrounding situation. Further, which behavior is most suitable may be determined in advance, or information on the behavior selected last time may be stored in the storage unit 8, and the behavior may be determined as the most suitable behavior. Then, the number of times each behavior has been selected in the past may be stored in the storage unit 8, and the behavior with the largest number of times may be determined as the most suitable behavior.
  • the vehicle control unit 7 displays “Deceleration” as the first behavior in the display area 79c, and displays “Acceleration” as the second behavior in the display area 79a. Further, the vehicle control unit 7 causes the display area 79g to display “automatic driving end” indicating switching to manual driving.
  • the vehicle control unit 7 can notify the driver of the behavior most suitable for the traveling environment as the first behavior according to the traveling environment.
  • the information on the first behavior may be arranged on the upper side and the information on the second behavior may be arranged on the lower side, and selection functions may be assigned to the operation buttons 51a and 51c, respectively. Further, the information on the acceleration behavior is arranged upward, the information on the deceleration behavior is arranged downward, the information on the behavior of the right lane change is arranged on the right side, and the information on the behavior of the left lane change is arranged on the left side. , 51b, 51d may be assigned a selection function. In addition, it may be possible to switch between them, and a separate action priority arrangement or operation priority arrangement may be displayed. Furthermore, the display size of the first behavior information may be increased and the display size of the second behavior information may be decreased. In addition, by arranging the behavior information display corresponding to the behavior of the front / rear / left / right of the vehicle, the driver can recognize and operate intuitively.
  • FIG. 8 is a diagram showing a third example of the driving environment and display control for it.
  • FIG. 8A is an overhead view showing the traveling environment of the vehicle 1. Specifically, FIG. 8A shows a travel environment in which the preceding vehicle travels at a slower speed than the vehicle 1 and the lane can be changed to the adjacent lane.
  • the vehicle control unit 7 determines that the traveling environment is a traveling environment as shown in FIG. 8A based on information on the traveling state and the surrounding situation. In this case, the vehicle control unit 7 causes the notification unit 92 to display the determined traveling environment information as character information 89.
  • the vehicle control unit 7 can select three behaviors as a candidate for the behavior corresponding to the determined traveling environment: traveling that overtakes the preceding vehicle, traveling that changes the lane to the adjacent lane, and traveling that decelerates the vehicle 1 and follows the preceding vehicle.
  • the candidate for the street behavior is read from the storage unit 8.
  • the vehicle control unit 7 allows the speed after the deceleration of the preceding vehicle to be higher than a predetermined value, so that the behavior in which the vehicle 1 decelerates and follows the preceding vehicle is most suitable, that is, the first behavior. It is determined that
  • the most suitable behavior is determined using a conventional technique that determines the most suitable behavior based on information on the driving state and the surrounding situation. Further, which behavior is most suitable may be determined in advance, or information on the behavior selected last time may be stored in the storage unit 8, and the behavior may be determined as the most suitable behavior. Then, the number of times each behavior has been selected in the past may be stored in the storage unit 8, and the behavior with the largest number of times may be determined as the most suitable behavior.
  • the vehicle control unit 7 highlights and displays the character information “follow” indicating the first behavior in the display area 89c, and displays the “overtaking” indicating the second behavior. "And” change lane “are displayed in the display areas 89a and 89b, respectively. Further, the vehicle control unit 7 causes the display area 89g to display “automatic driving end” indicating switching to manual driving.
  • the information on the first behavior may be arranged on the upper side and the information on the second behavior may be arranged on the lower side, and selection functions may be assigned to the operation buttons 51a and 51c, respectively. Further, the information on the passing behavior is arranged upward, the information on the following behavior is arranged downward, the information on the behavior of the right lane change is arranged on the right side, and the information on the behavior of the left lane change is arranged on the left side. , 51b, 51d may be assigned a selection function. In addition, it may be possible to switch between them, and a separate action priority arrangement or operation priority arrangement may be displayed. Furthermore, the display size of the first behavior information may be increased and the display size of the second behavior information may be decreased.
  • FIG. 9 is a diagram showing a fourth example of the driving environment and display control for it.
  • FIG. 9A is an overhead view showing the traveling environment of the vehicle 1. Specifically, FIG. 9A illustrates that the traveling environment is a traveling environment in which the lanes decrease in front of the same lane as the vehicle 1.
  • the vehicle control unit 7 determines that the traveling environment is a traveling environment as shown in FIG. 9A based on information on the traveling state and the surrounding situation. In this case, the vehicle control unit 7 causes the notification unit 92 to display the determined traveling environment information as the character information 99.
  • the vehicle control unit 7 reads out from the storage unit 8 two candidate behaviors, that is, a behavior for changing the lane to the adjacent lane and a driving for maintaining the current lane as the behavior candidates corresponding to the determined travel environment. .
  • the vehicle control unit 7 determines that the travel to change the lane to the adjacent lane is the most suitable behavior, that is, the first behavior because the TTC to the lane decrease point is shorter than a predetermined value. To do.
  • which of the two behavior candidates is the most suitable behavior is determined using a conventional technique for determining the most suitable behavior based on information on the driving state and the surrounding situation. Further, which behavior is most suitable may be determined in advance, or information on the behavior selected last time may be stored in the storage unit 8, and the behavior may be determined as the most suitable behavior. Then, the number of times each behavior has been selected in the past may be stored in the storage unit 8, and the behavior with the largest number of times may be determined as the most suitable behavior.
  • the vehicle control unit 7 highlights and displays the character information “lane change” indicating the first behavior in the display area 99 b and indicates the second behavior “ Character information “as is” is displayed in the display area 99c. Further, the vehicle control unit 7 causes the display area 99g to display “automatic driving end” indicating switching to manual driving.
  • the first behavior information may be arranged above, the second behavior information may be arranged below, and a selection function may be assigned to each of the operation buttons 51a and 51c.
  • the change behavior information is arranged on the right side, the left lane change behavior information is arranged on the left side, and a selection function may be assigned to each of the operation buttons 51c, 51b, 51d.
  • You may display whether it is action priority arrangement
  • the display size of the first behavior information may be increased and the display size of the second behavior information may be decreased.
  • different functions are assigned to the display areas according to different traveling environments, so that information notification or operation can be performed in a small area.
  • the vehicle control unit 7 causes the notification unit 92 to notify the behavior in accordance with the information on the traveling environment and the surrounding situation, but the present invention is not limited to this.
  • the notification unit 92 may be notified of the behavior.
  • FIG. 10 is a diagram showing a fifth example of the driving environment and display control for it.
  • FIG. 10A is an overhead view showing the traveling environment of the vehicle 1. Specifically, FIG. 10A shows a traveling environment indicating that the vehicle 1 is a traveling environment in which lanes can be changed to the left and right.
  • the driving environment shown in (a) of FIG. 10 is different from the cases of (a) to (a) of FIG. 5 and is a driving environment in which normal driving without changing lanes or accelerating or decelerating the vehicle is possible. is there.
  • the vehicle control unit 7 does not have to display the information on the driving environment on the notification unit 92 as character information.
  • the vehicle control unit 7 stores the behavior candidate in the normal travel as the storage unit 8. Read from.
  • the acceleration of the vehicle 1, the deceleration of the vehicle 1, and the lane change to the right side of the vehicle 1 are associated with the traveling environment of the normal traveling as shown in FIG. , Four behavior candidates for changing the lane to the left of the vehicle 1 are stored.
  • the vehicle control unit 7 reads out these and displays them on the display areas 109a to 109d of the notification unit 92, respectively.
  • the vehicle control unit 7 displays “automatic driving end” indicating that switching to manual driving is displayed in the display area 109g, and displays “cancel” indicating canceling behavior update in the display area 109e. Highlight and display.
  • the display in the notification unit 92 has been described as character information, but the present invention is not limited to this. For example, it may be displayed visually to the driver using a symbol indicating the behavior. In the following, the display using symbols visually displayed to the driver will be described by taking the display for FIGS. 5 and 7 as an example.
  • FIG. 11 is a diagram showing another display control for the first example of the traveling environment shown in FIG.
  • the first behavior described above is a lane change to the right of the vehicle 1
  • the second behavior is acceleration of the vehicle 1 and deceleration of the vehicle 1.
  • a symbol 111 indicating “lane change” as the first behavior is displayed large in the center
  • deceleration of the vehicle 1 is displayed small to the right.
  • a symbol 114 indicating the end of automatic driving is displayed small on the left.
  • the lane change is performed.
  • FIG. 12 is a diagram showing another display control for the second example of the traveling environment shown in FIG.
  • the lane cannot be changed. Therefore, for example, “deceleration of the vehicle 1” is set to the first behavior, and “acceleration of the vehicle 1” is set to the second behavior.
  • the symbol 121 indicating “deceleration of the vehicle 1” that is the first behavior is displayed large in the center, and “acceleration of the vehicle 1” that is the second behavior.
  • the symbol 122 indicating "" is displayed small to the right.
  • a symbol 123 indicating the end of automatic driving is displayed small on the left.
  • the operation unit 51 receives an operation for selecting “acceleration of the vehicle 1” from the driver.
  • the symbol 122 ′ indicating “acceleration of the vehicle 1” as the first behavior is displayed large in the center, and “deceleration of the vehicle 1” as the second behavior.
  • the symbol 121 ′ indicating “” is displayed small to the right.
  • the driver can grasp the behavior performed by the vehicle and other behaviors that can be selected, and can continue the automatic driving with a sense of security. Alternatively, the driver can give instructions to the car smoothly.
  • the option to be notified to the notification unit, that is, the second behavior can be made variable according to the traveling environment.
  • FIG. 13 is a block diagram showing a main configuration of the vehicle 1 including the information notification device according to Embodiment 2 of the present invention.
  • the same components as those in FIG. 1 are denoted by the same reference numerals as those in FIG.
  • a vehicle 1 shown in FIG. 13 is provided with a touch panel 10 instead of the operation unit 51 of the steering wheel 5.
  • the touch panel 10 is a device composed of a liquid crystal panel or the like capable of displaying information and receiving input, and is connected to the vehicle control unit 7.
  • the touch panel 10 includes a display unit 101 that displays information based on control by the vehicle control unit 7 and an input unit 102 that receives an operation from a driver or the like and outputs the received operation to the vehicle control unit 7.
  • display control of the touch panel 10 will be described.
  • display control when the vehicle 1 is traveling in the center of three lanes and the lane can be changed to either the right lane or the left lane will be described.
  • FIG. 14 is a diagram for explaining a display on the touch panel 10 according to the second embodiment.
  • FIG. 14A shows an initial display of the display unit 101 of the touch panel 10.
  • the vehicle control unit 7 determines that the lane can be changed to either the right lane or the left lane
  • the vehicle control unit 7 displays on the display unit 101 of the touch panel 10 as shown in FIG. Is executed.
  • the display “Touch” in the display area 121 indicates that the touch panel 10 is in a mode in which a touch operation by the driver can be received.
  • the input unit 102 accepts this operation and provides information indicating that this operation has been performed to the vehicle. Output to the control unit 7.
  • the vehicle control unit 7 causes the display unit 101 to display the display shown in FIG. 14B and causes the notification unit 92 to display the display shown in FIG.
  • FIG. 14B shows a display area 121a on which “Move” indicating an operation for instructing the vehicle 1 to move is displayed. Further, FIG. 14B shows display areas 121b to 121d indicating that the vehicle 1 can travel in each of the three lanes. The display areas 121b to 121d correspond to traveling in the lane indicated by arrows X, Y, and Z in FIG. 14C, respectively.
  • the display areas in FIG. 14B and the corresponding arrows in FIG. 14C have the same mode (for example, color or arrangement). As a result, the display is easier to understand for the driver.
  • the behavior performed by the vehicle determined by the vehicle control may be displayed so that the behavior selectable by the driver can be distinguished.
  • the driver selects the behavior of the vehicle 1 by touching the display area corresponding to the lane to be traveled among the display areas 121b to 121d.
  • the input unit 102 accepts a driver's behavior selection operation and outputs information on the selected behavior to the vehicle control unit 7.
  • the vehicle control unit 7 controls the vehicle 1 to execute the selected behavior.
  • the vehicle 1 travels in the lane that the driver wants to travel.
  • the driver may perform a swipe operation on the touch panel 10 instead of the touch operation.
  • the driver when the driver wants to change to the lane indicated by the arrow X in FIG. 14C, the driver performs a right swipe operation on the touch panel 10.
  • the input unit 102 receives the swipe operation and outputs information indicating the content of the swipe operation to the vehicle control unit 7. And the vehicle control part 7 controls the vehicle 1 to perform the lane change to the lane shown by the arrow X which is the selected behavior.
  • the user may speak “behavior selection” or the like by voice. Thereby, it becomes possible to operate only by displaying the HUD without looking at the touch panel at hand.
  • the display mode of the lane corresponding to the display area of the selected touch panel may be changed so that it can be confirmed before selecting which lane is being selected. For example, at the moment when the display area b is touched, the thickness of the lane X increases, and if the hand is released immediately, the lane X is not selected and the thickness of the lane X returns to the original size, and the display area 121c is touched. If the thickness of the lane Y increases and the state is maintained for a while, the lane Y may be selected and the fact that the lane Y has blinked may be notified. Thereby, selection or determination operation can be performed without looking at the hand.
  • vehicle control functions such as acceleration, deceleration, overtaking, and the like may be assigned to the display area according to the driving environment.
  • the driver can perform an intuitive operation.
  • the touch panel can freely change the number, shape, color, and the like of display areas for receiving operations, the degree of freedom of the user interface is improved.
  • the configuration according to the present embodiment is a configuration in which, in the configuration of FIG. 1 described in the first embodiment, the operation unit 51 further includes a grip sensor that detects whether or not the driver has gripped the steering wheel 5. .
  • FIG. 15 is a diagram illustrating the display of the notification unit 92 according to Embodiment 3 of the present invention.
  • a vehicle traveling in front of the same lane as the vehicle 1 travels at a slower speed than the vehicle 1, and the lane is changed to the adjacent lane.
  • the example of the display in the driving environment which can do is shown.
  • the vehicle control unit 7 determines that the traveling environment is the traveling environment illustrated in FIG. 8A, the vehicle control unit 7 first causes the notification unit 92 to execute the display illustrated in FIG.
  • the symbol 131 indicating “overtaking” which is the first behavior is displayed in the first mode (for example, the first Color).
  • the vehicle control unit 7 changes the symbol 131 from the first mode to the first mode. It is displayed on the notification unit 92 in a different second mode (for example, a second color different from the first color).
  • the second predetermined time is the same as the second predetermined time described in the first embodiment.
  • the driver can select the second behavior, but when the symbol 131 is changed to the second mode, the driver Selection of the behavior of 2 becomes impossible.
  • 15A shows a steering wheel shape symbol 132 indicating that the second behavior can be selected.
  • the symbol 132 is displayed, when the driver holds the steering wheel 5, the second behavior is displayed.
  • the symbol 132 is a display indicating that the second behavior can be selected.
  • the driver can select the second behavior. It may be shown. In this case, the symbol 132 may not be displayed.
  • the symbol 133 is an auxiliary display that indicates to the driver that the vehicle is traveling in automatic driving, but the symbol 133 may not be displayed.
  • the grip sensor detects it and outputs information of the detection result to the vehicle control unit 7.
  • the vehicle control unit 7 causes the notification unit 92 to execute the display shown in FIG.
  • FIG. 15B as in FIG. 15A, the symbol 131 indicating the “passing” which is the first behavior is shown in the first mode (for example, the first color). . Further, a symbol 134 indicating “lane change” as the second behavior and a symbol 135 indicating “deceleration” as the second behavior are shown.
  • the driver changes the first behavior to the second behavior by operating the operation unit 51 of the steering wheel 5. For example, the driver depresses the operation button 51a of the operation unit 51 or the operation button 51c (see (c) of FIG. 2), thereby “lane change” (symbol 134) or “deceleration” (symbol). 135) is updated.
  • FIG. 15B also shows a symbol 136 indicating that the vehicle control unit 7 is learning the behavior of the vehicle 1.
  • the symbol 136 When the symbol 136 is displayed, the vehicle control unit 7 learns the behavior selected by the driver.
  • the symbol 136 may not be displayed. In addition, learning may always be performed.
  • the vehicle control unit 7 stores the behavior selected by the driver in the storage unit 8, and when the same driving environment is subsequently set, the stored behavior is displayed on the notification unit 92 as the first behavior.
  • the vehicle control part 7 may memorize
  • FIG. 15 (b) a symbol 137 indicating that automatic operation is not being performed is shown.
  • the vehicle control unit 7 waits until a behavior selected after the first predetermined time has elapsed is selected by the driver.
  • the vehicle control unit 7 receives information on the selection operation, The notification unit 92 is caused to execute the display shown in FIG.
  • a symbol 134 'indicating “lane change” is shown in the first mode.
  • the vehicle control unit 7 determines that the selected behavior is the next behavior to be performed, and sets the symbol 134 ′ indicating “lane change” to the first. Is displayed on the notification unit 92.
  • the symbol 131 ′ in FIG. 15C is a symbol 131 ′ displayed as the first behavior in FIG. 15B and replaced with the symbol 134.
  • the vehicle control unit 7 receives information on an operation of pressing one of the operation buttons twice in succession, and changes from the display shown in FIG. 15C to the display shown in FIG. 15B. Is executed by the notification unit 92.
  • the vehicle control unit 7 causes the notification unit 92 to execute the display shown in FIG. 15A until the second predetermined time elapses, based on the driver's operation ( b)
  • the display of the notification unit 92 is changed to (c) in FIG.
  • the vehicle control unit 7 causes the notification unit 92 to display the display illustrated in FIG. 15D after the second predetermined time has elapsed since the notification unit 92 has executed the display illustrated in FIG. Display.
  • the vehicle control unit 7 displays the information shown in FIG. 15D before the second predetermined time elapses when information indicating that the driver has released his hand from the steering wheel 5 is acquired from the grip sensor. May be displayed on the notification unit 92.
  • the vehicle control unit 7 changes the display on the notification unit 92 so that the candidate of another behavior can be confirmed only when the driver wants to update the next behavior. To do.
  • the display visually recognized by the driver can be reduced, and the driver's troublesomeness can be reduced.
  • the driver model is obtained by modeling the tendency of the operation by the driver for each driving environment based on information on the frequency of each operation.
  • the driver model aggregates the traveling histories of a plurality of drivers and is constructed from the aggregated traveling histories.
  • the driving history of the driver is, for example, a history in which the behavior frequency actually selected by the driver among the behavior candidates corresponding to each driving environment is aggregated for each behavior candidate.
  • FIG. 16 is a diagram showing an example of a travel history.
  • the behavior candidates “decelerate”, “accelerate”, and “lane change” are selected three times, once, and five times, respectively. It has been shown.
  • candidate behaviors “follow”, “passing”, and “lane change” are shown twice, twice, 1 It is shown that you have selected once. The same applies to the driver y.
  • the driving history of the driver may aggregate behaviors selected during automatic driving or may aggregate behaviors actually performed by the driver during manual driving. This makes it possible to collect travel histories according to operating conditions such as automatic driving or manual driving.
  • the driver model includes a clustering type constructed by clustering the driving histories of a plurality of drivers, and a driver model of the driver x from a plurality of driving histories similar to a driving history of a specific driver (for example, driver x).
  • a specific driver for example, driver x
  • the clustering type driver model construction method aggregates the driving histories of a plurality of drivers as shown in FIG. Then, a driver model is constructed by grouping a plurality of drivers having a high degree of similarity between the traveling histories, that is, a plurality of drivers having similar driving operation tendencies.
  • FIG. 17 is a diagram illustrating a clustering type driver model construction method.
  • FIG. 17 shows the travel histories of the drivers a to f in a table format. From the driving histories of the drivers a to f, it is shown that the model A is constructed from the traveling histories of the drivers a to c, and the model B is constructed from the traveling histories of the drivers d to f.
  • the similarity of the travel histories is, for example, treating each frequency (each numerical value) in the travel histories of the driver a and the driver b as a frequency distribution, calculating a correlation value between the frequency distributions, and using the calculated correlation value as the similarity It is good.
  • each frequency each numerical value
  • the driving history of the driver a and the driver b is set as one group.
  • the calculation of similarity is not limited to this.
  • the degree of similarity may be calculated based on the number of the most frequently matched behaviors in the driving histories of the driver a and the driver b.
  • the clustering type driver model is constructed by, for example, calculating the average of each frequency in the driving history of drivers in each group.
  • FIG. 18 is a diagram illustrating an example of a built clustering driver model.
  • the average frequency of each group is derived by calculating the average of the respective frequencies.
  • the clustering type driver model is constructed with an average frequency for the behavior determined for each driving environment.
  • FIG. 19 is a diagram illustrating another example of the constructed clustering type driver model. As shown in FIG. 19, the most frequent behavior is selected for each traveling environment, and a driver model is constructed from the selected behavior.
  • the driver model as shown in FIG. 18 is stored in advance in the storage unit 8 of the vehicle 1. Further, the vehicle control unit 7 stores a travel history when the driver y has driven in the past in the storage unit 8. The driver y is detected by a camera or the like (not shown) installed in the vehicle.
  • the vehicle control unit 7 calculates the similarity between the driving history of the driver y and the driving history of each model of the driver model, and determines which model is most suitable for the driver y. For example, in the case of the driving history of the driver y shown in FIG. 16 and the driver model shown in FIG. 18, the vehicle control unit 7 determines that the model B is most suitable for the driver y.
  • the vehicle control unit 7 determines that the behavior with the highest frequency is the behavior most suitable for the driver y, that is, the first behavior in each traveling environment of the model B during actual automatic traveling.
  • the vehicle control unit 7 is based on the model B shown in FIG. “Follow-up” can be determined as the first behavior.
  • the individual adaptive type driver model construction method aggregates the driving histories of a plurality of drivers as shown in FIG.
  • the difference from the clustering type is that a driver model is constructed for each driver.
  • operator y is demonstrated.
  • the driving histories of a plurality of drivers having high similarity to the driving history of the driver y are extracted from the driving histories of the plurality of drivers collected. Then, a driver model of the driver y is constructed from the extracted driving histories of the plurality of drivers.
  • FIG. 20 is a diagram showing a method for constructing an individual adaptive driver model.
  • the driving histories of the drivers a to f are shown in a table format, as in FIG. FIG. 20 shows that the driver model of the driver y is constructed from the driving history of the driver y shown in FIG. 16 and the driving histories of the drivers c to e having high similarity.
  • the individual adaptive driver model is constructed by calculating the average of each frequency in the extracted driving history of each driver.
  • FIG. 21 is a diagram illustrating an example of a constructed individual adaptive driver model.
  • the average frequency of each behavior is derived for each driving environment.
  • the individually adaptive driver model for the driver y is constructed with an average frequency of behavior corresponding to each traveling environment.
  • the driver model of the driver y as shown in FIG. 21 is stored in advance in the storage unit 8 of the vehicle 1. Further, the vehicle control unit 7 stores a travel history when the driver y has driven in the past in the storage unit 8. The driver y is detected by a camera or the like (not shown) installed in the vehicle.
  • the vehicle control unit 7 determines that the behavior with the highest frequency is the most suitable behavior for the driver y, that is, the first behavior in each driving environment of the driver model of the driver y in actual automatic driving. Judge that there is.
  • the vehicle control unit 7 is based on the driver model shown in FIG. “Changing lane” can be determined as the first behavior.
  • the actual operation for example, the magnitude of acceleration, deceleration, or the amount of operation of the steering wheel
  • one behavior for example, lane change
  • the vehicle control unit 7 extracts a feature amount indicating the driving characteristics of the driver from the operation content of each part of the vehicle 1 of the driver, and stores it in the storage unit 8.
  • the feature amount includes, for example, a feature amount related to speed, a feature amount related to steering, a feature amount related to operation timing, a feature amount related to outside-vehicle sensing, a feature amount related to in-vehicle sensing, and the like.
  • the feature quantity related to speed includes, for example, the speed, acceleration, and deceleration of the vehicle, and these feature quantities are acquired from a speed sensor or the like that the vehicle has.
  • the feature amount related to steering includes, for example, the steering angle, angular velocity, and angular acceleration of the steering, and these feature amounts are acquired from the steering wheel 5.
  • the feature quantities related to the operation timing include, for example, the operation timing of the brake, accelerator, blinker lever, and steering wheel. These feature quantities are obtained from the brake pedal 2, the accelerator pedal 3, the blinker lever 4, and the steering wheel 5, respectively. Is done.
  • the feature amount related to outside-vehicle sensing includes, for example, a distance between vehicles in front, side, and rear, and these feature amounts are acquired from the sensor 62.
  • the feature amount related to in-vehicle sensing is, for example, personal recognition information indicating who the driver is and who is the passenger, and these feature amounts are acquired from a camera or the like installed in the vehicle.
  • the vehicle control unit 7 detects that the driver has manually changed the lane.
  • the detection method is performed by analyzing operation time series data acquired from CAN (Controller Area Network) information or the like by rule-setting an operation time series pattern for changing lanes in advance. In that case, the vehicle control part 7 acquires the feature-value mentioned above.
  • the vehicle control unit 7 stores the feature amount in the storage unit 8 for each driver, and constructs a driving characteristic model.
  • the vehicle control unit 7 may construct the above-described driver model based on the feature amount for each driver. That is, the vehicle control unit 7 extracts a feature value related to speed, a feature value related to steering, a feature value related to operation timing, a feature value related to outside-vehicle sensing, and a feature value related to in-vehicle sensing, and stores them in the storage unit 8. And based on the feature-value memorize
  • FIG. 22 is a diagram showing an example of the driving characteristic model.
  • FIG. 22 shows the feature values in a tabular format for each driver.
  • FIG. 22 also shows the number of times each behavior has been selected in the past for each driver. Although only a part of the feature amount is described, any or all of the above may be described.
  • the numerical value of speed is a numerical value indicating the actual speed in stages.
  • the numerical values for the steering wheel, brake, and accelerator are numerical values that indicate the operation amount in stages. These numerical values are obtained, for example, by calculating an average value of speed, steering wheel, brake, and accelerator operation amounts within a predetermined period in the past and expressing the average value stepwise.
  • the vehicle control unit 7 performs driving corresponding to the driver, the behavior, and the passenger according to who the driver is, what kind of behavior is executed, and who the passenger is.
  • the characteristic model is selected from the driving characteristic model shown in FIG.
  • the vehicle control unit 7 causes the vehicle 1 to travel at a speed corresponding to the selected driving characteristic model, and controls the vehicle 1 by a combination of the steering wheel, brake, accelerator operation amounts and timing. Thereby, the automatic driving
  • FIG. 23 is a diagram illustrating the display of the notification unit 92 according to the fourth embodiment of the present invention.
  • FIG. 23 is a display for the first example of the traveling environment shown in FIG.
  • FIG. 23 is a display of the notification unit 92 in a state in which the vehicle is performing normal travel that does not require lane change or vehicle acceleration / deceleration.
  • FIG. 23A shows a symbol 231 indicating that the driving characteristic of the driver is “high deceleration” driving characteristic and a symbol 232 indicating that the driver is currently in automatic driving.
  • the vehicle control unit 7 determines the driving characteristics of the driver based on the number of times each behavior included in the driving characteristics model shown in FIG. 22 has been selected in the past, for example. In this case, for example, the vehicle control unit 7 displays a display including a symbol 231 as shown in FIG. 23 for the driver who has a lot of “deceleration” based on driving characteristics (the number of times the behavior of so-called “deceleration” is selected is large). The information is displayed on the notification unit 92.
  • the vehicle control unit 7 determines that the driving environment is the driving environment of the first example illustrated in FIG. 5, the vehicle control unit 7 determines that the driving characteristic of the driver is “high deceleration” driving characteristic. Based on the fact, the first behavior is determined to be “deceleration”, and the notification unit 92 is caused to execute the display of FIG.
  • a symbol 233 indicating “deceleration” which is the first behavior is shown in a first mode (for example, a first color). Further, a symbol 234 indicating “acceleration” as the second behavior and a symbol 235 indicating “lane change” as the second behavior are shown.
  • the vehicle control unit 7 causes the notification unit 92 to execute the display of (c) of FIG. .
  • a symbol 234 'indicating "acceleration" which is the selected behavior is shown in the first mode. Further, the symbol 233 ′ is displayed by replacing the symbol 233 displayed as the first behavior in FIG. 23B with the symbol 234.
  • the vehicle control unit 7 causes the notification unit 92 to display the display illustrated in FIG. 23D after the second predetermined time has elapsed since the notification unit 92 has executed the display illustrated in FIG. Display.
  • a symbol 234 'indicating "acceleration" selected by the driver is displayed in the second mode as the next behavior.
  • the vehicle control unit 7 When it is determined that the next behavior to be taken is “acceleration”, the vehicle control unit 7 reads out feature amounts corresponding to the “acceleration” behavior included in the driving characteristic model, and performs “acceleration” reflecting those feature amounts. The vehicle 1 is controlled to do so.
  • FIG. 24 is a diagram illustrating the display of the notification unit 92 according to the fourth embodiment of the present invention.
  • FIG. 24 is a display for the second example of the traveling environment shown in FIG. 24, the same reference numerals as those in FIG. 23 are given to components common to those in FIG. 23, and detailed description thereof is omitted.
  • FIG. 24 is a diagram in which the symbol 235 indicating “lane change” is deleted from FIG. 23.
  • FIG. 7 As described above, in the second example (FIG. 7), unlike the first example (FIG. 5), another vehicle is traveling to the right of the vehicle 1, and therefore the lane cannot be changed. Therefore, “lane change” is not displayed in FIGS. 24B and 24C. Further, in the example of FIG. 24C, “acceleration” is selected as in the case of FIG. 23C, and therefore the vehicle control unit 7 is included in the driving characteristic model as in FIG. The vehicle 1 is controlled to read out feature amounts corresponding to the behavior of “acceleration” and perform “acceleration” reflecting those feature amounts.
  • FIG. 25 is a diagram illustrating the display of the notification unit 92 according to the fourth embodiment of the present invention.
  • FIG. 25 is a display for the third example of the travel environment shown in FIG.
  • (A) in FIG. 25 is the same as (a) in FIG.
  • the vehicle control unit 7 determines that the driving environment of the third example illustrated in FIG. 8 is satisfied
  • the vehicle control unit 7 determines that the driving characteristic of the driver is the “high deceleration” driving characteristic.
  • the first behavior is determined as “deceleration”, and the notification unit 92 is caused to execute the display of FIG.
  • a symbol 251 indicating "deceleration” which is the first behavior is shown in a first mode (for example, a first color). Further, a symbol 252 indicating “passing” that is the second behavior and a symbol 253 indicating “lane change” that is the second behavior are shown.
  • the vehicle control unit 7 causes the notification unit 92 to execute the display of (c) of FIG. 25. .
  • FIG. 25 (c) shows a symbol 252 'indicating "passing" which is the selected behavior in the first mode.
  • the symbol 251 ′ is displayed by replacing the symbol 251 displayed as the first behavior in FIG. 25B with the symbol 252.
  • the vehicle control unit 7 causes the notification unit 92 to display the display shown in FIG. 25D after the second predetermined time has elapsed since the notification unit 92 has executed the display shown in FIG. Display.
  • a symbol 252 'indicating "overtaking" selected by the driver is displayed in the second mode.
  • the vehicle control unit 7 reads out feature amounts corresponding to the “passing” behavior included in the driving characteristic model, and performs “acceleration” reflecting those feature amounts. The vehicle 1 is controlled to do so.
  • FIG. 26 is a diagram for explaining the display of the notification unit 92 according to the fourth embodiment of the present invention.
  • FIG. 26 is a display for the first example of the travel environment shown in FIG. 26A shows an example in which the driving characteristic of the driver is “high acceleration” driving characteristic, and
  • FIG. 26B shows the driving characteristic of the driver “many lane changes”. ”Shows an example in the case of driving characteristics.
  • FIG. 26 (a) shows a symbol 261 indicating that the driving characteristic of the driver is a driving characteristic with “high acceleration”. Further, a symbol 262 indicating “acceleration” which is the first behavior is shown in the first mode (for example, the first color). Further, a symbol 263 indicating “lane change” as the second behavior and a symbol 264 indicating “deceleration” as the second behavior are shown.
  • the vehicle control unit 7 gives a symbol such as (a) in FIG. 26 to a driver who has a lot of “acceleration” in the past based on driving characteristics (a large number of times the behavior of “acceleration” has been selected in the past).
  • the notification unit 92 is caused to execute display including H.261. Further, the vehicle control unit 7 determines that the first behavior is “acceleration” based on the driving characteristics of the driver being “high acceleration” driving characteristics, and notifies the display of FIG.
  • the unit 92 is caused to execute.
  • FIG. 26 (b) shows a symbol 265 indicating that the driving characteristic of the driver is a driving characteristic with “many lane changes”. Further, a symbol 266 indicating “lane change” as the first behavior is shown in the first mode (for example, the first color). Further, a symbol 267 indicating “lane change” as the second behavior and a symbol 268 indicating “deceleration” as the second behavior are shown.
  • the vehicle control unit 7 gives the driver a lot of “lane change” in the past from the driving characteristics (the so-called “lane change” has been selected many times in the past) as shown in FIG.
  • the notification unit 92 is caused to perform display including a simple symbol 265.
  • the vehicle control unit 7 determines that the first behavior is “lane change” based on the driving characteristics of the driver being “many lane changes”, and notifies the display of FIG.
  • the unit 92 is caused to execute.
  • the symbol 231 indicates the type of driver model selected from the driver's operation history. May be indicated.
  • the driver model applied to the driver who often selects “decelerate” causes the notification unit 92 to execute the display including the symbol 231 as shown in FIG.
  • the behavior of 1 is determined as “deceleration”.
  • the display including the symbol 261 as shown in FIG. 26A is executed by the notification unit 92, and the first behavior is determined as “acceleration”.
  • the display including the symbol 261 as shown in FIG. 26B is executed by the notification unit 92, and the first behavior is determined as “lane change”.
  • the driver's past driving history can be learned, and the result can be reflected in the determination of the future behavior.
  • the vehicle control unit controls the vehicle, the driving characteristics (driving preference) of the driver can be learned and reflected in the control of the vehicle.
  • automatic driving can be controlled at the timing or amount of operation that the driver or passenger likes, and unnecessary operation intervention by the driver during automatic driving without deviating from the feeling of actual driving by the actual driver. Can be suppressed.
  • a server device such as a cloud server may execute a function similar to the function executed by the vehicle control unit 7.
  • storage part 8 may exist not in the vehicle 1 but in server apparatuses, such as a cloud server.
  • the storage unit 8 may store an already constructed driver model, and the vehicle control unit 7 may determine the behavior with reference to the driver model stored in the storage unit 8.
  • the vehicle control unit 7 acquires feature amount information indicating the driving characteristics of the driver, the storage unit 8 stores the feature amount information, and the vehicle control unit 7 stores the feature amount information. Based on the feature amount information stored in the unit 8, a driver model indicating the tendency of the behavior of the vehicle selected by the driver with the frequency of each selected behavior is constructed for each traveling environment of the vehicle.
  • the vehicle control unit 7 determines a group of drivers who select a similar behavior among a plurality of drivers, and constructs a driver model for each group and for each driving environment of the vehicle.
  • the vehicle control unit 7 calculates the average value of the behavior frequency selected by each driver for each group of drivers who perform similar operations, and calculates the behavior tendency of the vehicle selected by the driver.
  • the driver model indicated by the value was constructed for each driving environment of the vehicle.
  • the vehicle control unit 7 determines the vehicle selected by the specific driver based on the behavior of the vehicle selected by another driver that tends to be similar to the behavior tendency of the vehicle selected by the specific driver.
  • a driver model indicating the tendency of behavior with the frequency of each selected behavior is constructed for each traveling environment of the vehicle.
  • the vehicle control unit 7 can construct a driver model more suitable for the driving tendency of the driver, and can perform more appropriate automatic driving for the driver based on the constructed driver model.
  • driver model (Modified example of driver model)
  • driver model demonstrated above modeled the tendency of operation (behavior) by the driver for every driving environment based on the information of the frequency of each operation, etc.
  • present invention is not limited to this. .
  • the driver model is constructed based on a travel history in which environmental parameters indicating travel environments (that is, situations) that have traveled in the past and operations (behaviors) actually selected by the driver in the travel environment are associated with each other. May be.
  • environmental parameters indicating travel environments (that is, situations) that have traveled in the past and operations (behaviors) actually selected by the driver in the travel environment are associated with each other. May be.
  • options can be determined without going through the procedure of separately detecting and classifying the driving environment and inputting (storing) the classification result into the driver model.
  • the driving environment differences as shown in FIGS. 23 and 24 are acquired as environment parameters and directly input (stored) in the driver model, so that “acceleration”, “deceleration”, “lane change” in FIG. “Is an option, and in FIG. 24," acceleration “and” deceleration "are options.
  • the driver model described below may be referred to as a situation database.
  • FIG. 27 is a diagram illustrating an example of a travel history.
  • FIG. 27 shows a travel history in which an environment parameter indicating a travel environment in which the vehicle driven by the driver x has traveled in the past and an operation (behavior) actually selected by the driver in the travel environment are associated with each other. Has been.
  • the environmental parameters (a) to (c) of the travel history shown in FIG. 27 are, for example, operated as shown in FIG. 8 (b), FIG. 5 (b), and FIG. 7 (b), respectively. It shows the driving environment when the behavior of the vehicle is presented to the person.
  • the environmental parameter of the travel history is obtained from sensing information and infrastructure information.
  • Sensing information is information detected by a vehicle sensor or radar.
  • the infrastructure information includes GPS information, map information, information acquired through road-to-vehicle communication, and the like.
  • the environmental parameters of the travel history shown in FIG. 27 are “own vehicle information”, “preceding vehicle information” indicating information of a vehicle traveling ahead of the lane on which the own vehicle a travels, and the lane on which the own vehicle travels.
  • “Side lane information” indicating the side lane information of the vehicle, and if there is a merging lane at the position where the host vehicle is traveling, the "Merge lane information” indicating the information of the merging lane, It includes “location information” indicating surrounding information.
  • “information on own vehicle” includes information on the speed Va of the own vehicle.
  • the “preceding vehicle information” includes information on the relative speed Vba of the preceding vehicle with respect to the own vehicle, the inter-vehicle distance DRba between the preceding vehicle and the own vehicle, and the rate of change RSb of the size of the preceding vehicle.
  • the speed Va of the host vehicle is detected by a speed sensor of the host vehicle.
  • the relative speed Vba and the inter-vehicle distance DRba are detected by a sensor or a radar.
  • “Information on the side lane” includes information on the side rear vehicle c traveling behind the host vehicle in the side lane, information on the side front vehicle d traveling ahead of the host vehicle in the side lane, Information of the remaining side lane length DRda.
  • the information on the side rear vehicle includes information on the relative speed Vca of the side rear vehicle with respect to the own vehicle, the inter-head distance Dca between the side rear vehicle and the own vehicle, and the change rate Rca of the inter-head distance.
  • the inter-head distance Dca between the side rear vehicle and the host vehicle is determined by measuring the front end portion (vehicle head) of the host vehicle and the front end portion of the side rear vehicle (in the direction along the traveling direction of the host vehicle (and the side rear vehicle)). This is the distance between The inter-vehicle distance may be calculated from the inter-vehicle distance and the vehicle length. The inter-vehicle distance may be substituted for the inter-vehicle distance.
  • the relative speed Vca and the inter-head distance Dca are detected by a sensor or a radar.
  • the information on the side front vehicle includes information on the relative speed Vda of the side front vehicle with respect to the host vehicle, the distance Dda between the head of the side front vehicle and the host vehicle, and the change rate Rda of the head distance.
  • the head-to-head distance Dda between the side front vehicle and the host vehicle is measured along the traveling direction of the host vehicle (and the side front vehicle) and the tip end portion (vehicle head) of the host vehicle and the tip portion (vehicle head) of the side front vehicle. Is the distance between.
  • the remaining side lane length DRda of the host vehicle is a parameter indicating a high possibility of lane change to the side lane. Specifically, the remaining side lane length DRda of the host vehicle is measured in the direction along the traveling direction of the host vehicle (and the side front vehicle) and the rear end portion of the side front vehicle. Is longer than the inter-vehicle distance DRba between the preceding vehicle and the host vehicle, the distance between the front end portion (vehicle head) of the host vehicle and the rear end portion of the side forward vehicle, and the front end portion of the host vehicle ( When the distance between the vehicle head) and the rear end portion of the side front vehicle is shorter than DRba, DRba is set. The remaining side lane length DRda of the host vehicle is detected by a sensor or a radar.
  • the information on the merging lane includes information on the relative speed Vma of the merging vehicle with respect to the own vehicle, the distance Dma between the merging vehicle and the own vehicle, and the rate of change Rma of the inter-vehicle distance.
  • the inter-head distance Dma between the joining vehicle and the host vehicle is measured in the direction along the traveling direction of the host vehicle (and the joining vehicle) and the leading end portion (head of the host vehicle) and the leading end portion (head of the joining vehicle) ).
  • the relative speed Vma and the inter-head distance Dma are detected by a sensor or a radar.
  • the numerical values of the speed, distance, and change rate described above are classified into a plurality of levels, and numerical values indicating the classified levels are stored. Note that the numerical values of the speed, the distance, and the change rate may be stored as they are without being classified into levels.
  • the location information includes “location information of own vehicle”, “number of driving lanes”, “traveling lane of own vehicle”, “distance to start / end points of merge section”, “distance to start / end points of branch section” ”,“ Distance to construction section start / end point ”,“ Distance to lane decrease section start / end point ”,“ Distance to traffic accident occurrence point ”, etc.
  • FIG. 27 shows information on “travel lane of own vehicle” (travel lane of FIG. 27) and “distance to start / end points of merge section” as examples of position information.
  • the distance to the start / end point of the merge section is determined in advance when the start / end point of the merge section exists within a predetermined distance. It is classified into a plurality of levels, and the numerical values of the classified levels are stored. If there is no start / end point of the merging section within the predetermined distance, “0” is stored in the “distance to the start / end point of the merging section” column.
  • the distance to the start / end point of the branch section is determined in advance. It is classified into a plurality of levels, and the numerical values of the classified levels are stored. If there is no start / end point of the branch section within the predetermined distance, “0” is stored in the “distance to the start / end point of the branch section”. In the "Distance to construction section start / end point” column, if there is a construction section start / end point within a predetermined distance, the distance to the construction section start / end point is determined in multiple levels. And the numerical value of the classified level is stored. When there is no construction section start / end point within a predetermined distance, “0” is stored in the column “Distance to construction section start / end point”.
  • the distance to the start / end point of lane decrease section is determined in advance when there is a start / end point of lane reduction section within the predetermined distance. It is classified into a plurality of levels, and the numerical values of the classified levels are stored. When there is no lane decrease section start / end point within a predetermined distance, “0” is stored in the “distance to lane decrease section start / end point” column.
  • the distance to the traffic accident occurrence point is classified into a plurality of predetermined levels. The numerical value of the selected level is stored. If there is no traffic accident occurrence point within a predetermined distance, “0” is stored in the “distance to the traffic accident occurrence point” column.
  • the position information may include information on which lanes of the road on which the vehicle is traveling are merge lanes, branch lanes, construction lanes, reduced lanes, and accident lanes.
  • the travel history shown in FIG. 27 is merely an example, and the present invention is not limited to this.
  • the travel history may further include “information on the left side lane” on the opposite side.
  • Left lane information includes information on the left rear vehicle traveling behind the host vehicle in the left lane, information on the left front vehicle traveling ahead of the host vehicle in the left lane, and the remaining left side of the host vehicle. Information on the direction lane length DRda.
  • the information on the left rear vehicle includes information on the relative speed Vfa of the left rear vehicle with respect to the host vehicle, the head distance Dfa between the left rear vehicle and the host vehicle, and the change rate Rfa of the head head distance.
  • the head-to-head distance Dfa between the left rear vehicle and the host vehicle is a front end portion (vehicle head) of the host vehicle measured in a direction along the traveling direction of the host vehicle (and the left rear vehicle) and a front end portion of the left rear vehicle ( This is the distance between
  • the information on the left front vehicle includes information on the relative speed Vga of the left front vehicle with respect to the own vehicle, the distance Dga between the left front vehicle and the own vehicle, and the rate of change Rga of the head distance.
  • the head-to-head distance Dga between the left front vehicle and the host vehicle is measured along the traveling direction of the host vehicle (and the left front vehicle) and the tip portion (vehicle head) of the host vehicle and the tip portion (vehicle head) of the left front vehicle. Is the distance between.
  • the travel history shown in FIG. 27 may include “rear vehicle information” indicating information on the rear vehicle traveling behind the host vehicle in the travel lane.
  • the information on the rear vehicle includes information on the relative speed Vea of the rear vehicle with respect to the host vehicle, the distance Dea between the rear vehicle and the host vehicle, and the rate of change Rea of the head distance.
  • the head-to-head distance Dea between the rear vehicle and the host vehicle is determined by the front end portion (vehicle head) of the host vehicle and the front end portion (vehicle head) of the rear vehicle measured in the direction along the traveling direction of the host vehicle (and the rear vehicle). Is the distance between.
  • the relative speed Vea and the inter-head distance Dea are detected by a sensor or a radar.
  • the measurable vehicle distance or an approximate value obtained by adding a predetermined vehicle length to the vehicle distance may be used as an alternative to the vehicle head distance.
  • the distance may be calculated by adding the length of each recognized vehicle type to the distance. Regardless of whether the head-to-head distance can be measured or not, as an alternative to the head-to-head distance, a measurable head-to-head distance or an approximate value obtained by adding a predetermined vehicle length to the head-to-head distance may be used. You may calculate by adding the vehicle length for every recognized vehicle type.
  • the traveling history may include various other information related to the traveling environment of the vehicle.
  • the travel history may include information on the size or type of the preceding vehicle, the side vehicle, the joining vehicle, or the relative position with respect to the host vehicle.
  • the type of a vehicle approaching from behind may be recognized by a camera sensor, and information indicating that the vehicle is an emergency vehicle may be included when the vehicle is an emergency vehicle. Thereby, it can inform that it is information reporting for correspondence to an emergency vehicle.
  • the numerical value which showed the steering wheel, the brake, the amount of accelerator operation in steps, or the passenger's information etc. as demonstrated in FIG. 22 may be contained in driving
  • the behaviors selected during the automatic driving may be aggregated, or the behaviors actually performed by the driver during the manual driving may be aggregated. This makes it possible to collect travel histories according to operating conditions such as automatic driving or manual driving.
  • the environmental parameter included in the travel history indicates the travel environment when the behavior of the vehicle is presented to the driver.
  • the travel environment when the driver selects the behavior May be shown.
  • both the environmental parameter indicating the driving environment when the behavior of the vehicle is presented to the driver and the environmental parameter indicating the driving environment when the driver selects the behavior may be included in the driving history. .
  • the vehicle control unit 7 performs the following operations: (a) in FIG. 2, (a) in FIG. 5, (a) in FIG. 6, (a) in FIG. 7, (a) in FIG. 8, (a) in FIG.
  • the following may be performed. That is, at least one of the information on the environmental parameter with a high degree of contribution and the information related to the environmental parameter (for example, an icon or the like) that causes the selection of the first behavior and the second behavior. May be generated as notification information, and the notification information may be notified to the notification unit 92 by, for example, showing the generated notification information on an overhead view.
  • the vehicle control unit 7 increases the luminance between the preceding vehicle and the own vehicle in the overhead view.
  • An area where the color is raised or the color is changed may be displayed to notify the notification unit 92 of the notification information.
  • the vehicle control unit 7 may display an icon indicating that the contribution degree of the inter-vehicle distance DRba or the change rate RSb is high in the area between the preceding vehicle and the host vehicle as the notification information. Further, the vehicle control unit 7 causes the notification unit 92 to draw a line segment connecting the preceding vehicle and the host vehicle as notification information on the overhead view, or to notify line segments connecting all the surrounding vehicles and the host vehicle. The line segment connecting the preceding vehicle and the host vehicle may be emphasized on the overhead view.
  • the vehicle control unit 7 raises the luminance between the preceding vehicle and the host vehicle in the viewpoint image seen from the driver, not the overhead view, and between the preceding vehicle and the host vehicle.
  • AR Augmented Reality
  • display may be realized by displaying different colored areas as notification information.
  • the vehicle control unit 7 may cause the notification unit 92 to display an AR indicating an environmental parameter having a high contribution degree in the region between the preceding vehicle and the host vehicle as notification information in the viewpoint image.
  • the vehicle control unit 7 displays the line segment connecting the preceding vehicle and the host vehicle in the viewpoint image as AR information, or the line segment connecting all the surrounding vehicles and the host vehicle in the viewpoint image. May be displayed as the notification information and the line segment connecting the preceding vehicle and the host vehicle may be emphasized.
  • the vehicle control unit 7 may generate, as notification information, an image that highlights a preceding vehicle that is a target of an environmental parameter with a high contribution, and may display the image on the notification unit 92.
  • the vehicle control unit 7 generates information indicating the direction of the preceding vehicle or the like that is the target of the environmental parameter with a high contribution in the overhead view or the AR display as the notification information, and the information is the own vehicle or the vicinity of the own vehicle. May be displayed.
  • the vehicle control unit 7 reduces the display brightness of a preceding vehicle or the like that is the target of the environmental parameter with a low contribution instead of notifying the information about the environmental parameter with a high contribution or information related to the environmental parameter.
  • information on an environmental parameter having a high degree of contribution that is made inconspicuous by making it inconspicuous or information related to the environmental parameter may be generated as notification information and displayed on the notification unit 92.
  • the driver model includes a clustering type constructed by clustering the driving histories of a plurality of drivers, and a driver model of the driver x from a plurality of driving histories similar to a driving history of a specific driver (for example, driver x).
  • a specific driver for example, driver x
  • the clustering type driver model construction method the driving history of the driver as shown in FIG. 27 is aggregated in advance for each driver. Then, a driver model is constructed by grouping a plurality of drivers having a high degree of similarity between the traveling histories, that is, a plurality of drivers having similar driving operation tendencies.
  • the similarity between the driving histories is a correlation between vectors having environmental parameter values and behavior values as elements. Can be determined from the value.
  • the correlation value calculated from the driving history of the driver a and the driver b is higher than a predetermined value, the driving history of the driver a and the driver b is set as one group. The calculation of the similarity is not limited to this.
  • the individual adaptive type driver model construction method aggregates the driving histories of a plurality of drivers as shown in FIG.
  • the difference from the clustering type is that a driver model is constructed for each driver.
  • the driving history of the driver y is compared with the driving histories of other drivers, and the driving histories of a plurality of drivers with high similarity are extracted. .
  • an individually adaptive driver model for the driver y is constructed from the extracted driving histories of the plurality of drivers.
  • driver model (situation database) based on the travel history shown in FIG. 27 is not limited to the clustering type or the individual adaptation type, and may be configured to include the travel history of all drivers, for example.
  • driver model in which the driving histories of four drivers a to d are aggregated is used for the driver x.
  • the driver model is constructed by the vehicle control unit 7.
  • FIG. 28 is a diagram showing a method of using the driver model in this modification.
  • (A) of FIG. 28 is an environmental parameter which shows the driving environment in the present time of the vehicle which the driver
  • FIG. 28B is an example of a driver model for the driver x.
  • the behavior (operation) with respect to the environmental parameter indicating the current driving environment is blank.
  • the vehicle control unit 7 acquires environmental parameters at predetermined intervals, and determines one of the environmental parameters as a trigger to determine the next behavior from the driver model shown in FIG.
  • a trigger for example, when the distance to the start point of the merging section is a predetermined distance or less, or when the relative speed with the preceding vehicle is a predetermined value or less, it is necessary to change the operation of the vehicle.
  • An environmental parameter indicating the case may be used as a trigger.
  • the vehicle control unit 7 compares the environmental parameter shown in FIG. 28A with the environmental parameter of each driving history of the driver model shown in FIG. 28B, and is associated with the most similar environmental parameter.
  • the determined behavior is determined to be the first behavior.
  • some behaviors associated with other similar environmental parameters are determined as second behaviors.
  • Whether the environmental parameters are similar can be determined from the correlation value of the vectors whose elements are the numerical values of the environmental parameters. For example, the correlation value calculated from the vector whose elements are the numerical values of the environmental parameters shown in FIG. 28A and the vector whose elements are the numerical values of the environmental parameters shown in FIG. 28B is higher than a predetermined value. The environmental parameters are determined to be similar. Note that the method for determining whether the environmental parameters are similar is not limited to this.
  • the storage unit 8 stores information indicating a safe driving criterion, and the vehicle control unit 7 determines whether or not the driving history satisfies this criterion. Furthermore, the vehicle control unit 7 may register a travel history that satisfies this criterion in the database, and may not register a travel history that does not satisfy this criterion in the database.
  • the vehicle control unit 7 accurately determines the next behavior without determining the specific driving environment, that is, without labeling the driving environment. Can be determined.
  • the driver model may be constructed from a travel history in which a behavior selected by the driver during automatic driving and an environment parameter indicating a travel environment when the behavior is presented are associated with each other.
  • the driver model may be constructed from a travel history in which a behavior selected by the driver during automatic driving and an environmental parameter indicating a travel environment when the behavior is performed by the vehicle are associated with each other.
  • an environmental parameter indicating a future driving environment is predicted from an environmental parameter indicating a current driving environment, and the predicted environment among the environmental parameters indicating the driving environment when the vehicle performs a behavior selected by the driver.
  • the behavior associated with the environmental parameter most similar to the parameter may be determined as the first behavior, and some behaviors associated with other similar environmental parameters may be determined as the second behavior.
  • the above prediction is performed, for example, by extrapolating environmental parameters at a future time from environmental parameters indicating the driving environment at the current time and a time before the current time.
  • the driver model (situation database) includes a driving history that associates a behavior selected by the driver during automatic driving with an environmental parameter indicating a driving environment when the behavior is presented, and a driver during automatic driving. May be constructed from both the travel history in which the behavior selected by and the environmental parameters indicating the travel environment when the vehicle performs the behavior are associated with each other.
  • both travel histories are stored in a format as shown in FIG. 28B, and the vehicle control unit 7 determines the next behavior from them.
  • the vehicle control unit 7 gives priority between the two, for example, associating the behavior selected by the driver during the automatic driving with the environment parameter indicating the traveling environment when the vehicle performs the behavior.
  • the next behavior may be determined preferentially from the travel history.
  • a server device such as a cloud server may execute a function similar to the function executed by the vehicle control unit 7.
  • the storage unit 8 since the storage unit 8 has an enormous number of data as the driving history is accumulated, it may be in a server device such as a cloud server instead of the vehicle 1.
  • the storage unit 8 may store an already constructed driver model, and the vehicle control unit 7 may determine the behavior with reference to the driver model stored in the storage unit 8.
  • the storage unit 8 In the configuration in which the storage unit 8 is provided in the cloud server, it is desirable to provide a cache in case the storage unit 8 cannot be accessed due to a decrease in communication speed or communication disconnection.
  • FIG. 29 is a block diagram showing an example of cache arrangement.
  • the vehicle control unit 7 stores the travel history in the storage unit 8 through the communication unit 291 and holds a part of the driver model (situation database) stored in the storage unit 8 in the cache 292 through the communication unit 291.
  • the vehicle control unit 7 accesses the driver model of the cache 292.
  • a method for creating a cache at this time a method of limiting by the presence or absence of environmental parameters, a method of using position information, a method of processing data, and the like are conceivable. Each will be described below.
  • the vehicle control unit 7 extracts driving environments having only the same environmental parameters from the driving environments stored in the storage unit 8, sorts these, and holds them in the cache 292.
  • the vehicle control unit 7 updates the primary cache at the timing when the environmental parameter obtained from the detected situation is changed. By doing so, the vehicle control unit 7 can extract a similar surrounding situation even if the communication speed decreases.
  • the environmental parameters for determining whether or not there is a change may be all of the environmental parameters listed above, or some of the environmental parameters.
  • a primary cache and a secondary cache may be prepared in the cache 292.
  • the vehicle control unit 7 holds a traveling environment having the same environmental parameter in the primary cache. Further, the vehicle control unit 7 is reduced by one from the driving environment in which one environmental parameter is added to the driving environment held in the temporary cache and from the driving environment in which the environmental parameter is held in the temporary cache. At least one of the driving environments in the state is held in the secondary cache.
  • the vehicle control unit 7 can extract a similar situation using only the data in the cache 292 even if a temporary communication interruption occurs.
  • the vehicle control unit 7 determines that the traveling environment in which only the side front vehicle 302 exists (the same The driving environment in which only the environmental parameters exist is extracted from the storage unit 8 in which all the driving environments (situations) are stored, and stored in the primary cache 304.
  • the vehicle control unit 7 is configured such that the traveling environment in which only one vehicle other than the side front vehicle 302 is added (the traveling environment in which one environmental parameter is added to the same environmental parameter) or the side front vehicle 302 is used.
  • a driving environment without a vehicle is extracted from the storage unit 8 and stored in the secondary cache 305.
  • the vehicle control unit 7 copies the driving environment corresponding to the changed ambient condition 303 from the secondary cache 305 to the primary cache 304, and the changed ambient condition 303. 2 is extracted from the storage unit 8 and stored in the secondary cache 305 by extracting a driving environment in which one environmental parameter has been added and a driving environment in which one environmental parameter has been reduced. The next cache 305 is updated. As a result, the vehicle control unit 7 can smoothly extract similar surrounding situations by comparing the surrounding situations.
  • the vehicle control unit 7 displays from the storage unit 8 a driving environment (situation) in which the position indicated by the position information is included within a certain range centered on the vehicle position. It can be extracted and stored in the cache 292.
  • the vehicle control unit 7 updates the cache 292 when the position indicated by the position information corresponding to the traveling environment is out of the certain range. By doing so, the vehicle control unit 7 can extract a similar ambient situation as long as the position is within a certain range even if communication is interrupted for a long time.
  • the storage unit 8 stores operation histories including environmental parameters.
  • the vehicle control unit 7 divides each environmental parameter into a certain range and creates a mesh in a multidimensional space. And the vehicle control part 7 creates the table which counted the behavior contained in each mesh for every classification.
  • the vehicle control unit 7 maps the environmental parameters included in the operation history in a planar shape as shown in FIG. 31A, and divides each of these axes within a certain range, thereby dividing the plane into a plurality of blocks. Divide. This is called a mesh.
  • the vehicle control unit 7 counts the number of behaviors included in each mesh for each type (for example, types such as acceleration, deceleration, lane change, and overtaking).
  • FIG. 31B shows a table in which the number of behaviors included in each mesh is counted for each type.
  • the vehicle control unit 7 holds this content in the cache 292. Then, the vehicle control unit 7 determines which mesh the detected environmental parameter is located in when extracting a similar surrounding situation by comparing the surrounding situations, and the behavior included in the determined mesh The behavior having the largest number is selected, and the behavior for notifying the selected behavior is determined.
  • the vehicle control section 7 when the vehicle control unit 7 determines that the detected environmental parameter is located at the third mesh position, the vehicle control section 7 indicates a behavior (here “acceleration”) indicating the maximum number of behaviors included in the third mesh. The behavior for notifying the operation is determined.
  • the update timing of the cache 292 may be anytime, and the capacity of the cache 292 can be made constant.
  • the vehicle control unit 7 acquires feature amount information indicating the driving characteristics of the driver including past driving environment information, and the storage unit 8 stores the feature amount information and changes the behavior of the vehicle.
  • the vehicle control unit 7 is similar to the feature amount indicating the driving characteristics of the driver including the newly acquired information on the driving environment from the feature amount information stored in the storage unit 8. The information to be determined is determined, and the behavior corresponding to the determined information is notified.
  • the feature amount information indicating the driving characteristics of the driver including the past driving environment information is the feature amount information when the vehicle behavior is presented to the driver. And at least one piece of feature amount information when the driver selects the behavior.
  • the feature amount information indicating the driving characteristics of the driver including the past driving environment information is the feature amount information when the behavior of the vehicle is presented to the driver. And the feature information when the driver has selected the behavior, the driver's driving characteristics including information on the driving environment newly acquired from both feature information. The information similar to the feature amount indicating is determined, and the behavior corresponding to the determined information is notified.
  • the driver model extension of the fourth embodiment the following is performed.
  • feature information indicating the driving characteristics of the driver including past driving environment information, information on the feature values when the behavior of the vehicle is presented to the driver, and the driver has selected the behavior
  • the driver's driving characteristics including the newly acquired driving environment information are preferentially selected from the feature amount information when the driver selects the behavior.
  • Information similar to the feature amount to be shown is determined, and a behavior corresponding to the determined information is notified.
  • the feature amount information indicating the driving characteristics of the driver including the past driving environment information is either one of the automatic driving and the manual driving of the vehicle, or The feature amount information indicates the driving characteristics of both drivers.
  • the vehicle control unit 7 can construct a driver model more suitable for the driving tendency of the driver, and can perform more appropriate automatic driving for the driver based on the constructed driver model.
  • the parameter indicating the driving environment By associating the parameter indicating the driving environment with the behavior, it is possible to accurately determine the next behavior without requiring processing for determining a specific driving environment, that is, without labeling the driving environment.
  • Level 5 In recent years, development related to automatic driving of automobiles has been promoted. As for automatic operation, the automation levels defined in 2013 by NHTSA (National Highway Traffic Safety Administration) are no automation (level 0), automation of specific functions (level 1), automation of complex functions (level 2), semi-automatic operation (Level 3) and fully automatic operation (level 4).
  • Level 1 is a driving support system that automatically performs one of acceleration, deceleration, and steering.
  • Level 2 is a driving support system that automatically performs two or more of acceleration, deceleration, and steering in harmony. is there. In either case, the driver remains involved in the driving operation.
  • Level 4 is a fully automatic driving system that automatically performs all of acceleration, deceleration, and steering, and the driver is not involved in the driving operation.
  • Level 3 is a semi-automatic driving system in which acceleration, deceleration, and steering are all performed automatically, and the driver performs driving operations as necessary.
  • a device that controls an HMI (Human Machine Interface) for exchanging information on automatic driving of a vehicle with a vehicle occupant (for example, a driver) (hereinafter referred to as a “driving support device”). Suggest).
  • the “behavior” of the vehicle in the following description corresponds to the “behavior” of the vehicle in the description of the first to fourth embodiments, and in automatic operation or manual operation, an operation such as steering or braking while the vehicle is running or stopped.
  • the control content related to the state or automatic operation control is included.
  • “Action” is, for example, constant speed running, acceleration, deceleration, temporary stop, stop, lane change, course change, right / left turn, parking, and the like.
  • the fifth embodiment a process for further improving the accuracy of estimating the next action will be described with respect to the individual adaptive driver model described in the fourth embodiment.
  • the fourth embodiment after collecting the driving history for each driver, by analyzing the operation frequency distribution of each driver, the driving history of other drivers similar to the driving history of the target driver is obtained.
  • a driver model is generated based on the selected travel history. That is, a driver model adapted to an individual is generated by grouping for each driver.
  • the driving behavior of the driver may change depending on the presence or absence of the passenger and the passenger's state. For example, even if there is no passenger, the lane is changed, and if there is a passenger, the vehicle is decelerated without changing the lane.
  • the operation history is collected for each combination of the driver and the passenger, and the travel history of another combination similar to the travel history of the target combination is selected and selected.
  • a driver model is generated based on the travel history. That is, the unit of processing is subdivided by executing the processing executed for each driver in the fourth embodiment for each combination of the driver and the passenger. In order to improve the accuracy of action estimation, more driving histories of drivers are required, and the processing load increases. Therefore, here, it is assumed that the processing is performed on the cloud server.
  • FIG. 32 is a block diagram showing a configuration of the vehicle 1000, and shows a configuration related to automatic driving.
  • the vehicle 1000 can travel in the automatic driving mode, and includes a notification device 1002, an input device 1004, a wireless device 1008, a driving operation unit 1010, a detection unit 1020, an automatic driving control device 1030, and a driving support device 1040.
  • Each device shown in FIG. 32 may be connected by wired communication such as a dedicated line or CAN (Controller-Area-Network). Further, it may be connected by wired communication or wireless communication such as USB (Universal Serial Bus), Ethernet (registered trademark), Wi-Fi (registered trademark), Bluetooth (registered trademark) or the like.
  • USB Universal Serial Bus
  • Ethernet registered trademark
  • Wi-Fi registered trademark
  • Bluetooth registered trademark
  • Vehicle 1000 corresponds to vehicle 1 in the first to fourth embodiments.
  • the notification device 1002 corresponds to the information notification device 9 in FIGS. 1 and 13
  • the input device 1004 corresponds to the operation unit 51 in FIG. 1 and the input unit 102 in FIG. 13, and the detection unit 1020 detects in FIG. This corresponds to part 6.
  • the automatic driving control device 1030 and the driving support device 1040 correspond to the vehicle control unit 7 in FIGS. 1 and 13.
  • the description of the configuration described in the first to fourth embodiments will be omitted as appropriate.
  • the notification device 1002 notifies the driver of information related to the traveling of the vehicle 1000.
  • the notification device 1002 emits light such as a car navigation system installed in a vehicle, a head-up display, a center display, a steering wheel, a pillar, a dashboard, and a meter panel.
  • the display unit may display information such as a body.
  • the speaker may be a speaker that converts information into sound and notifies the driver, or a vibration body provided at a position (for example, the driver's seat, steering wheel, etc.) that the driver can sense. Also good.
  • the notification device 1002 may be a combination of these.
  • the input device 1004 is a user interface device that receives an operation input by an occupant.
  • the input device 1004 receives information related to automatic driving of the host vehicle input by the driver.
  • the driving support device 1040 outputs the received information to the driving support device 1040 as an operation signal.
  • FIG. 33 schematically shows the interior of the vehicle 1000 in FIG.
  • the notification device 1002 may be a head-up display (HUD) 1002a or a center display 1002b.
  • the input device 1004 may be the first operation unit 1004a provided on the steering 1011 or the second operation unit 1004b provided between the driver seat and the passenger seat. Note that the notification device 1002 and the input device 1004 may be integrated, and may be implemented as a touch panel display, for example.
  • the vehicle 1000 may be provided with a speaker 1006 that presents information related to automatic driving to the occupant by voice.
  • the driving support device 1040 may display an image indicating information related to automatic driving on the notification device 1002 and output a sound indicating information related to automatic driving from the speaker 1006 together with or instead of the information.
  • the wireless device 1008 corresponds to a mobile phone communication system, WMAN (Wireless Metropolitan Area Network), and the like, and performs wireless communication with a device (not shown) outside the vehicle 1000.
  • the driving operation unit 1010 includes a steering 1011, a brake pedal 1012, an accelerator pedal 1013, and a blinker switch 1014.
  • the steering wheel 1011 is shown in FIGS. 1 and 13, the brake pedal 1012 is shown in FIG. 1, the brake pedal 2 shown in FIG. 13, the accelerator pedal 1013 is shown in FIG. 1, the accelerator pedal 3 shown in FIG. 13, and the winker switch 1014 is shown in FIGS. This corresponds to the winker lever 4.
  • Steering 1011, brake pedal 1012, accelerator pedal 1013, and winker switch 1014 can be electronically controlled by a steering ECU, a brake ECU, an engine ECU, a motor ECU, and a winker controller.
  • the steering ECU, the brake ECU, the engine ECU, and the motor ECU drive the actuator in accordance with a control signal supplied from the automatic operation control device 1030.
  • the blinker controller turns on or off the blinker lamp according to a control signal supplied from the automatic operation control device 1030.
  • Detecting unit 1020 detects the surrounding state and running state of vehicle 1000. As described in part in the first to fourth embodiments, for example, the detection unit 1020 detects the speed of the vehicle 1000, the relative speed of the preceding vehicle with respect to the vehicle 1000, the distance between the vehicle 1000 and the preceding vehicle, and the side lane with respect to the vehicle 1000. The relative speed of the vehicle, the distance between the vehicle 1000 and the vehicle in the side lane, and the position information of the vehicle 1000 are detected.
  • the detection unit 1020 outputs various types of detected information (hereinafter referred to as “detection information”) to the automatic driving control device 1030 and the driving support device 1040. Details of the detection unit 1020 will be described later.
  • the automatic driving control device 1030 is an automatic driving controller that implements an automatic driving control function, and determines the behavior of the vehicle 1000 in automatic driving.
  • the automatic operation control device 1030 includes a control unit 1031, a storage unit 1032, and an I / O unit (input / output unit) 1033.
  • the configuration of the control unit 1031 can be realized by cooperation of hardware resources and software resources, or only by hardware resources. Processors, ROM, RAM, and other LSIs can be used as hardware resources, and programs such as an operating system, application, and firmware can be used as software resources.
  • the storage unit 1032 includes a nonvolatile recording medium such as a flash memory.
  • the I / O unit 1033 executes communication control according to various communication formats. For example, the I / O unit 1033 outputs information related to automatic driving to the driving support device 1040 and inputs a control command from the driving support device 1040. Further, the I / O unit 1033 inputs detection information from the detection unit 1020.
  • the control unit 1031 applies the control command input from the driving support device 1040, various information collected from the detection unit 1020 or various ECUs to the automatic driving algorithm, and controls an automatic control target such as the traveling direction of the vehicle 1000. Calculate the control value.
  • the control unit 1031 transmits the calculated control value to each control target ECU or controller. In this embodiment, it is transmitted to the steering ECU, the brake ECU, the engine ECU, and the winker controller. In the case of an electric vehicle or a hybrid car, the control value is transmitted to the motor ECU instead of or in addition to the engine ECU.
  • the driving support device 1040 is an HMI controller that executes an interface function between the vehicle 1000 and the driver, and includes a control unit 1041, a storage unit 1042, and an I / O unit 1043.
  • the control unit 1041 executes various data processing such as HMI control.
  • the control unit 1041 can be realized by cooperation of hardware resources and software resources, or only by hardware resources. Processors, ROM, RAM, and other LSIs can be used as hardware resources, and programs such as an operating system, application, and firmware can be used as software resources.
  • the storage unit 1042 is a storage area that stores data that is referred to or updated by the control unit 1041. For example, it is realized by a non-volatile recording medium such as a flash memory.
  • the I / O unit 1043 executes various communication controls according to various communication formats.
  • the I / O unit 1043 includes an operation input unit 1050, an image output unit 1051, a detection information input unit 1052, a command IF (Interface) 1053, and a communication IF 1056.
  • the operation input unit 1050 receives an operation signal from the input device 1004 by the operation of the driver, the occupant, or the user outside the vehicle made to the input device 1004, and outputs it to the control unit 1041.
  • the image output unit 1051 outputs the image data generated by the control unit 1041 to the notification device 1002 for display.
  • the detection information input unit 1052 is a result of the detection process by the detection unit 1020, receives information (hereinafter referred to as “detection information”) indicating the current surrounding state and running state of the vehicle 1000 from the detection unit 1020, and performs control. Output to the unit 1041.
  • the command IF 1053 executes an interface process with the automatic driving control apparatus 1030, and includes a behavior information input unit 1054 and a command output unit 1055.
  • the behavior information input unit 1054 receives information regarding the automatic driving of the vehicle 1000 transmitted from the automatic driving control device 1030 and outputs the information to the control unit 1041.
  • the command output unit 1055 receives from the control unit 1041 a control command that instructs the automatic driving control device 1030 to specify the mode of automatic driving, and transmits the control command to the automatic driving control device 1030.
  • the communication IF 1056 executes interface processing with the wireless device 1008.
  • the communication IF 1056 transmits the data output from the control unit 1041 to the wireless device 1008, and transmits the data from the wireless device 1008 to a device outside the vehicle.
  • the communication IF 1056 receives data from a device outside the vehicle transferred by the wireless device 1008 and outputs the data to the control unit 1041.
  • the automatic driving control device 1030 and the driving support device 1040 are configured as separate devices.
  • the automatic driving control device 1030 and the driving support device 1040 may be integrated into one controller.
  • one automatic driving control device may be configured to have both functions of the automatic driving control device 1030 and the driving support device 1040 of FIG.
  • FIG. 34 is a block diagram showing a detailed configuration of the detection unit 1020 and the detection information input unit 1052.
  • the detection unit 1020 includes a first detection unit 1060 and a second detection unit 1062, and the detection information input unit 1052 includes a first input unit 1070 and a second input unit 1072.
  • the first detection unit 1060 includes a position information acquisition unit 1021, a sensor 1022, a speed information acquisition unit 1023, and a map information acquisition unit 1024.
  • the second detection unit 1062 includes a driver sensing unit 1064 and a passenger sensing unit 1066. including.
  • the first detection unit 1060 mainly detects the surrounding state and the running state of the vehicle 1000.
  • the first detection unit 1060 outputs the detected information (hereinafter referred to as “first detection information”) to the first input unit 1070.
  • the first input unit 1070 inputs the first detection information from the first detection unit 1060.
  • the second detection unit 1062 mainly detects information on the driver who is on the vehicle 1000 and the passenger.
  • the second detection unit 1062 outputs the detected information (hereinafter referred to as “second detection information”) to the second input unit 1072.
  • the second input unit 1072 inputs the second detection information from the second detection unit 1062. Note that the combination of the first detection information and the second detection information or one of them corresponds to the detection information described above.
  • the position information acquisition unit 1021 of the first detection unit 1060 acquires the current position of the vehicle 1000 from the GPS receiver.
  • the sensor 1022 is a generic name for various sensors for detecting the situation outside the vehicle and the state of the vehicle 1000.
  • a camera, a millimeter wave radar, a LIDAR (Light Detection and Ranging or Laser Imaging Detection and Ranging), a temperature sensor, an atmospheric pressure sensor, a humidity sensor, an illuminance sensor, and the like are mounted as sensors for detecting the situation outside the vehicle.
  • the situation outside the vehicle includes a road condition in which the host vehicle travels including lane information, an environment including weather, a situation around the host vehicle, and other vehicles in the vicinity (such as other vehicles traveling in the adjacent lane). Any information outside the vehicle that can be detected by the sensor may be used.
  • an acceleration sensor, a gyro sensor, a geomagnetic sensor, a tilt sensor, and the like are mounted as sensors for detecting the state of the vehicle 1000.
  • Speed information acquisition unit 1023 acquires the current speed of vehicle 1000 from the vehicle speed sensor.
  • the map information acquisition unit 1024 acquires map information around the current position of the vehicle 1000 from the map database.
  • the map database may be recorded on a recording medium in the vehicle 1000, or may be downloaded from a map server via a network when used.
  • the driver sensing unit 1064 of the second detection unit 1062 authenticates the driver sitting on the driver's seat of the vehicle 1000.
  • a camera capable of imaging a driver's face seated in the driver's seat of the vehicle 1000 is installed in the vehicle, and the driver's face is imaged by the camera.
  • the driver sensing unit 1064 previously holds information regarding the driver's face that may be seated in the driver's seat of the vehicle 1000.
  • the information related to the driver's face is, for example, a face image, feature point information of the face image, or the like.
  • the driver sensing unit 1064 identifies an individual driver seated in the driver's seat by comparing the image captured by the camera with information related to the driver's face.
  • a TOF (Time Of Flight) sensor and a fingerprint sensor are installed in the vehicle, and based on information acquired by such a sensor, the driver sensing unit 1064 identifies the individual driver sitting in the driver's seat. Also good. The driver sensing unit 1064 outputs the specified driver information as second detection information.
  • the passenger sensing unit 1066 authenticates passengers seated in the passenger seat and rear seat of the vehicle 1000.
  • a seating sensor is installed in each seat, and the passenger sensing unit 1066 identifies the presence or absence of a passenger based on information acquired by the seating sensor.
  • the passenger is identified as having a passenger in the passenger seat and no passenger in the rear seat.
  • a camera capable of capturing the face of the passenger sitting in the passenger seat and the rear seat is installed in the vehicle, and based on the images captured by the camera,
  • the person sensing unit 1066 may specify the presence / absence of a passenger and information related to the passenger.
  • Information related to passengers includes age / sex, personal recognition, and occupant status (drowsiness / car sickness).
  • the TOF sensor is installed in the vehicle, and based on the information acquired by the TOF sensor, the passenger sensing unit 1066 may specify the presence / absence of the passenger and information related to the passenger.
  • the fellow passenger sensing unit 1066 outputs the presence / absence of the identified fellow passenger as second detection information. Further, when the passenger sensing unit 1066 specifies information related to the passenger, the passenger sensing unit 1066 outputs the information as the second detection information.
  • FIG. 35 is a block diagram illustrating a detailed configuration of the control unit 1041.
  • the control unit 1041 includes a detection unit 1100, a travel history generation unit 1102, a transmission unit 1104, an inquiry unit 1106, a travel history acquisition unit 1108, a driver model generation unit 1110, a determination unit 1112, a confirmation unit 1114, a screen generation unit 1116, an instruction Part 1118.
  • the detection unit 1100 is connected to a door opening / closing sensor of the vehicle 1000 and is also connected to the second input unit 1072.
  • the detection unit 1100 is notified from the opening / closing sensor of the timing at which the door is opened / closed. Since a known technique may be used for detection of the opening / closing timing by the opening / closing sensor, description thereof is omitted here.
  • the detection unit 1100 receives the second detection information from the second input unit 1072 when receiving notification of the opening / closing timing.
  • the detection unit 1100 may input the second detection information from the second input unit 1072 when the passenger's state changes.
  • the detection unit 1100 detects the individual driver of the vehicle 1000 by inputting the second detection information.
  • the detection part 1100 detects the presence or absence of the passenger of the vehicle 1000 by inputting 2nd detection information. Further, the vehicle 1000 may detect information regarding passengers of the vehicle 1000.
  • the travel history generation unit 1102 is connected to the first input unit 1070, the detection unit 1100, and the instruction unit 1118. Although details will be described later, when the instruction unit 1118 instructs the automatic driving control apparatus 1030 to perform the next action, the instruction unit 1118 notifies the travel history generation unit 1102 of the instructed action. Here, the next action is selected by the driver, and the actions are, for example, “deceleration”, “acceleration”, “constant speed running”, “right lane change”, and the like. When the notification from the instruction unit 1118 is received, the travel history generation unit 1102 inputs the first detection information from the first input unit 1070 and the information from the detection unit 1100.
  • the traveling history generation unit 1102 derives environmental parameters based on various information included in the first detection information.
  • the environmental parameters include “the speed Va of the own vehicle”, “the relative speed Vba of the preceding vehicle with respect to the own vehicle”, “the inter-vehicle distance DRba between the preceding vehicle and the own vehicle”, and “the preceding vehicle”.
  • the inter-head distance Dca between the side rear vehicle and the host vehicle “the change rate Rca of the inter-head distance”, “the relative speed Vda of the side front vehicle with respect to the host vehicle”, “the head of the side front vehicle and the host vehicle”
  • Inter-distance Dda “change ratio Rda of head-to-vehicle distance”
  • “remaining side lane length DRda of the host vehicle” “the relative speed Vma of the joining vehicle with respect to the own vehicle”
  • the travel history generated in the travel history generation unit 1102 for each type of information input from the detection unit 1100 will be described.
  • types of information include: (1) individual driver, presence / absence of passenger, (2) individual driver, age / sex of passenger, (3) individual driver, individual passenger, (4) individual driver, Assume the presence or absence of a passenger, the passenger's condition, and (5) the driver's personality.
  • Each travel history is shown in FIG. 36, FIG. 37, and FIG. FIG. 36 shows a data structure of a travel history generated by the travel history generation unit 1102, and FIG. 37 shows another data structure of a travel history generated by the travel history generation unit 1102.
  • the traveling history generation unit 1102 generates a traveling history shown in FIG. Specifically, the travel history generation unit 1102 receives information from the detection unit 1100 at the timing when the notification from the acquisition unit 1108 is received, and includes the driver's name, the presence / absence of a passenger in the front passenger seat, Enter the number of participants.
  • the driver's name is shown as “A” or “B”, and in the case of a passenger in the passenger seat “Yes”, “ ⁇ ” is shown, and the passenger in the passenger seat “ In the case of “none”, “x” is indicated, and the number of passengers in the rear seat is indicated as “0” or “1”.
  • the travel history generation unit 1102 inputs a value such as “Va” as the travel history at the timing. Furthermore, the travel history generation unit 1102 stores the input information, values, and the action indicated in the notification from the acquisition unit 1108, for example, “deceleration”, and stores them in one line of FIG. That is, the travel history generation unit 1102 generates a travel history in which an environmental parameter indicating a travel environment in which the vehicle 1000 has traveled in the past and an action selected by the driver with respect to the environmental parameter. At that time, the travel history is generated for each combination of the presence / absence of the driver and the passenger.
  • the travel history generation unit 1102 generates a travel history shown in FIG. Specifically, the travel history generation unit 1102 inputs the driver's name and the age / gender of the passenger as information from the detection unit 1100 at the timing when the notification from the acquisition unit 1108 is received.
  • the age / sex of the passenger is shown as “30's / female”, “30's / female / boy”.
  • the former also indicates that there is one passenger, and the latter also indicates that there are two passengers.
  • Such a passenger's age and sex can be said to be information on the passenger.
  • the travel history generation unit 1102 summarizes the driver's name, the passenger's age and sex, values related to the travel history, and the behavior shown in the notification from the acquisition unit 1108. (B) in one row. In other words, the travel history generation unit 1102 generates a travel history for each combination of the driver, the presence / absence of a passenger detected in the past, and information related to the passenger.
  • the travel history generation unit 1102 generates a travel history shown in FIG. More specifically, the travel history generation unit 1102 receives information from the detection unit 1100 at the timing when the notification from the acquisition unit 1108 is received, and includes the driver's name, the passenger's passenger's name, Enter the person's name. In FIG. 37A, the names of the passengers are indicated as “B”, “C”, “D”. By confirming the passenger's name, the number of passengers is also clarified. As in the case of (1), the travel history generation unit 1102 indicates the driver's name, the passenger's passenger's name, the rear passenger's name, the value related to the travel history, and the notification from the acquisition unit 1108. Collected actions are stored in one line of FIG. Again, the travel history is generated in the travel history generation unit 1102 for each combination of the driver, the presence or absence of a fellow passenger detected in the past, and information related to the fellow passenger.
  • the travel history generation unit 1102 generates the travel history shown in FIG. Specifically, the travel history generation unit 1102 receives information from the detection unit 1100 at the timing when the notification from the acquisition unit 1108 is received, and includes the name of the driver, the presence or absence of a passenger in the passenger seat, the passenger's Enter the state. In FIG. 37 (b), the passenger's state is indicated as "normal”, “sleep", "car sickness”. In addition, the number of passengers is also clarified by confirming the passenger's condition.
  • the travel history generation unit 1102 is indicated in the driver's name, the presence / absence of a passenger in the passenger seat, the state of the passenger, the value related to the travel history, and the notification from the acquisition unit 1108. Actions are collected and stored in one line of FIG. Again, the travel history is generated in the travel history generation unit 1102 for each combination of the driver, the presence or absence of a fellow passenger detected in the past, and information related to the fellow passenger.
  • the travel history generation unit 1102 generates a travel history by executing the process in which the passenger's part in (1) to (4) is omitted. That is, the travel history generation unit 1102 generates, for each driver, a travel history in which an environment parameter indicating a travel environment in which the vehicle 1000 has traveled in the past and an action selected by the driver with respect to the environment parameter are associated with each other. .
  • the travel history generation unit 1102 generates a travel history using the presence / absence of a passenger, the age / sex of the passenger, the name of the passenger, the state of the passenger, and the like.
  • the travel history may be generated by arbitrarily combining these.
  • travel history generation section 1102 outputs the travel history to transmission section 1104 and inquiry section 1106.
  • the transmission unit 1104 inputs the travel history from the travel history generation unit 1102.
  • the transmission unit 1104 notifies the cloud server (not shown) of the travel history update from the wireless device 1008 via the communication IF 1056.
  • the cloud server is provided outside the vehicle 1000 in order to collect travel histories generated in the driving support device 1040 installed in each of the plurality of vehicles 1000. That is, the cloud server collectively manages the travel histories generated in each of the plurality of driving support devices 1040, but for convenience of explanation, the travel histories stored in the cloud server are referred to as “total travel histories”.
  • the cloud server receives the travel history update notification, the cloud server transmits a travel history request to the wireless device 1008 to transmit the travel history.
  • the transmission unit 1104 receives the identification information (hereinafter referred to as a combination of the driver's name and the presence / absence of a passenger) that is a combination in the driving history. , “ID”) is assigned to each combination.
  • FIG. 38 shows an outline of processing in the transmission unit 1104.
  • FIG. 38A shows the data structure of the travel history input to the transmission unit 1104, which is the same as FIG. FIG. 38 (b) shows the correspondence between the driver's name and the presence / absence of a passenger and the ID.
  • the driver's name is “A”
  • the passenger in the passenger seat is “none”
  • the passenger in the rear seat is “none”
  • the ID “0001” is associated.
  • the driver's name is “A”
  • the passenger in the passenger seat is “present”, and the passenger in the rear seat is “one person”, the ID “0003” is associated. Yes.
  • ID is defined so that it may not overlap between the several driving assistance apparatuses 1040.
  • FIG. Here, the case where the driver's name is “B”, the passenger in the passenger seat is “none”, and the passenger in the rear seat is “none” is added by updating the driving history, and When no ID is assigned to the combination, the transmission unit 1104 assigns an ID “0004” to the combination.
  • the passenger's age and gender are included in the combination.
  • only the driver's name, not the combination is included. include.
  • the transmitting unit 1104 uses the relationship shown in FIG. 38B to replace the combination shown in FIG. 38A with an ID.
  • FIG. 39 shows another processing outline in the transmission unit 1104. As illustrated, the combination is replaced with an ID. By using such an ID, even information relating to the driver “A” is separated into three pieces of information “ID” from “0001” to “0003”.
  • the transmission unit 1104 outputs the travel history replaced with the ID (hereinafter also referred to as “travel history”) to the communication IF 1056.
  • the communication IF 1056 causes the wireless device 1008 to transmit a travel history to the cloud server. At that time, only the updated portion of the travel history may be transmitted.
  • the cloud server adds the received travel history to the total travel history.
  • the inquiry unit 1106 inputs the travel history from the travel history generation unit 1102.
  • the inquiry unit 1106 also inputs information from the detection unit 1100.
  • the information input here is a combination of the name of the current driver and the presence or absence of the current passenger.
  • the inquiry unit 1106 extracts a travel history for a combination of the name of the current driver and the presence or absence of a current passenger from the travel history generated by the travel history generation unit 1102.
  • FIG. 40 shows an outline of processing in the inquiry unit 1106.
  • 40A shows the data structure of the travel history input to the inquiry unit 1106, which is the same as FIG. 36A.
  • FIG. 40B shows a result of extracting a travel history for the current combination from the travel history shown in FIG.
  • the inquiry unit 1106 transmits an inquiry signal for retrieving a travel history similar to the extracted travel history from the total travel history to the cloud server via the communication IF 1056 and the wireless device 1008.
  • the inquiry signal includes the extracted traveling history (hereinafter referred to as “inquiry traveling history”).
  • the cloud server receives the inquiry signal, acquires the inquiry travel history from the inquiry signal.
  • the cloud server searches and acquires a travel history similar to the inquiry travel history from the total travel history. More specifically, the cloud server extracts one action and an environment parameter corresponding to the action from the inquiry travel history.
  • the extracted environmental parameters are referred to as “first environmental parameters”.
  • the cloud server acquires a plurality of environmental parameters corresponding to the extracted behavior from the total travel history.
  • each of the acquired plurality of environment parameters is referred to as a “second environment parameter”.
  • the cloud server calculates a correlation value of vectors having the numerical value of the first environmental parameter and the numerical value of one second environmental parameter as elements. If the correlation value is larger than a threshold value (hereinafter referred to as “in-server threshold value”), the cloud server specifies an ID corresponding to the second environmental parameter and all the environmental parameters to which the ID is assigned. Obtained from the total travel history. On the other hand, if the correlation value is equal to or less than the intra-server threshold value, the cloud server does not execute acquisition. The cloud server executes such a process for each of the acquired plurality of second environment parameters, and also executes such a process for other actions included in the inquiry travel history. As a result, the cloud server acquires one or more environmental parameters similar to the inquiry travel history.
  • a threshold value hereinafter referred to as “in-server threshold value”
  • a plurality of IDs may be mixed in the acquired environmental parameter.
  • the cloud server collects the acquired environmental parameters as “similar travel history”. In that case, the action corresponding to each environmental parameter is also included.
  • the similar running history has a data structure as shown in FIG. 39, for example.
  • the acquisition unit 1108 acquires a similar traveling history from the cloud server via the wireless device 1008 and the communication IF 1056 as a response to the inquiry by the inquiry unit 1106.
  • the similar travel history is a travel history similar to the travel history for the combination of the current driver and the current presence / absence of a passenger.
  • the passenger's age and gender are included in the combination.
  • the driver model generation unit 1110 inputs the similar travel history from the acquisition unit 1108.
  • the driver model generation unit 1110 generates a driver model based on the similar traveling history.
  • the driver model generation unit 1110 generates a driver model by combining an inquiry travel history and a similar travel history.
  • FIG. 41 shows a data structure of a driver model generated by the driver model generation unit 1110. As shown in the figure, IDs, environmental parameters, and actions included in the similar traveling history are combined. Here, the ID is not shown, and the part where the environmental parameter and the action are combined corresponds to the inquiry travel history. Note that an ID may not be included in the driver model.
  • the driver model generation unit 1110 may generate a driver model by averaging the numerical values of the environmental parameters in the same action in the inquiry travel history and the similar travel history.
  • the driver model generation unit 1110 outputs the driver model to the determination unit 1112.
  • the determination unit 1112 inputs a driver model from the driver model generation unit 1110. Further, the determination unit 1112 receives the first detection information from the first input unit 1070. The determination unit 1112 derives the current environment parameter based on various types of information included in the first detection information. Since the environmental parameters are as described above, description thereof is omitted here. The determination unit 1112 calculates a correlation value of a vector whose elements are the value of the environmental parameter shown in each row of the driver model shown in FIG. 41 and the value of the current environmental parameter. Further, the determination unit 1112 repeatedly executes the calculation of the correlation value while changing the row of the driver model shown in FIG. As a result, a plurality of correlation values corresponding to each row of the driver model shown in FIG. 41 are derived.
  • the determination unit 1112 selects a maximum correlation value from among a plurality of correlation values, and then selects an action indicated in a row corresponding to the selected correlation value as an “action candidate”. The selection of an action candidate corresponds to determining the next action.
  • the determination unit 1112 may set a threshold value in advance, and may select a plurality of correlation values larger than the threshold value from a plurality of correlation values.
  • the determination unit 1112 takes statistics of the behaviors shown in the selected plurality of rows, and sets “first behavior candidates”, “second behavior candidates”,..., “Nth behavior candidates” in descending order. Note that an upper limit value may be set for the number of action candidates.
  • the determination unit 1112 outputs one or more action candidates to the confirmation unit 1114.
  • the confirmation unit 1114 is connected to the behavior information input unit 1054, and inputs information related to automatic driving from the automatic driving control device 1030 via the behavior information input unit 1054. In the information related to automatic driving, the next action of the vehicle 1000 is indicated. The next action (hereinafter referred to as “automatic action”) is determined by the automatic driving algorithm in the automatic driving control apparatus 1030. For this reason, the automatic behavior may not match the driver's sense.
  • the confirmation unit 1114 also inputs one or more action candidates from the determination unit 1112. The confirmation unit 1114 outputs these to the screen generation unit 1116 in order for the driver to select one of the automatic behavior and one or more behavior candidates.
  • the screen generation unit 1116 inputs automatic behavior and one or more behavior candidates from the confirmation unit 1114.
  • the screen generation unit 1116 generates an image in which these are collected.
  • FIG. 42 shows a screen generated by the screen generation unit 1116.
  • an action image 1200 is arranged at the center of the screen.
  • the screen generation unit 1116 stores in advance the contents of a plurality of types of automatic actions and images corresponding to them, and generates an action image 1200 by selecting an image corresponding to the input automatic action.
  • a first action candidate image 1202a and a second action candidate image 1202b are arranged on the right side of the screen.
  • the first action candidate image 1202a and the second action candidate image 1202b are collectively referred to as action candidate images 1202.
  • the first action candidate image 1202a is generated from the first action candidate
  • the second action candidate image 1202b is generated from the second action candidate
  • the screen generation unit 1116 is similar to the action image 1200 generation. Is generated.
  • the screen generation unit 1116 outputs the generated screen image to the image output unit 1051 as image data.
  • the image output unit 1051 displays the screen of the action candidate image 1202 by outputting the image data to the notification device 1002.
  • the notification device 1002 displays the screen shown in FIG. While using the input device 1004, the driver selects any one of the behavior image 1200, the first behavior candidate image 1202a, and the second behavior candidate image 1202b.
  • the operation input unit 1050 inputs the selection result as an operation signal from the input device 1004 and outputs the selection result to the control unit 1041.
  • the confirmation unit 1114 inputs the selection result from the operation input unit 1050.
  • the confirmation unit 1114 confirms selection of the first action candidate if the selection result is the first action candidate image 1202a, and confirms selection of the second action candidate if the selection result is the second action candidate image 1202b.
  • the confirmation unit 1114 confirms the selection of the automatic action if the selection result is the action image 1200.
  • the confirmation unit 1114 confirms the selection of the automatic behavior even when the selection result is not input even after a certain period of time has passed since the one or more behavior candidates are output to the screen generation unit 1116.
  • the confirmation unit 1114 outputs the selected action candidate to the instruction unit 1118.
  • the instruction unit 1118 instructs the automatic driving control apparatus 1030 via the command output unit 1055 for an action corresponding to the action candidate when the action candidate notification is input from the confirmation unit 1114. Specifically, the instruction unit 1118 outputs the input action candidate to the command output unit 1055.
  • the command output unit 1055 when an action candidate is input from the instruction unit 1118, outputs a control command corresponding to the action candidate to the automatic driving control device 1030.
  • the automatic driving control device 1030 controls the automatic driving of the vehicle 1000 with the action candidate as the next action. Therefore, even when “deceleration” is indicated in the automatic action, when the “right lane change” of the action candidate is selected, the vehicle 1000 travels according to the “right lane change” of the next action. .
  • the instruction unit 1118 notifies the traveling history generation unit 1102 of the instructed action when the next action is instructed to the automatic driving control apparatus 1030.
  • FIG. 43 is a flowchart illustrating a detection procedure performed by the second detection unit 1062.
  • FIG. 43A is a flowchart showing a first detection procedure.
  • the second detection unit 1062 executes driver personal authentication, passenger personal authentication, age / gender detection, or seating sensing (S1002).
  • the detection unit 1100 acquires and stores driver / passenger information (S1004).
  • Step 1002 and Step 1004 are skipped.
  • the second detection unit 1062 detects the passenger's state (normal / sleepiness / car sickness) (S1010).
  • the detection unit 1100 updates the passenger's state (S1014).
  • step 1014 is skipped.
  • FIG. 44 is a sequence diagram showing a registration procedure by the driving support apparatus 1040.
  • the driving support device 1040 generates a travel history (S1050).
  • the driving support device 1040 transmits a travel history update notification to the cloud server (S1052).
  • the cloud server transmits a travel history request to the driving support device 1040 (S1054).
  • the driving support device 1040 replaces the ID of the travel history (S1056) and transmits the travel history registration (S1058).
  • the cloud server stores the travel history (S1060).
  • the cloud server transmits the travel history registration result to the driving support device 1040 (S1062).
  • FIG. 45 is a flowchart showing a transmission procedure by the transmission unit 1104. If there is a travel history update (Y in S1100), the transmission unit 1104 acquires a travel history (updated) (S1102). When there is an unregistered condition (Y in S1104), the transmission unit 1104 assigns a new ID (S1106). If there is no ID unregistered condition (N in S1104), step 1106 is skipped. The transmission unit 1104 replaces the ID (S1108). If there is no travel history update (N in S1100), steps 1102 to 1108 are skipped.
  • FIG. 46 is a sequence diagram illustrating a driver model generation procedure performed by the driving support device 1040.
  • the driving support device 1040 generates a travel history (S1150).
  • the driving support device 1040 extracts the inquiry travel history (S1152).
  • the driving support device 1040 transmits an inquiry signal to the cloud server (S1154).
  • the cloud server extracts a similar travel history (S1156).
  • the cloud server transmits a similar travel history to the driving support device 1040 (S1158).
  • the driving support device 1040 generates a driver model (S1160).
  • FIG. 47 is a flowchart showing a travel history update procedure performed by the travel history generation unit 1102.
  • the determination unit 1112 determines the next action (S1200). When the determined action is selected (Y in S1202), the travel history generation unit 1102 updates the travel history (S1204). If the determined action is not selected (N in S1202), the process ends.
  • the driver model is generated based on the travel history similar to the travel history for the current driver, a driver model suitable for the current driver can be generated. Further, since the next action is determined based on the driver model suitable for the current driver and the current environmental parameters of the vehicle, the determination accuracy can be improved. Further, since the driver model is generated based on the traveling history similar to the traveling history for the combination of the current driver and the current presence / absence of the passenger, the accuracy of the driver model can be improved. In this case, since the driver model is generated based on the driving history similar to the driving history for the combination of the current driver, the presence / absence of the current passenger, and information on the current passenger, the accuracy of the driver model is improved. It can be further improved.
  • the server since a travel history similar to the travel history for the current driver is acquired from the server, it is possible to cause the server to search for a similar travel history. Further, since the server searches for similar travel histories, the amount of processing can be reduced. Moreover, since the travel history is transmitted to the server, the travel history generated in various driving support devices can be accumulated in the server. In addition, since the travel histories generated in various driving support devices are stored in the server, it is possible to improve the search accuracy of similar travel histories. In addition, since an ID for identifying each combination is given to the travel history, management at the server can be facilitated. In addition, since an image showing the next action is displayed, the driver can be notified of the next action.
  • next action is determined based on the driver model generated based on the driving history similar to the driving history for the current driver and the current environmental parameters of the vehicle. It is possible to improve the accuracy of determining the behavior.
  • the next action is determined based on the driver model generated based on the driving history similar to the driving history for the current driver and the current environmental parameters of the vehicle. The determination accuracy can be improved.
  • Computers that realize the functions described above by programs include input devices such as keyboards, mice, and touch pads, output devices such as displays and speakers, CPU (Central Processing Unit), ROM (Read Only Memory), RAM (Random Access Memory) Is provided. And a storage device such as a hard disk device or SSD (Solid State Drive), a reading device that reads information from a recording medium such as a DVD-ROM (Digital Versatile Disk Read Only Memory) or USB (Universal Serial Bus) memory, via a network A network card for performing communication is further provided, and each unit is connected by a bus.
  • input devices such as keyboards, mice, and touch pads
  • output devices such as displays and speakers
  • CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • a storage device such as a hard disk device or SSD (Solid State Drive)
  • a reading device that reads information from a recording medium such as a DVD-ROM (Digital Versatile Disk Read Only Memory) or USB (Universal Serial Bus) memory, via a network
  • the reading device reads the program from the recording medium on which the program is recorded, and stores the program in the storage device.
  • a network card communicates with the server apparatus connected to the network, and memorize
  • the CPU copies the program stored in the storage device to the RAM, and sequentially reads out and executes the instructions included in the program from the RAM, thereby realizing the functions of the respective devices.
  • a driving support device generates, for each driver, a driving history in which an environmental parameter indicating a driving environment in which the vehicle has traveled in the past and an action selected by the driver with respect to the environmental parameter are associated with each other.
  • a travel history generation unit is provided.
  • the driving support apparatus further includes an acquisition unit that acquires a travel history similar to the travel history for the current driver among the travel histories generated by the travel history generation unit.
  • the driving support device further indicates a driver model generation unit that generates a driver model, a driver model generated by the driver model generation unit, and a current driving environment of the vehicle based on the travel history acquired by the acquisition unit.
  • a determination unit that determines the next action based on the environmental parameter.
  • the determination accuracy is improved. It can be improved.
  • the driving support device may further include a detection unit that detects the presence or absence of a passenger in the vehicle.
  • the travel history generation unit generates a travel history for each driver and for each passenger who has been detected in the past by the detection unit, and the acquisition unit detects the current driver and the current detected by the detection unit.
  • a travel history similar to the travel history for the combination with the presence or absence of a passenger may be acquired.
  • the driver model is generated based on the travel history similar to the travel history for the combination of the current driver and the current presence / absence of the passenger, the accuracy of the driver model can be improved.
  • the detection unit also detects information related to the passengers of the vehicle, and the travel history generation unit calculates the travel history for each driver, and for each information related to the passengers and the presence or absence of the passengers detected in the past by the detection unit.
  • the generation unit may generate a travel history similar to a travel history for a combination of the current driver, the presence / absence of the current passenger detected by the detection unit, and information related to the current passenger.
  • the driver model is generated based on the driving history similar to the driving history for the combination of the current driver, the presence / absence of the current passenger, and information on the current passenger, the accuracy of the driver model is improved. It can be improved.
  • the driving support device may further include an inquiry unit that executes an inquiry to the server based on the driving history for the current driver among the driving histories generated by the driving history generation unit.
  • the acquisition unit may acquire a travel history similar to the travel history for the current driver from the server as a response to the inquiry by the inquiry unit. In this case, since a travel history similar to the travel history for the current driver is acquired from the server, the amount of processing can be reduced.
  • the driving support device further includes an inquiry unit that executes an inquiry to the server based on a traveling history of a combination of the current driver and the current presence / absence of a passenger among the traveling history generated by the traveling history generation unit. May be.
  • the acquisition unit may acquire a travel history similar to the travel history for the combination of the current driver and the presence or absence of the current passenger from the server as a response to the inquiry by the inquiry unit. In this case, since a travel history similar to the travel history for the current driver is acquired from the server, the amount of processing can be reduced.
  • the driving support device is an inquiry that executes an inquiry to the server based on a combination of the current driver and the presence / absence of the current passenger and information on the current passenger in the driving history generated by the driving history generation unit.
  • a part may be further provided.
  • the acquisition unit may acquire a travel history similar to the travel history for the combination of the current driver and the presence / absence of the current passenger and information on the current passenger as a response to the inquiry by the inquiry unit. . In this case, since a travel history similar to the travel history for the current driver is acquired from the server, the amount of processing can be reduced.
  • the driving support device may further include a transmission unit that transmits the travel history generated by the travel history generation unit to the server.
  • traveling histories generated in various driving support devices can be accumulated in the server.
  • the driving support device may further include a transmission unit that transmits the travel history generated by the travel history generation unit to the server.
  • the transmission unit may add identification information for identifying each combination in the travel history. In this case, since identification information for identifying each combination is given, management on the server can be facilitated.
  • the driving support device may further include an image output unit that causes the notification device to display an image indicating the next action determined by the determination unit. In this case, the driver can be notified of the next action.
  • Another aspect of the present invention is an automatic operation control device.
  • This device includes a travel history generation unit that generates, for each driver, a travel history in which an environment parameter indicating a travel environment in which the vehicle has traveled in the past and an action selected by the driver with respect to the environment parameter.
  • the automatic driving equipment control device further includes an acquisition unit for acquiring a travel history similar to the travel history for the current driver among the travel histories generated by the travel history generation unit, and a travel history acquired by the acquisition unit.
  • a driver model generation unit that generates a driver model.
  • the automatic driving equipment control device further includes a determination unit that determines the next action based on the driver model generated by the driver model generation unit and an environmental parameter indicating the current traveling environment of the vehicle, and a determination unit An automatic driving control unit that controls automatic driving of the vehicle based on the determined next action.
  • the determination accuracy is improved. It can be improved.
  • Still another aspect of the present invention is a vehicle.
  • This vehicle is a vehicle including a driving support device, and the driving support device travels by causing an environment parameter indicating a travel environment in which the vehicle has traveled in the past and an action selected by the driver to the environment parameter.
  • a travel history generation unit that generates a history for each driver is provided.
  • the driving support apparatus further includes an acquisition unit that acquires a travel history similar to the travel history for the current driver among the travel histories generated by the travel history generation unit.
  • the driving support device further indicates a driver model generation unit that generates a driver model, a driver model generated by the driver model generation unit, and a current driving environment of the vehicle based on the travel history acquired by the acquisition unit.
  • a determination unit that determines the next action based on the environmental parameter.
  • the determination accuracy is improved. It can be improved.
  • Still another aspect of the present invention is a driving support method.
  • This method includes a step of generating, for each driver, an environmental parameter indicating a driving environment in which the vehicle has traveled in the past, and a driving history in which the driver selects an action selected by the environmental parameter.
  • the driving support device further includes a step of acquiring a driving history similar to the driving history for the current driver among the generated driving history, a step of generating a driver model based on the acquired driving history, Determining the next action based on the generated driver model and an environmental parameter indicating the current driving environment of the vehicle.
  • the management of the total travel history and the extraction of the similar travel history are performed in the cloud server.
  • the present invention is not limited thereto, and for example, these processes may be performed in the driving support device 1040.
  • the driving assistance device 1040 mounted in each of the plurality of vehicles 1000 generates an overall traveling history for each driving assistance device 1040 by exchanging the traveling history with each other. According to this modification, the installation of a cloud server can be made unnecessary.
  • the driving support method according to the present invention and the driving support device, automatic driving control device, vehicle, and program using the driving support method are suitable for transmitting information to the driver.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Automation & Control Theory (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'invention concerne une technologie permettant d'améliorer la précision lors de la détermination de l'action suivante. Une unité de génération d'historique de déplacement (1102) génère, pour chaque conducteur, un historique de déplacement obtenu en associant des paramètres environnementaux indiquant l'environnement de déplacement à travers lequel un véhicule s'est déplacé dans le passé, et une action sélectionnée par le conducteur en réponse aux paramètres environnementaux. Une unité d'acquisition (1108) acquiert un historique de déplacement similaire à l'historique de déplacement de l'unité d'entraînement de courant parmi les historiques de déplacement générés par l'unité de génération d'historique de déplacement (1102). Une unité de génération de modèle d'entraînement (1110) génère un modèle d'entraînement sur la base de l'historique de déplacement acquis par l'unité d'acquisition (1108). Une unité de détermination (1112) détermine l'action suivante sur la base du modèle d'entraînement généré par l'unité de génération de modèle d'entraînement (1110) et des paramètres environnementaux indiquant l'environnement de déplacement du véhicule actuel.
PCT/JP2016/002048 2015-04-21 2016-04-15 Procédé d'assistance à la conduite, dispositif d'assistance à la conduite l'utilisant, dispositif de commande de conduite automatique, véhicule et programme d'assistance à la conduite WO2016170763A1 (fr)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP16782787.2A EP3269609B1 (fr) 2015-04-21 2016-04-15 Procédé d'assistance à la conduite, dispositif d'assistance à la conduite l'utilisant, dispositif de commande de conduite automatique, véhicule et programme d'assistance à la conduite
US15/564,702 US10919540B2 (en) 2015-04-21 2016-04-15 Driving assistance method, and driving assistance device, driving control device, vehicle, and recording medium using said method
CN201680021986.8A CN107531252B (zh) 2015-04-21 2016-04-15 驾驶辅助方法以及利用了该驾驶辅助方法的驾驶辅助装置、自动驾驶控制装置、车辆、存储介质
EP20177508.7A EP3738854A1 (fr) 2015-04-21 2016-04-15 Procédé d'assistance à la conduite, dispositif d'assistance à la conduite l'utilisant, dispositif de commande de conduite automatique, véhicule et programme d'assistance à la conduite

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
JP2015-087069 2015-04-21
JP2015087069 2015-04-21
JP2015-099474 2015-05-14
JP2015099474 2015-05-14
JP2015-119139 2015-06-12
JP2015119139 2015-06-12
JP2015252667A JP6761967B2 (ja) 2015-04-21 2015-12-24 運転支援方法およびそれを利用した運転支援装置、自動運転制御装置、車両、プログラム
JP2015-252667 2015-12-24

Publications (1)

Publication Number Publication Date
WO2016170763A1 true WO2016170763A1 (fr) 2016-10-27

Family

ID=57143812

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/002048 WO2016170763A1 (fr) 2015-04-21 2016-04-15 Procédé d'assistance à la conduite, dispositif d'assistance à la conduite l'utilisant, dispositif de commande de conduite automatique, véhicule et programme d'assistance à la conduite

Country Status (1)

Country Link
WO (1) WO2016170763A1 (fr)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107139921A (zh) * 2017-04-05 2017-09-08 吉利汽车研究院(宁波)有限公司 一种用于车辆的转向防碰撞方法及系统
CN109131340A (zh) * 2017-06-15 2019-01-04 株式会社日立制作所 基于驾驶员行为的主动车辆性能调整
CN109747621A (zh) * 2017-11-02 2019-05-14 本田技研工业株式会社 车辆控制装置
CN110325422A (zh) * 2017-02-23 2019-10-11 松下知识产权经营株式会社 信息处理系统、信息处理方法、程序以及记录介质
CN110446645A (zh) * 2017-04-07 2019-11-12 日立汽车系统株式会社 车辆控制装置
CN111201554A (zh) * 2017-10-17 2020-05-26 本田技研工业株式会社 行驶模型生成系统、行驶模型生成系统中的车辆、处理方法以及程序
JP2020086801A (ja) * 2018-11-22 2020-06-04 三菱電機株式会社 自動運転制御装置および自動運転制御方法
WO2021012528A1 (fr) * 2019-07-25 2021-01-28 平安科技(深圳)有限公司 Procédé et appareil d'assistance à la sécurité de conduite, véhicule et support de stockage lisible
CN112513950A (zh) * 2018-08-27 2021-03-16 日立汽车系统株式会社 更新系统和电子控制装置
WO2022107442A1 (fr) * 2020-11-20 2022-05-27 株式会社デンソー Dispositif de commande ihm et dispositif de commande de conduite

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007122579A (ja) * 2005-10-31 2007-05-17 Equos Research Co Ltd 車両制御装置
JP2012113631A (ja) * 2010-11-26 2012-06-14 Toyota Motor Corp 運転支援システム及び運転支援管理センター
JP2013069020A (ja) * 2011-09-21 2013-04-18 Nissan Motor Co Ltd エコ運転支援装置
JP2015081057A (ja) * 2013-10-24 2015-04-27 日産自動車株式会社 車両用表示装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007122579A (ja) * 2005-10-31 2007-05-17 Equos Research Co Ltd 車両制御装置
JP2012113631A (ja) * 2010-11-26 2012-06-14 Toyota Motor Corp 運転支援システム及び運転支援管理センター
JP2013069020A (ja) * 2011-09-21 2013-04-18 Nissan Motor Co Ltd エコ運転支援装置
JP2015081057A (ja) * 2013-10-24 2015-04-27 日産自動車株式会社 車両用表示装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3269609A4 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110325422A (zh) * 2017-02-23 2019-10-11 松下知识产权经营株式会社 信息处理系统、信息处理方法、程序以及记录介质
CN107139921B (zh) * 2017-04-05 2019-10-29 吉利汽车研究院(宁波)有限公司 一种用于车辆的转向防碰撞方法及系统
CN107139921A (zh) * 2017-04-05 2017-09-08 吉利汽车研究院(宁波)有限公司 一种用于车辆的转向防碰撞方法及系统
CN110446645B (zh) * 2017-04-07 2022-09-20 日立安斯泰莫株式会社 车辆控制装置
CN110446645A (zh) * 2017-04-07 2019-11-12 日立汽车系统株式会社 车辆控制装置
CN109131340B (zh) * 2017-06-15 2021-07-16 株式会社日立制作所 基于驾驶员行为的主动车辆性能调整
CN109131340A (zh) * 2017-06-15 2019-01-04 株式会社日立制作所 基于驾驶员行为的主动车辆性能调整
CN111201554A (zh) * 2017-10-17 2020-05-26 本田技研工业株式会社 行驶模型生成系统、行驶模型生成系统中的车辆、处理方法以及程序
CN109747621A (zh) * 2017-11-02 2019-05-14 本田技研工业株式会社 车辆控制装置
CN112513950A (zh) * 2018-08-27 2021-03-16 日立汽车系统株式会社 更新系统和电子控制装置
JP2020086801A (ja) * 2018-11-22 2020-06-04 三菱電機株式会社 自動運転制御装置および自動運転制御方法
WO2021012528A1 (fr) * 2019-07-25 2021-01-28 平安科技(深圳)有限公司 Procédé et appareil d'assistance à la sécurité de conduite, véhicule et support de stockage lisible
WO2022107442A1 (fr) * 2020-11-20 2022-05-27 株式会社デンソー Dispositif de commande ihm et dispositif de commande de conduite

Similar Documents

Publication Publication Date Title
JP6761967B2 (ja) 運転支援方法およびそれを利用した運転支援装置、自動運転制御装置、車両、プログラム
JP6447929B2 (ja) 情報処理システム、情報処理方法、およびプログラム
WO2018079392A1 (fr) Système de traitement d'informations, procédé de traitement d'informations, et programme
JP6895634B2 (ja) 情報処理システム、情報処理方法、およびプログラム
JP6074553B1 (ja) 情報処理システム、情報処理方法、およびプログラム
WO2016170763A1 (fr) Procédé d'assistance à la conduite, dispositif d'assistance à la conduite l'utilisant, dispositif de commande de conduite automatique, véhicule et programme d'assistance à la conduite
WO2016170764A1 (fr) Procédé d'assistance à la conduite et dispositif d'assistance à la conduite, dispositif de commande de conduite, véhicule, et programme d'assistance à la conduite utilisant ledit procédé
WO2016170773A1 (fr) Procédé d'assistance à la conduite, et dispositif d'assistance à la conduite, dispositif de commande de conduite automatique, véhicule, et programme d'assistance à la conduite au moyen dudit procédé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16782787

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 15564702

Country of ref document: US

REEP Request for entry into the european phase

Ref document number: 2016782787

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE