WO2016170763A1 - Driving assistance method, driving assistance device using same, automatic driving control device, vehicle, and driving assistance program - Google Patents

Driving assistance method, driving assistance device using same, automatic driving control device, vehicle, and driving assistance program Download PDF

Info

Publication number
WO2016170763A1
WO2016170763A1 PCT/JP2016/002048 JP2016002048W WO2016170763A1 WO 2016170763 A1 WO2016170763 A1 WO 2016170763A1 JP 2016002048 W JP2016002048 W JP 2016002048W WO 2016170763 A1 WO2016170763 A1 WO 2016170763A1
Authority
WO
WIPO (PCT)
Prior art keywords
driver
vehicle
unit
travel history
driving
Prior art date
Application number
PCT/JP2016/002048
Other languages
French (fr)
Japanese (ja)
Inventor
勝長 辻
森 俊也
江村 恒一
渉 仲井
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2015252667A external-priority patent/JP6761967B2/en
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Priority to US15/564,702 priority Critical patent/US10919540B2/en
Priority to EP20177508.7A priority patent/EP3738854A1/en
Priority to CN201680021986.8A priority patent/CN107531252B/en
Priority to EP16782787.2A priority patent/EP3269609B1/en
Publication of WO2016170763A1 publication Critical patent/WO2016170763A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K31/00Vehicle fittings, acting on a single sub-unit only, for automatically controlling vehicle speed, i.e. preventing speed from exceeding an arbitrarily established velocity or maintaining speed at a particular velocity, as selected by the vehicle operator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/10Interpretation of driver requests or demands
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a vehicle, a driving support method provided in the vehicle, a driving support device using the same, an automatic driving control device, and a driving support program.
  • Patent Document 1 discloses a travel control device that allows a driver to visually recognize the operating state of automatic steering control or automatic acceleration / deceleration control when the host vehicle performs automatic steering control or automatic acceleration / deceleration control. Yes.
  • the present invention relates to a driving support method capable of solving at least one of the above problems during fully automatic driving or partial automatic driving, and a driving support device, automatic driving control device, vehicle, driving using the same Provide support programs.
  • a driving support device provides a driving in which an environmental parameter indicating a driving environment in which the vehicle has traveled in the past and an action selected by the driver with respect to the environmental parameter are associated with each other.
  • a travel history generation unit that generates a history for each driver is provided.
  • the driving support apparatus further includes an acquisition unit that acquires a travel history similar to the travel history for the current driver among the travel histories generated by the travel history generation unit.
  • the driving support device further indicates a driver model generation unit that generates a driver model, a driver model generated by the driver model generation unit, and a current driving environment of the vehicle based on the travel history acquired by the acquisition unit.
  • a determination unit that determines the next action based on the environmental parameter.
  • Another aspect of the present invention is an automatic operation control device.
  • This device includes a travel history generation unit that generates, for each driver, a travel history in which an environment parameter indicating a travel environment in which the vehicle has traveled in the past and an action selected by the driver with respect to the environment parameter.
  • the automatic operation control device further includes an acquisition unit that acquires a travel history similar to the travel history for the current driver among the travel histories generated by the travel history generation unit.
  • the automatic driving control device further includes a driver model generation unit that generates a driver model based on the travel history acquired by the acquisition unit, a driver model generated by the driver model generation unit, and a current driving environment of the vehicle.
  • a determination unit that determines the next action based on the environmental parameter shown, and an automatic driving control unit that controls the automatic driving of the vehicle based on the next action determined by the determination unit.
  • Still another aspect of the present invention is a vehicle.
  • This vehicle is a vehicle including a driving support device, and the driving support device travels by causing an environment parameter indicating a travel environment in which the vehicle has traveled in the past and an action selected by the driver to the environment parameter.
  • a travel history generation unit that generates a history for each driver is provided.
  • the driving support apparatus further includes an acquisition unit that acquires a travel history similar to the travel history for the current driver among the travel histories generated by the travel history generation unit.
  • the driving support device further indicates a driver model generation unit that generates a driver model, a driver model generated by the driver model generation unit, and a current driving environment of the vehicle based on the travel history acquired by the acquisition unit.
  • a determination unit that determines the next action based on the environmental parameter.
  • Still another aspect of the present invention is a driving support method.
  • This method includes a step of generating, for each driver, a driving history in which an environmental parameter indicating a driving environment in which the vehicle has traveled in the past and an action selected by the driver with respect to the environmental parameter are associated with each other. And a step of acquiring a travel history similar to the travel history for the current driver.
  • the driving support method further includes a step of generating a driver model based on the acquired traveling history, a generated driver model, and an environment parameter indicating a current driving environment of the vehicle, and the next action Determining.
  • information can be appropriately transmitted from the vehicle to the occupant so that comfortable automatic driving can be performed in which the operation of the vehicle and the driver is less likely to conflict with each other in fully automatic driving or partial automatic driving.
  • FIG. 1 It is a block diagram which shows the principal part structure of the vehicle containing the information alerting device which concerns on Embodiment 1 of this invention. It is a figure explaining the 1st example of driving environment, the display of the alerting
  • FIG. 10 is a diagram illustrating display on a touch panel in Embodiment 2.
  • FIG. 10 is a diagram illustrating display on a touch panel in Embodiment 2.
  • FIG. 10 is a block diagram illustrating a configuration of a vehicle according to a fifth embodiment. It is a figure which shows typically the interior of the vehicle of FIG. It is a block diagram which shows the detailed structure of the detection part of FIG. 32, and a detection information input part. It is a block diagram which shows the detailed structure of the control part of FIG.
  • FIG. 36 is a diagram illustrating a data structure of a travel history generated by a travel history generation unit in FIG. 35.
  • FIG. 36 is a diagram showing another data structure of a travel history generated by the travel history generation unit in FIG. 35. It is a figure which shows the process outline
  • FIG. 33 is a sequence diagram illustrating a procedure for generating a driver model by the driving support apparatus of FIG. 32. It is a flowchart which shows the update procedure of the travel history by the travel history production
  • FIG. 1 is a block diagram showing a main configuration of a vehicle 1 including an information notification device according to Embodiment 1 of the present invention.
  • the vehicle 1 is a vehicle that can automatically perform all or part of the driving control without requiring the operation of the driver.
  • the vehicle 1 includes a brake pedal 2, an accelerator pedal 3, a winker lever 4, a steering wheel 5, a detection unit 6, a vehicle control unit 7, a storage unit 8, and an information notification device 9.
  • the brake pedal 2 receives a brake operation by the driver and decelerates the vehicle 1.
  • the brake pedal 2 may receive a control result from the vehicle control unit 7 and change in an amount corresponding to the degree of deceleration of the vehicle 1.
  • the accelerator pedal 3 accepts an accelerator operation by the driver and accelerates the vehicle 1. Further, the accelerator pedal 3 may receive a control result by the vehicle control unit 7 and may change by an amount corresponding to the degree of acceleration of the vehicle 1.
  • the winker lever 4 receives a lever operation by the driver and turns on a direction indicator (not shown) of the vehicle 1.
  • the winker lever 4 may receive a control result from the vehicle control unit 7, change the winker lever 4 to a state corresponding to the direction indicating direction of the vehicle 1, and turn on a direction indicator (not shown) of the vehicle 1.
  • Steering wheel 5 receives the steering operation by the driver and changes the traveling direction of the vehicle 1. Further, the steering wheel 5 may receive a control result by the vehicle control unit 7 and may change by an amount corresponding to a change in the traveling direction of the vehicle 1.
  • the steering wheel 5 has an operation unit 51.
  • the operation unit 51 is provided on the front surface (surface facing the driver) of the steering wheel 5 and receives an input operation from the driver.
  • the operation unit 51 is a device such as a button, a touch panel, or a grip sensor, for example.
  • the operation unit 51 outputs information on the input operation received from the driver to the vehicle control unit 7.
  • the detection unit 6 detects the traveling state of the vehicle 1 and the situation around the vehicle 1. Then, the detection unit 6 outputs information on the detected traveling state and surrounding conditions to the vehicle control unit 7.
  • the detection unit 6 includes a position information acquisition unit 61, a sensor 62, a speed information acquisition unit 63, and a map information acquisition unit 64.
  • the position information acquisition unit 61 acquires the position information of the vehicle 1 as travel state information by GPS (Global Positioning System) positioning or the like.
  • GPS Global Positioning System
  • the sensor 62 determines the collision prediction time (TTC: Time) from the position of the other vehicle existing around the vehicle 1 and the lane position information, from the type of the other vehicle and whether it is a preceding vehicle, the speed of the other vehicle, and the speed of the own vehicle. To Collision), the situation around the vehicle 1 such as an obstacle around the vehicle 1 is detected.
  • TTC Time
  • the speed information acquisition unit 63 acquires information such as the speed of the vehicle 1 or the traveling direction from a speed sensor or the like (not shown) as traveling state information.
  • the map information acquisition unit 64 obtains map information around the vehicle 1 such as the road on which the vehicle 1 travels, a merging point with other vehicles on the road, the currently traveling lane, the position of the intersection, and the like. Obtain as information.
  • the sensor 62 is constituted by a millimeter wave radar, a laser radar, a camera, or a combination thereof.
  • the storage unit 8 is a storage device such as a ROM (Read Only Memory), a RAM (Random Access Memory), a hard disk device, or an SSD (Solid State Drive), and the current running environment and next (first predetermined time elapse). Memorize the correspondence between possible behavior candidates (later).
  • the current traveling environment is an environment determined by the position of the vehicle 1, the road on which the vehicle 1 is traveling, the position and speed of other vehicles existing around the vehicle 1, and the like.
  • the other vehicle may be interrupted during acceleration or deceleration depending on the position or speed of the other vehicle, and collision may occur after 1 second. You may judge to sex. Thereby, the behavior of the other vehicle can be predicted, and the traveling environment can be grasped in more detail and accurately.
  • the candidate for behavior is a candidate for behavior that the vehicle 1 can take next (after the first predetermined time) with respect to the current traveling environment.
  • the storage unit 8 has a merge path ahead of the lane in which the vehicle 1 travels, there is a vehicle that merges from the left side of the lane, and the lane can be changed to the right side of the lane in which the vehicle 1 travels.
  • three behavior candidates of acceleration of the vehicle 1, deceleration of the vehicle 1, and lane change to the right of the vehicle 1 are stored in advance.
  • the storage unit 8 allows a vehicle traveling in front of the same lane as the vehicle 1 (hereinafter referred to as “preceding vehicle”) to travel at a slower speed than the vehicle 1 and can change the lane to an adjacent lane.
  • preceding vehicle a vehicle traveling in front of the same lane as the vehicle 1
  • three behavior candidates are stored in advance: driving that overtakes the preceding vehicle, driving that changes the lane to the adjacent lane, and driving that decelerates the vehicle 1 and follows the preceding vehicle.
  • the storage unit 8 may store priorities for the respective behavior candidates. For example, the storage unit 8 may store the number of behaviors actually adopted in the same driving environment in the past, and may store the priority set higher for the behaviors that are adopted more frequently.
  • the vehicle control unit 7 can be realized as a part of an LSI circuit or an electronic control unit (ECU) that controls the vehicle, for example.
  • the vehicle control unit 7 controls the vehicle based on the traveling state information and the surrounding situation information acquired from the detection unit 6, and the brake pedal 2, the accelerator pedal 3, the blinker lever 4, and information notification corresponding to the vehicle control result.
  • the device 9 is controlled.
  • the object which the vehicle control part 7 controls is not limited to these.
  • the vehicle control unit 7 determines the current driving environment based on information on the driving state and surrounding conditions. For this determination, various conventionally proposed methods can be used.
  • the vehicle control unit 7 determines that the current driving environment is based on the information on the driving state and the surrounding situation: “There is a merge path in front of the lane in which the vehicle 1 travels, and a vehicle that merges from the left side of the lane. It is determined that the travel environment is present and can be changed to the right of the lane in which the vehicle 1 travels.
  • the vehicle control unit 7 determines that the time series of the travel environment is “a vehicle traveling in front of the same lane as the vehicle 1 travels at a slower speed than the vehicle 1 based on information on the travel state and the surrounding conditions. In addition, it is determined that the travel environment allows a lane change to the adjacent lane.
  • the vehicle control unit 7 causes the notification unit 92 of the information notification device 9 to notify information related to the traveling environment indicating the traveling state and the surrounding situation. Further, the vehicle control unit 7 reads, from the storage unit 8, behavior candidates that the vehicle 1 can take next (after the first predetermined time has elapsed) with respect to the determined traveling environment.
  • the vehicle control unit 7 determines which behavior is most suitable for the current traveling environment from the read behavior candidates, and sets the behavior most suitable for the current traveling environment as the first behavior.
  • the first behavior may be the same behavior that the vehicle is currently implementing, that is, continuing the currently implemented behavior.
  • the vehicle control part 7 sets the candidate of the behavior which a driver
  • the vehicle control unit 7 may set the most suitable behavior as the first behavior using a conventional technique that determines the most suitable behavior based on information on the running state and the surrounding situation.
  • the vehicle control unit 7 may set a preset behavior among the plurality of behavior candidates as the most suitable behavior, or store information on the behavior selected last time in the storage unit 8.
  • the behavior may be determined as the most suitable behavior, or the number of times each behavior has been selected in the past is stored in the storage unit 8, and the behavior with the largest number of times may be determined as the most suitable behavior. Good.
  • the vehicle control part 7 makes the alerting
  • vehicle control unit 7 may cause the notification unit 92 to simultaneously notify the information on the first behavior and the second behavior, and information on the running state and the surrounding situation.
  • the vehicle control unit 7 acquires information on the operation received by the operation unit 51 from the driver. After notifying the first behavior and the second behavior, the vehicle control unit 7 determines whether or not the operation unit 51 has accepted the operation within the second predetermined time. This operation is, for example, an operation for selecting one behavior from behaviors included in the second behavior.
  • the vehicle control unit 7 controls the vehicle so as to execute the first behavior when the operation unit 51 does not accept the operation within the second predetermined time, and the brake pedal 2 and the accelerator according to the vehicle control result.
  • the pedal 3 and the winker lever 4 are controlled.
  • the vehicle control unit 7 performs control corresponding to the accepted operation when the operation unit 51 accepts the operation within the second predetermined time.
  • the information notification device 9 acquires various information related to the traveling of the vehicle 1 from the vehicle control unit 7 and notifies the acquired information.
  • the information notification device 9 includes an information acquisition unit 91 and a notification unit 92.
  • the information acquisition unit 91 acquires various information related to the traveling of the vehicle 1 from the vehicle control unit 7. For example, the information acquisition unit 91 acquires the first behavior information and the second behavior information from the vehicle control unit 7 when the vehicle control unit 7 determines that there is a possibility of updating the behavior of the vehicle 1. To do.
  • the information acquisition part 91 memorize
  • the notification unit 92 notifies the driver of information related to the traveling of the vehicle 1.
  • the notification unit 92 displays information such as a car navigation system installed in the vehicle, a head-up display, a center display, a steering wheel 5 or a light emitter such as a light emitting diode (LED) installed in the pillar.
  • the display part to display may be sufficient.
  • the notification unit 92 may be a speaker that converts information into sound and notifies the driver, or is provided at a position that can be sensed by the driver (for example, the driver's seat, the steering wheel 5). It may be a vibrating body.
  • the notification unit 92 may be a combination of these.
  • the notification unit 92 is a notification device.
  • the notification unit 92 includes, for example, a head-up display (HUD), an LCD (Liquid Crystal Display), an HMD (Head-Mounted Display or Helmet-Mounted Display), a glasses-type display (Smart Glasses), Other dedicated displays.
  • the HUD may be, for example, a windshield of the vehicle 1, or may be a separately provided glass surface, plastic surface (for example, a combiner), or the like.
  • the windshield may be, for example, a windshield, a side glass or a rear glass of the vehicle 1.
  • the HUD may be a transmissive display provided on the surface or inside of the windshield.
  • the transmissive display is, for example, a transmissive organic EL (Electroluminescence) display or a transparent display using glass that emits light when irradiated with light of a specific wavelength.
  • the driver can view the display on the transmissive display at the same time as viewing the background.
  • the notification unit 92 may be a display medium that transmits light. In either case, an image is displayed on the notification unit 92.
  • the notification unit 92 notifies the driver of information related to travel acquired from the vehicle control unit 7 via the information acquisition unit 91.
  • the notification unit 92 notifies the driver of information on the first behavior and the second behavior acquired from the vehicle control unit 7.
  • FIG. 2 is a diagram for explaining a first example of the traveling environment, the display of the notification unit 92 and the operation of the operation unit 51 corresponding thereto.
  • FIG. 2 is an overhead view showing a traveling environment of the vehicle 1. Specifically, (a) of FIG. 2 is a right-hand side of the lane in which the vehicle 1 travels, and there is a merge channel in front of the lane in which the vehicle 1 travels. This indicates that the driving environment is capable of changing lanes.
  • the vehicle control unit 7 determines that the traveling environment is a traveling environment as shown in FIG. 2A based on information on the traveling state and the surrounding situation. Note that the vehicle control unit 7 generates the overhead view shown in FIG. 2A and causes the notification unit 92 to notify the generated overhead view in addition to the information on the first behavior and the second behavior. May be.
  • FIG. 2 shows an example of the display of the notification unit 92 for the traveling environment shown in (a) in FIG.
  • the display range of the notification unit 92 options on the behavior of the vehicle 1 are displayed on the right side, and information for switching to manual driving is displayed on the left side.
  • the first behavior is “lane change” shown in the highlighted display area 29b among the display areas 29a to 29c, 29g.
  • the second behavior is “acceleration” and “deceleration” shown in the display areas 29a and 29c, respectively.
  • the display area 29g displays “automatic operation end” indicating switching to manual operation.
  • FIG. 2C shows an example of the operation unit 51 provided in the steering wheel 5.
  • the operation unit 51 includes operation buttons 51 a to 51 d provided on the right side of the steering wheel 5 and operation buttons 51 e to 51 h provided on the left side of the steering wheel 5.
  • the number, shape, etc. of the operation part 51 provided in the steering wheel 5 are not limited to these.
  • the display areas 29a to 29c and the operation buttons 51a to 51c shown in FIG. 2B correspond to each other, and the display area 29g and the operation buttons 51g correspond to each other.
  • the driver presses an operation button corresponding to each display area when selecting any of the contents displayed in each display area. For example, when the driver selects the behavior “acceleration” displayed in the display area 29a, the driver presses the operation button 51a.
  • FIG. 2B only character information is displayed in each display area, but a symbol or icon relating to driving of the vehicle may be displayed as described below. As a result, the driver can grasp the display contents at a glance.
  • FIG. 3 is a diagram showing another example of display in the notification unit 92. As shown in FIG. 3, both character information and symbols indicating the information are displayed in the display areas 39a to 39c and 39g. Only symbols may be displayed.
  • FIG. 4 is a flowchart showing a processing procedure of information notification processing in the present embodiment.
  • FIG. 5 is a diagram illustrating a first example of a traveling environment and display control for the first example.
  • the detection unit 6 detects the traveling state of the vehicle (step S11). Next, the detection unit 6 detects the situation around the vehicle (step S12). Information on the detected traveling state of the vehicle and the situation around the vehicle is output by the detection unit 6 to the vehicle control unit 7.
  • the vehicle control unit 7 determines the current traveling environment based on the information on the traveling state and the surrounding situation (step S13).
  • the vehicle control unit 7 indicates that the current travel environment is “there is a merge path in front of the lane in which the vehicle 1 travels, and a vehicle that merges from the left side of the lane, And it determines with it being the driving
  • the vehicle control unit 7 causes the notification unit 92 of the information notification device 9 to notify the determined traveling environment information (step S14).
  • the vehicle control unit 7 outputs information on the determined traveling environment to the information acquisition unit 91.
  • the notification unit 92 acquires travel environment information from the information acquisition unit 91 and displays it as character information 59.
  • the vehicle control unit 7 may notify the driver of the information on the driving environment as sound through a speaker or the like instead of displaying the information on the driving environment on the notification unit 92. Thereby, even when the driver is not looking at or overlooking the display or monitor, information can be reliably transmitted to the driver.
  • the vehicle control unit 7 determines whether or not the determined traveling environment has a possibility of updating the behavior. If it is determined that there is a possibility of updating, the vehicle control unit 7 further includes the first behavior, And determination of a 2nd behavior is performed (step S15). The determination as to whether or not the driving environment is likely to be updated is made based on whether or not the driving environment has changed.
  • the behavior to be implemented after the update is, for example, the vehicle that decelerates when there is a possibility of a collision with another vehicle, etc., the speed changes when the preceding vehicle disappears in ACC (Adaptive Cruise Control), It is conceivable to change lanes when free. Whether to update or not is determined using conventional technology.
  • the vehicle control unit 7 reads, from the storage unit 8, candidate behaviors that the vehicle 1 can take next (after the first predetermined time has elapsed) with respect to the determined traveling environment. Then, the vehicle control unit 7 determines which behavior is most suitable for the current traveling environment from the behavior candidates, and sets the behavior most suitable for the current traveling environment as the first behavior. Then, the vehicle control unit 7 sets behavior candidates excluding the first behavior to the second behavior.
  • the vehicle control unit 7 selects from the storage unit 8 candidates for three behaviors of acceleration of the vehicle 1, deceleration of the vehicle 1, and lane change to the right of the vehicle 1. read out. Then, the vehicle control unit 7 determines that the rightward lane change of the vehicle 1 is the most suitable behavior based on the speed of the vehicle joining from the left side and the situation of the right lane of the vehicle 1. The behavior is set to the first behavior. Then, the vehicle control unit 7 sets behavior candidates excluding the first behavior to the second behavior.
  • the vehicle control unit 7 causes the notification unit 92 of the information notification device 9 to notify the first behavior and the second behavior (step S16).
  • the notification unit 92 highlights and displays the character information “lane change”, which is the first behavior information, in the display area 59b, and is the second behavior information. “Acceleration” and “Deceleration” are displayed in the display areas 59a and 59c, respectively.
  • the vehicle control unit 7 determines whether or not the operation unit 51 has received an operation from the driver within the second predetermined time (step S17).
  • the vehicle control unit 7 sets the time from when it is determined that the current travel environment is the travel environment illustrated in FIG. 5A to the arrival at the junction point as the first predetermined time. And the vehicle control part 7 sets 2nd predetermined time shorter than 1st predetermined time as time when reception of operation with respect to the next behavior performed by a merge point is possible.
  • the vehicle control unit 7 determines whether the received operation is an operation for terminating automatic driving or a behavior selection operation (so-called operation). Update) is determined (step S18).
  • each display area of the notification unit 92 and each operation button of the operation unit 51 correspond to each other.
  • the driver selects the end of the automatic driving in FIG. 5B, the driver presses the operation button 51g shown in FIG. Further, when selecting the behavior, the driver presses one of the operation buttons 51a to 51c shown in FIG.
  • the vehicle control unit 7 terminates the automatic driving when the operation received by the operating unit 51 is an operation for terminating the automatic driving (that is, when it is detected that the operation button 51g is pressed) (step S19).
  • the operation received by the operation unit 51 is a behavior selection operation (that is, when any of the operation buttons 51a to 51c is pressed)
  • the vehicle control unit 7 executes the behavior corresponding to the pressed operation button.
  • the vehicle 1 is controlled (step S20).
  • the vehicle control unit 7 controls the vehicle 1 to execute the first behavior when the operation unit 51 does not accept the operation from the driver within the second predetermined time (NO in step S17). (Step S21).
  • FIG. 6 is a diagram showing a first example of the driving environment and another display control for it. 6 (a) is the same as FIG. 5 (a), but the display control of FIG. 6 (b) is different from the display control of FIG. 5 (b).
  • the vehicle control unit 7 accelerates the vehicle 1 from the storage unit 8 with respect to the traveling environment illustrated in (a) of FIG. Three candidate motions for deceleration and lane change to the right of the vehicle 1 are read out. At that time, it is assumed that the storage unit 8 stores a behavior in which the lane change to the right side of the vehicle 1 has the highest priority.
  • the vehicle control unit 7 causes the notification unit 92 to notify the traveling environment information and the first behavior information.
  • the vehicle control unit 7 generates character information 69 indicating information on the driving environment and information on the first behavior, and causes the notification unit 92 to display the character information 69.
  • the vehicle control unit 7 causes the display areas 69a and 69c to display a display prompting the driver to adopt or reject the first behavior.
  • the vehicle control unit 7 displays a display “automatic driving end” indicating that switching to manual driving is possible in the display area 69g.
  • the vehicle control unit 7 highlights and displays “YES” corresponding to adopting the first behavior. Which of “YES” and “NO” is emphasized and displayed may be determined in advance, the option selected last time may be highlighted and displayed, or the number of times selected in the past May be stored in the storage unit 8, and the notification unit 92 may highlight and display the one with the larger number of times.
  • the vehicle control unit 7 can appropriately notify the driver of information. Moreover, the display made to alert
  • FIG. 7 is a diagram showing a second example of the driving environment and display control for the second example.
  • FIG. 7A is an overhead view showing the traveling environment.
  • the traveling environment shown in FIG. 7A is the same as FIG. 5A and FIG. 6A in that there is a joint path ahead, but the traveling vehicle exists on the right side of the vehicle 1. 5 (a) and FIG. 6 (a) are different. In such a case, the vehicle control unit 7 determines that the lane change cannot be performed.
  • the vehicle control unit 7 determines that the traveling environment of the vehicle 1 is as shown in FIG. 7A, the vehicle control unit 7 informs the notification unit 92 of the determined traveling environment information as shown in FIG. It is displayed as character information 79.
  • the vehicle control unit 7 selects the right side of the vehicle 1 among the three behavior candidates of acceleration of the vehicle 1 read from the storage unit 8, deceleration of the vehicle 1, and lane change to the right side of the vehicle 1. Since the lane cannot be changed, only the acceleration of the vehicle 1 and the deceleration of the vehicle 1 are selected.
  • the vehicle control unit 7 predicts that the vehicle 1 is too close to the joining vehicle when proceeding at this speed, and determines that the deceleration of the vehicle 1 is the most suitable behavior, that is, the first behavior.
  • the most suitable behavior is determined using a conventional technique that determines the most suitable behavior based on information on the driving state and the surrounding situation. Further, which behavior is most suitable may be determined in advance, or information on the behavior selected last time may be stored in the storage unit 8, and the behavior may be determined as the most suitable behavior. Then, the number of times each behavior has been selected in the past may be stored in the storage unit 8, and the behavior with the largest number of times may be determined as the most suitable behavior.
  • the vehicle control unit 7 displays “Deceleration” as the first behavior in the display area 79c, and displays “Acceleration” as the second behavior in the display area 79a. Further, the vehicle control unit 7 causes the display area 79g to display “automatic driving end” indicating switching to manual driving.
  • the vehicle control unit 7 can notify the driver of the behavior most suitable for the traveling environment as the first behavior according to the traveling environment.
  • the information on the first behavior may be arranged on the upper side and the information on the second behavior may be arranged on the lower side, and selection functions may be assigned to the operation buttons 51a and 51c, respectively. Further, the information on the acceleration behavior is arranged upward, the information on the deceleration behavior is arranged downward, the information on the behavior of the right lane change is arranged on the right side, and the information on the behavior of the left lane change is arranged on the left side. , 51b, 51d may be assigned a selection function. In addition, it may be possible to switch between them, and a separate action priority arrangement or operation priority arrangement may be displayed. Furthermore, the display size of the first behavior information may be increased and the display size of the second behavior information may be decreased. In addition, by arranging the behavior information display corresponding to the behavior of the front / rear / left / right of the vehicle, the driver can recognize and operate intuitively.
  • FIG. 8 is a diagram showing a third example of the driving environment and display control for it.
  • FIG. 8A is an overhead view showing the traveling environment of the vehicle 1. Specifically, FIG. 8A shows a travel environment in which the preceding vehicle travels at a slower speed than the vehicle 1 and the lane can be changed to the adjacent lane.
  • the vehicle control unit 7 determines that the traveling environment is a traveling environment as shown in FIG. 8A based on information on the traveling state and the surrounding situation. In this case, the vehicle control unit 7 causes the notification unit 92 to display the determined traveling environment information as character information 89.
  • the vehicle control unit 7 can select three behaviors as a candidate for the behavior corresponding to the determined traveling environment: traveling that overtakes the preceding vehicle, traveling that changes the lane to the adjacent lane, and traveling that decelerates the vehicle 1 and follows the preceding vehicle.
  • the candidate for the street behavior is read from the storage unit 8.
  • the vehicle control unit 7 allows the speed after the deceleration of the preceding vehicle to be higher than a predetermined value, so that the behavior in which the vehicle 1 decelerates and follows the preceding vehicle is most suitable, that is, the first behavior. It is determined that
  • the most suitable behavior is determined using a conventional technique that determines the most suitable behavior based on information on the driving state and the surrounding situation. Further, which behavior is most suitable may be determined in advance, or information on the behavior selected last time may be stored in the storage unit 8, and the behavior may be determined as the most suitable behavior. Then, the number of times each behavior has been selected in the past may be stored in the storage unit 8, and the behavior with the largest number of times may be determined as the most suitable behavior.
  • the vehicle control unit 7 highlights and displays the character information “follow” indicating the first behavior in the display area 89c, and displays the “overtaking” indicating the second behavior. "And” change lane “are displayed in the display areas 89a and 89b, respectively. Further, the vehicle control unit 7 causes the display area 89g to display “automatic driving end” indicating switching to manual driving.
  • the information on the first behavior may be arranged on the upper side and the information on the second behavior may be arranged on the lower side, and selection functions may be assigned to the operation buttons 51a and 51c, respectively. Further, the information on the passing behavior is arranged upward, the information on the following behavior is arranged downward, the information on the behavior of the right lane change is arranged on the right side, and the information on the behavior of the left lane change is arranged on the left side. , 51b, 51d may be assigned a selection function. In addition, it may be possible to switch between them, and a separate action priority arrangement or operation priority arrangement may be displayed. Furthermore, the display size of the first behavior information may be increased and the display size of the second behavior information may be decreased.
  • FIG. 9 is a diagram showing a fourth example of the driving environment and display control for it.
  • FIG. 9A is an overhead view showing the traveling environment of the vehicle 1. Specifically, FIG. 9A illustrates that the traveling environment is a traveling environment in which the lanes decrease in front of the same lane as the vehicle 1.
  • the vehicle control unit 7 determines that the traveling environment is a traveling environment as shown in FIG. 9A based on information on the traveling state and the surrounding situation. In this case, the vehicle control unit 7 causes the notification unit 92 to display the determined traveling environment information as the character information 99.
  • the vehicle control unit 7 reads out from the storage unit 8 two candidate behaviors, that is, a behavior for changing the lane to the adjacent lane and a driving for maintaining the current lane as the behavior candidates corresponding to the determined travel environment. .
  • the vehicle control unit 7 determines that the travel to change the lane to the adjacent lane is the most suitable behavior, that is, the first behavior because the TTC to the lane decrease point is shorter than a predetermined value. To do.
  • which of the two behavior candidates is the most suitable behavior is determined using a conventional technique for determining the most suitable behavior based on information on the driving state and the surrounding situation. Further, which behavior is most suitable may be determined in advance, or information on the behavior selected last time may be stored in the storage unit 8, and the behavior may be determined as the most suitable behavior. Then, the number of times each behavior has been selected in the past may be stored in the storage unit 8, and the behavior with the largest number of times may be determined as the most suitable behavior.
  • the vehicle control unit 7 highlights and displays the character information “lane change” indicating the first behavior in the display area 99 b and indicates the second behavior “ Character information “as is” is displayed in the display area 99c. Further, the vehicle control unit 7 causes the display area 99g to display “automatic driving end” indicating switching to manual driving.
  • the first behavior information may be arranged above, the second behavior information may be arranged below, and a selection function may be assigned to each of the operation buttons 51a and 51c.
  • the change behavior information is arranged on the right side, the left lane change behavior information is arranged on the left side, and a selection function may be assigned to each of the operation buttons 51c, 51b, 51d.
  • You may display whether it is action priority arrangement
  • the display size of the first behavior information may be increased and the display size of the second behavior information may be decreased.
  • different functions are assigned to the display areas according to different traveling environments, so that information notification or operation can be performed in a small area.
  • the vehicle control unit 7 causes the notification unit 92 to notify the behavior in accordance with the information on the traveling environment and the surrounding situation, but the present invention is not limited to this.
  • the notification unit 92 may be notified of the behavior.
  • FIG. 10 is a diagram showing a fifth example of the driving environment and display control for it.
  • FIG. 10A is an overhead view showing the traveling environment of the vehicle 1. Specifically, FIG. 10A shows a traveling environment indicating that the vehicle 1 is a traveling environment in which lanes can be changed to the left and right.
  • the driving environment shown in (a) of FIG. 10 is different from the cases of (a) to (a) of FIG. 5 and is a driving environment in which normal driving without changing lanes or accelerating or decelerating the vehicle is possible. is there.
  • the vehicle control unit 7 does not have to display the information on the driving environment on the notification unit 92 as character information.
  • the vehicle control unit 7 stores the behavior candidate in the normal travel as the storage unit 8. Read from.
  • the acceleration of the vehicle 1, the deceleration of the vehicle 1, and the lane change to the right side of the vehicle 1 are associated with the traveling environment of the normal traveling as shown in FIG. , Four behavior candidates for changing the lane to the left of the vehicle 1 are stored.
  • the vehicle control unit 7 reads out these and displays them on the display areas 109a to 109d of the notification unit 92, respectively.
  • the vehicle control unit 7 displays “automatic driving end” indicating that switching to manual driving is displayed in the display area 109g, and displays “cancel” indicating canceling behavior update in the display area 109e. Highlight and display.
  • the display in the notification unit 92 has been described as character information, but the present invention is not limited to this. For example, it may be displayed visually to the driver using a symbol indicating the behavior. In the following, the display using symbols visually displayed to the driver will be described by taking the display for FIGS. 5 and 7 as an example.
  • FIG. 11 is a diagram showing another display control for the first example of the traveling environment shown in FIG.
  • the first behavior described above is a lane change to the right of the vehicle 1
  • the second behavior is acceleration of the vehicle 1 and deceleration of the vehicle 1.
  • a symbol 111 indicating “lane change” as the first behavior is displayed large in the center
  • deceleration of the vehicle 1 is displayed small to the right.
  • a symbol 114 indicating the end of automatic driving is displayed small on the left.
  • the lane change is performed.
  • FIG. 12 is a diagram showing another display control for the second example of the traveling environment shown in FIG.
  • the lane cannot be changed. Therefore, for example, “deceleration of the vehicle 1” is set to the first behavior, and “acceleration of the vehicle 1” is set to the second behavior.
  • the symbol 121 indicating “deceleration of the vehicle 1” that is the first behavior is displayed large in the center, and “acceleration of the vehicle 1” that is the second behavior.
  • the symbol 122 indicating "" is displayed small to the right.
  • a symbol 123 indicating the end of automatic driving is displayed small on the left.
  • the operation unit 51 receives an operation for selecting “acceleration of the vehicle 1” from the driver.
  • the symbol 122 ′ indicating “acceleration of the vehicle 1” as the first behavior is displayed large in the center, and “deceleration of the vehicle 1” as the second behavior.
  • the symbol 121 ′ indicating “” is displayed small to the right.
  • the driver can grasp the behavior performed by the vehicle and other behaviors that can be selected, and can continue the automatic driving with a sense of security. Alternatively, the driver can give instructions to the car smoothly.
  • the option to be notified to the notification unit, that is, the second behavior can be made variable according to the traveling environment.
  • FIG. 13 is a block diagram showing a main configuration of the vehicle 1 including the information notification device according to Embodiment 2 of the present invention.
  • the same components as those in FIG. 1 are denoted by the same reference numerals as those in FIG.
  • a vehicle 1 shown in FIG. 13 is provided with a touch panel 10 instead of the operation unit 51 of the steering wheel 5.
  • the touch panel 10 is a device composed of a liquid crystal panel or the like capable of displaying information and receiving input, and is connected to the vehicle control unit 7.
  • the touch panel 10 includes a display unit 101 that displays information based on control by the vehicle control unit 7 and an input unit 102 that receives an operation from a driver or the like and outputs the received operation to the vehicle control unit 7.
  • display control of the touch panel 10 will be described.
  • display control when the vehicle 1 is traveling in the center of three lanes and the lane can be changed to either the right lane or the left lane will be described.
  • FIG. 14 is a diagram for explaining a display on the touch panel 10 according to the second embodiment.
  • FIG. 14A shows an initial display of the display unit 101 of the touch panel 10.
  • the vehicle control unit 7 determines that the lane can be changed to either the right lane or the left lane
  • the vehicle control unit 7 displays on the display unit 101 of the touch panel 10 as shown in FIG. Is executed.
  • the display “Touch” in the display area 121 indicates that the touch panel 10 is in a mode in which a touch operation by the driver can be received.
  • the input unit 102 accepts this operation and provides information indicating that this operation has been performed to the vehicle. Output to the control unit 7.
  • the vehicle control unit 7 causes the display unit 101 to display the display shown in FIG. 14B and causes the notification unit 92 to display the display shown in FIG.
  • FIG. 14B shows a display area 121a on which “Move” indicating an operation for instructing the vehicle 1 to move is displayed. Further, FIG. 14B shows display areas 121b to 121d indicating that the vehicle 1 can travel in each of the three lanes. The display areas 121b to 121d correspond to traveling in the lane indicated by arrows X, Y, and Z in FIG. 14C, respectively.
  • the display areas in FIG. 14B and the corresponding arrows in FIG. 14C have the same mode (for example, color or arrangement). As a result, the display is easier to understand for the driver.
  • the behavior performed by the vehicle determined by the vehicle control may be displayed so that the behavior selectable by the driver can be distinguished.
  • the driver selects the behavior of the vehicle 1 by touching the display area corresponding to the lane to be traveled among the display areas 121b to 121d.
  • the input unit 102 accepts a driver's behavior selection operation and outputs information on the selected behavior to the vehicle control unit 7.
  • the vehicle control unit 7 controls the vehicle 1 to execute the selected behavior.
  • the vehicle 1 travels in the lane that the driver wants to travel.
  • the driver may perform a swipe operation on the touch panel 10 instead of the touch operation.
  • the driver when the driver wants to change to the lane indicated by the arrow X in FIG. 14C, the driver performs a right swipe operation on the touch panel 10.
  • the input unit 102 receives the swipe operation and outputs information indicating the content of the swipe operation to the vehicle control unit 7. And the vehicle control part 7 controls the vehicle 1 to perform the lane change to the lane shown by the arrow X which is the selected behavior.
  • the user may speak “behavior selection” or the like by voice. Thereby, it becomes possible to operate only by displaying the HUD without looking at the touch panel at hand.
  • the display mode of the lane corresponding to the display area of the selected touch panel may be changed so that it can be confirmed before selecting which lane is being selected. For example, at the moment when the display area b is touched, the thickness of the lane X increases, and if the hand is released immediately, the lane X is not selected and the thickness of the lane X returns to the original size, and the display area 121c is touched. If the thickness of the lane Y increases and the state is maintained for a while, the lane Y may be selected and the fact that the lane Y has blinked may be notified. Thereby, selection or determination operation can be performed without looking at the hand.
  • vehicle control functions such as acceleration, deceleration, overtaking, and the like may be assigned to the display area according to the driving environment.
  • the driver can perform an intuitive operation.
  • the touch panel can freely change the number, shape, color, and the like of display areas for receiving operations, the degree of freedom of the user interface is improved.
  • the configuration according to the present embodiment is a configuration in which, in the configuration of FIG. 1 described in the first embodiment, the operation unit 51 further includes a grip sensor that detects whether or not the driver has gripped the steering wheel 5. .
  • FIG. 15 is a diagram illustrating the display of the notification unit 92 according to Embodiment 3 of the present invention.
  • a vehicle traveling in front of the same lane as the vehicle 1 travels at a slower speed than the vehicle 1, and the lane is changed to the adjacent lane.
  • the example of the display in the driving environment which can do is shown.
  • the vehicle control unit 7 determines that the traveling environment is the traveling environment illustrated in FIG. 8A, the vehicle control unit 7 first causes the notification unit 92 to execute the display illustrated in FIG.
  • the symbol 131 indicating “overtaking” which is the first behavior is displayed in the first mode (for example, the first Color).
  • the vehicle control unit 7 changes the symbol 131 from the first mode to the first mode. It is displayed on the notification unit 92 in a different second mode (for example, a second color different from the first color).
  • the second predetermined time is the same as the second predetermined time described in the first embodiment.
  • the driver can select the second behavior, but when the symbol 131 is changed to the second mode, the driver Selection of the behavior of 2 becomes impossible.
  • 15A shows a steering wheel shape symbol 132 indicating that the second behavior can be selected.
  • the symbol 132 is displayed, when the driver holds the steering wheel 5, the second behavior is displayed.
  • the symbol 132 is a display indicating that the second behavior can be selected.
  • the driver can select the second behavior. It may be shown. In this case, the symbol 132 may not be displayed.
  • the symbol 133 is an auxiliary display that indicates to the driver that the vehicle is traveling in automatic driving, but the symbol 133 may not be displayed.
  • the grip sensor detects it and outputs information of the detection result to the vehicle control unit 7.
  • the vehicle control unit 7 causes the notification unit 92 to execute the display shown in FIG.
  • FIG. 15B as in FIG. 15A, the symbol 131 indicating the “passing” which is the first behavior is shown in the first mode (for example, the first color). . Further, a symbol 134 indicating “lane change” as the second behavior and a symbol 135 indicating “deceleration” as the second behavior are shown.
  • the driver changes the first behavior to the second behavior by operating the operation unit 51 of the steering wheel 5. For example, the driver depresses the operation button 51a of the operation unit 51 or the operation button 51c (see (c) of FIG. 2), thereby “lane change” (symbol 134) or “deceleration” (symbol). 135) is updated.
  • FIG. 15B also shows a symbol 136 indicating that the vehicle control unit 7 is learning the behavior of the vehicle 1.
  • the symbol 136 When the symbol 136 is displayed, the vehicle control unit 7 learns the behavior selected by the driver.
  • the symbol 136 may not be displayed. In addition, learning may always be performed.
  • the vehicle control unit 7 stores the behavior selected by the driver in the storage unit 8, and when the same driving environment is subsequently set, the stored behavior is displayed on the notification unit 92 as the first behavior.
  • the vehicle control part 7 may memorize
  • FIG. 15 (b) a symbol 137 indicating that automatic operation is not being performed is shown.
  • the vehicle control unit 7 waits until a behavior selected after the first predetermined time has elapsed is selected by the driver.
  • the vehicle control unit 7 receives information on the selection operation, The notification unit 92 is caused to execute the display shown in FIG.
  • a symbol 134 'indicating “lane change” is shown in the first mode.
  • the vehicle control unit 7 determines that the selected behavior is the next behavior to be performed, and sets the symbol 134 ′ indicating “lane change” to the first. Is displayed on the notification unit 92.
  • the symbol 131 ′ in FIG. 15C is a symbol 131 ′ displayed as the first behavior in FIG. 15B and replaced with the symbol 134.
  • the vehicle control unit 7 receives information on an operation of pressing one of the operation buttons twice in succession, and changes from the display shown in FIG. 15C to the display shown in FIG. 15B. Is executed by the notification unit 92.
  • the vehicle control unit 7 causes the notification unit 92 to execute the display shown in FIG. 15A until the second predetermined time elapses, based on the driver's operation ( b)
  • the display of the notification unit 92 is changed to (c) in FIG.
  • the vehicle control unit 7 causes the notification unit 92 to display the display illustrated in FIG. 15D after the second predetermined time has elapsed since the notification unit 92 has executed the display illustrated in FIG. Display.
  • the vehicle control unit 7 displays the information shown in FIG. 15D before the second predetermined time elapses when information indicating that the driver has released his hand from the steering wheel 5 is acquired from the grip sensor. May be displayed on the notification unit 92.
  • the vehicle control unit 7 changes the display on the notification unit 92 so that the candidate of another behavior can be confirmed only when the driver wants to update the next behavior. To do.
  • the display visually recognized by the driver can be reduced, and the driver's troublesomeness can be reduced.
  • the driver model is obtained by modeling the tendency of the operation by the driver for each driving environment based on information on the frequency of each operation.
  • the driver model aggregates the traveling histories of a plurality of drivers and is constructed from the aggregated traveling histories.
  • the driving history of the driver is, for example, a history in which the behavior frequency actually selected by the driver among the behavior candidates corresponding to each driving environment is aggregated for each behavior candidate.
  • FIG. 16 is a diagram showing an example of a travel history.
  • the behavior candidates “decelerate”, “accelerate”, and “lane change” are selected three times, once, and five times, respectively. It has been shown.
  • candidate behaviors “follow”, “passing”, and “lane change” are shown twice, twice, 1 It is shown that you have selected once. The same applies to the driver y.
  • the driving history of the driver may aggregate behaviors selected during automatic driving or may aggregate behaviors actually performed by the driver during manual driving. This makes it possible to collect travel histories according to operating conditions such as automatic driving or manual driving.
  • the driver model includes a clustering type constructed by clustering the driving histories of a plurality of drivers, and a driver model of the driver x from a plurality of driving histories similar to a driving history of a specific driver (for example, driver x).
  • a specific driver for example, driver x
  • the clustering type driver model construction method aggregates the driving histories of a plurality of drivers as shown in FIG. Then, a driver model is constructed by grouping a plurality of drivers having a high degree of similarity between the traveling histories, that is, a plurality of drivers having similar driving operation tendencies.
  • FIG. 17 is a diagram illustrating a clustering type driver model construction method.
  • FIG. 17 shows the travel histories of the drivers a to f in a table format. From the driving histories of the drivers a to f, it is shown that the model A is constructed from the traveling histories of the drivers a to c, and the model B is constructed from the traveling histories of the drivers d to f.
  • the similarity of the travel histories is, for example, treating each frequency (each numerical value) in the travel histories of the driver a and the driver b as a frequency distribution, calculating a correlation value between the frequency distributions, and using the calculated correlation value as the similarity It is good.
  • each frequency each numerical value
  • the driving history of the driver a and the driver b is set as one group.
  • the calculation of similarity is not limited to this.
  • the degree of similarity may be calculated based on the number of the most frequently matched behaviors in the driving histories of the driver a and the driver b.
  • the clustering type driver model is constructed by, for example, calculating the average of each frequency in the driving history of drivers in each group.
  • FIG. 18 is a diagram illustrating an example of a built clustering driver model.
  • the average frequency of each group is derived by calculating the average of the respective frequencies.
  • the clustering type driver model is constructed with an average frequency for the behavior determined for each driving environment.
  • FIG. 19 is a diagram illustrating another example of the constructed clustering type driver model. As shown in FIG. 19, the most frequent behavior is selected for each traveling environment, and a driver model is constructed from the selected behavior.
  • the driver model as shown in FIG. 18 is stored in advance in the storage unit 8 of the vehicle 1. Further, the vehicle control unit 7 stores a travel history when the driver y has driven in the past in the storage unit 8. The driver y is detected by a camera or the like (not shown) installed in the vehicle.
  • the vehicle control unit 7 calculates the similarity between the driving history of the driver y and the driving history of each model of the driver model, and determines which model is most suitable for the driver y. For example, in the case of the driving history of the driver y shown in FIG. 16 and the driver model shown in FIG. 18, the vehicle control unit 7 determines that the model B is most suitable for the driver y.
  • the vehicle control unit 7 determines that the behavior with the highest frequency is the behavior most suitable for the driver y, that is, the first behavior in each traveling environment of the model B during actual automatic traveling.
  • the vehicle control unit 7 is based on the model B shown in FIG. “Follow-up” can be determined as the first behavior.
  • the individual adaptive type driver model construction method aggregates the driving histories of a plurality of drivers as shown in FIG.
  • the difference from the clustering type is that a driver model is constructed for each driver.
  • operator y is demonstrated.
  • the driving histories of a plurality of drivers having high similarity to the driving history of the driver y are extracted from the driving histories of the plurality of drivers collected. Then, a driver model of the driver y is constructed from the extracted driving histories of the plurality of drivers.
  • FIG. 20 is a diagram showing a method for constructing an individual adaptive driver model.
  • the driving histories of the drivers a to f are shown in a table format, as in FIG. FIG. 20 shows that the driver model of the driver y is constructed from the driving history of the driver y shown in FIG. 16 and the driving histories of the drivers c to e having high similarity.
  • the individual adaptive driver model is constructed by calculating the average of each frequency in the extracted driving history of each driver.
  • FIG. 21 is a diagram illustrating an example of a constructed individual adaptive driver model.
  • the average frequency of each behavior is derived for each driving environment.
  • the individually adaptive driver model for the driver y is constructed with an average frequency of behavior corresponding to each traveling environment.
  • the driver model of the driver y as shown in FIG. 21 is stored in advance in the storage unit 8 of the vehicle 1. Further, the vehicle control unit 7 stores a travel history when the driver y has driven in the past in the storage unit 8. The driver y is detected by a camera or the like (not shown) installed in the vehicle.
  • the vehicle control unit 7 determines that the behavior with the highest frequency is the most suitable behavior for the driver y, that is, the first behavior in each driving environment of the driver model of the driver y in actual automatic driving. Judge that there is.
  • the vehicle control unit 7 is based on the driver model shown in FIG. “Changing lane” can be determined as the first behavior.
  • the actual operation for example, the magnitude of acceleration, deceleration, or the amount of operation of the steering wheel
  • one behavior for example, lane change
  • the vehicle control unit 7 extracts a feature amount indicating the driving characteristics of the driver from the operation content of each part of the vehicle 1 of the driver, and stores it in the storage unit 8.
  • the feature amount includes, for example, a feature amount related to speed, a feature amount related to steering, a feature amount related to operation timing, a feature amount related to outside-vehicle sensing, a feature amount related to in-vehicle sensing, and the like.
  • the feature quantity related to speed includes, for example, the speed, acceleration, and deceleration of the vehicle, and these feature quantities are acquired from a speed sensor or the like that the vehicle has.
  • the feature amount related to steering includes, for example, the steering angle, angular velocity, and angular acceleration of the steering, and these feature amounts are acquired from the steering wheel 5.
  • the feature quantities related to the operation timing include, for example, the operation timing of the brake, accelerator, blinker lever, and steering wheel. These feature quantities are obtained from the brake pedal 2, the accelerator pedal 3, the blinker lever 4, and the steering wheel 5, respectively. Is done.
  • the feature amount related to outside-vehicle sensing includes, for example, a distance between vehicles in front, side, and rear, and these feature amounts are acquired from the sensor 62.
  • the feature amount related to in-vehicle sensing is, for example, personal recognition information indicating who the driver is and who is the passenger, and these feature amounts are acquired from a camera or the like installed in the vehicle.
  • the vehicle control unit 7 detects that the driver has manually changed the lane.
  • the detection method is performed by analyzing operation time series data acquired from CAN (Controller Area Network) information or the like by rule-setting an operation time series pattern for changing lanes in advance. In that case, the vehicle control part 7 acquires the feature-value mentioned above.
  • the vehicle control unit 7 stores the feature amount in the storage unit 8 for each driver, and constructs a driving characteristic model.
  • the vehicle control unit 7 may construct the above-described driver model based on the feature amount for each driver. That is, the vehicle control unit 7 extracts a feature value related to speed, a feature value related to steering, a feature value related to operation timing, a feature value related to outside-vehicle sensing, and a feature value related to in-vehicle sensing, and stores them in the storage unit 8. And based on the feature-value memorize
  • FIG. 22 is a diagram showing an example of the driving characteristic model.
  • FIG. 22 shows the feature values in a tabular format for each driver.
  • FIG. 22 also shows the number of times each behavior has been selected in the past for each driver. Although only a part of the feature amount is described, any or all of the above may be described.
  • the numerical value of speed is a numerical value indicating the actual speed in stages.
  • the numerical values for the steering wheel, brake, and accelerator are numerical values that indicate the operation amount in stages. These numerical values are obtained, for example, by calculating an average value of speed, steering wheel, brake, and accelerator operation amounts within a predetermined period in the past and expressing the average value stepwise.
  • the vehicle control unit 7 performs driving corresponding to the driver, the behavior, and the passenger according to who the driver is, what kind of behavior is executed, and who the passenger is.
  • the characteristic model is selected from the driving characteristic model shown in FIG.
  • the vehicle control unit 7 causes the vehicle 1 to travel at a speed corresponding to the selected driving characteristic model, and controls the vehicle 1 by a combination of the steering wheel, brake, accelerator operation amounts and timing. Thereby, the automatic driving
  • FIG. 23 is a diagram illustrating the display of the notification unit 92 according to the fourth embodiment of the present invention.
  • FIG. 23 is a display for the first example of the traveling environment shown in FIG.
  • FIG. 23 is a display of the notification unit 92 in a state in which the vehicle is performing normal travel that does not require lane change or vehicle acceleration / deceleration.
  • FIG. 23A shows a symbol 231 indicating that the driving characteristic of the driver is “high deceleration” driving characteristic and a symbol 232 indicating that the driver is currently in automatic driving.
  • the vehicle control unit 7 determines the driving characteristics of the driver based on the number of times each behavior included in the driving characteristics model shown in FIG. 22 has been selected in the past, for example. In this case, for example, the vehicle control unit 7 displays a display including a symbol 231 as shown in FIG. 23 for the driver who has a lot of “deceleration” based on driving characteristics (the number of times the behavior of so-called “deceleration” is selected is large). The information is displayed on the notification unit 92.
  • the vehicle control unit 7 determines that the driving environment is the driving environment of the first example illustrated in FIG. 5, the vehicle control unit 7 determines that the driving characteristic of the driver is “high deceleration” driving characteristic. Based on the fact, the first behavior is determined to be “deceleration”, and the notification unit 92 is caused to execute the display of FIG.
  • a symbol 233 indicating “deceleration” which is the first behavior is shown in a first mode (for example, a first color). Further, a symbol 234 indicating “acceleration” as the second behavior and a symbol 235 indicating “lane change” as the second behavior are shown.
  • the vehicle control unit 7 causes the notification unit 92 to execute the display of (c) of FIG. .
  • a symbol 234 'indicating "acceleration" which is the selected behavior is shown in the first mode. Further, the symbol 233 ′ is displayed by replacing the symbol 233 displayed as the first behavior in FIG. 23B with the symbol 234.
  • the vehicle control unit 7 causes the notification unit 92 to display the display illustrated in FIG. 23D after the second predetermined time has elapsed since the notification unit 92 has executed the display illustrated in FIG. Display.
  • a symbol 234 'indicating "acceleration" selected by the driver is displayed in the second mode as the next behavior.
  • the vehicle control unit 7 When it is determined that the next behavior to be taken is “acceleration”, the vehicle control unit 7 reads out feature amounts corresponding to the “acceleration” behavior included in the driving characteristic model, and performs “acceleration” reflecting those feature amounts. The vehicle 1 is controlled to do so.
  • FIG. 24 is a diagram illustrating the display of the notification unit 92 according to the fourth embodiment of the present invention.
  • FIG. 24 is a display for the second example of the traveling environment shown in FIG. 24, the same reference numerals as those in FIG. 23 are given to components common to those in FIG. 23, and detailed description thereof is omitted.
  • FIG. 24 is a diagram in which the symbol 235 indicating “lane change” is deleted from FIG. 23.
  • FIG. 7 As described above, in the second example (FIG. 7), unlike the first example (FIG. 5), another vehicle is traveling to the right of the vehicle 1, and therefore the lane cannot be changed. Therefore, “lane change” is not displayed in FIGS. 24B and 24C. Further, in the example of FIG. 24C, “acceleration” is selected as in the case of FIG. 23C, and therefore the vehicle control unit 7 is included in the driving characteristic model as in FIG. The vehicle 1 is controlled to read out feature amounts corresponding to the behavior of “acceleration” and perform “acceleration” reflecting those feature amounts.
  • FIG. 25 is a diagram illustrating the display of the notification unit 92 according to the fourth embodiment of the present invention.
  • FIG. 25 is a display for the third example of the travel environment shown in FIG.
  • (A) in FIG. 25 is the same as (a) in FIG.
  • the vehicle control unit 7 determines that the driving environment of the third example illustrated in FIG. 8 is satisfied
  • the vehicle control unit 7 determines that the driving characteristic of the driver is the “high deceleration” driving characteristic.
  • the first behavior is determined as “deceleration”, and the notification unit 92 is caused to execute the display of FIG.
  • a symbol 251 indicating "deceleration” which is the first behavior is shown in a first mode (for example, a first color). Further, a symbol 252 indicating “passing” that is the second behavior and a symbol 253 indicating “lane change” that is the second behavior are shown.
  • the vehicle control unit 7 causes the notification unit 92 to execute the display of (c) of FIG. 25. .
  • FIG. 25 (c) shows a symbol 252 'indicating "passing" which is the selected behavior in the first mode.
  • the symbol 251 ′ is displayed by replacing the symbol 251 displayed as the first behavior in FIG. 25B with the symbol 252.
  • the vehicle control unit 7 causes the notification unit 92 to display the display shown in FIG. 25D after the second predetermined time has elapsed since the notification unit 92 has executed the display shown in FIG. Display.
  • a symbol 252 'indicating "overtaking" selected by the driver is displayed in the second mode.
  • the vehicle control unit 7 reads out feature amounts corresponding to the “passing” behavior included in the driving characteristic model, and performs “acceleration” reflecting those feature amounts. The vehicle 1 is controlled to do so.
  • FIG. 26 is a diagram for explaining the display of the notification unit 92 according to the fourth embodiment of the present invention.
  • FIG. 26 is a display for the first example of the travel environment shown in FIG. 26A shows an example in which the driving characteristic of the driver is “high acceleration” driving characteristic, and
  • FIG. 26B shows the driving characteristic of the driver “many lane changes”. ”Shows an example in the case of driving characteristics.
  • FIG. 26 (a) shows a symbol 261 indicating that the driving characteristic of the driver is a driving characteristic with “high acceleration”. Further, a symbol 262 indicating “acceleration” which is the first behavior is shown in the first mode (for example, the first color). Further, a symbol 263 indicating “lane change” as the second behavior and a symbol 264 indicating “deceleration” as the second behavior are shown.
  • the vehicle control unit 7 gives a symbol such as (a) in FIG. 26 to a driver who has a lot of “acceleration” in the past based on driving characteristics (a large number of times the behavior of “acceleration” has been selected in the past).
  • the notification unit 92 is caused to execute display including H.261. Further, the vehicle control unit 7 determines that the first behavior is “acceleration” based on the driving characteristics of the driver being “high acceleration” driving characteristics, and notifies the display of FIG.
  • the unit 92 is caused to execute.
  • FIG. 26 (b) shows a symbol 265 indicating that the driving characteristic of the driver is a driving characteristic with “many lane changes”. Further, a symbol 266 indicating “lane change” as the first behavior is shown in the first mode (for example, the first color). Further, a symbol 267 indicating “lane change” as the second behavior and a symbol 268 indicating “deceleration” as the second behavior are shown.
  • the vehicle control unit 7 gives the driver a lot of “lane change” in the past from the driving characteristics (the so-called “lane change” has been selected many times in the past) as shown in FIG.
  • the notification unit 92 is caused to perform display including a simple symbol 265.
  • the vehicle control unit 7 determines that the first behavior is “lane change” based on the driving characteristics of the driver being “many lane changes”, and notifies the display of FIG.
  • the unit 92 is caused to execute.
  • the symbol 231 indicates the type of driver model selected from the driver's operation history. May be indicated.
  • the driver model applied to the driver who often selects “decelerate” causes the notification unit 92 to execute the display including the symbol 231 as shown in FIG.
  • the behavior of 1 is determined as “deceleration”.
  • the display including the symbol 261 as shown in FIG. 26A is executed by the notification unit 92, and the first behavior is determined as “acceleration”.
  • the display including the symbol 261 as shown in FIG. 26B is executed by the notification unit 92, and the first behavior is determined as “lane change”.
  • the driver's past driving history can be learned, and the result can be reflected in the determination of the future behavior.
  • the vehicle control unit controls the vehicle, the driving characteristics (driving preference) of the driver can be learned and reflected in the control of the vehicle.
  • automatic driving can be controlled at the timing or amount of operation that the driver or passenger likes, and unnecessary operation intervention by the driver during automatic driving without deviating from the feeling of actual driving by the actual driver. Can be suppressed.
  • a server device such as a cloud server may execute a function similar to the function executed by the vehicle control unit 7.
  • storage part 8 may exist not in the vehicle 1 but in server apparatuses, such as a cloud server.
  • the storage unit 8 may store an already constructed driver model, and the vehicle control unit 7 may determine the behavior with reference to the driver model stored in the storage unit 8.
  • the vehicle control unit 7 acquires feature amount information indicating the driving characteristics of the driver, the storage unit 8 stores the feature amount information, and the vehicle control unit 7 stores the feature amount information. Based on the feature amount information stored in the unit 8, a driver model indicating the tendency of the behavior of the vehicle selected by the driver with the frequency of each selected behavior is constructed for each traveling environment of the vehicle.
  • the vehicle control unit 7 determines a group of drivers who select a similar behavior among a plurality of drivers, and constructs a driver model for each group and for each driving environment of the vehicle.
  • the vehicle control unit 7 calculates the average value of the behavior frequency selected by each driver for each group of drivers who perform similar operations, and calculates the behavior tendency of the vehicle selected by the driver.
  • the driver model indicated by the value was constructed for each driving environment of the vehicle.
  • the vehicle control unit 7 determines the vehicle selected by the specific driver based on the behavior of the vehicle selected by another driver that tends to be similar to the behavior tendency of the vehicle selected by the specific driver.
  • a driver model indicating the tendency of behavior with the frequency of each selected behavior is constructed for each traveling environment of the vehicle.
  • the vehicle control unit 7 can construct a driver model more suitable for the driving tendency of the driver, and can perform more appropriate automatic driving for the driver based on the constructed driver model.
  • driver model (Modified example of driver model)
  • driver model demonstrated above modeled the tendency of operation (behavior) by the driver for every driving environment based on the information of the frequency of each operation, etc.
  • present invention is not limited to this. .
  • the driver model is constructed based on a travel history in which environmental parameters indicating travel environments (that is, situations) that have traveled in the past and operations (behaviors) actually selected by the driver in the travel environment are associated with each other. May be.
  • environmental parameters indicating travel environments (that is, situations) that have traveled in the past and operations (behaviors) actually selected by the driver in the travel environment are associated with each other. May be.
  • options can be determined without going through the procedure of separately detecting and classifying the driving environment and inputting (storing) the classification result into the driver model.
  • the driving environment differences as shown in FIGS. 23 and 24 are acquired as environment parameters and directly input (stored) in the driver model, so that “acceleration”, “deceleration”, “lane change” in FIG. “Is an option, and in FIG. 24," acceleration “and” deceleration "are options.
  • the driver model described below may be referred to as a situation database.
  • FIG. 27 is a diagram illustrating an example of a travel history.
  • FIG. 27 shows a travel history in which an environment parameter indicating a travel environment in which the vehicle driven by the driver x has traveled in the past and an operation (behavior) actually selected by the driver in the travel environment are associated with each other. Has been.
  • the environmental parameters (a) to (c) of the travel history shown in FIG. 27 are, for example, operated as shown in FIG. 8 (b), FIG. 5 (b), and FIG. 7 (b), respectively. It shows the driving environment when the behavior of the vehicle is presented to the person.
  • the environmental parameter of the travel history is obtained from sensing information and infrastructure information.
  • Sensing information is information detected by a vehicle sensor or radar.
  • the infrastructure information includes GPS information, map information, information acquired through road-to-vehicle communication, and the like.
  • the environmental parameters of the travel history shown in FIG. 27 are “own vehicle information”, “preceding vehicle information” indicating information of a vehicle traveling ahead of the lane on which the own vehicle a travels, and the lane on which the own vehicle travels.
  • “Side lane information” indicating the side lane information of the vehicle, and if there is a merging lane at the position where the host vehicle is traveling, the "Merge lane information” indicating the information of the merging lane, It includes “location information” indicating surrounding information.
  • “information on own vehicle” includes information on the speed Va of the own vehicle.
  • the “preceding vehicle information” includes information on the relative speed Vba of the preceding vehicle with respect to the own vehicle, the inter-vehicle distance DRba between the preceding vehicle and the own vehicle, and the rate of change RSb of the size of the preceding vehicle.
  • the speed Va of the host vehicle is detected by a speed sensor of the host vehicle.
  • the relative speed Vba and the inter-vehicle distance DRba are detected by a sensor or a radar.
  • “Information on the side lane” includes information on the side rear vehicle c traveling behind the host vehicle in the side lane, information on the side front vehicle d traveling ahead of the host vehicle in the side lane, Information of the remaining side lane length DRda.
  • the information on the side rear vehicle includes information on the relative speed Vca of the side rear vehicle with respect to the own vehicle, the inter-head distance Dca between the side rear vehicle and the own vehicle, and the change rate Rca of the inter-head distance.
  • the inter-head distance Dca between the side rear vehicle and the host vehicle is determined by measuring the front end portion (vehicle head) of the host vehicle and the front end portion of the side rear vehicle (in the direction along the traveling direction of the host vehicle (and the side rear vehicle)). This is the distance between The inter-vehicle distance may be calculated from the inter-vehicle distance and the vehicle length. The inter-vehicle distance may be substituted for the inter-vehicle distance.
  • the relative speed Vca and the inter-head distance Dca are detected by a sensor or a radar.
  • the information on the side front vehicle includes information on the relative speed Vda of the side front vehicle with respect to the host vehicle, the distance Dda between the head of the side front vehicle and the host vehicle, and the change rate Rda of the head distance.
  • the head-to-head distance Dda between the side front vehicle and the host vehicle is measured along the traveling direction of the host vehicle (and the side front vehicle) and the tip end portion (vehicle head) of the host vehicle and the tip portion (vehicle head) of the side front vehicle. Is the distance between.
  • the remaining side lane length DRda of the host vehicle is a parameter indicating a high possibility of lane change to the side lane. Specifically, the remaining side lane length DRda of the host vehicle is measured in the direction along the traveling direction of the host vehicle (and the side front vehicle) and the rear end portion of the side front vehicle. Is longer than the inter-vehicle distance DRba between the preceding vehicle and the host vehicle, the distance between the front end portion (vehicle head) of the host vehicle and the rear end portion of the side forward vehicle, and the front end portion of the host vehicle ( When the distance between the vehicle head) and the rear end portion of the side front vehicle is shorter than DRba, DRba is set. The remaining side lane length DRda of the host vehicle is detected by a sensor or a radar.
  • the information on the merging lane includes information on the relative speed Vma of the merging vehicle with respect to the own vehicle, the distance Dma between the merging vehicle and the own vehicle, and the rate of change Rma of the inter-vehicle distance.
  • the inter-head distance Dma between the joining vehicle and the host vehicle is measured in the direction along the traveling direction of the host vehicle (and the joining vehicle) and the leading end portion (head of the host vehicle) and the leading end portion (head of the joining vehicle) ).
  • the relative speed Vma and the inter-head distance Dma are detected by a sensor or a radar.
  • the numerical values of the speed, distance, and change rate described above are classified into a plurality of levels, and numerical values indicating the classified levels are stored. Note that the numerical values of the speed, the distance, and the change rate may be stored as they are without being classified into levels.
  • the location information includes “location information of own vehicle”, “number of driving lanes”, “traveling lane of own vehicle”, “distance to start / end points of merge section”, “distance to start / end points of branch section” ”,“ Distance to construction section start / end point ”,“ Distance to lane decrease section start / end point ”,“ Distance to traffic accident occurrence point ”, etc.
  • FIG. 27 shows information on “travel lane of own vehicle” (travel lane of FIG. 27) and “distance to start / end points of merge section” as examples of position information.
  • the distance to the start / end point of the merge section is determined in advance when the start / end point of the merge section exists within a predetermined distance. It is classified into a plurality of levels, and the numerical values of the classified levels are stored. If there is no start / end point of the merging section within the predetermined distance, “0” is stored in the “distance to the start / end point of the merging section” column.
  • the distance to the start / end point of the branch section is determined in advance. It is classified into a plurality of levels, and the numerical values of the classified levels are stored. If there is no start / end point of the branch section within the predetermined distance, “0” is stored in the “distance to the start / end point of the branch section”. In the "Distance to construction section start / end point” column, if there is a construction section start / end point within a predetermined distance, the distance to the construction section start / end point is determined in multiple levels. And the numerical value of the classified level is stored. When there is no construction section start / end point within a predetermined distance, “0” is stored in the column “Distance to construction section start / end point”.
  • the distance to the start / end point of lane decrease section is determined in advance when there is a start / end point of lane reduction section within the predetermined distance. It is classified into a plurality of levels, and the numerical values of the classified levels are stored. When there is no lane decrease section start / end point within a predetermined distance, “0” is stored in the “distance to lane decrease section start / end point” column.
  • the distance to the traffic accident occurrence point is classified into a plurality of predetermined levels. The numerical value of the selected level is stored. If there is no traffic accident occurrence point within a predetermined distance, “0” is stored in the “distance to the traffic accident occurrence point” column.
  • the position information may include information on which lanes of the road on which the vehicle is traveling are merge lanes, branch lanes, construction lanes, reduced lanes, and accident lanes.
  • the travel history shown in FIG. 27 is merely an example, and the present invention is not limited to this.
  • the travel history may further include “information on the left side lane” on the opposite side.
  • Left lane information includes information on the left rear vehicle traveling behind the host vehicle in the left lane, information on the left front vehicle traveling ahead of the host vehicle in the left lane, and the remaining left side of the host vehicle. Information on the direction lane length DRda.
  • the information on the left rear vehicle includes information on the relative speed Vfa of the left rear vehicle with respect to the host vehicle, the head distance Dfa between the left rear vehicle and the host vehicle, and the change rate Rfa of the head head distance.
  • the head-to-head distance Dfa between the left rear vehicle and the host vehicle is a front end portion (vehicle head) of the host vehicle measured in a direction along the traveling direction of the host vehicle (and the left rear vehicle) and a front end portion of the left rear vehicle ( This is the distance between
  • the information on the left front vehicle includes information on the relative speed Vga of the left front vehicle with respect to the own vehicle, the distance Dga between the left front vehicle and the own vehicle, and the rate of change Rga of the head distance.
  • the head-to-head distance Dga between the left front vehicle and the host vehicle is measured along the traveling direction of the host vehicle (and the left front vehicle) and the tip portion (vehicle head) of the host vehicle and the tip portion (vehicle head) of the left front vehicle. Is the distance between.
  • the travel history shown in FIG. 27 may include “rear vehicle information” indicating information on the rear vehicle traveling behind the host vehicle in the travel lane.
  • the information on the rear vehicle includes information on the relative speed Vea of the rear vehicle with respect to the host vehicle, the distance Dea between the rear vehicle and the host vehicle, and the rate of change Rea of the head distance.
  • the head-to-head distance Dea between the rear vehicle and the host vehicle is determined by the front end portion (vehicle head) of the host vehicle and the front end portion (vehicle head) of the rear vehicle measured in the direction along the traveling direction of the host vehicle (and the rear vehicle). Is the distance between.
  • the relative speed Vea and the inter-head distance Dea are detected by a sensor or a radar.
  • the measurable vehicle distance or an approximate value obtained by adding a predetermined vehicle length to the vehicle distance may be used as an alternative to the vehicle head distance.
  • the distance may be calculated by adding the length of each recognized vehicle type to the distance. Regardless of whether the head-to-head distance can be measured or not, as an alternative to the head-to-head distance, a measurable head-to-head distance or an approximate value obtained by adding a predetermined vehicle length to the head-to-head distance may be used. You may calculate by adding the vehicle length for every recognized vehicle type.
  • the traveling history may include various other information related to the traveling environment of the vehicle.
  • the travel history may include information on the size or type of the preceding vehicle, the side vehicle, the joining vehicle, or the relative position with respect to the host vehicle.
  • the type of a vehicle approaching from behind may be recognized by a camera sensor, and information indicating that the vehicle is an emergency vehicle may be included when the vehicle is an emergency vehicle. Thereby, it can inform that it is information reporting for correspondence to an emergency vehicle.
  • the numerical value which showed the steering wheel, the brake, the amount of accelerator operation in steps, or the passenger's information etc. as demonstrated in FIG. 22 may be contained in driving
  • the behaviors selected during the automatic driving may be aggregated, or the behaviors actually performed by the driver during the manual driving may be aggregated. This makes it possible to collect travel histories according to operating conditions such as automatic driving or manual driving.
  • the environmental parameter included in the travel history indicates the travel environment when the behavior of the vehicle is presented to the driver.
  • the travel environment when the driver selects the behavior May be shown.
  • both the environmental parameter indicating the driving environment when the behavior of the vehicle is presented to the driver and the environmental parameter indicating the driving environment when the driver selects the behavior may be included in the driving history. .
  • the vehicle control unit 7 performs the following operations: (a) in FIG. 2, (a) in FIG. 5, (a) in FIG. 6, (a) in FIG. 7, (a) in FIG. 8, (a) in FIG.
  • the following may be performed. That is, at least one of the information on the environmental parameter with a high degree of contribution and the information related to the environmental parameter (for example, an icon or the like) that causes the selection of the first behavior and the second behavior. May be generated as notification information, and the notification information may be notified to the notification unit 92 by, for example, showing the generated notification information on an overhead view.
  • the vehicle control unit 7 increases the luminance between the preceding vehicle and the own vehicle in the overhead view.
  • An area where the color is raised or the color is changed may be displayed to notify the notification unit 92 of the notification information.
  • the vehicle control unit 7 may display an icon indicating that the contribution degree of the inter-vehicle distance DRba or the change rate RSb is high in the area between the preceding vehicle and the host vehicle as the notification information. Further, the vehicle control unit 7 causes the notification unit 92 to draw a line segment connecting the preceding vehicle and the host vehicle as notification information on the overhead view, or to notify line segments connecting all the surrounding vehicles and the host vehicle. The line segment connecting the preceding vehicle and the host vehicle may be emphasized on the overhead view.
  • the vehicle control unit 7 raises the luminance between the preceding vehicle and the host vehicle in the viewpoint image seen from the driver, not the overhead view, and between the preceding vehicle and the host vehicle.
  • AR Augmented Reality
  • display may be realized by displaying different colored areas as notification information.
  • the vehicle control unit 7 may cause the notification unit 92 to display an AR indicating an environmental parameter having a high contribution degree in the region between the preceding vehicle and the host vehicle as notification information in the viewpoint image.
  • the vehicle control unit 7 displays the line segment connecting the preceding vehicle and the host vehicle in the viewpoint image as AR information, or the line segment connecting all the surrounding vehicles and the host vehicle in the viewpoint image. May be displayed as the notification information and the line segment connecting the preceding vehicle and the host vehicle may be emphasized.
  • the vehicle control unit 7 may generate, as notification information, an image that highlights a preceding vehicle that is a target of an environmental parameter with a high contribution, and may display the image on the notification unit 92.
  • the vehicle control unit 7 generates information indicating the direction of the preceding vehicle or the like that is the target of the environmental parameter with a high contribution in the overhead view or the AR display as the notification information, and the information is the own vehicle or the vicinity of the own vehicle. May be displayed.
  • the vehicle control unit 7 reduces the display brightness of a preceding vehicle or the like that is the target of the environmental parameter with a low contribution instead of notifying the information about the environmental parameter with a high contribution or information related to the environmental parameter.
  • information on an environmental parameter having a high degree of contribution that is made inconspicuous by making it inconspicuous or information related to the environmental parameter may be generated as notification information and displayed on the notification unit 92.
  • the driver model includes a clustering type constructed by clustering the driving histories of a plurality of drivers, and a driver model of the driver x from a plurality of driving histories similar to a driving history of a specific driver (for example, driver x).
  • a specific driver for example, driver x
  • the clustering type driver model construction method the driving history of the driver as shown in FIG. 27 is aggregated in advance for each driver. Then, a driver model is constructed by grouping a plurality of drivers having a high degree of similarity between the traveling histories, that is, a plurality of drivers having similar driving operation tendencies.
  • the similarity between the driving histories is a correlation between vectors having environmental parameter values and behavior values as elements. Can be determined from the value.
  • the correlation value calculated from the driving history of the driver a and the driver b is higher than a predetermined value, the driving history of the driver a and the driver b is set as one group. The calculation of the similarity is not limited to this.
  • the individual adaptive type driver model construction method aggregates the driving histories of a plurality of drivers as shown in FIG.
  • the difference from the clustering type is that a driver model is constructed for each driver.
  • the driving history of the driver y is compared with the driving histories of other drivers, and the driving histories of a plurality of drivers with high similarity are extracted. .
  • an individually adaptive driver model for the driver y is constructed from the extracted driving histories of the plurality of drivers.
  • driver model (situation database) based on the travel history shown in FIG. 27 is not limited to the clustering type or the individual adaptation type, and may be configured to include the travel history of all drivers, for example.
  • driver model in which the driving histories of four drivers a to d are aggregated is used for the driver x.
  • the driver model is constructed by the vehicle control unit 7.
  • FIG. 28 is a diagram showing a method of using the driver model in this modification.
  • (A) of FIG. 28 is an environmental parameter which shows the driving environment in the present time of the vehicle which the driver
  • FIG. 28B is an example of a driver model for the driver x.
  • the behavior (operation) with respect to the environmental parameter indicating the current driving environment is blank.
  • the vehicle control unit 7 acquires environmental parameters at predetermined intervals, and determines one of the environmental parameters as a trigger to determine the next behavior from the driver model shown in FIG.
  • a trigger for example, when the distance to the start point of the merging section is a predetermined distance or less, or when the relative speed with the preceding vehicle is a predetermined value or less, it is necessary to change the operation of the vehicle.
  • An environmental parameter indicating the case may be used as a trigger.
  • the vehicle control unit 7 compares the environmental parameter shown in FIG. 28A with the environmental parameter of each driving history of the driver model shown in FIG. 28B, and is associated with the most similar environmental parameter.
  • the determined behavior is determined to be the first behavior.
  • some behaviors associated with other similar environmental parameters are determined as second behaviors.
  • Whether the environmental parameters are similar can be determined from the correlation value of the vectors whose elements are the numerical values of the environmental parameters. For example, the correlation value calculated from the vector whose elements are the numerical values of the environmental parameters shown in FIG. 28A and the vector whose elements are the numerical values of the environmental parameters shown in FIG. 28B is higher than a predetermined value. The environmental parameters are determined to be similar. Note that the method for determining whether the environmental parameters are similar is not limited to this.
  • the storage unit 8 stores information indicating a safe driving criterion, and the vehicle control unit 7 determines whether or not the driving history satisfies this criterion. Furthermore, the vehicle control unit 7 may register a travel history that satisfies this criterion in the database, and may not register a travel history that does not satisfy this criterion in the database.
  • the vehicle control unit 7 accurately determines the next behavior without determining the specific driving environment, that is, without labeling the driving environment. Can be determined.
  • the driver model may be constructed from a travel history in which a behavior selected by the driver during automatic driving and an environment parameter indicating a travel environment when the behavior is presented are associated with each other.
  • the driver model may be constructed from a travel history in which a behavior selected by the driver during automatic driving and an environmental parameter indicating a travel environment when the behavior is performed by the vehicle are associated with each other.
  • an environmental parameter indicating a future driving environment is predicted from an environmental parameter indicating a current driving environment, and the predicted environment among the environmental parameters indicating the driving environment when the vehicle performs a behavior selected by the driver.
  • the behavior associated with the environmental parameter most similar to the parameter may be determined as the first behavior, and some behaviors associated with other similar environmental parameters may be determined as the second behavior.
  • the above prediction is performed, for example, by extrapolating environmental parameters at a future time from environmental parameters indicating the driving environment at the current time and a time before the current time.
  • the driver model (situation database) includes a driving history that associates a behavior selected by the driver during automatic driving with an environmental parameter indicating a driving environment when the behavior is presented, and a driver during automatic driving. May be constructed from both the travel history in which the behavior selected by and the environmental parameters indicating the travel environment when the vehicle performs the behavior are associated with each other.
  • both travel histories are stored in a format as shown in FIG. 28B, and the vehicle control unit 7 determines the next behavior from them.
  • the vehicle control unit 7 gives priority between the two, for example, associating the behavior selected by the driver during the automatic driving with the environment parameter indicating the traveling environment when the vehicle performs the behavior.
  • the next behavior may be determined preferentially from the travel history.
  • a server device such as a cloud server may execute a function similar to the function executed by the vehicle control unit 7.
  • the storage unit 8 since the storage unit 8 has an enormous number of data as the driving history is accumulated, it may be in a server device such as a cloud server instead of the vehicle 1.
  • the storage unit 8 may store an already constructed driver model, and the vehicle control unit 7 may determine the behavior with reference to the driver model stored in the storage unit 8.
  • the storage unit 8 In the configuration in which the storage unit 8 is provided in the cloud server, it is desirable to provide a cache in case the storage unit 8 cannot be accessed due to a decrease in communication speed or communication disconnection.
  • FIG. 29 is a block diagram showing an example of cache arrangement.
  • the vehicle control unit 7 stores the travel history in the storage unit 8 through the communication unit 291 and holds a part of the driver model (situation database) stored in the storage unit 8 in the cache 292 through the communication unit 291.
  • the vehicle control unit 7 accesses the driver model of the cache 292.
  • a method for creating a cache at this time a method of limiting by the presence or absence of environmental parameters, a method of using position information, a method of processing data, and the like are conceivable. Each will be described below.
  • the vehicle control unit 7 extracts driving environments having only the same environmental parameters from the driving environments stored in the storage unit 8, sorts these, and holds them in the cache 292.
  • the vehicle control unit 7 updates the primary cache at the timing when the environmental parameter obtained from the detected situation is changed. By doing so, the vehicle control unit 7 can extract a similar surrounding situation even if the communication speed decreases.
  • the environmental parameters for determining whether or not there is a change may be all of the environmental parameters listed above, or some of the environmental parameters.
  • a primary cache and a secondary cache may be prepared in the cache 292.
  • the vehicle control unit 7 holds a traveling environment having the same environmental parameter in the primary cache. Further, the vehicle control unit 7 is reduced by one from the driving environment in which one environmental parameter is added to the driving environment held in the temporary cache and from the driving environment in which the environmental parameter is held in the temporary cache. At least one of the driving environments in the state is held in the secondary cache.
  • the vehicle control unit 7 can extract a similar situation using only the data in the cache 292 even if a temporary communication interruption occurs.
  • the vehicle control unit 7 determines that the traveling environment in which only the side front vehicle 302 exists (the same The driving environment in which only the environmental parameters exist is extracted from the storage unit 8 in which all the driving environments (situations) are stored, and stored in the primary cache 304.
  • the vehicle control unit 7 is configured such that the traveling environment in which only one vehicle other than the side front vehicle 302 is added (the traveling environment in which one environmental parameter is added to the same environmental parameter) or the side front vehicle 302 is used.
  • a driving environment without a vehicle is extracted from the storage unit 8 and stored in the secondary cache 305.
  • the vehicle control unit 7 copies the driving environment corresponding to the changed ambient condition 303 from the secondary cache 305 to the primary cache 304, and the changed ambient condition 303. 2 is extracted from the storage unit 8 and stored in the secondary cache 305 by extracting a driving environment in which one environmental parameter has been added and a driving environment in which one environmental parameter has been reduced. The next cache 305 is updated. As a result, the vehicle control unit 7 can smoothly extract similar surrounding situations by comparing the surrounding situations.
  • the vehicle control unit 7 displays from the storage unit 8 a driving environment (situation) in which the position indicated by the position information is included within a certain range centered on the vehicle position. It can be extracted and stored in the cache 292.
  • the vehicle control unit 7 updates the cache 292 when the position indicated by the position information corresponding to the traveling environment is out of the certain range. By doing so, the vehicle control unit 7 can extract a similar ambient situation as long as the position is within a certain range even if communication is interrupted for a long time.
  • the storage unit 8 stores operation histories including environmental parameters.
  • the vehicle control unit 7 divides each environmental parameter into a certain range and creates a mesh in a multidimensional space. And the vehicle control part 7 creates the table which counted the behavior contained in each mesh for every classification.
  • the vehicle control unit 7 maps the environmental parameters included in the operation history in a planar shape as shown in FIG. 31A, and divides each of these axes within a certain range, thereby dividing the plane into a plurality of blocks. Divide. This is called a mesh.
  • the vehicle control unit 7 counts the number of behaviors included in each mesh for each type (for example, types such as acceleration, deceleration, lane change, and overtaking).
  • FIG. 31B shows a table in which the number of behaviors included in each mesh is counted for each type.
  • the vehicle control unit 7 holds this content in the cache 292. Then, the vehicle control unit 7 determines which mesh the detected environmental parameter is located in when extracting a similar surrounding situation by comparing the surrounding situations, and the behavior included in the determined mesh The behavior having the largest number is selected, and the behavior for notifying the selected behavior is determined.
  • the vehicle control section 7 when the vehicle control unit 7 determines that the detected environmental parameter is located at the third mesh position, the vehicle control section 7 indicates a behavior (here “acceleration”) indicating the maximum number of behaviors included in the third mesh. The behavior for notifying the operation is determined.
  • the update timing of the cache 292 may be anytime, and the capacity of the cache 292 can be made constant.
  • the vehicle control unit 7 acquires feature amount information indicating the driving characteristics of the driver including past driving environment information, and the storage unit 8 stores the feature amount information and changes the behavior of the vehicle.
  • the vehicle control unit 7 is similar to the feature amount indicating the driving characteristics of the driver including the newly acquired information on the driving environment from the feature amount information stored in the storage unit 8. The information to be determined is determined, and the behavior corresponding to the determined information is notified.
  • the feature amount information indicating the driving characteristics of the driver including the past driving environment information is the feature amount information when the vehicle behavior is presented to the driver. And at least one piece of feature amount information when the driver selects the behavior.
  • the feature amount information indicating the driving characteristics of the driver including the past driving environment information is the feature amount information when the behavior of the vehicle is presented to the driver. And the feature information when the driver has selected the behavior, the driver's driving characteristics including information on the driving environment newly acquired from both feature information. The information similar to the feature amount indicating is determined, and the behavior corresponding to the determined information is notified.
  • the driver model extension of the fourth embodiment the following is performed.
  • feature information indicating the driving characteristics of the driver including past driving environment information, information on the feature values when the behavior of the vehicle is presented to the driver, and the driver has selected the behavior
  • the driver's driving characteristics including the newly acquired driving environment information are preferentially selected from the feature amount information when the driver selects the behavior.
  • Information similar to the feature amount to be shown is determined, and a behavior corresponding to the determined information is notified.
  • the feature amount information indicating the driving characteristics of the driver including the past driving environment information is either one of the automatic driving and the manual driving of the vehicle, or The feature amount information indicates the driving characteristics of both drivers.
  • the vehicle control unit 7 can construct a driver model more suitable for the driving tendency of the driver, and can perform more appropriate automatic driving for the driver based on the constructed driver model.
  • the parameter indicating the driving environment By associating the parameter indicating the driving environment with the behavior, it is possible to accurately determine the next behavior without requiring processing for determining a specific driving environment, that is, without labeling the driving environment.
  • Level 5 In recent years, development related to automatic driving of automobiles has been promoted. As for automatic operation, the automation levels defined in 2013 by NHTSA (National Highway Traffic Safety Administration) are no automation (level 0), automation of specific functions (level 1), automation of complex functions (level 2), semi-automatic operation (Level 3) and fully automatic operation (level 4).
  • Level 1 is a driving support system that automatically performs one of acceleration, deceleration, and steering.
  • Level 2 is a driving support system that automatically performs two or more of acceleration, deceleration, and steering in harmony. is there. In either case, the driver remains involved in the driving operation.
  • Level 4 is a fully automatic driving system that automatically performs all of acceleration, deceleration, and steering, and the driver is not involved in the driving operation.
  • Level 3 is a semi-automatic driving system in which acceleration, deceleration, and steering are all performed automatically, and the driver performs driving operations as necessary.
  • a device that controls an HMI (Human Machine Interface) for exchanging information on automatic driving of a vehicle with a vehicle occupant (for example, a driver) (hereinafter referred to as a “driving support device”). Suggest).
  • the “behavior” of the vehicle in the following description corresponds to the “behavior” of the vehicle in the description of the first to fourth embodiments, and in automatic operation or manual operation, an operation such as steering or braking while the vehicle is running or stopped.
  • the control content related to the state or automatic operation control is included.
  • “Action” is, for example, constant speed running, acceleration, deceleration, temporary stop, stop, lane change, course change, right / left turn, parking, and the like.
  • the fifth embodiment a process for further improving the accuracy of estimating the next action will be described with respect to the individual adaptive driver model described in the fourth embodiment.
  • the fourth embodiment after collecting the driving history for each driver, by analyzing the operation frequency distribution of each driver, the driving history of other drivers similar to the driving history of the target driver is obtained.
  • a driver model is generated based on the selected travel history. That is, a driver model adapted to an individual is generated by grouping for each driver.
  • the driving behavior of the driver may change depending on the presence or absence of the passenger and the passenger's state. For example, even if there is no passenger, the lane is changed, and if there is a passenger, the vehicle is decelerated without changing the lane.
  • the operation history is collected for each combination of the driver and the passenger, and the travel history of another combination similar to the travel history of the target combination is selected and selected.
  • a driver model is generated based on the travel history. That is, the unit of processing is subdivided by executing the processing executed for each driver in the fourth embodiment for each combination of the driver and the passenger. In order to improve the accuracy of action estimation, more driving histories of drivers are required, and the processing load increases. Therefore, here, it is assumed that the processing is performed on the cloud server.
  • FIG. 32 is a block diagram showing a configuration of the vehicle 1000, and shows a configuration related to automatic driving.
  • the vehicle 1000 can travel in the automatic driving mode, and includes a notification device 1002, an input device 1004, a wireless device 1008, a driving operation unit 1010, a detection unit 1020, an automatic driving control device 1030, and a driving support device 1040.
  • Each device shown in FIG. 32 may be connected by wired communication such as a dedicated line or CAN (Controller-Area-Network). Further, it may be connected by wired communication or wireless communication such as USB (Universal Serial Bus), Ethernet (registered trademark), Wi-Fi (registered trademark), Bluetooth (registered trademark) or the like.
  • USB Universal Serial Bus
  • Ethernet registered trademark
  • Wi-Fi registered trademark
  • Bluetooth registered trademark
  • Vehicle 1000 corresponds to vehicle 1 in the first to fourth embodiments.
  • the notification device 1002 corresponds to the information notification device 9 in FIGS. 1 and 13
  • the input device 1004 corresponds to the operation unit 51 in FIG. 1 and the input unit 102 in FIG. 13, and the detection unit 1020 detects in FIG. This corresponds to part 6.
  • the automatic driving control device 1030 and the driving support device 1040 correspond to the vehicle control unit 7 in FIGS. 1 and 13.
  • the description of the configuration described in the first to fourth embodiments will be omitted as appropriate.
  • the notification device 1002 notifies the driver of information related to the traveling of the vehicle 1000.
  • the notification device 1002 emits light such as a car navigation system installed in a vehicle, a head-up display, a center display, a steering wheel, a pillar, a dashboard, and a meter panel.
  • the display unit may display information such as a body.
  • the speaker may be a speaker that converts information into sound and notifies the driver, or a vibration body provided at a position (for example, the driver's seat, steering wheel, etc.) that the driver can sense. Also good.
  • the notification device 1002 may be a combination of these.
  • the input device 1004 is a user interface device that receives an operation input by an occupant.
  • the input device 1004 receives information related to automatic driving of the host vehicle input by the driver.
  • the driving support device 1040 outputs the received information to the driving support device 1040 as an operation signal.
  • FIG. 33 schematically shows the interior of the vehicle 1000 in FIG.
  • the notification device 1002 may be a head-up display (HUD) 1002a or a center display 1002b.
  • the input device 1004 may be the first operation unit 1004a provided on the steering 1011 or the second operation unit 1004b provided between the driver seat and the passenger seat. Note that the notification device 1002 and the input device 1004 may be integrated, and may be implemented as a touch panel display, for example.
  • the vehicle 1000 may be provided with a speaker 1006 that presents information related to automatic driving to the occupant by voice.
  • the driving support device 1040 may display an image indicating information related to automatic driving on the notification device 1002 and output a sound indicating information related to automatic driving from the speaker 1006 together with or instead of the information.
  • the wireless device 1008 corresponds to a mobile phone communication system, WMAN (Wireless Metropolitan Area Network), and the like, and performs wireless communication with a device (not shown) outside the vehicle 1000.
  • the driving operation unit 1010 includes a steering 1011, a brake pedal 1012, an accelerator pedal 1013, and a blinker switch 1014.
  • the steering wheel 1011 is shown in FIGS. 1 and 13, the brake pedal 1012 is shown in FIG. 1, the brake pedal 2 shown in FIG. 13, the accelerator pedal 1013 is shown in FIG. 1, the accelerator pedal 3 shown in FIG. 13, and the winker switch 1014 is shown in FIGS. This corresponds to the winker lever 4.
  • Steering 1011, brake pedal 1012, accelerator pedal 1013, and winker switch 1014 can be electronically controlled by a steering ECU, a brake ECU, an engine ECU, a motor ECU, and a winker controller.
  • the steering ECU, the brake ECU, the engine ECU, and the motor ECU drive the actuator in accordance with a control signal supplied from the automatic operation control device 1030.
  • the blinker controller turns on or off the blinker lamp according to a control signal supplied from the automatic operation control device 1030.
  • Detecting unit 1020 detects the surrounding state and running state of vehicle 1000. As described in part in the first to fourth embodiments, for example, the detection unit 1020 detects the speed of the vehicle 1000, the relative speed of the preceding vehicle with respect to the vehicle 1000, the distance between the vehicle 1000 and the preceding vehicle, and the side lane with respect to the vehicle 1000. The relative speed of the vehicle, the distance between the vehicle 1000 and the vehicle in the side lane, and the position information of the vehicle 1000 are detected.
  • the detection unit 1020 outputs various types of detected information (hereinafter referred to as “detection information”) to the automatic driving control device 1030 and the driving support device 1040. Details of the detection unit 1020 will be described later.
  • the automatic driving control device 1030 is an automatic driving controller that implements an automatic driving control function, and determines the behavior of the vehicle 1000 in automatic driving.
  • the automatic operation control device 1030 includes a control unit 1031, a storage unit 1032, and an I / O unit (input / output unit) 1033.
  • the configuration of the control unit 1031 can be realized by cooperation of hardware resources and software resources, or only by hardware resources. Processors, ROM, RAM, and other LSIs can be used as hardware resources, and programs such as an operating system, application, and firmware can be used as software resources.
  • the storage unit 1032 includes a nonvolatile recording medium such as a flash memory.
  • the I / O unit 1033 executes communication control according to various communication formats. For example, the I / O unit 1033 outputs information related to automatic driving to the driving support device 1040 and inputs a control command from the driving support device 1040. Further, the I / O unit 1033 inputs detection information from the detection unit 1020.
  • the control unit 1031 applies the control command input from the driving support device 1040, various information collected from the detection unit 1020 or various ECUs to the automatic driving algorithm, and controls an automatic control target such as the traveling direction of the vehicle 1000. Calculate the control value.
  • the control unit 1031 transmits the calculated control value to each control target ECU or controller. In this embodiment, it is transmitted to the steering ECU, the brake ECU, the engine ECU, and the winker controller. In the case of an electric vehicle or a hybrid car, the control value is transmitted to the motor ECU instead of or in addition to the engine ECU.
  • the driving support device 1040 is an HMI controller that executes an interface function between the vehicle 1000 and the driver, and includes a control unit 1041, a storage unit 1042, and an I / O unit 1043.
  • the control unit 1041 executes various data processing such as HMI control.
  • the control unit 1041 can be realized by cooperation of hardware resources and software resources, or only by hardware resources. Processors, ROM, RAM, and other LSIs can be used as hardware resources, and programs such as an operating system, application, and firmware can be used as software resources.
  • the storage unit 1042 is a storage area that stores data that is referred to or updated by the control unit 1041. For example, it is realized by a non-volatile recording medium such as a flash memory.
  • the I / O unit 1043 executes various communication controls according to various communication formats.
  • the I / O unit 1043 includes an operation input unit 1050, an image output unit 1051, a detection information input unit 1052, a command IF (Interface) 1053, and a communication IF 1056.
  • the operation input unit 1050 receives an operation signal from the input device 1004 by the operation of the driver, the occupant, or the user outside the vehicle made to the input device 1004, and outputs it to the control unit 1041.
  • the image output unit 1051 outputs the image data generated by the control unit 1041 to the notification device 1002 for display.
  • the detection information input unit 1052 is a result of the detection process by the detection unit 1020, receives information (hereinafter referred to as “detection information”) indicating the current surrounding state and running state of the vehicle 1000 from the detection unit 1020, and performs control. Output to the unit 1041.
  • the command IF 1053 executes an interface process with the automatic driving control apparatus 1030, and includes a behavior information input unit 1054 and a command output unit 1055.
  • the behavior information input unit 1054 receives information regarding the automatic driving of the vehicle 1000 transmitted from the automatic driving control device 1030 and outputs the information to the control unit 1041.
  • the command output unit 1055 receives from the control unit 1041 a control command that instructs the automatic driving control device 1030 to specify the mode of automatic driving, and transmits the control command to the automatic driving control device 1030.
  • the communication IF 1056 executes interface processing with the wireless device 1008.
  • the communication IF 1056 transmits the data output from the control unit 1041 to the wireless device 1008, and transmits the data from the wireless device 1008 to a device outside the vehicle.
  • the communication IF 1056 receives data from a device outside the vehicle transferred by the wireless device 1008 and outputs the data to the control unit 1041.
  • the automatic driving control device 1030 and the driving support device 1040 are configured as separate devices.
  • the automatic driving control device 1030 and the driving support device 1040 may be integrated into one controller.
  • one automatic driving control device may be configured to have both functions of the automatic driving control device 1030 and the driving support device 1040 of FIG.
  • FIG. 34 is a block diagram showing a detailed configuration of the detection unit 1020 and the detection information input unit 1052.
  • the detection unit 1020 includes a first detection unit 1060 and a second detection unit 1062, and the detection information input unit 1052 includes a first input unit 1070 and a second input unit 1072.
  • the first detection unit 1060 includes a position information acquisition unit 1021, a sensor 1022, a speed information acquisition unit 1023, and a map information acquisition unit 1024.
  • the second detection unit 1062 includes a driver sensing unit 1064 and a passenger sensing unit 1066. including.
  • the first detection unit 1060 mainly detects the surrounding state and the running state of the vehicle 1000.
  • the first detection unit 1060 outputs the detected information (hereinafter referred to as “first detection information”) to the first input unit 1070.
  • the first input unit 1070 inputs the first detection information from the first detection unit 1060.
  • the second detection unit 1062 mainly detects information on the driver who is on the vehicle 1000 and the passenger.
  • the second detection unit 1062 outputs the detected information (hereinafter referred to as “second detection information”) to the second input unit 1072.
  • the second input unit 1072 inputs the second detection information from the second detection unit 1062. Note that the combination of the first detection information and the second detection information or one of them corresponds to the detection information described above.
  • the position information acquisition unit 1021 of the first detection unit 1060 acquires the current position of the vehicle 1000 from the GPS receiver.
  • the sensor 1022 is a generic name for various sensors for detecting the situation outside the vehicle and the state of the vehicle 1000.
  • a camera, a millimeter wave radar, a LIDAR (Light Detection and Ranging or Laser Imaging Detection and Ranging), a temperature sensor, an atmospheric pressure sensor, a humidity sensor, an illuminance sensor, and the like are mounted as sensors for detecting the situation outside the vehicle.
  • the situation outside the vehicle includes a road condition in which the host vehicle travels including lane information, an environment including weather, a situation around the host vehicle, and other vehicles in the vicinity (such as other vehicles traveling in the adjacent lane). Any information outside the vehicle that can be detected by the sensor may be used.
  • an acceleration sensor, a gyro sensor, a geomagnetic sensor, a tilt sensor, and the like are mounted as sensors for detecting the state of the vehicle 1000.
  • Speed information acquisition unit 1023 acquires the current speed of vehicle 1000 from the vehicle speed sensor.
  • the map information acquisition unit 1024 acquires map information around the current position of the vehicle 1000 from the map database.
  • the map database may be recorded on a recording medium in the vehicle 1000, or may be downloaded from a map server via a network when used.
  • the driver sensing unit 1064 of the second detection unit 1062 authenticates the driver sitting on the driver's seat of the vehicle 1000.
  • a camera capable of imaging a driver's face seated in the driver's seat of the vehicle 1000 is installed in the vehicle, and the driver's face is imaged by the camera.
  • the driver sensing unit 1064 previously holds information regarding the driver's face that may be seated in the driver's seat of the vehicle 1000.
  • the information related to the driver's face is, for example, a face image, feature point information of the face image, or the like.
  • the driver sensing unit 1064 identifies an individual driver seated in the driver's seat by comparing the image captured by the camera with information related to the driver's face.
  • a TOF (Time Of Flight) sensor and a fingerprint sensor are installed in the vehicle, and based on information acquired by such a sensor, the driver sensing unit 1064 identifies the individual driver sitting in the driver's seat. Also good. The driver sensing unit 1064 outputs the specified driver information as second detection information.
  • the passenger sensing unit 1066 authenticates passengers seated in the passenger seat and rear seat of the vehicle 1000.
  • a seating sensor is installed in each seat, and the passenger sensing unit 1066 identifies the presence or absence of a passenger based on information acquired by the seating sensor.
  • the passenger is identified as having a passenger in the passenger seat and no passenger in the rear seat.
  • a camera capable of capturing the face of the passenger sitting in the passenger seat and the rear seat is installed in the vehicle, and based on the images captured by the camera,
  • the person sensing unit 1066 may specify the presence / absence of a passenger and information related to the passenger.
  • Information related to passengers includes age / sex, personal recognition, and occupant status (drowsiness / car sickness).
  • the TOF sensor is installed in the vehicle, and based on the information acquired by the TOF sensor, the passenger sensing unit 1066 may specify the presence / absence of the passenger and information related to the passenger.
  • the fellow passenger sensing unit 1066 outputs the presence / absence of the identified fellow passenger as second detection information. Further, when the passenger sensing unit 1066 specifies information related to the passenger, the passenger sensing unit 1066 outputs the information as the second detection information.
  • FIG. 35 is a block diagram illustrating a detailed configuration of the control unit 1041.
  • the control unit 1041 includes a detection unit 1100, a travel history generation unit 1102, a transmission unit 1104, an inquiry unit 1106, a travel history acquisition unit 1108, a driver model generation unit 1110, a determination unit 1112, a confirmation unit 1114, a screen generation unit 1116, an instruction Part 1118.
  • the detection unit 1100 is connected to a door opening / closing sensor of the vehicle 1000 and is also connected to the second input unit 1072.
  • the detection unit 1100 is notified from the opening / closing sensor of the timing at which the door is opened / closed. Since a known technique may be used for detection of the opening / closing timing by the opening / closing sensor, description thereof is omitted here.
  • the detection unit 1100 receives the second detection information from the second input unit 1072 when receiving notification of the opening / closing timing.
  • the detection unit 1100 may input the second detection information from the second input unit 1072 when the passenger's state changes.
  • the detection unit 1100 detects the individual driver of the vehicle 1000 by inputting the second detection information.
  • the detection part 1100 detects the presence or absence of the passenger of the vehicle 1000 by inputting 2nd detection information. Further, the vehicle 1000 may detect information regarding passengers of the vehicle 1000.
  • the travel history generation unit 1102 is connected to the first input unit 1070, the detection unit 1100, and the instruction unit 1118. Although details will be described later, when the instruction unit 1118 instructs the automatic driving control apparatus 1030 to perform the next action, the instruction unit 1118 notifies the travel history generation unit 1102 of the instructed action. Here, the next action is selected by the driver, and the actions are, for example, “deceleration”, “acceleration”, “constant speed running”, “right lane change”, and the like. When the notification from the instruction unit 1118 is received, the travel history generation unit 1102 inputs the first detection information from the first input unit 1070 and the information from the detection unit 1100.
  • the traveling history generation unit 1102 derives environmental parameters based on various information included in the first detection information.
  • the environmental parameters include “the speed Va of the own vehicle”, “the relative speed Vba of the preceding vehicle with respect to the own vehicle”, “the inter-vehicle distance DRba between the preceding vehicle and the own vehicle”, and “the preceding vehicle”.
  • the inter-head distance Dca between the side rear vehicle and the host vehicle “the change rate Rca of the inter-head distance”, “the relative speed Vda of the side front vehicle with respect to the host vehicle”, “the head of the side front vehicle and the host vehicle”
  • Inter-distance Dda “change ratio Rda of head-to-vehicle distance”
  • “remaining side lane length DRda of the host vehicle” “the relative speed Vma of the joining vehicle with respect to the own vehicle”
  • the travel history generated in the travel history generation unit 1102 for each type of information input from the detection unit 1100 will be described.
  • types of information include: (1) individual driver, presence / absence of passenger, (2) individual driver, age / sex of passenger, (3) individual driver, individual passenger, (4) individual driver, Assume the presence or absence of a passenger, the passenger's condition, and (5) the driver's personality.
  • Each travel history is shown in FIG. 36, FIG. 37, and FIG. FIG. 36 shows a data structure of a travel history generated by the travel history generation unit 1102, and FIG. 37 shows another data structure of a travel history generated by the travel history generation unit 1102.
  • the traveling history generation unit 1102 generates a traveling history shown in FIG. Specifically, the travel history generation unit 1102 receives information from the detection unit 1100 at the timing when the notification from the acquisition unit 1108 is received, and includes the driver's name, the presence / absence of a passenger in the front passenger seat, Enter the number of participants.
  • the driver's name is shown as “A” or “B”, and in the case of a passenger in the passenger seat “Yes”, “ ⁇ ” is shown, and the passenger in the passenger seat “ In the case of “none”, “x” is indicated, and the number of passengers in the rear seat is indicated as “0” or “1”.
  • the travel history generation unit 1102 inputs a value such as “Va” as the travel history at the timing. Furthermore, the travel history generation unit 1102 stores the input information, values, and the action indicated in the notification from the acquisition unit 1108, for example, “deceleration”, and stores them in one line of FIG. That is, the travel history generation unit 1102 generates a travel history in which an environmental parameter indicating a travel environment in which the vehicle 1000 has traveled in the past and an action selected by the driver with respect to the environmental parameter. At that time, the travel history is generated for each combination of the presence / absence of the driver and the passenger.
  • the travel history generation unit 1102 generates a travel history shown in FIG. Specifically, the travel history generation unit 1102 inputs the driver's name and the age / gender of the passenger as information from the detection unit 1100 at the timing when the notification from the acquisition unit 1108 is received.
  • the age / sex of the passenger is shown as “30's / female”, “30's / female / boy”.
  • the former also indicates that there is one passenger, and the latter also indicates that there are two passengers.
  • Such a passenger's age and sex can be said to be information on the passenger.
  • the travel history generation unit 1102 summarizes the driver's name, the passenger's age and sex, values related to the travel history, and the behavior shown in the notification from the acquisition unit 1108. (B) in one row. In other words, the travel history generation unit 1102 generates a travel history for each combination of the driver, the presence / absence of a passenger detected in the past, and information related to the passenger.
  • the travel history generation unit 1102 generates a travel history shown in FIG. More specifically, the travel history generation unit 1102 receives information from the detection unit 1100 at the timing when the notification from the acquisition unit 1108 is received, and includes the driver's name, the passenger's passenger's name, Enter the person's name. In FIG. 37A, the names of the passengers are indicated as “B”, “C”, “D”. By confirming the passenger's name, the number of passengers is also clarified. As in the case of (1), the travel history generation unit 1102 indicates the driver's name, the passenger's passenger's name, the rear passenger's name, the value related to the travel history, and the notification from the acquisition unit 1108. Collected actions are stored in one line of FIG. Again, the travel history is generated in the travel history generation unit 1102 for each combination of the driver, the presence or absence of a fellow passenger detected in the past, and information related to the fellow passenger.
  • the travel history generation unit 1102 generates the travel history shown in FIG. Specifically, the travel history generation unit 1102 receives information from the detection unit 1100 at the timing when the notification from the acquisition unit 1108 is received, and includes the name of the driver, the presence or absence of a passenger in the passenger seat, the passenger's Enter the state. In FIG. 37 (b), the passenger's state is indicated as "normal”, “sleep", "car sickness”. In addition, the number of passengers is also clarified by confirming the passenger's condition.
  • the travel history generation unit 1102 is indicated in the driver's name, the presence / absence of a passenger in the passenger seat, the state of the passenger, the value related to the travel history, and the notification from the acquisition unit 1108. Actions are collected and stored in one line of FIG. Again, the travel history is generated in the travel history generation unit 1102 for each combination of the driver, the presence or absence of a fellow passenger detected in the past, and information related to the fellow passenger.
  • the travel history generation unit 1102 generates a travel history by executing the process in which the passenger's part in (1) to (4) is omitted. That is, the travel history generation unit 1102 generates, for each driver, a travel history in which an environment parameter indicating a travel environment in which the vehicle 1000 has traveled in the past and an action selected by the driver with respect to the environment parameter are associated with each other. .
  • the travel history generation unit 1102 generates a travel history using the presence / absence of a passenger, the age / sex of the passenger, the name of the passenger, the state of the passenger, and the like.
  • the travel history may be generated by arbitrarily combining these.
  • travel history generation section 1102 outputs the travel history to transmission section 1104 and inquiry section 1106.
  • the transmission unit 1104 inputs the travel history from the travel history generation unit 1102.
  • the transmission unit 1104 notifies the cloud server (not shown) of the travel history update from the wireless device 1008 via the communication IF 1056.
  • the cloud server is provided outside the vehicle 1000 in order to collect travel histories generated in the driving support device 1040 installed in each of the plurality of vehicles 1000. That is, the cloud server collectively manages the travel histories generated in each of the plurality of driving support devices 1040, but for convenience of explanation, the travel histories stored in the cloud server are referred to as “total travel histories”.
  • the cloud server receives the travel history update notification, the cloud server transmits a travel history request to the wireless device 1008 to transmit the travel history.
  • the transmission unit 1104 receives the identification information (hereinafter referred to as a combination of the driver's name and the presence / absence of a passenger) that is a combination in the driving history. , “ID”) is assigned to each combination.
  • FIG. 38 shows an outline of processing in the transmission unit 1104.
  • FIG. 38A shows the data structure of the travel history input to the transmission unit 1104, which is the same as FIG. FIG. 38 (b) shows the correspondence between the driver's name and the presence / absence of a passenger and the ID.
  • the driver's name is “A”
  • the passenger in the passenger seat is “none”
  • the passenger in the rear seat is “none”
  • the ID “0001” is associated.
  • the driver's name is “A”
  • the passenger in the passenger seat is “present”, and the passenger in the rear seat is “one person”, the ID “0003” is associated. Yes.
  • ID is defined so that it may not overlap between the several driving assistance apparatuses 1040.
  • FIG. Here, the case where the driver's name is “B”, the passenger in the passenger seat is “none”, and the passenger in the rear seat is “none” is added by updating the driving history, and When no ID is assigned to the combination, the transmission unit 1104 assigns an ID “0004” to the combination.
  • the passenger's age and gender are included in the combination.
  • only the driver's name, not the combination is included. include.
  • the transmitting unit 1104 uses the relationship shown in FIG. 38B to replace the combination shown in FIG. 38A with an ID.
  • FIG. 39 shows another processing outline in the transmission unit 1104. As illustrated, the combination is replaced with an ID. By using such an ID, even information relating to the driver “A” is separated into three pieces of information “ID” from “0001” to “0003”.
  • the transmission unit 1104 outputs the travel history replaced with the ID (hereinafter also referred to as “travel history”) to the communication IF 1056.
  • the communication IF 1056 causes the wireless device 1008 to transmit a travel history to the cloud server. At that time, only the updated portion of the travel history may be transmitted.
  • the cloud server adds the received travel history to the total travel history.
  • the inquiry unit 1106 inputs the travel history from the travel history generation unit 1102.
  • the inquiry unit 1106 also inputs information from the detection unit 1100.
  • the information input here is a combination of the name of the current driver and the presence or absence of the current passenger.
  • the inquiry unit 1106 extracts a travel history for a combination of the name of the current driver and the presence or absence of a current passenger from the travel history generated by the travel history generation unit 1102.
  • FIG. 40 shows an outline of processing in the inquiry unit 1106.
  • 40A shows the data structure of the travel history input to the inquiry unit 1106, which is the same as FIG. 36A.
  • FIG. 40B shows a result of extracting a travel history for the current combination from the travel history shown in FIG.
  • the inquiry unit 1106 transmits an inquiry signal for retrieving a travel history similar to the extracted travel history from the total travel history to the cloud server via the communication IF 1056 and the wireless device 1008.
  • the inquiry signal includes the extracted traveling history (hereinafter referred to as “inquiry traveling history”).
  • the cloud server receives the inquiry signal, acquires the inquiry travel history from the inquiry signal.
  • the cloud server searches and acquires a travel history similar to the inquiry travel history from the total travel history. More specifically, the cloud server extracts one action and an environment parameter corresponding to the action from the inquiry travel history.
  • the extracted environmental parameters are referred to as “first environmental parameters”.
  • the cloud server acquires a plurality of environmental parameters corresponding to the extracted behavior from the total travel history.
  • each of the acquired plurality of environment parameters is referred to as a “second environment parameter”.
  • the cloud server calculates a correlation value of vectors having the numerical value of the first environmental parameter and the numerical value of one second environmental parameter as elements. If the correlation value is larger than a threshold value (hereinafter referred to as “in-server threshold value”), the cloud server specifies an ID corresponding to the second environmental parameter and all the environmental parameters to which the ID is assigned. Obtained from the total travel history. On the other hand, if the correlation value is equal to or less than the intra-server threshold value, the cloud server does not execute acquisition. The cloud server executes such a process for each of the acquired plurality of second environment parameters, and also executes such a process for other actions included in the inquiry travel history. As a result, the cloud server acquires one or more environmental parameters similar to the inquiry travel history.
  • a threshold value hereinafter referred to as “in-server threshold value”
  • a plurality of IDs may be mixed in the acquired environmental parameter.
  • the cloud server collects the acquired environmental parameters as “similar travel history”. In that case, the action corresponding to each environmental parameter is also included.
  • the similar running history has a data structure as shown in FIG. 39, for example.
  • the acquisition unit 1108 acquires a similar traveling history from the cloud server via the wireless device 1008 and the communication IF 1056 as a response to the inquiry by the inquiry unit 1106.
  • the similar travel history is a travel history similar to the travel history for the combination of the current driver and the current presence / absence of a passenger.
  • the passenger's age and gender are included in the combination.
  • the driver model generation unit 1110 inputs the similar travel history from the acquisition unit 1108.
  • the driver model generation unit 1110 generates a driver model based on the similar traveling history.
  • the driver model generation unit 1110 generates a driver model by combining an inquiry travel history and a similar travel history.
  • FIG. 41 shows a data structure of a driver model generated by the driver model generation unit 1110. As shown in the figure, IDs, environmental parameters, and actions included in the similar traveling history are combined. Here, the ID is not shown, and the part where the environmental parameter and the action are combined corresponds to the inquiry travel history. Note that an ID may not be included in the driver model.
  • the driver model generation unit 1110 may generate a driver model by averaging the numerical values of the environmental parameters in the same action in the inquiry travel history and the similar travel history.
  • the driver model generation unit 1110 outputs the driver model to the determination unit 1112.
  • the determination unit 1112 inputs a driver model from the driver model generation unit 1110. Further, the determination unit 1112 receives the first detection information from the first input unit 1070. The determination unit 1112 derives the current environment parameter based on various types of information included in the first detection information. Since the environmental parameters are as described above, description thereof is omitted here. The determination unit 1112 calculates a correlation value of a vector whose elements are the value of the environmental parameter shown in each row of the driver model shown in FIG. 41 and the value of the current environmental parameter. Further, the determination unit 1112 repeatedly executes the calculation of the correlation value while changing the row of the driver model shown in FIG. As a result, a plurality of correlation values corresponding to each row of the driver model shown in FIG. 41 are derived.
  • the determination unit 1112 selects a maximum correlation value from among a plurality of correlation values, and then selects an action indicated in a row corresponding to the selected correlation value as an “action candidate”. The selection of an action candidate corresponds to determining the next action.
  • the determination unit 1112 may set a threshold value in advance, and may select a plurality of correlation values larger than the threshold value from a plurality of correlation values.
  • the determination unit 1112 takes statistics of the behaviors shown in the selected plurality of rows, and sets “first behavior candidates”, “second behavior candidates”,..., “Nth behavior candidates” in descending order. Note that an upper limit value may be set for the number of action candidates.
  • the determination unit 1112 outputs one or more action candidates to the confirmation unit 1114.
  • the confirmation unit 1114 is connected to the behavior information input unit 1054, and inputs information related to automatic driving from the automatic driving control device 1030 via the behavior information input unit 1054. In the information related to automatic driving, the next action of the vehicle 1000 is indicated. The next action (hereinafter referred to as “automatic action”) is determined by the automatic driving algorithm in the automatic driving control apparatus 1030. For this reason, the automatic behavior may not match the driver's sense.
  • the confirmation unit 1114 also inputs one or more action candidates from the determination unit 1112. The confirmation unit 1114 outputs these to the screen generation unit 1116 in order for the driver to select one of the automatic behavior and one or more behavior candidates.
  • the screen generation unit 1116 inputs automatic behavior and one or more behavior candidates from the confirmation unit 1114.
  • the screen generation unit 1116 generates an image in which these are collected.
  • FIG. 42 shows a screen generated by the screen generation unit 1116.
  • an action image 1200 is arranged at the center of the screen.
  • the screen generation unit 1116 stores in advance the contents of a plurality of types of automatic actions and images corresponding to them, and generates an action image 1200 by selecting an image corresponding to the input automatic action.
  • a first action candidate image 1202a and a second action candidate image 1202b are arranged on the right side of the screen.
  • the first action candidate image 1202a and the second action candidate image 1202b are collectively referred to as action candidate images 1202.
  • the first action candidate image 1202a is generated from the first action candidate
  • the second action candidate image 1202b is generated from the second action candidate
  • the screen generation unit 1116 is similar to the action image 1200 generation. Is generated.
  • the screen generation unit 1116 outputs the generated screen image to the image output unit 1051 as image data.
  • the image output unit 1051 displays the screen of the action candidate image 1202 by outputting the image data to the notification device 1002.
  • the notification device 1002 displays the screen shown in FIG. While using the input device 1004, the driver selects any one of the behavior image 1200, the first behavior candidate image 1202a, and the second behavior candidate image 1202b.
  • the operation input unit 1050 inputs the selection result as an operation signal from the input device 1004 and outputs the selection result to the control unit 1041.
  • the confirmation unit 1114 inputs the selection result from the operation input unit 1050.
  • the confirmation unit 1114 confirms selection of the first action candidate if the selection result is the first action candidate image 1202a, and confirms selection of the second action candidate if the selection result is the second action candidate image 1202b.
  • the confirmation unit 1114 confirms the selection of the automatic action if the selection result is the action image 1200.
  • the confirmation unit 1114 confirms the selection of the automatic behavior even when the selection result is not input even after a certain period of time has passed since the one or more behavior candidates are output to the screen generation unit 1116.
  • the confirmation unit 1114 outputs the selected action candidate to the instruction unit 1118.
  • the instruction unit 1118 instructs the automatic driving control apparatus 1030 via the command output unit 1055 for an action corresponding to the action candidate when the action candidate notification is input from the confirmation unit 1114. Specifically, the instruction unit 1118 outputs the input action candidate to the command output unit 1055.
  • the command output unit 1055 when an action candidate is input from the instruction unit 1118, outputs a control command corresponding to the action candidate to the automatic driving control device 1030.
  • the automatic driving control device 1030 controls the automatic driving of the vehicle 1000 with the action candidate as the next action. Therefore, even when “deceleration” is indicated in the automatic action, when the “right lane change” of the action candidate is selected, the vehicle 1000 travels according to the “right lane change” of the next action. .
  • the instruction unit 1118 notifies the traveling history generation unit 1102 of the instructed action when the next action is instructed to the automatic driving control apparatus 1030.
  • FIG. 43 is a flowchart illustrating a detection procedure performed by the second detection unit 1062.
  • FIG. 43A is a flowchart showing a first detection procedure.
  • the second detection unit 1062 executes driver personal authentication, passenger personal authentication, age / gender detection, or seating sensing (S1002).
  • the detection unit 1100 acquires and stores driver / passenger information (S1004).
  • Step 1002 and Step 1004 are skipped.
  • the second detection unit 1062 detects the passenger's state (normal / sleepiness / car sickness) (S1010).
  • the detection unit 1100 updates the passenger's state (S1014).
  • step 1014 is skipped.
  • FIG. 44 is a sequence diagram showing a registration procedure by the driving support apparatus 1040.
  • the driving support device 1040 generates a travel history (S1050).
  • the driving support device 1040 transmits a travel history update notification to the cloud server (S1052).
  • the cloud server transmits a travel history request to the driving support device 1040 (S1054).
  • the driving support device 1040 replaces the ID of the travel history (S1056) and transmits the travel history registration (S1058).
  • the cloud server stores the travel history (S1060).
  • the cloud server transmits the travel history registration result to the driving support device 1040 (S1062).
  • FIG. 45 is a flowchart showing a transmission procedure by the transmission unit 1104. If there is a travel history update (Y in S1100), the transmission unit 1104 acquires a travel history (updated) (S1102). When there is an unregistered condition (Y in S1104), the transmission unit 1104 assigns a new ID (S1106). If there is no ID unregistered condition (N in S1104), step 1106 is skipped. The transmission unit 1104 replaces the ID (S1108). If there is no travel history update (N in S1100), steps 1102 to 1108 are skipped.
  • FIG. 46 is a sequence diagram illustrating a driver model generation procedure performed by the driving support device 1040.
  • the driving support device 1040 generates a travel history (S1150).
  • the driving support device 1040 extracts the inquiry travel history (S1152).
  • the driving support device 1040 transmits an inquiry signal to the cloud server (S1154).
  • the cloud server extracts a similar travel history (S1156).
  • the cloud server transmits a similar travel history to the driving support device 1040 (S1158).
  • the driving support device 1040 generates a driver model (S1160).
  • FIG. 47 is a flowchart showing a travel history update procedure performed by the travel history generation unit 1102.
  • the determination unit 1112 determines the next action (S1200). When the determined action is selected (Y in S1202), the travel history generation unit 1102 updates the travel history (S1204). If the determined action is not selected (N in S1202), the process ends.
  • the driver model is generated based on the travel history similar to the travel history for the current driver, a driver model suitable for the current driver can be generated. Further, since the next action is determined based on the driver model suitable for the current driver and the current environmental parameters of the vehicle, the determination accuracy can be improved. Further, since the driver model is generated based on the traveling history similar to the traveling history for the combination of the current driver and the current presence / absence of the passenger, the accuracy of the driver model can be improved. In this case, since the driver model is generated based on the driving history similar to the driving history for the combination of the current driver, the presence / absence of the current passenger, and information on the current passenger, the accuracy of the driver model is improved. It can be further improved.
  • the server since a travel history similar to the travel history for the current driver is acquired from the server, it is possible to cause the server to search for a similar travel history. Further, since the server searches for similar travel histories, the amount of processing can be reduced. Moreover, since the travel history is transmitted to the server, the travel history generated in various driving support devices can be accumulated in the server. In addition, since the travel histories generated in various driving support devices are stored in the server, it is possible to improve the search accuracy of similar travel histories. In addition, since an ID for identifying each combination is given to the travel history, management at the server can be facilitated. In addition, since an image showing the next action is displayed, the driver can be notified of the next action.
  • next action is determined based on the driver model generated based on the driving history similar to the driving history for the current driver and the current environmental parameters of the vehicle. It is possible to improve the accuracy of determining the behavior.
  • the next action is determined based on the driver model generated based on the driving history similar to the driving history for the current driver and the current environmental parameters of the vehicle. The determination accuracy can be improved.
  • Computers that realize the functions described above by programs include input devices such as keyboards, mice, and touch pads, output devices such as displays and speakers, CPU (Central Processing Unit), ROM (Read Only Memory), RAM (Random Access Memory) Is provided. And a storage device such as a hard disk device or SSD (Solid State Drive), a reading device that reads information from a recording medium such as a DVD-ROM (Digital Versatile Disk Read Only Memory) or USB (Universal Serial Bus) memory, via a network A network card for performing communication is further provided, and each unit is connected by a bus.
  • input devices such as keyboards, mice, and touch pads
  • output devices such as displays and speakers
  • CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • a storage device such as a hard disk device or SSD (Solid State Drive)
  • a reading device that reads information from a recording medium such as a DVD-ROM (Digital Versatile Disk Read Only Memory) or USB (Universal Serial Bus) memory, via a network
  • the reading device reads the program from the recording medium on which the program is recorded, and stores the program in the storage device.
  • a network card communicates with the server apparatus connected to the network, and memorize
  • the CPU copies the program stored in the storage device to the RAM, and sequentially reads out and executes the instructions included in the program from the RAM, thereby realizing the functions of the respective devices.
  • a driving support device generates, for each driver, a driving history in which an environmental parameter indicating a driving environment in which the vehicle has traveled in the past and an action selected by the driver with respect to the environmental parameter are associated with each other.
  • a travel history generation unit is provided.
  • the driving support apparatus further includes an acquisition unit that acquires a travel history similar to the travel history for the current driver among the travel histories generated by the travel history generation unit.
  • the driving support device further indicates a driver model generation unit that generates a driver model, a driver model generated by the driver model generation unit, and a current driving environment of the vehicle based on the travel history acquired by the acquisition unit.
  • a determination unit that determines the next action based on the environmental parameter.
  • the determination accuracy is improved. It can be improved.
  • the driving support device may further include a detection unit that detects the presence or absence of a passenger in the vehicle.
  • the travel history generation unit generates a travel history for each driver and for each passenger who has been detected in the past by the detection unit, and the acquisition unit detects the current driver and the current detected by the detection unit.
  • a travel history similar to the travel history for the combination with the presence or absence of a passenger may be acquired.
  • the driver model is generated based on the travel history similar to the travel history for the combination of the current driver and the current presence / absence of the passenger, the accuracy of the driver model can be improved.
  • the detection unit also detects information related to the passengers of the vehicle, and the travel history generation unit calculates the travel history for each driver, and for each information related to the passengers and the presence or absence of the passengers detected in the past by the detection unit.
  • the generation unit may generate a travel history similar to a travel history for a combination of the current driver, the presence / absence of the current passenger detected by the detection unit, and information related to the current passenger.
  • the driver model is generated based on the driving history similar to the driving history for the combination of the current driver, the presence / absence of the current passenger, and information on the current passenger, the accuracy of the driver model is improved. It can be improved.
  • the driving support device may further include an inquiry unit that executes an inquiry to the server based on the driving history for the current driver among the driving histories generated by the driving history generation unit.
  • the acquisition unit may acquire a travel history similar to the travel history for the current driver from the server as a response to the inquiry by the inquiry unit. In this case, since a travel history similar to the travel history for the current driver is acquired from the server, the amount of processing can be reduced.
  • the driving support device further includes an inquiry unit that executes an inquiry to the server based on a traveling history of a combination of the current driver and the current presence / absence of a passenger among the traveling history generated by the traveling history generation unit. May be.
  • the acquisition unit may acquire a travel history similar to the travel history for the combination of the current driver and the presence or absence of the current passenger from the server as a response to the inquiry by the inquiry unit. In this case, since a travel history similar to the travel history for the current driver is acquired from the server, the amount of processing can be reduced.
  • the driving support device is an inquiry that executes an inquiry to the server based on a combination of the current driver and the presence / absence of the current passenger and information on the current passenger in the driving history generated by the driving history generation unit.
  • a part may be further provided.
  • the acquisition unit may acquire a travel history similar to the travel history for the combination of the current driver and the presence / absence of the current passenger and information on the current passenger as a response to the inquiry by the inquiry unit. . In this case, since a travel history similar to the travel history for the current driver is acquired from the server, the amount of processing can be reduced.
  • the driving support device may further include a transmission unit that transmits the travel history generated by the travel history generation unit to the server.
  • traveling histories generated in various driving support devices can be accumulated in the server.
  • the driving support device may further include a transmission unit that transmits the travel history generated by the travel history generation unit to the server.
  • the transmission unit may add identification information for identifying each combination in the travel history. In this case, since identification information for identifying each combination is given, management on the server can be facilitated.
  • the driving support device may further include an image output unit that causes the notification device to display an image indicating the next action determined by the determination unit. In this case, the driver can be notified of the next action.
  • Another aspect of the present invention is an automatic operation control device.
  • This device includes a travel history generation unit that generates, for each driver, a travel history in which an environment parameter indicating a travel environment in which the vehicle has traveled in the past and an action selected by the driver with respect to the environment parameter.
  • the automatic driving equipment control device further includes an acquisition unit for acquiring a travel history similar to the travel history for the current driver among the travel histories generated by the travel history generation unit, and a travel history acquired by the acquisition unit.
  • a driver model generation unit that generates a driver model.
  • the automatic driving equipment control device further includes a determination unit that determines the next action based on the driver model generated by the driver model generation unit and an environmental parameter indicating the current traveling environment of the vehicle, and a determination unit An automatic driving control unit that controls automatic driving of the vehicle based on the determined next action.
  • the determination accuracy is improved. It can be improved.
  • Still another aspect of the present invention is a vehicle.
  • This vehicle is a vehicle including a driving support device, and the driving support device travels by causing an environment parameter indicating a travel environment in which the vehicle has traveled in the past and an action selected by the driver to the environment parameter.
  • a travel history generation unit that generates a history for each driver is provided.
  • the driving support apparatus further includes an acquisition unit that acquires a travel history similar to the travel history for the current driver among the travel histories generated by the travel history generation unit.
  • the driving support device further indicates a driver model generation unit that generates a driver model, a driver model generated by the driver model generation unit, and a current driving environment of the vehicle based on the travel history acquired by the acquisition unit.
  • a determination unit that determines the next action based on the environmental parameter.
  • the determination accuracy is improved. It can be improved.
  • Still another aspect of the present invention is a driving support method.
  • This method includes a step of generating, for each driver, an environmental parameter indicating a driving environment in which the vehicle has traveled in the past, and a driving history in which the driver selects an action selected by the environmental parameter.
  • the driving support device further includes a step of acquiring a driving history similar to the driving history for the current driver among the generated driving history, a step of generating a driver model based on the acquired driving history, Determining the next action based on the generated driver model and an environmental parameter indicating the current driving environment of the vehicle.
  • the management of the total travel history and the extraction of the similar travel history are performed in the cloud server.
  • the present invention is not limited thereto, and for example, these processes may be performed in the driving support device 1040.
  • the driving assistance device 1040 mounted in each of the plurality of vehicles 1000 generates an overall traveling history for each driving assistance device 1040 by exchanging the traveling history with each other. According to this modification, the installation of a cloud server can be made unnecessary.
  • the driving support method according to the present invention and the driving support device, automatic driving control device, vehicle, and program using the driving support method are suitable for transmitting information to the driver.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Automation & Control Theory (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

Provided is a technology for improving accuracy when determining the next action. A travel history generation unit (1102) generates, for each driver, a travel history obtained by associating environmental parameters indicating the travel environment through which a vehicle has traveled in the past, and an action selected by the driver in response to the environmental parameters. An acquisition unit (1108) acquires a travel history similar to the travel history of the current driver from among the travel histories generated by the travel history generation unit (1102). A driver model generation unit (1110) generates a driver model on the basis of the travel history acquired by the acquisition unit (1108). A determination unit (1112) determines the next action on the basis of the driver model generated by the driver model generation unit (1110) and the environmental parameters indicating the current vehicle travel environment.

Description

運転支援方法およびそれを利用した運転支援装置、自動運転制御装置、車両、運転支援プログラムDriving support method, driving support device using the same, automatic driving control device, vehicle, driving support program
 本発明は、車両、車両に設けられる運転支援方法およびそれを利用した運転支援装置、自動運転制御装置、運転支援プログラムに関する。 The present invention relates to a vehicle, a driving support method provided in the vehicle, a driving support device using the same, an automatic driving control device, and a driving support program.
 近年、車両の周囲の状況あるいは車両の走行状態(例えば、自車両の速度あるいは操舵・アクセル・ブレーキ・方向指示器・アクチュエータの制御情報など)に基づいて、運転者が自ら運転操作を行う手動運転と一部若しくはすべての運転操作を自動で行う自動運転とによる走行が可能な車両あるいは完全自動運転可能な車両に関する様々な技術が提案され、実用化されている。 In recent years, manual driving in which the driver performs driving operations based on the surrounding conditions of the vehicle or the running state of the vehicle (for example, the control information of the vehicle's speed or steering / accelerator / brake / direction indicator / actuator). Various technologies relating to vehicles capable of traveling by automatic driving in which part or all of the driving operations are automatically performed or vehicles capable of fully automatic driving have been proposed and put into practical use.
 例えば、特許文献1には、自車両を自動操舵制御あるいは自動加減速制御する場合に、自動操舵制御あるいは自動加減速制御の作動状態を視覚的に運転者に認識させる走行制御装置が開示されている。 For example, Patent Document 1 discloses a travel control device that allows a driver to visually recognize the operating state of automatic steering control or automatic acceleration / deceleration control when the host vehicle performs automatic steering control or automatic acceleration / deceleration control. Yes.
特開2005-67483号公報Japanese Patent Laid-Open No. 2005-67483
 しかしながら、自動運転(完全自動運転及び一部自動運転の両方を含む)中、車に運転を任せるということで、車と運転者の間の信頼関係が極めて大事であり、車両と運転者(乗員)との間に適切な情報を伝達することが必要となる。特許文献1には、運転者に対して現在の作動状態のみを報知している。 However, during automatic driving (including both fully automatic driving and partial automatic driving), leaving the driving to the car means that the trust relationship between the car and the driver is extremely important, and the vehicle and the driver (occupant) ) To communicate appropriate information. In Patent Document 1, only the current operating state is notified to the driver.
 自動運転中、車両の現在の挙動(作動状態または制御内容)を報知されるだけで、これから実施する挙動(例えば、特に合流前、交差点進入の前、緊急車両が近くにいた場合あるいは周囲の他車が何らかの作動をした/しそうなときに、車両が実施しようとする車線変更、加速、減速といった制御内容)について何も知らされていない状態だと、運転者が非常に不安感を抱えてしまうという第1の問題があった。 During automatic driving, the current behavior of the vehicle (operating state or control details) is only notified, and the behavior to be performed (for example, before joining, before entering the intersection, when an emergency vehicle is nearby, If the vehicle is in operation / probable, the driver will be very anxious if nothing is known about the lane change, acceleration, deceleration, etc.) There was a first problem.
 また、完全自動運転中だと、運転者が運転監視以外の他の行動を取っている可能性が高く、いきなり現在の作動状態のみ表示されても、現在の車両の周囲状況あるいは車両の走行状態も把握できないし、運転者の意思で運転を指示しようとしてもすぐに対応できなく、運転者がスムーズに車へ指示を与えることができないという第2の問題があった。 Also, during fully automatic driving, it is highly likely that the driver is taking actions other than driving monitoring, and even if only the current operating state is suddenly displayed, In addition, there is a second problem that the driver cannot give an instruction to the car smoothly because the driver cannot give an instruction to drive the vehicle at will.
 また、運転者に現在の作動状態のみを報知しているだけで、運転者が車に対して直接手動運転を行おうとしても、すぐに切り替えられないという第3の問題があった。 In addition, there is a third problem that even if the driver is only informing the driver of the current operating state, the driver cannot immediately switch the vehicle even if he / she tries to perform manual driving directly on the vehicle.
 また、運転者若しくは乗員によって、車が同じ動作を取るとしても、動作のタイミングあるいは操作量は人によって異なり、実際運転者が手動運転する場合の感覚と乖離する可能性が高く、最悪な場合、自動運転中に運転者による不要な操作介入を誘発してしまうことがあるという第4の問題があった。 Also, even if the car takes the same action depending on the driver or occupant, the timing of operation or the amount of operation differs depending on the person, and there is a high possibility that the actual driver will deviate from the sense of manual driving. There was a fourth problem that unnecessary operation intervention by the driver may be induced during automatic driving.
 本発明は、完全自動運転または一部自動運転中において、上記問題のうち少なくとも1つの課題を解決することが可能な運転支援方法およびそれを利用した運転支援装置、自動運転制御装置、車両、運転支援プログラムを提供する。 The present invention relates to a driving support method capable of solving at least one of the above problems during fully automatic driving or partial automatic driving, and a driving support device, automatic driving control device, vehicle, driving using the same Provide support programs.
 上記課題を解決するために、本発明のある態様の運転支援装置は、車両が過去に走行した走行環境を示す環境パラメータと当該環境パラメータに対して運転者が選択した行動とを対応させた走行履歴を運転者ごとに生成する走行履歴生成部を備える。この運転支援装置は、さらに走行履歴生成部において生成した走行履歴のうち、現在の運転者に対する走行履歴に類似した走行履歴を取得する取得部と、を備える。この運転支援装置は、さらに、取得部において取得した走行履歴をもとに、ドライバモデルを生成するドライバモデル生成部と、ドライバモデル生成部において生成したドライバモデルと、車両の現在の走行環境を示す環境パラメータとをもとに、次の行動を判定する判定部と、を備える。 In order to solve the above-described problem, a driving support device according to an aspect of the present invention provides a driving in which an environmental parameter indicating a driving environment in which the vehicle has traveled in the past and an action selected by the driver with respect to the environmental parameter are associated with each other. A travel history generation unit that generates a history for each driver is provided. The driving support apparatus further includes an acquisition unit that acquires a travel history similar to the travel history for the current driver among the travel histories generated by the travel history generation unit. The driving support device further indicates a driver model generation unit that generates a driver model, a driver model generated by the driver model generation unit, and a current driving environment of the vehicle based on the travel history acquired by the acquisition unit. A determination unit that determines the next action based on the environmental parameter.
 本発明の別の態様は、自動運転制御装置である。この装置は、車両が過去に走行した走行環境を示す環境パラメータと当該環境パラメータに対して運転者が選択した行動とを対応させた走行履歴を運転者ごとに生成する走行履歴生成部を備える。この自動運転制御装置は、さらに走行履歴生成部において生成した走行履歴のうち、現在の運転者に対する走行履歴に類似した走行履歴を取得する取得部と、を備える。この自動運転制御装置は、さらに、取得部において取得した走行履歴をもとに、ドライバモデルを生成するドライバモデル生成部と、ドライバモデル生成部において生成したドライバモデルと、車両の現在の走行環境を示す環境パラメータとをもとに、次の行動を判定する判定部と、判定部において判定した次の行動をもとに、車両の自動運転を制御する自動運転制御部と、を備える。 Another aspect of the present invention is an automatic operation control device. This device includes a travel history generation unit that generates, for each driver, a travel history in which an environment parameter indicating a travel environment in which the vehicle has traveled in the past and an action selected by the driver with respect to the environment parameter. The automatic operation control device further includes an acquisition unit that acquires a travel history similar to the travel history for the current driver among the travel histories generated by the travel history generation unit. The automatic driving control device further includes a driver model generation unit that generates a driver model based on the travel history acquired by the acquisition unit, a driver model generated by the driver model generation unit, and a current driving environment of the vehicle. A determination unit that determines the next action based on the environmental parameter shown, and an automatic driving control unit that controls the automatic driving of the vehicle based on the next action determined by the determination unit.
 本発明のさらに別の態様は、車両である。この車両は、運転支援装置を備える車両であって、運転支援装置は、車両が過去に走行した走行環境を示す環境パラメータと当該環境パラメータに対して運転者が選択した行動とを対応させた走行履歴を運転者ごとに生成する走行履歴生成部を備える。この運転支援装置は、さらに、走行履歴生成部において生成した走行履歴のうち、現在の運転者に対する走行履歴に類似した走行履歴を取得する取得部と、を備える。この運転支援装置は、さらに、取得部において取得した走行履歴をもとに、ドライバモデルを生成するドライバモデル生成部と、ドライバモデル生成部において生成したドライバモデルと、車両の現在の走行環境を示す環境パラメータとをもとに、次の行動を判定する判定部と、を備える。 Still another aspect of the present invention is a vehicle. This vehicle is a vehicle including a driving support device, and the driving support device travels by causing an environment parameter indicating a travel environment in which the vehicle has traveled in the past and an action selected by the driver to the environment parameter. A travel history generation unit that generates a history for each driver is provided. The driving support apparatus further includes an acquisition unit that acquires a travel history similar to the travel history for the current driver among the travel histories generated by the travel history generation unit. The driving support device further indicates a driver model generation unit that generates a driver model, a driver model generated by the driver model generation unit, and a current driving environment of the vehicle based on the travel history acquired by the acquisition unit. A determination unit that determines the next action based on the environmental parameter.
 本発明のさらに別の態様は、運転支援方法である。この方法は、車両が過去に走行した走行環境を示す環境パラメータと当該環境パラメータに対して運転者が選択した行動とを対応させた走行履歴を運転者ごとに生成するステップと、生成した走行履歴のうち、現在の運転者に対する走行履歴に類似した走行履歴を取得するステップと、を備える。この運転支援方法は、さらに、取得した走行履歴をもとに、ドライバモデルを生成するステップと、生成したドライバモデルと、車両の現在の走行環境を示す環境パラメータとをもとに、次の行動を判定するステップと、を備える。 Still another aspect of the present invention is a driving support method. This method includes a step of generating, for each driver, a driving history in which an environmental parameter indicating a driving environment in which the vehicle has traveled in the past and an action selected by the driver with respect to the environmental parameter are associated with each other. And a step of acquiring a travel history similar to the travel history for the current driver. The driving support method further includes a step of generating a driver model based on the acquired traveling history, a generated driver model, and an environment parameter indicating a current driving environment of the vehicle, and the next action Determining.
 なお、以上の構成要素の任意の組合せ、本発明の表現を装置、システム、方法、プログラム、プログラムを記録した記録媒体、本装置を搭載した車両などの間で変換したものもまた、本発明の態様として有効である。 An arbitrary combination of the above components, the expression of the present invention converted between an apparatus, a system, a method, a program, a recording medium recording the program, a vehicle equipped with the apparatus, and the like are also included in the present invention. It is effective as an embodiment.
 本発明によれば、完全自動運転または一部自動運転において、車両と運転者の操作が対立しにくい、快適な自動運転ができるように、車両から乗員へ適切に情報を伝達することができる。 According to the present invention, information can be appropriately transmitted from the vehicle to the occupant so that comfortable automatic driving can be performed in which the operation of the vehicle and the driver is less likely to conflict with each other in fully automatic driving or partial automatic driving.
本発明の実施の形態1に係る情報報知装置を含む車両の要部構成を示すブロック図である。It is a block diagram which shows the principal part structure of the vehicle containing the information alerting device which concerns on Embodiment 1 of this invention. 走行環境の第1の例と、それに対する報知部の表示、及び、操作部の操作について説明する図である。It is a figure explaining the 1st example of driving environment, the display of the alerting | reporting part with respect to it, and operation of an operation part. 報知部における表示の別の例を示す図である。It is a figure which shows another example of the display in an alerting | reporting part. 本実施の形態における情報報知処理の処理手順を示すフローチャートである。It is a flowchart which shows the process sequence of the information alerting | reporting process in this Embodiment. 走行環境の第1の例と、それに対する表示制御を示す図である。It is a figure which shows the 1st example of driving environment, and the display control with respect to it. 走行環境の第1の例と、それに対する別の表示制御を示す図である。It is a figure which shows the 1st example of driving environment, and another display control with respect to it. 走行環境の第2の例と、それに対する表示制御を示す図である。It is a figure which shows the 2nd example of driving environment, and the display control with respect to it. 走行環境の第3の例と、それに対する表示制御を示す図である。It is a figure which shows the 3rd example of driving environment, and the display control with respect to it. 走行環境の第4の例と、それに対する表示制御を示す図である。It is a figure which shows the 4th example of driving environment, and the display control with respect to it. 走行環境の第5の例と、それに対する表示制御を示す図である。It is a figure which shows the 5th example of driving environment, and the display control with respect to it. 図5に示した走行環境の第1の例に対する別の表示制御を示す図である。It is a figure which shows another display control with respect to the 1st example of the driving | running | working environment shown in FIG. 図7に示した走行環境の第2の例に対する別の表示制御を示す図である。It is a figure which shows another display control with respect to the 2nd example of the driving | running | working environment shown in FIG. 本発明の実施の形態2に係る情報報知装置を含む車両の要部構成を示すブロック図である。It is a block diagram which shows the principal part structure of the vehicle containing the information alerting device which concerns on Embodiment 2 of this invention. 実施の形態2におけるタッチパネルの表示を説明する図である。10 is a diagram illustrating display on a touch panel in Embodiment 2. FIG. 本発明の実施の形態3における報知部の表示を説明する図である。It is a figure explaining the display of the alerting | reporting part in Embodiment 3 of this invention. 走行履歴の一例を示す図である。It is a figure which shows an example of a driving history. クラスタリング型のドライバモデルの構築方法を示す図である。It is a figure which shows the construction method of a clustering type driver model. 構築されたクラスタリング型のドライバモデルの一例を示す図である。It is a figure which shows an example of the constructed | assembled clustering type driver model. 構築されたクラスタリング型のドライバモデルの別の一例を示す図である。It is a figure which shows another example of the constructed | assembled clustering type driver model. 個別適応型のドライバモデルの構築方法を示す図である。It is a figure which shows the construction method of the individual adaptive type driver model. 構築された個別適応型のドライバモデルの一例を示す図である。It is a figure which shows an example of the constructed | assembled individual adaptation type driver model. 運転特性モデルの一例を示す図である。It is a figure which shows an example of a driving | running characteristic model. 本発明の実施の形態4における報知部の表示を説明する図である。It is a figure explaining the display of the alerting | reporting part in Embodiment 4 of this invention. 本発明の実施の形態4における報知部の表示を説明する図である。It is a figure explaining the display of the alerting | reporting part in Embodiment 4 of this invention. 本発明の実施の形態4における報知部の表示を説明する図である。It is a figure explaining the display of the alerting | reporting part in Embodiment 4 of this invention. 本発明の実施の形態4における報知部の表示を説明する図である。It is a figure explaining the display of the alerting | reporting part in Embodiment 4 of this invention. 走行履歴の一例を示す図である。It is a figure which shows an example of a driving history. 本変形例におけるドライバモデルの使用方法を示す図である。It is a figure which shows the usage method of the driver model in this modification. 本変形例におけるキャッシュの配置の一例を示すブロック図である。It is a block diagram which shows an example of arrangement | positioning of the cache in this modification. 本変形例におけるキャッシュの作成方法の一例を示す図である。It is a figure which shows an example of the creation method of the cache in this modification. 本変形例におけるキャッシュの作成方法の一例を示す図である。It is a figure which shows an example of the creation method of the cache in this modification. 実施の形態5に係る車両の構成を示すブロック図である。FIG. 10 is a block diagram illustrating a configuration of a vehicle according to a fifth embodiment. 図32の車両の室内を模式的に示す図である。It is a figure which shows typically the interior of the vehicle of FIG. 図32の検出部および検出情報入力部の詳細な構成を示すブロック図である。It is a block diagram which shows the detailed structure of the detection part of FIG. 32, and a detection information input part. 図32の制御部の詳細な構成を示すブロック図である。It is a block diagram which shows the detailed structure of the control part of FIG. 図35の走行履歴生成部において生成される走行履歴のデータ構造を示す図である。FIG. 36 is a diagram illustrating a data structure of a travel history generated by a travel history generation unit in FIG. 35. 図35の走行履歴生成部において生成される走行履歴の別のデータ構造を示す図である。FIG. 36 is a diagram showing another data structure of a travel history generated by the travel history generation unit in FIG. 35. 図35の送信部における処理概要を示す図である。It is a figure which shows the process outline | summary in the transmission part of FIG. 図35の送信部における別の処理概要を示す図である。It is a figure which shows another process outline | summary in the transmission part of FIG. 図35の問い合せ部における処理概要を示す図である。It is a figure which shows the process outline | summary in the inquiry part of FIG. 図35のドライバモデル生成部において生成されるドライバモデルのデータ構造を示す図である。It is a figure which shows the data structure of the driver model produced | generated in the driver model production | generation part of FIG. 図35の画面生成部において生成される画面を示す図である。It is a figure which shows the screen produced | generated in the screen production | generation part of FIG. 図34の第2検出部による検出手順を示すフローチャートである。It is a flowchart which shows the detection procedure by the 2nd detection part of FIG. 図32の運転支援装置による登録手順を示すシーケンス図である。It is a sequence diagram which shows the registration procedure by the driving assistance device of FIG. 図35の送信部による送信手順を示すフローチャートである。It is a flowchart which shows the transmission procedure by the transmission part of FIG. 図32の運転支援装置によるドライバモデルの生成手順を示すシーケンス図である。FIG. 33 is a sequence diagram illustrating a procedure for generating a driver model by the driving support apparatus of FIG. 32. 図35の走行履歴生成部による走行履歴の更新手順を示すフローチャートである。It is a flowchart which shows the update procedure of the travel history by the travel history production | generation part of FIG.
 以下、本発明の実施の形態について、図面を参照して詳細に説明する。なお、以下に説明する各実施の形態は一例であり、本発明はこれらの実施の形態により限定されるものではない。 Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. Each embodiment described below is an example, and the present invention is not limited to these embodiments.
 (実施の形態1)
 図1は、本発明の実施の形態1に係る情報報知装置を含む車両1の要部構成を示すブロック図である。車両1は、運転者の操作を必要とせずに、運転制御の全てまたは一部を自動で行うことができる車両である。
(Embodiment 1)
FIG. 1 is a block diagram showing a main configuration of a vehicle 1 including an information notification device according to Embodiment 1 of the present invention. The vehicle 1 is a vehicle that can automatically perform all or part of the driving control without requiring the operation of the driver.
 車両1は、ブレーキペダル2と、アクセルペダル3と、ウィンカレバー4と、ステアリングホイール5と、検出部6と、車両制御部7と、記憶部8と、情報報知装置9とを有する。 The vehicle 1 includes a brake pedal 2, an accelerator pedal 3, a winker lever 4, a steering wheel 5, a detection unit 6, a vehicle control unit 7, a storage unit 8, and an information notification device 9.
 ブレーキペダル2は、運転者によるブレーキ操作を受けつけ、車両1を減速させる。またブレーキペダル2は、車両制御部7による制御結果を受けつけ、車両1の減速の度合いに対応した量変化してもよい。アクセルペダル3は、運転者によるアクセル操作を受けつけ、車両1を加速させる。またアクセルペダル3は、車両制御部7による制御結果を受けつけ、車両1の加速の度合いに対応した量変化してもよい。ウィンカレバー4は、運転者によるレバー操作を受けつけ、車両1の図示しない方向指示器を点灯させる。またウィンカレバー4は、車両制御部7による制御結果を受けつけ、車両1の方向指示方向に対応する状態にウィンカレバー4を変化させ、車両1の図示しない方向指示器を点灯させてもよい。 The brake pedal 2 receives a brake operation by the driver and decelerates the vehicle 1. The brake pedal 2 may receive a control result from the vehicle control unit 7 and change in an amount corresponding to the degree of deceleration of the vehicle 1. The accelerator pedal 3 accepts an accelerator operation by the driver and accelerates the vehicle 1. Further, the accelerator pedal 3 may receive a control result by the vehicle control unit 7 and may change by an amount corresponding to the degree of acceleration of the vehicle 1. The winker lever 4 receives a lever operation by the driver and turns on a direction indicator (not shown) of the vehicle 1. The winker lever 4 may receive a control result from the vehicle control unit 7, change the winker lever 4 to a state corresponding to the direction indicating direction of the vehicle 1, and turn on a direction indicator (not shown) of the vehicle 1.
 ステアリングホイール5は、運転者によるステアリング操作を受けつけ、車両1の走行する方向を変更する。またステアリングホイール5は、車両制御部7による制御結果を受けつけ、車両1の走行する方向の変更に対応した量変化してもよい。ステアリングホイール5は、操作部51を有する。 Steering wheel 5 receives the steering operation by the driver and changes the traveling direction of the vehicle 1. Further, the steering wheel 5 may receive a control result by the vehicle control unit 7 and may change by an amount corresponding to a change in the traveling direction of the vehicle 1. The steering wheel 5 has an operation unit 51.
 操作部51は、ステアリングホイール5の前面(運転者と対向する面)に設けられ、運転者からの入力操作を受け付ける。操作部51は、例えば、ボタン、タッチパネル、グリップセンサ等の装置である。操作部51は、運転者から受けつけた入力操作の情報を車両制御部7へ出力する。 The operation unit 51 is provided on the front surface (surface facing the driver) of the steering wheel 5 and receives an input operation from the driver. The operation unit 51 is a device such as a button, a touch panel, or a grip sensor, for example. The operation unit 51 outputs information on the input operation received from the driver to the vehicle control unit 7.
 検出部6は、車両1の走行状態、及び、車両1の周囲の状況を検出する。そして、検出部6は、検出した走行状態、及び、周囲の状況の情報を車両制御部7へ出力する。 The detection unit 6 detects the traveling state of the vehicle 1 and the situation around the vehicle 1. Then, the detection unit 6 outputs information on the detected traveling state and surrounding conditions to the vehicle control unit 7.
 この検出部6は、位置情報取得部61と、センサ62と、速度情報取得部63と、地図情報取得部64とを有する。 The detection unit 6 includes a position information acquisition unit 61, a sensor 62, a speed information acquisition unit 63, and a map information acquisition unit 64.
 位置情報取得部61は、GPS(Global Positioning System)測位等により車両1の位置情報を走行状態の情報として取得する。 The position information acquisition unit 61 acquires the position information of the vehicle 1 as travel state information by GPS (Global Positioning System) positioning or the like.
 センサ62は、車両1の周囲に存在する他車両の位置および車線位置情報から、他車両の位置および先行車両かどうかという種別、他車両の速度と自車両の速度から衝突予測時間(TTC:Time To Collision)、車両1の周囲に存在する障害物など、車両1の周囲の状況を検出する。 The sensor 62 determines the collision prediction time (TTC: Time) from the position of the other vehicle existing around the vehicle 1 and the lane position information, from the type of the other vehicle and whether it is a preceding vehicle, the speed of the other vehicle, and the speed of the own vehicle. To Collision), the situation around the vehicle 1 such as an obstacle around the vehicle 1 is detected.
 速度情報取得部63は、走行状態の情報として、図示しない速度センサ等から車両1の速度あるいは走行方向などの情報を取得する。 The speed information acquisition unit 63 acquires information such as the speed of the vehicle 1 or the traveling direction from a speed sensor or the like (not shown) as traveling state information.
 地図情報取得部64は、車両1が走行する道路、道路における他車両との合流ポイント、現在走行中の車線、交差点の位置などの車両1の周辺の地図情報を、車両1の周囲の状況の情報として取得する。 The map information acquisition unit 64 obtains map information around the vehicle 1 such as the road on which the vehicle 1 travels, a merging point with other vehicles on the road, the currently traveling lane, the position of the intersection, and the like. Obtain as information.
 なお、センサ62は、ミリ波レーダ、レーザレーダあるいはカメラなど、またそれらの組合せから構成される。 The sensor 62 is constituted by a millimeter wave radar, a laser radar, a camera, or a combination thereof.
 記憶部8は、ROM(Read Only Memory)、RAM(Random Access Memory)、ハードディスク装置あるいはSSD(Solid State Drive)などの記憶装置であり、現時点の走行環境と、次に(第1の所定時間経過後に)とり得る挙動の候補との間の対応関係を記憶する。 The storage unit 8 is a storage device such as a ROM (Read Only Memory), a RAM (Random Access Memory), a hard disk device, or an SSD (Solid State Drive), and the current running environment and next (first predetermined time elapse). Memorize the correspondence between possible behavior candidates (later).
 現時点の走行環境とは、車両1の位置、車両1が走行している道路、車両1の周囲に存在する他車両の位置および速度等によって判定される環境である。なお、瞬間的なデータのみならず、その時点の前後のデータまでを基に、例えば、他車両の位置あるいは速度により加速中、減速中、他車両が割込んできて1秒後には衝突する可能性まで判定してもよい。これにより、他車両の行動を予測することができ、走行環境をより詳細かつ正確に把握することが可能である。挙動の候補とは、現時点の走行環境に対して、車両1が次に(第1の所定時間経過後に)とり得る挙動の候補である。 The current traveling environment is an environment determined by the position of the vehicle 1, the road on which the vehicle 1 is traveling, the position and speed of other vehicles existing around the vehicle 1, and the like. In addition, based on not only the instantaneous data but also the data before and after that point, for example, the other vehicle may be interrupted during acceleration or deceleration depending on the position or speed of the other vehicle, and collision may occur after 1 second. You may judge to sex. Thereby, the behavior of the other vehicle can be predicted, and the traveling environment can be grasped in more detail and accurately. The candidate for behavior is a candidate for behavior that the vehicle 1 can take next (after the first predetermined time) with respect to the current traveling environment.
 例えば、記憶部8は、車両1が走行する車線の前方に合流路があり、車線の左方から合流する車両が存在し、かつ、車両1が走行する車線の右方への車線変更が可能な走行環境に対応付けて、車両1の加速、車両1の減速、及び、車両1の右方への車線変更の3通りの挙動の候補を予め記憶する。 For example, the storage unit 8 has a merge path ahead of the lane in which the vehicle 1 travels, there is a vehicle that merges from the left side of the lane, and the lane can be changed to the right side of the lane in which the vehicle 1 travels. In association with various driving environments, three behavior candidates of acceleration of the vehicle 1, deceleration of the vehicle 1, and lane change to the right of the vehicle 1 are stored in advance.
 また、記憶部8は、車両1と同一車線の前方を走行する車両(以下、「先行車両」と記載)が車両1よりも遅い速度で走行し、かつ、隣の車線への車線変更が可能な走行環境に対応付けて、先行車両を追い越す走行、隣の車線へ車線変更を行う走行、車両1を減速させて先行車両に追従する走行の3通りの挙動の候補を予め記憶する。 In addition, the storage unit 8 allows a vehicle traveling in front of the same lane as the vehicle 1 (hereinafter referred to as “preceding vehicle”) to travel at a slower speed than the vehicle 1 and can change the lane to an adjacent lane. Corresponding to a different driving environment, three behavior candidates are stored in advance: driving that overtakes the preceding vehicle, driving that changes the lane to the adjacent lane, and driving that decelerates the vehicle 1 and follows the preceding vehicle.
 さらに、記憶部8は、それぞれの挙動の候補に対する優先順位を記憶してもよい。例えば、記憶部8は、過去の同一の走行環境において実際に採用された挙動の回数を記憶し、採用された回数の多い挙動ほど高く設定された優先順位を記憶してもよい。 Furthermore, the storage unit 8 may store priorities for the respective behavior candidates. For example, the storage unit 8 may store the number of behaviors actually adopted in the same driving environment in the past, and may store the priority set higher for the behaviors that are adopted more frequently.
 車両制御部7は、例えば、LSI回路、または、車両を制御する電子制御ユニット(Electronic Control Unit:ECU)の一部として実現可能である。車両制御部7は、検出部6から取得する走行状態および周囲の状況の情報に基づいて、車両を制御し、車両制御結果に対応してブレーキペダル2、アクセルペダル3、ウィンカレバー4、情報報知装置9を制御する。なお、車両制御部7が制御する対象は、これらに限定されない。 The vehicle control unit 7 can be realized as a part of an LSI circuit or an electronic control unit (ECU) that controls the vehicle, for example. The vehicle control unit 7 controls the vehicle based on the traveling state information and the surrounding situation information acquired from the detection unit 6, and the brake pedal 2, the accelerator pedal 3, the blinker lever 4, and information notification corresponding to the vehicle control result. The device 9 is controlled. In addition, the object which the vehicle control part 7 controls is not limited to these.
 まず、車両制御部7は、走行状態および周囲の状況の情報に基づいて、現時点の走行環境を判定する。この判定には、従来提案されている様々な方法が利用され得る。 First, the vehicle control unit 7 determines the current driving environment based on information on the driving state and surrounding conditions. For this determination, various conventionally proposed methods can be used.
 例えば、車両制御部7は、走行状態および周囲の状況の情報に基づいて、現時点の走行環境が、「車両1が走行する車線の前方に合流路があり、車線の左方から合流する車両が存在し、かつ、車両1が走行する車線の右方への車線変更が可能な走行環境」であると判定する。 For example, the vehicle control unit 7 determines that the current driving environment is based on the information on the driving state and the surrounding situation: “There is a merge path in front of the lane in which the vehicle 1 travels, and a vehicle that merges from the left side of the lane. It is determined that the travel environment is present and can be changed to the right of the lane in which the vehicle 1 travels.
 また、例えば、車両制御部7は、走行状態および周囲の状況の情報に基づいて、走行環境の時系列が、「車両1と同一車線の前方を走行する車両が車両1よりも遅い速度で走行し、かつ、隣の車線への車線変更が可能な走行環境」であると判定する。 In addition, for example, the vehicle control unit 7 determines that the time series of the travel environment is “a vehicle traveling in front of the same lane as the vehicle 1 travels at a slower speed than the vehicle 1 based on information on the travel state and the surrounding conditions. In addition, it is determined that the travel environment allows a lane change to the adjacent lane.
 車両制御部7は、走行状態および周囲の状況を示す走行環境に関する情報を情報報知装置9の報知部92に報知させる。また、車両制御部7は、判定した走行環境に対して、車両1が次に(第1の所定時間経過後に)とり得る挙動の候補を記憶部8から読み出す。 The vehicle control unit 7 causes the notification unit 92 of the information notification device 9 to notify information related to the traveling environment indicating the traveling state and the surrounding situation. Further, the vehicle control unit 7 reads, from the storage unit 8, behavior candidates that the vehicle 1 can take next (after the first predetermined time has elapsed) with respect to the determined traveling environment.
 車両制御部7は、読み出した挙動の候補から、現在の走行環境に最も適した挙動がどれかを判定し、現在の走行環境に最も適した挙動を第1の挙動に設定する。なお、第1の挙動は車両が現在実施している挙動と同じ挙動、即ち現在実施している挙動を継続することであってもよい。そして、車両制御部7は、現在の走行環境において第1の挙動を除く他に運転者が実施可能な挙動の候補を第2の挙動(いわゆる実施する挙動とは異なる挙動)に設定する。 The vehicle control unit 7 determines which behavior is most suitable for the current traveling environment from the read behavior candidates, and sets the behavior most suitable for the current traveling environment as the first behavior. The first behavior may be the same behavior that the vehicle is currently implementing, that is, continuing the currently implemented behavior. And the vehicle control part 7 sets the candidate of the behavior which a driver | operator can implement other than a 1st behavior in the present driving environment to a 2nd behavior (behavior different from the behavior to implement).
 例えば、車両制御部7は、走行状態および周囲の状況の情報に基づいて最も適した挙動を判定する従来技術を用いて、最も適した挙動を第1の挙動に設定することとしてもよい。 For example, the vehicle control unit 7 may set the most suitable behavior as the first behavior using a conventional technique that determines the most suitable behavior based on information on the running state and the surrounding situation.
 または、車両制御部7は、複数の挙動の候補のうち、予め設定された挙動を最も適した挙動として設定してもよいし、前回選択された挙動の情報を記憶部8に記憶しておき、その挙動を最も適した挙動と判定してもよいし、過去に各挙動が選択された回数を記憶部8に記憶しておき、回数が最も多い挙動を最も適した挙動と判定してもよい。 Alternatively, the vehicle control unit 7 may set a preset behavior among the plurality of behavior candidates as the most suitable behavior, or store information on the behavior selected last time in the storage unit 8. The behavior may be determined as the most suitable behavior, or the number of times each behavior has been selected in the past is stored in the storage unit 8, and the behavior with the largest number of times may be determined as the most suitable behavior. Good.
 そして、車両制御部7は、第1の挙動と第2の挙動の情報を情報報知装置9の報知部92に報知させる。なお、車両制御部7は第2の挙動が無いと判定した場合、第1の挙動のみを報知部92に報知させる。 And the vehicle control part 7 makes the alerting | reporting part 92 of the information alerting | reporting apparatus 9 alert | report the information of a 1st behavior and a 2nd behavior. Note that when the vehicle control unit 7 determines that there is no second behavior, the vehicle control unit 7 informs the notification unit 92 of only the first behavior.
 なお、車両制御部7は、第1の挙動と第2の挙動の情報と、走行状態および周囲の状況の情報とを、同時に、報知部92に報知させてもよい。 Note that the vehicle control unit 7 may cause the notification unit 92 to simultaneously notify the information on the first behavior and the second behavior, and information on the running state and the surrounding situation.
 さらに、車両制御部7は、操作部51が運転者から受けつけた操作の情報を取得する。車両制御部7は、第1の挙動と第2の挙動を報知してから、第2の所定時間内に操作部51が操作を受けつけたか否かを判定する。この操作は、例えば、第2の挙動に含まれる挙動の中から1つの挙動を選択する操作である。 Furthermore, the vehicle control unit 7 acquires information on the operation received by the operation unit 51 from the driver. After notifying the first behavior and the second behavior, the vehicle control unit 7 determines whether or not the operation unit 51 has accepted the operation within the second predetermined time. This operation is, for example, an operation for selecting one behavior from behaviors included in the second behavior.
 車両制御部7は、第2の所定時間内に操作部51が操作を受けつけなかった場合、第1の挙動を実行するように車両を制御し、車両制御結果に対応してブレーキペダル2、アクセルペダル3、ウィンカレバー4を制御する。 The vehicle control unit 7 controls the vehicle so as to execute the first behavior when the operation unit 51 does not accept the operation within the second predetermined time, and the brake pedal 2 and the accelerator according to the vehicle control result. The pedal 3 and the winker lever 4 are controlled.
 車両制御部7は、第2の所定時間内に操作部51が操作を受けつけた場合、受けつけた操作に対応する制御を行う。 The vehicle control unit 7 performs control corresponding to the accepted operation when the operation unit 51 accepts the operation within the second predetermined time.
 情報報知装置9は、車両制御部7から車両1の走行に関する種々の情報を取得し、取得した情報を報知する。情報報知装置9は、情報取得部91と報知部92とを有する。 The information notification device 9 acquires various information related to the traveling of the vehicle 1 from the vehicle control unit 7 and notifies the acquired information. The information notification device 9 includes an information acquisition unit 91 and a notification unit 92.
 情報取得部91は、車両制御部7から車両1の走行に関する種々の情報を取得する。例えば、情報取得部91は、車両制御部7が車両1の挙動を更新する可能性があると判定した場合に、車両制御部7から第1の挙動の情報と第2の挙動の情報を取得する。 The information acquisition unit 91 acquires various information related to the traveling of the vehicle 1 from the vehicle control unit 7. For example, the information acquisition unit 91 acquires the first behavior information and the second behavior information from the vehicle control unit 7 when the vehicle control unit 7 determines that there is a possibility of updating the behavior of the vehicle 1. To do.
 そして、情報取得部91は、取得した情報を図示しない記憶部に一時的に記憶し、必要に応じて記憶した情報を記憶部から読み出して報知部92へ出力する。 And the information acquisition part 91 memorize | stores the acquired information temporarily in the memory | storage part which is not shown in figure, reads the memorize | stored information from the memory | storage part as needed, and outputs it to the alerting | reporting part 92.
 報知部92は、車両1の走行に関する情報を運転者に報知する。報知部92は、例えば、車内に設置されているカーナビゲーションシステム、ヘッドアップディスプレイ、センターディスプレイ、ステアリングホイール5あるいはピラーに設置されているLED(Light Emitting Diode)などの発光体などのような情報を表示する表示部であってもよい。また、報知部92は、情報を音声に変換して運転者に報知するスピーカであってもよいし、あるいは、運転者が感知できる位置(例えば、運転者の座席、ステアリングホイール5など)に設けられる振動体であってもよい。また、報知部92は、これらの組み合わせであってもよい。 The notification unit 92 notifies the driver of information related to the traveling of the vehicle 1. The notification unit 92 displays information such as a car navigation system installed in the vehicle, a head-up display, a center display, a steering wheel 5 or a light emitter such as a light emitting diode (LED) installed in the pillar. The display part to display may be sufficient. In addition, the notification unit 92 may be a speaker that converts information into sound and notifies the driver, or is provided at a position that can be sensed by the driver (for example, the driver's seat, the steering wheel 5). It may be a vibrating body. The notification unit 92 may be a combination of these.
 以下の説明では、報知部92が報知装置であるものとする。 In the following description, it is assumed that the notification unit 92 is a notification device.
 この場合、報知部92とは、例えば、ヘッドアップディスプレイ(Head Up Display:HUD)、LCD(Liquid Crystal Display)、HMD(Head-Mounted DisplayまたはHelmet-Mounted Display)、眼鏡型ディスプレイ(Smart Glasses)、その他の専用のディスプレイなどである。HUDは、例えば、車両1のウインドシールドであってもよいし、別途設けられるガラス面、プラスチック面(例えば、コンバイナ)などであってもよい。また、ウインドシールドは、例えば、フロントガラスであってもよいし、車両1のサイドガラスまたはリアガラスであってもよい。 In this case, the notification unit 92 includes, for example, a head-up display (HUD), an LCD (Liquid Crystal Display), an HMD (Head-Mounted Display or Helmet-Mounted Display), a glasses-type display (Smart Glasses), Other dedicated displays. The HUD may be, for example, a windshield of the vehicle 1, or may be a separately provided glass surface, plastic surface (for example, a combiner), or the like. The windshield may be, for example, a windshield, a side glass or a rear glass of the vehicle 1.
 さらに、HUDは、ウインドシールドの表面または内側に備えられた透過型ディスプレイであってもよい。ここで、透過型ディスプレイとは、例えば、透過型の有機EL(Electroluminescence)ディスプレイ、または、特定の波長の光を照射した際に発光するガラスを用いた透明なディスプレイである。運転者は、背景を視認すると同時に、透過型ディスプレイ上の表示を視認することができる。このように報知部92は、光を透過する表示媒体であってもよい。いずれの場合も、画像が報知部92に表示される。 Further, the HUD may be a transmissive display provided on the surface or inside of the windshield. Here, the transmissive display is, for example, a transmissive organic EL (Electroluminescence) display or a transparent display using glass that emits light when irradiated with light of a specific wavelength. The driver can view the display on the transmissive display at the same time as viewing the background. Thus, the notification unit 92 may be a display medium that transmits light. In either case, an image is displayed on the notification unit 92.
 報知部92は、車両制御部7から情報取得部91を介して取得した走行に関する情報を運転者に報知する。例えば、報知部92は、車両制御部7から取得した第1の挙動、及び、第2の挙動の情報を運転者に報知する。 The notification unit 92 notifies the driver of information related to travel acquired from the vehicle control unit 7 via the information acquisition unit 91. For example, the notification unit 92 notifies the driver of information on the first behavior and the second behavior acquired from the vehicle control unit 7.
 ここで、具体的な表示内容、及び、操作部51に対してなされる操作について説明する。 Here, specific display contents and operations performed on the operation unit 51 will be described.
 図2は、走行環境の第1の例と、それに対する報知部92の表示、及び、操作部51の操作について説明する図である。 FIG. 2 is a diagram for explaining a first example of the traveling environment, the display of the notification unit 92 and the operation of the operation unit 51 corresponding thereto.
 図2の(a)は、車両1の走行環境を示す俯瞰図である。具体的には、図2の(a)は、車両1が走行する車線の前方に合流路があり、車線の左方から合流する車両が存在し、かつ、車両1が走行する車線の右方への車線変更が可能な走行環境であることを示している。 (A) of FIG. 2 is an overhead view showing a traveling environment of the vehicle 1. Specifically, (a) of FIG. 2 is a right-hand side of the lane in which the vehicle 1 travels, and there is a merge channel in front of the lane in which the vehicle 1 travels. This indicates that the driving environment is capable of changing lanes.
 車両制御部7は、走行状態および周囲の状況の情報に基づき、走行環境が、図2の(a)に示すような走行環境であると判定する。なお、車両制御部7は、図2の(a)に示す俯瞰図を生成し、第1の挙動、及び、第2の挙動の情報に加えて、生成した俯瞰図を報知部92に報知させてもよい。 The vehicle control unit 7 determines that the traveling environment is a traveling environment as shown in FIG. 2A based on information on the traveling state and the surrounding situation. Note that the vehicle control unit 7 generates the overhead view shown in FIG. 2A and causes the notification unit 92 to notify the generated overhead view in addition to the information on the first behavior and the second behavior. May be.
 図2の(b)は、図2の(a)に示した走行環境に対する報知部92の表示の一例を示している。報知部92の表示範囲のうち、右側には、車両1の挙動に関わる選択肢が表示され、左側には、手動運転に切り替えるための情報が表示される。 (B) in FIG. 2 shows an example of the display of the notification unit 92 for the traveling environment shown in (a) in FIG. Among the display range of the notification unit 92, options on the behavior of the vehicle 1 are displayed on the right side, and information for switching to manual driving is displayed on the left side.
 第1の挙動は、表示領域29a~29c、29gのうち、強調されている表示領域29bに示されている「車線変更」である。第2の挙動は、表示領域29a、29cにそれぞれ示されている「加速」、「減速」である。また、表示領域29gには、手動運転に切替えることを示す「自動運転終了」が表示されている。 The first behavior is “lane change” shown in the highlighted display area 29b among the display areas 29a to 29c, 29g. The second behavior is “acceleration” and “deceleration” shown in the display areas 29a and 29c, respectively. The display area 29g displays “automatic operation end” indicating switching to manual operation.
 図2の(c)は、ステアリングホイール5に設けられる操作部51の一例を示している。操作部51は、ステアリングホイール5の右側に設けられる操作ボタン51a~51dと、ステアリングホイール5の左側に設けられる操作ボタン51e~51hとを有する。なお、ステアリングホイール5に設けられる操作部51の数や形状等はこれらに限定されない。 FIG. 2C shows an example of the operation unit 51 provided in the steering wheel 5. The operation unit 51 includes operation buttons 51 a to 51 d provided on the right side of the steering wheel 5 and operation buttons 51 e to 51 h provided on the left side of the steering wheel 5. In addition, the number, shape, etc. of the operation part 51 provided in the steering wheel 5 are not limited to these.
 本実施の形態では、図2の(b)に示す表示領域29a~29cと操作ボタン51a~51cがそれぞれ対応し、表示領域29gと操作ボタン51gとが対応する。 In this embodiment, the display areas 29a to 29c and the operation buttons 51a to 51c shown in FIG. 2B correspond to each other, and the display area 29g and the operation buttons 51g correspond to each other.
 この構成において、運転者は、各表示領域に表示される内容のいずれかを選択する際に、各表示領域に対応する操作ボタンを押下する。例えば、運転者が表示領域29aに表示される「加速」という挙動を選択する場合、運転者は、操作ボタン51aを押下する。 In this configuration, the driver presses an operation button corresponding to each display area when selecting any of the contents displayed in each display area. For example, when the driver selects the behavior “acceleration” displayed in the display area 29a, the driver presses the operation button 51a.
 なお、図2の(b)には、各表示領域に文字の情報のみが表示されているが、次に説明するように、車両の駆動に関する記号あるいはアイコンを表示してもよい。これにより、運転者に表示内容を一目瞭然に把握できる。 In FIG. 2B, only character information is displayed in each display area, but a symbol or icon relating to driving of the vehicle may be displayed as described below. As a result, the driver can grasp the display contents at a glance.
 図3は、報知部92における表示の別の例を示す図である。図3に示すように、表示領域39a~39c、39gに文字の情報とその情報を示す記号の両方が表示される。なお、記号のみが表示されてもよい。 FIG. 3 is a diagram showing another example of display in the notification unit 92. As shown in FIG. 3, both character information and symbols indicating the information are displayed in the display areas 39a to 39c and 39g. Only symbols may be displayed.
 次に、具体的な走行環境を例に挙げて、表示制御の流れについて説明する。 Next, the flow of display control will be described using a specific driving environment as an example.
 図4は、本実施の形態における情報報知処理の処理手順を示すフローチャートである。図5は、走行環境の第1の例と、それに対する表示制御を示す図である。 FIG. 4 is a flowchart showing a processing procedure of information notification processing in the present embodiment. FIG. 5 is a diagram illustrating a first example of a traveling environment and display control for the first example.
 図4に示すように、検出部6は、車両の走行状態を検出する(ステップS11)。次に、検出部6は、車両の周囲の状況を検出する(ステップS12)。検出された車両の走行状態、及び、車両の周囲の状況の情報は、検出部6により車両制御部7へ出力される。 As shown in FIG. 4, the detection unit 6 detects the traveling state of the vehicle (step S11). Next, the detection unit 6 detects the situation around the vehicle (step S12). Information on the detected traveling state of the vehicle and the situation around the vehicle is output by the detection unit 6 to the vehicle control unit 7.
 つぎに、車両制御部7は、走行状態および周囲の状況の情報に基づいて、現時点の走行環境を判定する(ステップS13)。図5の(a)の例の場合、車両制御部7は、現時点の走行環境が、「車両1が走行する車線の前方に合流路があり、車線の左方から合流する車両が存在し、かつ、車両1が走行する車線の右方への車線変更が可能な走行環境」であると判定する。 Next, the vehicle control unit 7 determines the current traveling environment based on the information on the traveling state and the surrounding situation (step S13). In the case of the example of FIG. 5A, the vehicle control unit 7 indicates that the current travel environment is “there is a merge path in front of the lane in which the vehicle 1 travels, and a vehicle that merges from the left side of the lane, And it determines with it being the driving | running | working environment in which the lane change to the right side of the lane which the vehicle 1 drive | works is possible.
 その後、車両制御部7は、判定した走行環境の情報を情報報知装置9の報知部92に報知させる(ステップS14)。図5の(b)の例の場合、車両制御部7は、判定した走行環境の情報を情報取得部91へ出力する。報知部92は、情報取得部91から走行環境の情報を取得し、文字情報59として表示させる。なお、車両制御部7は、走行環境の情報を報知部92に表示させる代わりに、スピーカ等で音声として走行環境の情報を運転者に報知してもよい。これにより、運転者がディスプレイあるいはモニターを見ていない、もしくは見落としている場合でも、運転者に確実に情報を伝達できる。 Thereafter, the vehicle control unit 7 causes the notification unit 92 of the information notification device 9 to notify the determined traveling environment information (step S14). In the case of the example in FIG. 5B, the vehicle control unit 7 outputs information on the determined traveling environment to the information acquisition unit 91. The notification unit 92 acquires travel environment information from the information acquisition unit 91 and displays it as character information 59. In addition, the vehicle control unit 7 may notify the driver of the information on the driving environment as sound through a speaker or the like instead of displaying the information on the driving environment on the notification unit 92. Thereby, even when the driver is not looking at or overlooking the display or monitor, information can be reliably transmitted to the driver.
 次に、車両制御部7は、判定した走行環境が挙動の更新の可能性があるとするか否かを判定し、更新する可能性があるとすると判定された場合、さらに第1の挙動、及び、第2の挙動の判定を行う(ステップS15)。走行環境が挙動の更新の可能性があるとするか否かの判定は、走行環境が変更したか否かによって判定される。更新後実施する挙動とは、例えば、他の車両等と衝突が発生する可能性がある場合に減速する、ACC(Adaptive Cruise Control)において先行車両が消えた場合に速度変更する、隣の車線が空いた場合に車線変更するなどが考えられる。更新するか否かを判定するときは従来技術を用いてなされる。 Next, the vehicle control unit 7 determines whether or not the determined traveling environment has a possibility of updating the behavior. If it is determined that there is a possibility of updating, the vehicle control unit 7 further includes the first behavior, And determination of a 2nd behavior is performed (step S15). The determination as to whether or not the driving environment is likely to be updated is made based on whether or not the driving environment has changed. The behavior to be implemented after the update is, for example, the vehicle that decelerates when there is a possibility of a collision with another vehicle, etc., the speed changes when the preceding vehicle disappears in ACC (Adaptive Cruise Control), It is conceivable to change lanes when free. Whether to update or not is determined using conventional technology.
 この場合、車両制御部7は、判定した走行環境に対して、車両1が次に(第1の所定時間経過後に)とり得る挙動の候補を記憶部8から読み出す。そして、車両制御部7は、挙動の候補から、現在の走行環境に最も適した挙動がどれかを判定し、現在の走行環境に最も適した挙動を第1の挙動に設定する。そして、車両制御部7は、第1の挙動を除く挙動の候補を第2の挙動に設定する。 In this case, the vehicle control unit 7 reads, from the storage unit 8, candidate behaviors that the vehicle 1 can take next (after the first predetermined time has elapsed) with respect to the determined traveling environment. Then, the vehicle control unit 7 determines which behavior is most suitable for the current traveling environment from the behavior candidates, and sets the behavior most suitable for the current traveling environment as the first behavior. Then, the vehicle control unit 7 sets behavior candidates excluding the first behavior to the second behavior.
 図5の(b)の例の場合、車両制御部7は、記憶部8から、車両1の加速、車両1の減速、及び車両1の右方への車線変更の3通りの挙動の候補を読み出す。そして、車両制御部7は、左方から合流する車両の速度、及び、車両1の右方の車線の状況に基づき、車両1の右方への車線変更が最も適した挙動であると判定し、その挙動を第1の挙動に設定する。そして、車両制御部7は、第1の挙動を除く挙動の候補を第2の挙動に設定する。 In the case of the example of FIG. 5B, the vehicle control unit 7 selects from the storage unit 8 candidates for three behaviors of acceleration of the vehicle 1, deceleration of the vehicle 1, and lane change to the right of the vehicle 1. read out. Then, the vehicle control unit 7 determines that the rightward lane change of the vehicle 1 is the most suitable behavior based on the speed of the vehicle joining from the left side and the situation of the right lane of the vehicle 1. The behavior is set to the first behavior. Then, the vehicle control unit 7 sets behavior candidates excluding the first behavior to the second behavior.
 次に、車両制御部7は、第1の挙動、及び、第2の挙動を情報報知装置9の報知部92に報知させる(ステップS16)。図5の(b)の例の場合、報知部92は、第1の挙動の情報である「車線変更」という文字情報を表示領域59bに強調して表示し、第2の挙動の情報である「加速」、「減速」をそれぞれ表示領域59a、59cに表示させる。 Next, the vehicle control unit 7 causes the notification unit 92 of the information notification device 9 to notify the first behavior and the second behavior (step S16). In the case of the example of FIG. 5B, the notification unit 92 highlights and displays the character information “lane change”, which is the first behavior information, in the display area 59b, and is the second behavior information. “Acceleration” and “Deceleration” are displayed in the display areas 59a and 59c, respectively.
 次に、車両制御部7は、第2の所定時間内に操作部51が運転者からの操作を受けつけたか否かを判定する(ステップS17)。 Next, the vehicle control unit 7 determines whether or not the operation unit 51 has received an operation from the driver within the second predetermined time (step S17).
 例えば、車両制御部7は、現時点での走行環境が図5の(a)に示す走行環境であると判定してから、合流ポイントに到達するまでの時間を第1の所定時間と設定する。そして、車両制御部7は、第1の所定時間よりも短い第2の所定時間を、合流ポイントまでに実行される次の挙動に対する操作の受付が可能な時間として設定する。 For example, the vehicle control unit 7 sets the time from when it is determined that the current travel environment is the travel environment illustrated in FIG. 5A to the arrival at the junction point as the first predetermined time. And the vehicle control part 7 sets 2nd predetermined time shorter than 1st predetermined time as time when reception of operation with respect to the next behavior performed by a merge point is possible.
 車両制御部7は、第2の所定時間内に操作部51が運転者からの操作を受けつけた場合(ステップS17においてYES)、受けつけた操作が自動運転終了の操作か、挙動の選択操作(いわゆる更新)かを判定する(ステップS18)。 When the operation unit 51 receives an operation from the driver within the second predetermined time (YES in step S17), the vehicle control unit 7 determines whether the received operation is an operation for terminating automatic driving or a behavior selection operation (so-called operation). Update) is determined (step S18).
 図2にて説明したように、報知部92の各表示領域と操作部51の各操作ボタンとは対応している。運転者は、図5の(b)における自動運転終了を選択する場合、図2の(c)に示した操作ボタン51gを押下する。また、運転者は、挙動の選択を行う場合、図2の(c)に示した操作ボタン51a~51cのいずれかを押下する。 As described in FIG. 2, each display area of the notification unit 92 and each operation button of the operation unit 51 correspond to each other. When the driver selects the end of the automatic driving in FIG. 5B, the driver presses the operation button 51g shown in FIG. Further, when selecting the behavior, the driver presses one of the operation buttons 51a to 51c shown in FIG.
 車両制御部7は、操作部51が受けつけた操作が自動運転終了の操作である場合(つまり、操作ボタン51gが押下されたことを検知した場合)、自動運転を終了させる(ステップS19)。車両制御部7は、操作部51が受けつけた操作が挙動の選択操作である場合(つまり、操作ボタン51a~51cのいずれかが押下された場合)、押下された操作ボタンに対応する挙動を実行するように、車両1の制御を行う(ステップS20)。 The vehicle control unit 7 terminates the automatic driving when the operation received by the operating unit 51 is an operation for terminating the automatic driving (that is, when it is detected that the operation button 51g is pressed) (step S19). When the operation received by the operation unit 51 is a behavior selection operation (that is, when any of the operation buttons 51a to 51c is pressed), the vehicle control unit 7 executes the behavior corresponding to the pressed operation button. Thus, the vehicle 1 is controlled (step S20).
 車両制御部7は、第2の所定時間内に操作部51が運転者からの操作を受けつけなかった場合(ステップS17においてNO)、第1の挙動を実行するように、車両1の制御を行う(ステップS21)。 The vehicle control unit 7 controls the vehicle 1 to execute the first behavior when the operation unit 51 does not accept the operation from the driver within the second predetermined time (NO in step S17). (Step S21).
 図6は、走行環境の第1の例と、それに対する別の表示制御を示す図である。図6の(a)は、図5の(a)と同様であるが、図6の(b)の表示制御が図5の(b)の表示制御とは異なっている。 FIG. 6 is a diagram showing a first example of the driving environment and another display control for it. 6 (a) is the same as FIG. 5 (a), but the display control of FIG. 6 (b) is different from the display control of FIG. 5 (b).
 図5の(b)を用いて説明した場合と同様に、車両制御部7は、図6の(a)に示した走行環境に対して、記憶部8から、車両1の加速、車両1の減速、及び車両1の右方への車線変更の3通りの挙動の候補を読み出す。その際、記憶部8には、車両1の右方への車線変更が最も優先される挙動として記憶されているものとする。 Similarly to the case described with reference to (b) of FIG. 5, the vehicle control unit 7 accelerates the vehicle 1 from the storage unit 8 with respect to the traveling environment illustrated in (a) of FIG. Three candidate motions for deceleration and lane change to the right of the vehicle 1 are read out. At that time, it is assumed that the storage unit 8 stores a behavior in which the lane change to the right side of the vehicle 1 has the highest priority.
 この場合、車両制御部7は、走行環境の情報と、第1の挙動の情報とを報知部92に報知させる。図6の(b)の場合、車両制御部7は、走行環境の情報と、第1の挙動の情報を示す文字情報69を生成し、報知部92に文字情報69を表示させる。 In this case, the vehicle control unit 7 causes the notification unit 92 to notify the traveling environment information and the first behavior information. In the case of (b) in FIG. 6, the vehicle control unit 7 generates character information 69 indicating information on the driving environment and information on the first behavior, and causes the notification unit 92 to display the character information 69.
 そして、車両制御部7は、運転者に第1の挙動の採否を促す表示を表示領域69a、69cに表示させる。また、車両制御部7は、手動運転に切り替え可能であることを示す「自動運転終了」という表示を表示領域69gに表示させる。 Then, the vehicle control unit 7 causes the display areas 69a and 69c to display a display prompting the driver to adopt or reject the first behavior. In addition, the vehicle control unit 7 displays a display “automatic driving end” indicating that switching to manual driving is possible in the display area 69g.
 ここで、車両制御部7は、第1の挙動を採用することに対応する「YES」を強調して表示する。「YES」、「NO」のどちらを強調して表示するかは、予め定められていてもよいし、前回選択された選択肢を強調して表示することとしてもよいし、過去に選択された回数を記憶部8に記憶しておき、回数が多い方を報知部92が強調して表示することとしてもよい。 Here, the vehicle control unit 7 highlights and displays “YES” corresponding to adopting the first behavior. Which of “YES” and “NO” is emphasized and displayed may be determined in advance, the option selected last time may be highlighted and displayed, or the number of times selected in the past May be stored in the storage unit 8, and the notification unit 92 may highlight and display the one with the larger number of times.
 このように過去に選択された挙動を学習することにより、車両制御部7は、運転者に適切に情報を報知できる。また、図5の(b)の場合よりも報知部92に報知させる表示を減らすことができ、運転者の煩わしさを低減できる。 Thus, by learning the behavior selected in the past, the vehicle control unit 7 can appropriately notify the driver of information. Moreover, the display made to alert | report the alerting part 92 can be reduced rather than the case of FIG.5 (b), and a driver | operator's troublesomeness can be reduced.
 図7は、走行環境の第2の例と、それに対する表示制御を示す図である。図7の(a)は、走行環境を示す俯瞰図である。図7の(a)に示す走行環境は、前方に合流路がある点で図5の(a)、図6の(a)と同様であるが、車両1の右側に走行車両が存在する点で図5の(a)、図6の(a)と異なる。このような場合、車両制御部7は、車線変更が行えないと判断する。 FIG. 7 is a diagram showing a second example of the driving environment and display control for the second example. FIG. 7A is an overhead view showing the traveling environment. The traveling environment shown in FIG. 7A is the same as FIG. 5A and FIG. 6A in that there is a joint path ahead, but the traveling vehicle exists on the right side of the vehicle 1. 5 (a) and FIG. 6 (a) are different. In such a case, the vehicle control unit 7 determines that the lane change cannot be performed.
 そして、車両制御部7は、車両1の走行環境が図7の(a)のようなものと判定した場合、図7(b)に示すように、判定した走行環境の情報を報知部92に文字情報79として表示させる。 When the vehicle control unit 7 determines that the traveling environment of the vehicle 1 is as shown in FIG. 7A, the vehicle control unit 7 informs the notification unit 92 of the determined traveling environment information as shown in FIG. It is displayed as character information 79.
 さらに、車両制御部7は、記憶部8から読み出した車両1の加速、車両1の減速、及び、車両1の右方への車線変更の3通りの挙動の候補のうち、車両1の右方への車線変更はできないため、車両1の加速、及び、車両1の減速のみを選択する。 Further, the vehicle control unit 7 selects the right side of the vehicle 1 among the three behavior candidates of acceleration of the vehicle 1 read from the storage unit 8, deceleration of the vehicle 1, and lane change to the right side of the vehicle 1. Since the lane cannot be changed, only the acceleration of the vehicle 1 and the deceleration of the vehicle 1 are selected.
 また、車両制御部7は、このままの速度で進むと合流車両と接近しすぎることを予測し、車両1の減速が最も適した挙動である、つまり、第1の挙動であると判定する。 Further, the vehicle control unit 7 predicts that the vehicle 1 is too close to the joining vehicle when proceeding at this speed, and determines that the deceleration of the vehicle 1 is the most suitable behavior, that is, the first behavior.
 ここで、3通りの挙動の候補のうち、最も適した挙動がどれかは、走行状態および周囲の状況の情報に基づいて最も適した挙動を判定する従来技術を用いて判定される。また、最も適した挙動がどれかは、予め定められていてもよいし、前回選択された挙動の情報を記憶部8に記憶しておき、その挙動を最も適した挙動と判定してもよいし、過去に各挙動が選択された回数を記憶部8に記憶しておき、回数が最も多い挙動を最も適した挙動と判定してもよい。 Here, of the three behavior candidates, which is the most suitable behavior is determined using a conventional technique that determines the most suitable behavior based on information on the driving state and the surrounding situation. Further, which behavior is most suitable may be determined in advance, or information on the behavior selected last time may be stored in the storage unit 8, and the behavior may be determined as the most suitable behavior. Then, the number of times each behavior has been selected in the past may be stored in the storage unit 8, and the behavior with the largest number of times may be determined as the most suitable behavior.
 その後、車両制御部7は、「減速」を第1の挙動として表示領域79cに表示させ、「加速」を第2の挙動として表示領域79aに表示させる。また、車両制御部7は、手動運転に切替えることを示す「自動運転終了」という表示を表示領域79gに表示させる。 Thereafter, the vehicle control unit 7 displays “Deceleration” as the first behavior in the display area 79c, and displays “Acceleration” as the second behavior in the display area 79a. Further, the vehicle control unit 7 causes the display area 79g to display “automatic driving end” indicating switching to manual driving.
 このような表示制御により、車両制御部7は、走行環境に応じて、その走行環境に最も適した挙動を第1の挙動として運転者に報知できる。 By such display control, the vehicle control unit 7 can notify the driver of the behavior most suitable for the traveling environment as the first behavior according to the traveling environment.
 第1の挙動の情報を上方に、第2の挙動の情報を下方に配置し、それぞれ操作ボタン51a、51cに選択機能を割り当ててもよい。また、加速挙動の情報を上方に、減速挙動の情報を下方に、右車線変更の挙動の情報を右方に、左車線変更の挙動の情報を左方へ配置し、それぞれ操作ボタン51a、51c、51b、51dに選択機能を割り当ててもよい。また、それらを切り替えられるようにし、別途行動優先配置か、操作優先配置かを表示してもよい。さらに、第1の挙動の情報の表示サイズを大きく、第2の挙動の情報の表示サイズを小さくしてもよい。なお、車の前後・左右の挙動と対応して挙動情報の表示を配置することにより、運転者に直感的な認識と操作が可能である。 The information on the first behavior may be arranged on the upper side and the information on the second behavior may be arranged on the lower side, and selection functions may be assigned to the operation buttons 51a and 51c, respectively. Further, the information on the acceleration behavior is arranged upward, the information on the deceleration behavior is arranged downward, the information on the behavior of the right lane change is arranged on the right side, and the information on the behavior of the left lane change is arranged on the left side. , 51b, 51d may be assigned a selection function. In addition, it may be possible to switch between them, and a separate action priority arrangement or operation priority arrangement may be displayed. Furthermore, the display size of the first behavior information may be increased and the display size of the second behavior information may be decreased. In addition, by arranging the behavior information display corresponding to the behavior of the front / rear / left / right of the vehicle, the driver can recognize and operate intuitively.
 次に、前方に合流路があるという走行環境以外の走行環境の例について説明する。 Next, an example of a travel environment other than the travel environment in which there is a joint path ahead will be described.
 図8は、走行環境の第3の例と、それに対する表示制御を示す図である。図8の(a)は、車両1の走行環境を示す俯瞰図である。具体的には、図8の(a)には、先行車両が車両1よりも遅い速度で走行し、かつ、隣の車線への車線変更が可能な走行環境が示されている。 FIG. 8 is a diagram showing a third example of the driving environment and display control for it. FIG. 8A is an overhead view showing the traveling environment of the vehicle 1. Specifically, FIG. 8A shows a travel environment in which the preceding vehicle travels at a slower speed than the vehicle 1 and the lane can be changed to the adjacent lane.
 車両制御部7は、走行状態および周囲の状況の情報に基づき、走行環境が、図8の(a)に示すような走行環境であると判定する。この場合、車両制御部7は、判定した走行環境の情報を報知部92に文字情報89として表示させる。 The vehicle control unit 7 determines that the traveling environment is a traveling environment as shown in FIG. 8A based on information on the traveling state and the surrounding situation. In this case, the vehicle control unit 7 causes the notification unit 92 to display the determined traveling environment information as character information 89.
 また、車両制御部7は、判定した走行環境に対応する挙動の候補として、先行車両を追い越す走行、隣の車線へ車線変更を行う走行、車両1を減速させて先行車両を追従する走行の3通りの挙動の候補を記憶部8から読み出す。 In addition, the vehicle control unit 7 can select three behaviors as a candidate for the behavior corresponding to the determined traveling environment: traveling that overtakes the preceding vehicle, traveling that changes the lane to the adjacent lane, and traveling that decelerates the vehicle 1 and follows the preceding vehicle. The candidate for the street behavior is read from the storage unit 8.
 そして、車両制御部7は、例えば、先行車両の減速後の速度が所定値より高く許容できることから、車両1を減速させて先行車両を追従する走行が最も適した挙動、つまり、第1の挙動であると判定する。 For example, the vehicle control unit 7 allows the speed after the deceleration of the preceding vehicle to be higher than a predetermined value, so that the behavior in which the vehicle 1 decelerates and follows the preceding vehicle is most suitable, that is, the first behavior. It is determined that
 ここで、3通りの挙動の候補のうち、最も適した挙動がどれかは、走行状態および周囲の状況の情報に基づいて最も適した挙動を判定する従来技術を用いて判定される。また、最も適した挙動がどれかは、予め定められていてもよいし、前回選択された挙動の情報を記憶部8に記憶しておき、その挙動を最も適した挙動と判定してもよいし、過去に各挙動が選択された回数を記憶部8に記憶しておき、回数が最も多い挙動を最も適した挙動と判定してもよい。 Here, of the three behavior candidates, which is the most suitable behavior is determined using a conventional technique that determines the most suitable behavior based on information on the driving state and the surrounding situation. Further, which behavior is most suitable may be determined in advance, or information on the behavior selected last time may be stored in the storage unit 8, and the behavior may be determined as the most suitable behavior. Then, the number of times each behavior has been selected in the past may be stored in the storage unit 8, and the behavior with the largest number of times may be determined as the most suitable behavior.
 さらに、車両制御部7は、図8の(b)に示すように、第1の挙動を示す「追従」という文字情報を表示領域89cに強調して表示し、第2の挙動を示す「追い越し」、「車線変更」という文字情報をそれぞれ表示領域89a、89bに表示させる。また、車両制御部7は、手動運転に切替えることを示す「自動運転終了」という表示を表示領域89gに表示させる。 Further, as shown in FIG. 8B, the vehicle control unit 7 highlights and displays the character information “follow” indicating the first behavior in the display area 89c, and displays the “overtaking” indicating the second behavior. "And" change lane "are displayed in the display areas 89a and 89b, respectively. Further, the vehicle control unit 7 causes the display area 89g to display “automatic driving end” indicating switching to manual driving.
 第1の挙動の情報を上方に、第2の挙動の情報を下方に配置し、それぞれ操作ボタン51a、51cに選択機能を割り当ててもよい。また、追い越し挙動の情報を上方に、追従挙動の情報を下方に、右車線変更の挙動の情報を右方に、左車線変更の挙動の情報を左方へ配置し、それぞれ操作ボタン51a、51c、51b、51dに選択機能を割り当ててもよい。また、それらを切り替えられるようにし、別途行動優先配置か、操作優先配置かを表示してもよい。さらに、第1の挙動の情報の表示サイズを大きく、第2の挙動の情報の表示サイズを小さくしてもよい。 The information on the first behavior may be arranged on the upper side and the information on the second behavior may be arranged on the lower side, and selection functions may be assigned to the operation buttons 51a and 51c, respectively. Further, the information on the passing behavior is arranged upward, the information on the following behavior is arranged downward, the information on the behavior of the right lane change is arranged on the right side, and the information on the behavior of the left lane change is arranged on the left side. , 51b, 51d may be assigned a selection function. In addition, it may be possible to switch between them, and a separate action priority arrangement or operation priority arrangement may be displayed. Furthermore, the display size of the first behavior information may be increased and the display size of the second behavior information may be decreased.
 図9は、走行環境の第4の例と、それに対する表示制御を示す図である。図9の(a)は、車両1の走行環境を示す俯瞰図である。具体的には、図9の(a)は、走行環境が、車両1と同一車線の前方において、車線が減少する走行環境であることを示している。 FIG. 9 is a diagram showing a fourth example of the driving environment and display control for it. FIG. 9A is an overhead view showing the traveling environment of the vehicle 1. Specifically, FIG. 9A illustrates that the traveling environment is a traveling environment in which the lanes decrease in front of the same lane as the vehicle 1.
 車両制御部7は、走行状態および周囲の状況の情報に基づき、走行環境が、図9の(a)に示すような走行環境であると判定する。この場合、車両制御部7は、判定した走行環境の情報を報知部92に文字情報99として表示させる。 The vehicle control unit 7 determines that the traveling environment is a traveling environment as shown in FIG. 9A based on information on the traveling state and the surrounding situation. In this case, the vehicle control unit 7 causes the notification unit 92 to display the determined traveling environment information as the character information 99.
 また、車両制御部7は、判定した走行環境に対応する挙動の候補として、隣の車線へ車線変更を行う走行、そのまま現車線を維持する走行の2通りの挙動の候補を記憶部8から読み出す。 In addition, the vehicle control unit 7 reads out from the storage unit 8 two candidate behaviors, that is, a behavior for changing the lane to the adjacent lane and a driving for maintaining the current lane as the behavior candidates corresponding to the determined travel environment. .
 そして、車両制御部7は、例えば、車線減少箇所までのTTCが所定値より短いため、隣の車線へ車線変更を行う走行が最も適した挙動である、つまり、第1の挙動であると判定する。 For example, the vehicle control unit 7 determines that the travel to change the lane to the adjacent lane is the most suitable behavior, that is, the first behavior because the TTC to the lane decrease point is shorter than a predetermined value. To do.
 ここで、2通りの挙動の候補のうち、最も適した挙動がどれかは、走行状態および周囲の状況の情報に基づいて最も適した挙動を判定する従来技術を用いて判定される。また、最も適した挙動がどれかは、予め定められていてもよいし、前回選択された挙動の情報を記憶部8に記憶しておき、その挙動を最も適した挙動と判定してもよいし、過去に各挙動が選択された回数を記憶部8に記憶しておき、回数が最も多い挙動を最も適した挙動と判定してもよい。 Here, which of the two behavior candidates is the most suitable behavior is determined using a conventional technique for determining the most suitable behavior based on information on the driving state and the surrounding situation. Further, which behavior is most suitable may be determined in advance, or information on the behavior selected last time may be stored in the storage unit 8, and the behavior may be determined as the most suitable behavior. Then, the number of times each behavior has been selected in the past may be stored in the storage unit 8, and the behavior with the largest number of times may be determined as the most suitable behavior.
 さらに、車両制御部7は、図9の(b)に示すように、第1の挙動を示す「車線変更」という文字情報を表示領域99bに強調して表示し、第2の挙動を示す「そのまま」という文字情報を表示領域99cに表示させる。また、車両制御部7は、手動運転に切替えることを示す「自動運転終了」という表示を表示領域99gに表示させる。 Further, as shown in FIG. 9B, the vehicle control unit 7 highlights and displays the character information “lane change” indicating the first behavior in the display area 99 b and indicates the second behavior “ Character information “as is” is displayed in the display area 99c. Further, the vehicle control unit 7 causes the display area 99g to display “automatic driving end” indicating switching to manual driving.
 第1の挙動の情報を上方に、第2の挙動の情報を下方に配置し、それぞれ操作ボタン51a、51cに選択機能を割り当ててもよいし、何もしない挙動の情報を下方に、右車線変更の挙動の情報を右方に、左車線変更の挙動の情報を左方へ配置し、それぞれ操作ボタン51c、51b、51dに選択機能を割り当ててもよいし、それらを切り替えられるようにし、別途行動優先配置か、操作優先配置かを表示してもよい。さらに、第1の挙動の情報の表示サイズを大きく、第2の挙動の情報の表示サイズを小さくしてもよい。なお、図7、8、9に示されているように、異なる走行環境によって、表示領域にはそれぞれ異なる機能が割り当てられることで、少ない領域で情報報知あるいは操作することができる。 The first behavior information may be arranged above, the second behavior information may be arranged below, and a selection function may be assigned to each of the operation buttons 51a and 51c. The change behavior information is arranged on the right side, the left lane change behavior information is arranged on the left side, and a selection function may be assigned to each of the operation buttons 51c, 51b, 51d. You may display whether it is action priority arrangement | positioning or operation priority arrangement | positioning. Furthermore, the display size of the first behavior information may be increased and the display size of the second behavior information may be decreased. As shown in FIGS. 7, 8, and 9, different functions are assigned to the display areas according to different traveling environments, so that information notification or operation can be performed in a small area.
 上記の説明では、車両制御部7が、走行環境および周囲の状況の情報に応じて、報知部92に挙動を報知させる場合について説明したが、本発明はこれに限定されない。例えば、運転者による所定の操作があったときに、報知部92に挙動を報知させることとしてもよい。 In the above description, the case has been described in which the vehicle control unit 7 causes the notification unit 92 to notify the behavior in accordance with the information on the traveling environment and the surrounding situation, but the present invention is not limited to this. For example, when a predetermined operation is performed by the driver, the notification unit 92 may be notified of the behavior.
 図10は、走行環境の第5の例と、それに対する表示制御を示す図である。図10の(a)は、車両1の走行環境を示す俯瞰図である。具体的には、図10の(a)には、車両1が左方と右方にそれぞれ車線変更可能な走行環境であることを示す走行環境が示されている。 FIG. 10 is a diagram showing a fifth example of the driving environment and display control for it. FIG. 10A is an overhead view showing the traveling environment of the vehicle 1. Specifically, FIG. 10A shows a traveling environment indicating that the vehicle 1 is a traveling environment in which lanes can be changed to the left and right.
 図10の(a)に示す走行環境は、図5の(a)~図9の(a)の場合と異なり、車線の変更あるいは車両の加速、減速が不要な通常走行が可能な走行環境である。この場合、車両制御部7は、図10の(b)の表示109に示すように、走行環境の情報を報知部92に文字情報として表示させなくともよい。 The driving environment shown in (a) of FIG. 10 is different from the cases of (a) to (a) of FIG. 5 and is a driving environment in which normal driving without changing lanes or accelerating or decelerating the vehicle is possible. is there. In this case, as shown in the display 109 in FIG. 10B, the vehicle control unit 7 does not have to display the information on the driving environment on the notification unit 92 as character information.
 このように報知部92に文字情報が表示されていない状況において、運転者が操作部51のいずれかの操作ボタンを押下した場合、車両制御部7は、通常走行における挙動の候補を記憶部8から読み出す。 When the driver presses one of the operation buttons of the operation unit 51 in a situation where the character information is not displayed on the notification unit 92 as described above, the vehicle control unit 7 stores the behavior candidate in the normal travel as the storage unit 8. Read from.
 具体的には、記憶部8には、図10の(a)に示すような通常走行の走行環境に対応付けて、車両1の加速、車両1の減速、車両1の右方への車線変更、車両1の左方への車線変更の4通りの挙動の候補が記憶されている。車両制御部7は、これらを読み出し、報知部92の表示領域109a~109dにそれぞれ表示させる。 Specifically, in the storage unit 8, the acceleration of the vehicle 1, the deceleration of the vehicle 1, and the lane change to the right side of the vehicle 1 are associated with the traveling environment of the normal traveling as shown in FIG. , Four behavior candidates for changing the lane to the left of the vehicle 1 are stored. The vehicle control unit 7 reads out these and displays them on the display areas 109a to 109d of the notification unit 92, respectively.
 また、車両制御部7は、手動運転に切り替えることを示す「自動運転終了」という表示を表示領域109gに表示させるとともに、挙動の更新をキャンセルすることを示す「キャンセル」という表示を表示領域109eに強調して表示させる。 In addition, the vehicle control unit 7 displays “automatic driving end” indicating that switching to manual driving is displayed in the display area 109g, and displays “cancel” indicating canceling behavior update in the display area 109e. Highlight and display.
 以上説明した本実施の形態によれば、運転者に次に実施される挙動の候補を効果的に報知し、運転者により好ましい挙動を選択させることができる。 According to the present embodiment described above, it is possible to effectively notify the driver of behavior candidates to be performed next, and to select a preferable behavior by the driver.
 なお、運転者が実施したい挙動を選択する代わりに、直接ステアリングホイールなどの手動操作をしてもよい。これにより、運転者が自分の意思により素早く手動運転操作に切り替えられる。 Note that instead of selecting the behavior that the driver wants to perform, manual operation of the steering wheel or the like may be performed directly. As a result, the driver can quickly switch to the manual driving operation according to his / her own intention.
 [変形例]
 以上説明した本実施の形態では、報知部92における表示は、文字情報であるとして説明したが、本発明はこれに限定されない。例えば、挙動を示す記号を用いて運転者に視覚的に表示させてもよい。以下では、運転者に視覚的に表示させる記号を用いた表示を図5および図7に対する表示を例にとって説明する。
[Modification]
In the present embodiment described above, the display in the notification unit 92 has been described as character information, but the present invention is not limited to this. For example, it may be displayed visually to the driver using a symbol indicating the behavior. In the following, the display using symbols visually displayed to the driver will be described by taking the display for FIGS. 5 and 7 as an example.
 図11は、図5に示した走行環境の第1の例に対する別の表示制御を示す図である。この例では、上述した第1の挙動が車両1の右方への車線変更であり、第2の挙動が車両1の加速、及び、車両1の減速である。 FIG. 11 is a diagram showing another display control for the first example of the traveling environment shown in FIG. In this example, the first behavior described above is a lane change to the right of the vehicle 1, and the second behavior is acceleration of the vehicle 1 and deceleration of the vehicle 1.
 この場合、第1の挙動である「車線変更」を示す記号111が中央に大きく表示され、第2の挙動である「車両1の加速」を示す記号112、及び、「車両1の減速」を示す記号113が右方に小さく表示される。また、自動運転終了を示す記号114が左方に小さく表示される。 In this case, a symbol 111 indicating “lane change” as the first behavior is displayed large in the center, a symbol 112 indicating “acceleration of the vehicle 1” as a second behavior, and “deceleration of the vehicle 1”. The symbol 113 shown is displayed small to the right. Further, a symbol 114 indicating the end of automatic driving is displayed small on the left.
 そして、このまま運転手により車両1の挙動の変更指示を受けつけなければ、車線変更が行われる。 If the driver does not accept the change instruction of the behavior of the vehicle 1 as it is, the lane change is performed.
 図12は、図7に示した走行環境の第2の例に対する別の表示制御を示す図である。この例では、上記第1の例と異なり、車両1の右方に別の車両が走行しているため、車線変更ができない。そのため、例えば、「車両1の減速」が第1の挙動に設定され、「車両1の加速」が第2の挙動に設定される。 FIG. 12 is a diagram showing another display control for the second example of the traveling environment shown in FIG. In this example, unlike the first example, since another vehicle is traveling to the right of the vehicle 1, the lane cannot be changed. Therefore, for example, “deceleration of the vehicle 1” is set to the first behavior, and “acceleration of the vehicle 1” is set to the second behavior.
 そして、この場合、図12の(a)に示すように、第1の挙動である「車両1の減速」を示す記号121が中央に大きく表示され、第2の挙動である「車両1の加速」を示す記号122が右方に小さく表示される。また、自動運転終了を示す記号123が左方に小さく表示される。 In this case, as shown in FIG. 12A, the symbol 121 indicating “deceleration of the vehicle 1” that is the first behavior is displayed large in the center, and “acceleration of the vehicle 1” that is the second behavior. The symbol 122 indicating "" is displayed small to the right. In addition, a symbol 123 indicating the end of automatic driving is displayed small on the left.
 ここで、操作部51が運転手から「車両1の加速」を選択する操作を受けつけたものとする。この場合、図12の(b)に示すように、第1の挙動である「車両1の加速」を示す記号122’が中央に大きく表示され、第2の挙動である「車両1の減速」を示す記号121’が右方に小さく表示されることになる。 Here, it is assumed that the operation unit 51 receives an operation for selecting “acceleration of the vehicle 1” from the driver. In this case, as shown in FIG. 12 (b), the symbol 122 ′ indicating “acceleration of the vehicle 1” as the first behavior is displayed large in the center, and “deceleration of the vehicle 1” as the second behavior. The symbol 121 ′ indicating “” is displayed small to the right.
 以上説明した本実施の形態によれば、運転者に次に実施される挙動の候補を効果的に報知し、運転者により好ましい挙動を選択させることができる。一方、運転者は、車両が実施する挙動や他に選択可能な挙動を把握でき、安心感を持って自動運転を継続することできる。または、運転者がスムーズに車へ指示を与えることができる。 According to the present embodiment described above, it is possible to effectively notify the driver of behavior candidates to be performed next, and to select a preferable behavior by the driver. On the other hand, the driver can grasp the behavior performed by the vehicle and other behaviors that can be selected, and can continue the automatic driving with a sense of security. Alternatively, the driver can give instructions to the car smoothly.
 また、本実施の形態によれば、走行環境に応じて、報知部に報知させる選択肢、つまり、第2の挙動を可変にすることができる。 Further, according to the present embodiment, the option to be notified to the notification unit, that is, the second behavior can be made variable according to the traveling environment.
 (実施の形態2)
 実施の形態1では、ステアリングホイール5に設けられた操作部51によって、報知部92の表示に応じた操作を行う構成について説明した。本実施の形態では、ステアリングホイール5に設けられる操作部51の代わりに、タッチパネルが設けられる構成について説明する。
(Embodiment 2)
In the first embodiment, the configuration in which the operation unit 51 provided on the steering wheel 5 performs an operation according to the display of the notification unit 92 has been described. In the present embodiment, a configuration in which a touch panel is provided instead of the operation unit 51 provided on the steering wheel 5 will be described.
 図13は、本発明の実施の形態2に係る情報報知装置を含む車両1の要部構成を示すブロック図である。なお、図13において、図1と共通する構成には図1と同一の符号を付し、その詳しい説明を省略する。図13に示す車両1には、ステアリングホイール5の操作部51の代わりにタッチパネル10が設けられている。 FIG. 13 is a block diagram showing a main configuration of the vehicle 1 including the information notification device according to Embodiment 2 of the present invention. In FIG. 13, the same components as those in FIG. 1 are denoted by the same reference numerals as those in FIG. A vehicle 1 shown in FIG. 13 is provided with a touch panel 10 instead of the operation unit 51 of the steering wheel 5.
 タッチパネル10は、情報の表示と入力の受付が可能な液晶パネル等からなる装置であり、車両制御部7と接続される。タッチパネル10は、車両制御部7による制御に基づいて情報を表示する表示部101と、運転者等からの操作を受けつけ、受けつけた操作を車両制御部7へ出力する入力部102とを有する。 The touch panel 10 is a device composed of a liquid crystal panel or the like capable of displaying information and receiving input, and is connected to the vehicle control unit 7. The touch panel 10 includes a display unit 101 that displays information based on control by the vehicle control unit 7 and an input unit 102 that receives an operation from a driver or the like and outputs the received operation to the vehicle control unit 7.
 次に、タッチパネル10の表示制御について説明する。ここでは、車両1が3車線の中央を走行中であり、右方の車線と左方の車線のいずれかに車線変更が可能である場合の表示制御について説明する。 Next, display control of the touch panel 10 will be described. Here, display control when the vehicle 1 is traveling in the center of three lanes and the lane can be changed to either the right lane or the left lane will be described.
 図14は、実施の形態2におけるタッチパネル10の表示を説明する図である。図14の(a)は、タッチパネル10の表示部101の初期表示である。車両制御部7は、車両1が右方の車線と左方の車線のいずれかに車線変更が可能であると判定した場合、タッチパネル10の表示部101に図14の(a)のような表示を実行させる。ここで、表示領域121における「Touch」という表示は、タッチパネル10が運転者によるタッチ操作を受けつけ可能なモードであることを示している。 FIG. 14 is a diagram for explaining a display on the touch panel 10 according to the second embodiment. FIG. 14A shows an initial display of the display unit 101 of the touch panel 10. When the vehicle control unit 7 determines that the lane can be changed to either the right lane or the left lane, the vehicle control unit 7 displays on the display unit 101 of the touch panel 10 as shown in FIG. Is executed. Here, the display “Touch” in the display area 121 indicates that the touch panel 10 is in a mode in which a touch operation by the driver can be received.
 運転者は、図14の(a)に示す表示において、表示領域121をタッチするタッチ操作を行う場合、入力部102は、この操作を受けつけて、この操作が行われたことを示す情報を車両制御部7へ出力する。車両制御部7は、この情報を受けつけると、図14の(b)に示す表示を表示部101に表示させ、また、図14の(c)に示す表示を報知部92に表示させる。 When the driver performs a touch operation of touching the display area 121 in the display shown in FIG. 14A, the input unit 102 accepts this operation and provides information indicating that this operation has been performed to the vehicle. Output to the control unit 7. Upon receipt of this information, the vehicle control unit 7 causes the display unit 101 to display the display shown in FIG. 14B and causes the notification unit 92 to display the display shown in FIG.
 図14の(b)には、車両1へ移動を指示する操作であることを示す「Move」と表示された表示領域121aが示されている。また、図14の(b)には、車両1が3車線のそれぞれを走行可能であることを示す表示領域121b~121dが示されている。なお、表示領域121b~121dは、それぞれ、図14の(c)に矢印X、Y、Zで示される車線での走行と対応する。 FIG. 14B shows a display area 121a on which “Move” indicating an operation for instructing the vehicle 1 to move is displayed. Further, FIG. 14B shows display areas 121b to 121d indicating that the vehicle 1 can travel in each of the three lanes. The display areas 121b to 121d correspond to traveling in the lane indicated by arrows X, Y, and Z in FIG. 14C, respectively.
 また、図14の(b)の各表示領域と、図14の(c)の対応する矢印とは、態様(例えば、色あるいは配置など)を一致させる。これにより、運転者により理解しやすい表示となる。 Further, the display areas in FIG. 14B and the corresponding arrows in FIG. 14C have the same mode (for example, color or arrangement). As a result, the display is easier to understand for the driver.
 さらに、矢印X、Y、Zで示される車線の太さなどを変えて、車両制御が判定した車両が実施する挙動と他に運転者が選択可能な挙動が区別できるように表示してもよい。 Further, by changing the thickness of the lane indicated by the arrows X, Y, and Z, the behavior performed by the vehicle determined by the vehicle control may be displayed so that the behavior selectable by the driver can be distinguished. .
 運転者は、表示領域121b~121dのうち、走行したい車線に対応する表示領域に触れることによって、車両1の挙動の選択を行う。この場合、入力部102は、運転者の挙動の選択操作を受けつけて、選択された挙動の情報を車両制御部7へ出力する。そして、車両制御部7は、選択された挙動を実行するよう車両1を制御する。これにより、運転者が走行したい車線を車両1が走行することになる。 The driver selects the behavior of the vehicle 1 by touching the display area corresponding to the lane to be traveled among the display areas 121b to 121d. In this case, the input unit 102 accepts a driver's behavior selection operation and outputs information on the selected behavior to the vehicle control unit 7. Then, the vehicle control unit 7 controls the vehicle 1 to execute the selected behavior. As a result, the vehicle 1 travels in the lane that the driver wants to travel.
 なお、運転者は、タッチパネル10に対して、タッチ操作の代わりに、スワイプ操作を行ってもよい。例えば、図14に示す例において、運転者が図14の(c)の矢印Xで示される車線への変更を行いたい場合、運転者は、タッチパネル10において右方へのスワイプ操作を行う。 Note that the driver may perform a swipe operation on the touch panel 10 instead of the touch operation. For example, in the example shown in FIG. 14, when the driver wants to change to the lane indicated by the arrow X in FIG. 14C, the driver performs a right swipe operation on the touch panel 10.
 この場合、入力部102は、スワイプ操作を受けつけ、スワイプ操作の内容を示す情報を車両制御部7へ出力する。そして、車両制御部7は、選択された挙動である矢印Xで示される車線への車線変更を実行するよう車両1を制御する。 In this case, the input unit 102 receives the swipe operation and outputs information indicating the content of the swipe operation to the vehicle control unit 7. And the vehicle control part 7 controls the vehicle 1 to perform the lane change to the lane shown by the arrow X which is the selected behavior.
 さらに、車両1へ移動を指示する操作であることを示す「Move」と表示された表示領域121aが示されるときに、音声で「挙動選択」などと発話してもよい。これにより、手元のタッチパネルを見ることなく、HUDの表示のみで操作が可能となる。 Furthermore, when the display area 121a displayed as “Move” indicating an operation for instructing the vehicle 1 to move is displayed, the user may speak “behavior selection” or the like by voice. Thereby, it becomes possible to operate only by displaying the HUD without looking at the touch panel at hand.
 また、タッチ操作あるいはスワイプ操作の際に、選択したタッチパネルの表示領域に対応する車線の表示態様を変更し、どの車線を選択しようとしているのか選択前に確認できるようにしてもよい。例えば、表示領域bをタッチした瞬間に、車線Xの太さが拡大し、すぐに手を離せば車線Xは選択されず車線Xの太さが元の大きさに戻り、表示領域121cにタッチを移動した瞬間に、車線Yの太さが拡大し、しばらくその状態を保持すると、車線Yが選択され、車線Yが点滅することで決定されたことを伝えても良い。これにより、手元を目視せずに選択あるいは決定の操作ができる。 Also, during the touch operation or swipe operation, the display mode of the lane corresponding to the display area of the selected touch panel may be changed so that it can be confirmed before selecting which lane is being selected. For example, at the moment when the display area b is touched, the thickness of the lane X increases, and if the hand is released immediately, the lane X is not selected and the thickness of the lane X returns to the original size, and the display area 121c is touched. If the thickness of the lane Y increases and the state is maintained for a while, the lane Y may be selected and the fact that the lane Y has blinked may be notified. Thereby, selection or determination operation can be performed without looking at the hand.
 なお、実施の形態1と同様に、加速、減速、追越し、そのままなどの車両制御機能を、走行環境に応じて、表示領域に割り当てても良い。 Note that, similarly to the first embodiment, vehicle control functions such as acceleration, deceleration, overtaking, and the like may be assigned to the display area according to the driving environment.
 以上説明した本実施の形態によれば、操作部の代わりにタッチパネルを設けることにより、運転者に直感的な操作を行わせることができる。また、タッチパネルは、操作を受けつける表示領域の数、形状、色などを自由に変更させることができるため、ユーザインタフェースの自由度が向上する。 According to the present embodiment described above, by providing a touch panel instead of the operation unit, the driver can perform an intuitive operation. In addition, since the touch panel can freely change the number, shape, color, and the like of display areas for receiving operations, the degree of freedom of the user interface is improved.
 (実施の形態3)
 実施の形態1では、第1の挙動と第2の挙動が同時に表示される場合について説明した。本実施の形態では、まず、報知部92に第1の挙動が表示され、運転者の操作を受けつけた場合に、第2の挙動が表示される構成について説明する。
(Embodiment 3)
In the first embodiment, the case where the first behavior and the second behavior are displayed simultaneously has been described. In the present embodiment, first, a configuration in which the first behavior is displayed on the notification unit 92 and the second behavior is displayed when a driver's operation is accepted will be described.
 本実施の形態に係る構成は、実施の形態1で説明した図1の構成において、操作部51に運転者がステアリングホイール5を握ったか否かを検出するグリップセンサがさらに含まれた構成となる。 The configuration according to the present embodiment is a configuration in which, in the configuration of FIG. 1 described in the first embodiment, the operation unit 51 further includes a grip sensor that detects whether or not the driver has gripped the steering wheel 5. .
 図15は、本発明の実施の形態3における報知部92の表示を説明する図である。図15には、図8の(a)に示した場合と同様に、車両1と同一車線の前方を走行する車両が車両1よりも遅い速度で走行し、かつ、隣の車線への車線変更が可能な走行環境における表示の例が示されている。 FIG. 15 is a diagram illustrating the display of the notification unit 92 according to Embodiment 3 of the present invention. In FIG. 15, similarly to the case shown in FIG. 8A, a vehicle traveling in front of the same lane as the vehicle 1 travels at a slower speed than the vehicle 1, and the lane is changed to the adjacent lane. The example of the display in the driving environment which can do is shown.
 車両制御部7は、走行環境が、図8の(a)に示した走行環境であると判定すると、まず、報知部92に図15の(a)に示す表示を実行させる。 When the vehicle control unit 7 determines that the traveling environment is the traveling environment illustrated in FIG. 8A, the vehicle control unit 7 first causes the notification unit 92 to execute the display illustrated in FIG.
 図15の(a)には、第1の所定時間が経過した後に実施される挙動の候補うち、第1の挙動である「追い越し」を示す記号131が第1の態様(例えば、第1の色)で示されている。 In FIG. 15A, among the behavior candidates to be implemented after the first predetermined time has elapsed, the symbol 131 indicating “overtaking” which is the first behavior is displayed in the first mode (for example, the first Color).
 車両制御部7は、図15の(a)に示す表示を報知部92に実行させた後、第2の所定時間が経過した場合、記号131を第1の態様から、第1の態様とは異なる第2の態様(例えば、第1の色とは異なる第2の色)で報知部92に表示させる。ここで、第2の所定時間は、実施の形態1で説明した第2の所定時間と同様のものである。 When the second predetermined time has elapsed after causing the notification unit 92 to execute the display shown in FIG. 15A, the vehicle control unit 7 changes the symbol 131 from the first mode to the first mode. It is displayed on the notification unit 92 in a different second mode (for example, a second color different from the first color). Here, the second predetermined time is the same as the second predetermined time described in the first embodiment.
 つまり、記号131が第1の態様で示されている間、運転者は、第2の挙動の選択が可能であるが、記号131が第2の態様に変更された場合、運転者は、第2の挙動の選択が不可能になる。 That is, while the symbol 131 is shown in the first mode, the driver can select the second behavior, but when the symbol 131 is changed to the second mode, the driver Selection of the behavior of 2 becomes impossible.
 また、図15の(a)には、第2の挙動が選択可能であることを示すステアリングホイール形状の記号132が示されている。記号132が表示されている場合に運転者がステアリングホイール5を握ることによって、第2の挙動が表示される。記号132は、第2の挙動が選択可能であることを示す表示であるが、記号131が第1の態様にて表示されることによって、運転者に第2の挙動が選択可能であることを示すこととしてもよい。この場合、記号132は、表示されなくてもよい。 15A shows a steering wheel shape symbol 132 indicating that the second behavior can be selected. When the symbol 132 is displayed, when the driver holds the steering wheel 5, the second behavior is displayed. The symbol 132 is a display indicating that the second behavior can be selected. By displaying the symbol 131 in the first mode, the driver can select the second behavior. It may be shown. In this case, the symbol 132 may not be displayed.
 また、図15の(a)には、現在、自動運転中であることを示す記号133が示されている。記号133は、自動運転で走行中であることを運転者に示す補助的な表示であるが、記号133は表示されなくてもよい。 15A shows a symbol 133 indicating that automatic operation is currently in progress. The symbol 133 is an auxiliary display that indicates to the driver that the vehicle is traveling in automatic driving, but the symbol 133 may not be displayed.
 図15の(a)の表示に対して運転者がステアリングホイール5を握った場合、グリップセンサがそれを検出し、その検出結果の情報を車両制御部7へ出力する。この場合、車両制御部7は、図15の(b)に示す表示を報知部92に実行させる。 When the driver grips the steering wheel 5 with respect to the display of FIG. 15A, the grip sensor detects it and outputs information of the detection result to the vehicle control unit 7. In this case, the vehicle control unit 7 causes the notification unit 92 to execute the display shown in FIG.
 図15の(b)には、図15の(a)と同様に、第1の挙動である「追い越し」を示す記号131が第1の態様(例えば、第1の色)で示されている。また、第2の挙動である「車線変更」を示す記号134と、第2の挙動である「減速」を示す記号135が示されている。 In FIG. 15B, as in FIG. 15A, the symbol 131 indicating the “passing” which is the first behavior is shown in the first mode (for example, the first color). . Further, a symbol 134 indicating “lane change” as the second behavior and a symbol 135 indicating “deceleration” as the second behavior are shown.
 運転者は、ステアリングホイール5の操作部51を操作することによって、第1の挙動から第2の挙動への変更を行う。例えば、運転者は、操作部51の操作ボタン51a、または、操作ボタン51c(図2の(c)参照)を押下することによって、「車線変更」(記号134)、または、「減速」(記号135)への挙動の更新を行う。 The driver changes the first behavior to the second behavior by operating the operation unit 51 of the steering wheel 5. For example, the driver depresses the operation button 51a of the operation unit 51 or the operation button 51c (see (c) of FIG. 2), thereby “lane change” (symbol 134) or “deceleration” (symbol). 135) is updated.
 また、図15の(b)には、車両制御部7が、車両1の挙動を学習中であることを示す記号136が示されている。記号136が表示されている場合、車両制御部7は、運転者が選択した挙動を学習する。記号136は表示されなくても構わない。また、学習は常に行っていても構わない。 FIG. 15B also shows a symbol 136 indicating that the vehicle control unit 7 is learning the behavior of the vehicle 1. When the symbol 136 is displayed, the vehicle control unit 7 learns the behavior selected by the driver. The symbol 136 may not be displayed. In addition, learning may always be performed.
 つまり、車両制御部7は、運転者が選択した挙動を記憶部8に記憶し、次に同様の走行環境になった場合、記憶した挙動を第1の挙動として、報知部92に表示させる。または、車両制御部7は、過去に各挙動が選択された回数を記憶部8に記憶しておき、回数が最も多い挙動を第1の挙動として、報知部92に表示させてもよい。 That is, the vehicle control unit 7 stores the behavior selected by the driver in the storage unit 8, and when the same driving environment is subsequently set, the stored behavior is displayed on the notification unit 92 as the first behavior. Or the vehicle control part 7 may memorize | store the frequency | count that each behavior was selected in the memory | storage part 8 in the past, and may display the behavior with the largest frequency on the alerting | reporting part 92 as a 1st behavior.
 また、図15の(b)には、自動運転中ではないことを示す記号137が示されている。記号137が表示されている場合、車両制御部7は、第1の所定時間経過後に行う挙動が運転者によって選択されるまで待機する。 Further, in FIG. 15 (b), a symbol 137 indicating that automatic operation is not being performed is shown. When the symbol 137 is displayed, the vehicle control unit 7 waits until a behavior selected after the first predetermined time has elapsed is selected by the driver.
 図15の(b)に示す表示に対して、運転者が操作部51の操作ボタン51aを押下して「車線変更」を選択した場合、車両制御部7は、この選択操作の情報を受けつけ、図15(c)に示す表示を報知部92に実行させる。 When the driver presses the operation button 51a of the operation unit 51 and selects “change lane” with respect to the display shown in FIG. 15B, the vehicle control unit 7 receives information on the selection operation, The notification unit 92 is caused to execute the display shown in FIG.
 図15の(c)には、「車線変更」を示す記号134’が、第1の態様で示されている。車両制御部7は、「車線変更」を選択する選択操作の情報を受けつけた場合、この選択された挙動が次に行う挙動であると判定し、「車線変更」を示す記号134’を第1の態様で報知部92に表示させる。 15 (c), a symbol 134 'indicating "lane change" is shown in the first mode. When the vehicle control unit 7 receives information on the selection operation for selecting “lane change”, the vehicle control unit 7 determines that the selected behavior is the next behavior to be performed, and sets the symbol 134 ′ indicating “lane change” to the first. Is displayed on the notification unit 92.
 また、図15の(c)の記号131’は、図15の(b)において第1の挙動として表示されていた記号131が記号134と入れ替わって表示されたものである。 Further, the symbol 131 ′ in FIG. 15C is a symbol 131 ′ displayed as the first behavior in FIG. 15B and replaced with the symbol 134.
 また、図15の(c)に示す表示に対して、運転者が操作ボタンのいずれかを2度連続して押下した場合、運転者が前に行った選択操作をキャンセルできるようにしてもよい。この場合、車両制御部7は、操作ボタンのいずれかを2度連続して押下する操作の情報を受けつけ、図15の(c)に示す表示から図15の(b)に示す表示への変更を報知部92に実行させる。 In addition, when the driver presses one of the operation buttons twice in succession with respect to the display shown in FIG. 15C, the selection operation previously performed by the driver may be canceled. . In this case, the vehicle control unit 7 receives information on an operation of pressing one of the operation buttons twice in succession, and changes from the display shown in FIG. 15C to the display shown in FIG. 15B. Is executed by the notification unit 92.
 車両制御部7は、図15の(a)に示す表示を報知部92に実行させてから、第2の所定時間が経過するまでの間に、運転者の操作に基づいて、図15の(b)、図15の(c)へと報知部92の表示を変化させる。その後、車両制御部7は、図15の(a)に示す表示を報知部92に実行させてから第2の所定時間が経過した後に、図15の(d)に示す表示を報知部92に表示させる。 The vehicle control unit 7 causes the notification unit 92 to execute the display shown in FIG. 15A until the second predetermined time elapses, based on the driver's operation ( b) The display of the notification unit 92 is changed to (c) in FIG. Thereafter, the vehicle control unit 7 causes the notification unit 92 to display the display illustrated in FIG. 15D after the second predetermined time has elapsed since the notification unit 92 has executed the display illustrated in FIG. Display.
 なお、車両制御部7は、運転者がステアリングホイール5から手を離したこと示す情報をグリップセンサから取得した場合に、第2の所定時間が経過する前に図15の(d)に示す表示を報知部92に表示させてもよい。 The vehicle control unit 7 displays the information shown in FIG. 15D before the second predetermined time elapses when information indicating that the driver has released his hand from the steering wheel 5 is acquired from the grip sensor. May be displayed on the notification unit 92.
 ここで、図15の(d)には、次の挙動として、運転者が選択した「車線変更」を示す記号134’が第2の態様で表示され、また、自動運転で走行中であることを示す記号133が、再び、表示された状態が示されている。 Here, in (d) of FIG. 15, as the next behavior, the symbol 134 ′ indicating “lane change” selected by the driver is displayed in the second mode, and the vehicle is traveling in automatic driving. The state in which the symbol 133 indicating is displayed again is shown.
 以上説明した本実施の形態によれば、車両制御部7は、運転者が次にとる挙動を更新したい場合にのみ、他の挙動の候補を確認できるように、報知部92での表示を変更する。この構成により、運転者が視認する表示を減らすことができ、運転者の煩わしさを低減できる。 According to the present embodiment described above, the vehicle control unit 7 changes the display on the notification unit 92 so that the candidate of another behavior can be confirmed only when the driver wants to update the next behavior. To do. With this configuration, the display visually recognized by the driver can be reduced, and the driver's troublesomeness can be reduced.
 (実施の形態4)
 上述した実施の形態において、車両1が実行しうる複数の挙動の候補のうち最も適した挙動がどれかを判定する方法についていくつか説明した。本実施の形態では、最も適した挙動を判定する方法として、予め学習により構築されたドライバモデルを用いる場合について説明する。
(Embodiment 4)
In the embodiment described above, several methods have been described for determining which of the plurality of behavior candidates that the vehicle 1 can execute is the most suitable behavior. In the present embodiment, as a method for determining the most suitable behavior, a case will be described in which a driver model constructed by learning in advance is used.
 ここで、ドライバモデルの構築方法について説明する。ドライバモデルは、走行環境毎の運転者による操作の傾向を各操作の頻度の情報などに基づいてモデル化したものである。ドライバモデルは、複数の運転者の走行履歴を集約し、集約した走行履歴から構築される。 Here, the construction method of the driver model is explained. The driver model is obtained by modeling the tendency of the operation by the driver for each driving environment based on information on the frequency of each operation. The driver model aggregates the traveling histories of a plurality of drivers and is constructed from the aggregated traveling histories.
 運転者の走行履歴は、例えば、各走行環境に対応する挙動の候補のうち、運転者が実際に選択した挙動の頻度が、挙動の候補毎に集約された履歴である。 The driving history of the driver is, for example, a history in which the behavior frequency actually selected by the driver among the behavior candidates corresponding to each driving environment is aggregated for each behavior candidate.
 図16は、走行履歴の一例を示す図である。図16には、運転者xが「合流路が近づく」という走行環境において、「減速」、「加速」、「車線変更」という挙動の候補を、それぞれ、3回、1回、5回選択したことが示されている。また、図16には、運転者Xが「前方に低速車あり」という走行環境において、「追従」、「追い越し」、「車線変更」という挙動の候補を、それぞれ、2回、2回、1回選択したことが示されている。運転者yについても同様である。 FIG. 16 is a diagram showing an example of a travel history. In FIG. 16, in the driving environment where the driver x “closes the joint path”, the behavior candidates “decelerate”, “accelerate”, and “lane change” are selected three times, once, and five times, respectively. It has been shown. Further, in FIG. 16, in a driving environment where the driver X “has a low speed vehicle ahead”, candidate behaviors “follow”, “passing”, and “lane change” are shown twice, twice, 1 It is shown that you have selected once. The same applies to the driver y.
 運転者の走行履歴は、自動運転中に選択した挙動を集約してもよいし、運転者が手動運転中に実際に行った挙動を集約してもよい。これにより、自動運転あるいは手動運転といった運転状態に応じた走行履歴の収集ができる。 The driving history of the driver may aggregate behaviors selected during automatic driving or may aggregate behaviors actually performed by the driver during manual driving. This makes it possible to collect travel histories according to operating conditions such as automatic driving or manual driving.
 ドライバモデルには、複数の運転者の走行履歴をクラスタリングして構築するクラスタリング型と、特定の運転者(例えば、運転者x)の走行履歴と類似する複数の走行履歴から運転者xのドライバモデルを構築する個別適応型とがある。 The driver model includes a clustering type constructed by clustering the driving histories of a plurality of drivers, and a driver model of the driver x from a plurality of driving histories similar to a driving history of a specific driver (for example, driver x). There is an individual adaptive type that builds.
 まず、クラスタリング型について説明する。クラスタリング型のドライバモデルの構築方法は、図16に示したような複数の運転者の走行履歴を予め集約する。そして、互いの走行履歴の類似度が高い複数の運転者、つまり、類似した運転操作傾向を有する複数の運転者をグループ化してドライバモデルを構築する。 First, the clustering type will be described. The clustering type driver model construction method aggregates the driving histories of a plurality of drivers as shown in FIG. Then, a driver model is constructed by grouping a plurality of drivers having a high degree of similarity between the traveling histories, that is, a plurality of drivers having similar driving operation tendencies.
 図17は、クラスタリング型のドライバモデルの構築方法を示す図である。図17には、運転者a~fの走行履歴が表形式で示されている。そして、運転者a~fの走行履歴から、モデルAが運転者a~cの走行履歴から構築され、モデルBが運転者d~fの走行履歴から構築されることが示されている。 FIG. 17 is a diagram illustrating a clustering type driver model construction method. FIG. 17 shows the travel histories of the drivers a to f in a table format. From the driving histories of the drivers a to f, it is shown that the model A is constructed from the traveling histories of the drivers a to c, and the model B is constructed from the traveling histories of the drivers d to f.
 走行履歴の類似度は、例えば、運転者aと運転者bの走行履歴における各頻度(各数値)を頻度分布として扱い、互いの頻度分布の相関値を算出し、算出した相関値を類似度としてもよい。この場合、例えば、運転者aと運転者bの走行履歴から算出した相関値が所定値よりも高い場合に、運転者aと運転者bの走行履歴を1つのグループとする。 The similarity of the travel histories is, for example, treating each frequency (each numerical value) in the travel histories of the driver a and the driver b as a frequency distribution, calculating a correlation value between the frequency distributions, and using the calculated correlation value as the similarity It is good. In this case, for example, when the correlation value calculated from the driving history of the driver a and the driver b is higher than a predetermined value, the driving history of the driver a and the driver b is set as one group.
 なお、類似度の算出については、これに限定されない。例えば、運転者aと運転者bの各走行履歴において、最も頻度の高い挙動が一致する数に基づいて、類似度を算出してもよい。 Note that the calculation of similarity is not limited to this. For example, the degree of similarity may be calculated based on the number of the most frequently matched behaviors in the driving histories of the driver a and the driver b.
 そして、クラスタリング型のドライバモデルは、例えば、各グループ内の運転者の走行履歴において、それぞれの頻度の平均を算出することによって構築される。 The clustering type driver model is constructed by, for example, calculating the average of each frequency in the driving history of drivers in each group.
 図18は、構築されたクラスタリング型のドライバモデルの一例を示す図である。図17で示した各グループ内の運転者の走行履歴において、それぞれの頻度の平均を算出することによって、各グループの走行履歴の平均頻度を導出する。このように、クラスタリング型のドライバモデルは、走行環境毎に定められた挙動に対する平均頻度で構築される。 FIG. 18 is a diagram illustrating an example of a built clustering driver model. In the travel history of the drivers in each group shown in FIG. 17, the average frequency of each group is derived by calculating the average of the respective frequencies. Thus, the clustering type driver model is constructed with an average frequency for the behavior determined for each driving environment.
 なお、ドライバモデルは、算出した平均頻度から最も頻度の高いもののみで構築してもよい。図19は、構築されたクラスタリング型のドライバモデルの別の一例を示す図である。図19に示すように、走行環境毎に最頻の挙動が選択され、選択された挙動からドライバモデルが構築される。 It should be noted that the driver model may be constructed with only the highest frequency from the calculated average frequency. FIG. 19 is a diagram illustrating another example of the constructed clustering type driver model. As shown in FIG. 19, the most frequent behavior is selected for each traveling environment, and a driver model is constructed from the selected behavior.
 ここで、構築したクラスタリング型のドライバモデルの使用方法について、例を挙げて説明する。 Here, the method of using the built clustering driver model will be described with an example.
 図18に示したようなドライバモデルは、予め車両1の記憶部8に記憶される。また、車両制御部7は、運転者yが過去に運転した際の走行履歴を記憶部8に記憶しておく。なお、運転者yの検知は、車内に設置されるカメラ等(図示しない)で実行される。 The driver model as shown in FIG. 18 is stored in advance in the storage unit 8 of the vehicle 1. Further, the vehicle control unit 7 stores a travel history when the driver y has driven in the past in the storage unit 8. The driver y is detected by a camera or the like (not shown) installed in the vehicle.
 そして、車両制御部7は、運転者yの走行履歴とドライバモデルの各モデルの走行履歴との類似度を算出し、どのモデルが運転者yに最も適しているかを判定する。例えば、図16に示した運転者yの走行履歴と図18に示したドライバモデルの場合、車両制御部7は、モデルBが運転者yに最も適していると判定する。 Then, the vehicle control unit 7 calculates the similarity between the driving history of the driver y and the driving history of each model of the driver model, and determines which model is most suitable for the driver y. For example, in the case of the driving history of the driver y shown in FIG. 16 and the driver model shown in FIG. 18, the vehicle control unit 7 determines that the model B is most suitable for the driver y.
 車両制御部7は、実際の自動走行の際に、モデルBの各走行環境において、最も頻度が高い挙動が運転者yに最も適した挙動、つまり、第1の挙動であると判定する。 The vehicle control unit 7 determines that the behavior with the highest frequency is the behavior most suitable for the driver y, that is, the first behavior in each traveling environment of the model B during actual automatic traveling.
 このように、予め複数の運転者の走行履歴からドライバモデルを構築することにより、運転者により適した挙動を報知できる。 Thus, by constructing a driver model from a plurality of driver's driving histories in advance, it is possible to notify a behavior more suitable for the driver.
 例えば、図16に示すように、運転者yの走行履歴に「前方に低速車あり」という走行環境に対する挙動の頻度が0、つまり、運転者が「前方に低速車あり」という走行環境において「追従」、「追い越し」、「車線変更」という挙動を選択したことが無い場合においても、車両制御部7は、図18に示すモデルBに基づき、「前方に低速車あり」という走行環境において、「追従」を第1の挙動として判定できる。 For example, as illustrated in FIG. 16, in the driving environment where the driving history of the driver y is “there is a low-speed vehicle ahead” in the driving history of the driver y, that is, the driving environment is “a low-speed vehicle ahead”. Even when the behavior of “follow”, “passing”, and “lane change” has never been selected, the vehicle control unit 7 is based on the model B shown in FIG. “Follow-up” can be determined as the first behavior.
 次に、個別適応型について説明する。個別適応型のドライバモデルの構築方法は、クラスタリング型の場合と同様に、図16に示したような複数の運転者の走行履歴を予め集約する。ここで、クラスタリング型の場合と異なる点は、運転者毎にドライバモデルを構築する点である。以下では、運転者yに対してドライバモデルを構築する例について説明する。 Next, the individual adaptive type will be described. As in the case of the clustering type, the individual adaptive type driver model construction method aggregates the driving histories of a plurality of drivers as shown in FIG. Here, the difference from the clustering type is that a driver model is constructed for each driver. Below, the example which builds a driver model with respect to the driver | operator y is demonstrated.
 まず、集約した複数の運転者の走行履歴の中から、運転者yの走行履歴と類似度が高い複数の運転者の走行履歴を抽出する。そして、抽出した複数の運転者の走行履歴から運転者yのドライバモデルを構築する。 First, the driving histories of a plurality of drivers having high similarity to the driving history of the driver y are extracted from the driving histories of the plurality of drivers collected. Then, a driver model of the driver y is constructed from the extracted driving histories of the plurality of drivers.
 図20は、個別適応型のドライバモデルの構築方法を示す図である。図20には、図17と同様に、運転者a~fの走行履歴が表形式で示されている。また、図20には、図16に示した運転者yの走行履歴と類似度が高い運転者c~eの走行履歴とから運転者yのドライバモデルが構築されることが示されている。 FIG. 20 is a diagram showing a method for constructing an individual adaptive driver model. In FIG. 20, the driving histories of the drivers a to f are shown in a table format, as in FIG. FIG. 20 shows that the driver model of the driver y is constructed from the driving history of the driver y shown in FIG. 16 and the driving histories of the drivers c to e having high similarity.
 個別適応型のドライバモデルは、抽出した各運転者の走行履歴において、それぞれの頻度の平均を算出することによって構築される。 The individual adaptive driver model is constructed by calculating the average of each frequency in the extracted driving history of each driver.
 図21は、構築された個別適応型のドライバモデルの一例を示す図である。図16に示した運転者yの走行履歴、及び、図20に示した運転者c~eの走行履歴において、走行環境毎に、各挙動の平均頻度を導出する。このように、運転者yに対する個別適応型のドライバモデルは、各走行環境に対応する挙動の平均頻度で構築される。 FIG. 21 is a diagram illustrating an example of a constructed individual adaptive driver model. In the driving history of the driver y shown in FIG. 16 and the driving history of the drivers c to e shown in FIG. 20, the average frequency of each behavior is derived for each driving environment. As described above, the individually adaptive driver model for the driver y is constructed with an average frequency of behavior corresponding to each traveling environment.
 ここで、構築した個別適応型のドライバモデルの使用方法について、例を挙げて説明する。 Here, how to use the built individually adaptable driver model is explained with an example.
 図21に示したような運転者yのドライバモデルは、予め車両1の記憶部8に記憶される。また、車両制御部7は、運転者yが過去に運転した際の走行履歴を記憶部8に記憶しておく。なお、運転者yの検知は、車内に設置されるカメラ等(図示しない)で実行される。 The driver model of the driver y as shown in FIG. 21 is stored in advance in the storage unit 8 of the vehicle 1. Further, the vehicle control unit 7 stores a travel history when the driver y has driven in the past in the storage unit 8. The driver y is detected by a camera or the like (not shown) installed in the vehicle.
 そして、車両制御部7は、実際の自動走行の際に、運転者yのドライバモデルの各走行環境において、最も頻度が高い挙動が運転者yに最も適した挙動、つまり、第1の挙動であると判定する。 Then, the vehicle control unit 7 determines that the behavior with the highest frequency is the most suitable behavior for the driver y, that is, the first behavior in each driving environment of the driver model of the driver y in actual automatic driving. Judge that there is.
 このように、予め複数の運転者の走行履歴から運転者個人のドライバモデルを構築することにより、運転者により適した挙動を報知できる。 In this way, by constructing a driver model of an individual driver from the driving histories of a plurality of drivers in advance, a behavior more suitable for the driver can be notified.
 例えば、図16に示すように、運転者yの走行履歴に「前方に低速車あり」という走行環境に対する挙動の頻度が0、つまり、運転者が「前方に低速車あり」という走行環境において「追従」、「追い越し」、「車線変更」という挙動を選択したことが無い場合においても、車両制御部7は、図21に示すドライバモデルに基づき、「前方に低速車あり」という走行環境において、「車線変更」を第1の挙動として判定できる。 For example, as illustrated in FIG. 16, in the driving environment where the driving history of the driver y is “there is a low-speed vehicle ahead” in the driving history of the driver y, that is, the driving environment is “a low-speed vehicle ahead”. Even when the behaviors of “follow”, “passing”, and “lane change” have never been selected, the vehicle control unit 7 is based on the driver model shown in FIG. “Changing lane” can be determined as the first behavior.
 次に、運転者の運転特性(運転の癖)を取得し、運転者の嗜好に応じた自動運転を行う場合について説明する。一般に、1つの挙動(例えば、車線変更)に対する実際の動作(例えば、加速、減速の大きさ、あるいは、ステアリングホイールの操作量)は、運転者毎に異なる。そのため、運転者の嗜好に応じた自動運転を行うことにより、運転者にとってより快適な走行が可能となる。 Next, the case where the driving characteristics (driving habits) of the driver are acquired and the automatic driving according to the driver's preference is performed will be described. In general, the actual operation (for example, the magnitude of acceleration, deceleration, or the amount of operation of the steering wheel) for one behavior (for example, lane change) differs for each driver. Therefore, it is possible to travel more comfortably for the driver by performing the automatic driving according to the preference of the driver.
 なお、以下の説明では、手動運転中に運転者の運転特性を取得し、取得した運転特性を自動運転の際に反映させる場合について説明するが、本発明はこれに限定されない。 In the following description, a case will be described in which the driving characteristics of the driver are acquired during manual driving and the acquired driving characteristics are reflected in automatic driving, but the present invention is not limited to this.
 車両制御部7は、運転者の車両1の各部の操作内容から、運転者の運転特性を示す特徴量を抽出し、記憶部8に記憶する。ここで、特徴量とは、例えば、速度に関する特徴量、ステアリングに関する特徴量、操作タイミングに関する特徴量、車外センシングに関する特徴量、車内センシングに関する特徴量等がある。 The vehicle control unit 7 extracts a feature amount indicating the driving characteristics of the driver from the operation content of each part of the vehicle 1 of the driver, and stores it in the storage unit 8. Here, the feature amount includes, for example, a feature amount related to speed, a feature amount related to steering, a feature amount related to operation timing, a feature amount related to outside-vehicle sensing, a feature amount related to in-vehicle sensing, and the like.
 速度に関する特徴量は、例えば、車両の速度、加速度、減速度などがあり、これらの特徴量は、車両が有する速度センサ等から取得される。 The feature quantity related to speed includes, for example, the speed, acceleration, and deceleration of the vehicle, and these feature quantities are acquired from a speed sensor or the like that the vehicle has.
 ステアリングに関する特徴量は、例えば、ステアリングの舵角、角速度、角加速度などがあり、これらの特徴量は、ステアリングホイール5から取得される。 The feature amount related to steering includes, for example, the steering angle, angular velocity, and angular acceleration of the steering, and these feature amounts are acquired from the steering wheel 5.
 操作タイミングに関する特徴量は、例えば、ブレーキ、アクセル、ウィンカレバー、ステアリングホイールの操作タイミングなどがあり、これらの特徴量は、それぞれ、ブレーキペダル2、アクセルペダル3、ウィンカレバー4、ステアリングホイール5から取得される。 The feature quantities related to the operation timing include, for example, the operation timing of the brake, accelerator, blinker lever, and steering wheel. These feature quantities are obtained from the brake pedal 2, the accelerator pedal 3, the blinker lever 4, and the steering wheel 5, respectively. Is done.
 車外センシングに関する特徴量は、例えば、前方、側方、後方に存在する車両との車間距離などがあり、これらの特徴量は、センサ62から取得される。 The feature amount related to outside-vehicle sensing includes, for example, a distance between vehicles in front, side, and rear, and these feature amounts are acquired from the sensor 62.
 車内センシングに関する特徴量は、例えば、運転者が誰であるか、及び、同乗者が誰であるかを示す個人認識情報であり、これらの特徴量は、車内に設置されるカメラ等から取得される。 The feature amount related to in-vehicle sensing is, for example, personal recognition information indicating who the driver is and who is the passenger, and these feature amounts are acquired from a camera or the like installed in the vehicle. The
 例えば、運転者が手動で車線変更を行う場合、車両制御部7は、運転者が手動で車線変更を行ったことを検知する。検知方法は、予め車線変更の操作時系列パターンをルール化しておくことにより、CAN(Controller Area Network)情報などから取得した操作時系列データを解析することで検知する。その際、車両制御部7は、上述した特徴量を取得する。車両制御部7は、運転者毎に、特徴量を記憶部8に記憶し、運転特性モデルを構築する。 For example, when the driver manually changes the lane, the vehicle control unit 7 detects that the driver has manually changed the lane. The detection method is performed by analyzing operation time series data acquired from CAN (Controller Area Network) information or the like by rule-setting an operation time series pattern for changing lanes in advance. In that case, the vehicle control part 7 acquires the feature-value mentioned above. The vehicle control unit 7 stores the feature amount in the storage unit 8 for each driver, and constructs a driving characteristic model.
 なお、車両制御部7は、運転者毎の特徴量に基づき、上述したドライバモデルを構築してもよい。つまり、車両制御部7は、速度に関する特徴量、ステアリングに関する特徴量、操作タイミングに関する特徴量、車外センシングに関する特徴量、車内センシングに関する特徴量を抽出し、記憶部8に記憶する。そして、記憶部8に記憶した特徴量に基づいて、走行環境毎の運転者による操作の傾向と各操作の頻度の情報を対応づけたドライバモデルを構築してもよい。 Note that the vehicle control unit 7 may construct the above-described driver model based on the feature amount for each driver. That is, the vehicle control unit 7 extracts a feature value related to speed, a feature value related to steering, a feature value related to operation timing, a feature value related to outside-vehicle sensing, and a feature value related to in-vehicle sensing, and stores them in the storage unit 8. And based on the feature-value memorize | stored in the memory | storage part 8, you may build the driver model which matched the tendency of operation by the driver for every driving | running | working environment, and the information of the frequency of each operation.
 図22は、運転特性モデルの一例を示す図である。図22は、運転者毎に、特徴量が表形式で示されている。また、図22には、運転者毎に、各挙動を過去に選択した回数が示されている。特徴量についても一部のみが記載されているが、上記に挙げたいずれか、またはその全てを記載してもよい。 FIG. 22 is a diagram showing an example of the driving characteristic model. FIG. 22 shows the feature values in a tabular format for each driver. FIG. 22 also shows the number of times each behavior has been selected in the past for each driver. Although only a part of the feature amount is described, any or all of the above may be described.
 図22に記載の特徴量について詳細を説明する。速度の数値は、実際の速度を段階的に示している数値である。ステアリングホイール、ブレーキ、アクセルの数値は、操作量を段階的に示している数値である。これらの数値は、例えば、過去の所定の期間内の速度、ステアリングホイール、ブレーキ、アクセルの操作量の平均値を算出し、その平均値を段階的に表すことによって得られる。 Details of the feature amount described in FIG. 22 will be described. The numerical value of speed is a numerical value indicating the actual speed in stages. The numerical values for the steering wheel, brake, and accelerator are numerical values that indicate the operation amount in stages. These numerical values are obtained, for example, by calculating an average value of speed, steering wheel, brake, and accelerator operation amounts within a predetermined period in the past and expressing the average value stepwise.
 例えば、図22において、運転者xが同乗者がいない状態で車線変更を行う場合、速さのレベルは8であり、ステアリングホイール、ブレーキ、アクセルの操作量のレベルはそれぞれ4、6、8である。 For example, in FIG. 22, when the driver x changes lanes without a passenger, the speed level is 8, and the steering wheel, brake, and accelerator operation amounts are 4, 6, and 8, respectively. is there.
 自動運転の際は、車両制御部7は、運転者が誰か、どのような挙動が実行されるか、及び、同乗者が誰かに応じて、運転者、挙動、及び、同乗者に対応する運転特性モデルを図22に示す運転特性モデルの中から選択する。 In the case of automatic driving, the vehicle control unit 7 performs driving corresponding to the driver, the behavior, and the passenger according to who the driver is, what kind of behavior is executed, and who the passenger is. The characteristic model is selected from the driving characteristic model shown in FIG.
 そして、車両制御部7は、選択した運転特性モデルに対応する速度で車両1を走行させ、また、ステアリングホイール、ブレーキ、アクセルの操作量およびそのタイミングの組み合わせで車両1を制御する。これにより、運転者の嗜好に応じた自動運転を行うことができる。なお、図22に示すような運転特性モデルの情報は、報知部92に報知させることができる。 Then, the vehicle control unit 7 causes the vehicle 1 to travel at a speed corresponding to the selected driving characteristic model, and controls the vehicle 1 by a combination of the steering wheel, brake, accelerator operation amounts and timing. Thereby, the automatic driving | operation according to a driver | operator's preference can be performed. Note that the information on the driving characteristic model as shown in FIG.
 図23は、本発明の実施の形態4における報知部92の表示を説明する図である。図23は、図5に示した走行環境の第1の例に対する表示である。 FIG. 23 is a diagram illustrating the display of the notification unit 92 according to the fourth embodiment of the present invention. FIG. 23 is a display for the first example of the traveling environment shown in FIG.
 図23の(a)は、車線の変更あるいは車両の加速、減速が不要な通常走行を行っている状態の報知部92の表示である。図23の(a)には、運転者の運転特性が「減速が多い」運転特性であることを示す記号231と、現在、自動運転中であることを示す記号232が示されている。 (A) of FIG. 23 is a display of the notification unit 92 in a state in which the vehicle is performing normal travel that does not require lane change or vehicle acceleration / deceleration. FIG. 23A shows a symbol 231 indicating that the driving characteristic of the driver is “high deceleration” driving characteristic and a symbol 232 indicating that the driver is currently in automatic driving.
 車両制御部7は、例えば、図22に示した運転特性モデルに含まれる各挙動を過去に選択した回数に基づいて、運転者の運転特性を判定する。この場合、車両制御部7は、例えば、運転特性から「減速」が多い(いわゆる「減速」という挙動を選択した回数が多い)運転者に対して、図23のような記号231を含む表示を報知部92に表示させる。 The vehicle control unit 7 determines the driving characteristics of the driver based on the number of times each behavior included in the driving characteristics model shown in FIG. 22 has been selected in the past, for example. In this case, for example, the vehicle control unit 7 displays a display including a symbol 231 as shown in FIG. 23 for the driver who has a lot of “deceleration” based on driving characteristics (the number of times the behavior of so-called “deceleration” is selected is large). The information is displayed on the notification unit 92.
 そして、車両制御部7が、走行環境が図5に示した第1の例の走行環境であると判定した場合、車両制御部7は、運転者の運転特性が「減速が多い」運転特性であることに基づいて、第1の挙動を「減速」と判定し、図23の(b)の表示を報知部92に実行させる。 When the vehicle control unit 7 determines that the driving environment is the driving environment of the first example illustrated in FIG. 5, the vehicle control unit 7 determines that the driving characteristic of the driver is “high deceleration” driving characteristic. Based on the fact, the first behavior is determined to be “deceleration”, and the notification unit 92 is caused to execute the display of FIG.
 図23の(b)には、第1の挙動である「減速」を示す記号233が第1の態様(例えば、第1の色)で示されている。また、第2の挙動である「加速」を示す記号234と、第2の挙動である「車線変更」を示す記号235が示されている。 In FIG. 23B, a symbol 233 indicating “deceleration” which is the first behavior is shown in a first mode (for example, a first color). Further, a symbol 234 indicating “acceleration” as the second behavior and a symbol 235 indicating “lane change” as the second behavior are shown.
 運転者は、実施の形態1で説明したような操作により、「加速」への挙動の変更を行った場合、車両制御部7は、図23の(c)の表示を報知部92に実行させる。 When the driver changes the behavior to “acceleration” by the operation described in the first embodiment, the vehicle control unit 7 causes the notification unit 92 to execute the display of (c) of FIG. .
 図23の(c)には、選択された挙動である「加速」を示す記号234’が、第1の態様で示されている。また、記号233’は、図23の(b)において第1の挙動として表示されていた記号233が記号234と入れ替わって表示されたものである。 23 (c), a symbol 234 'indicating "acceleration" which is the selected behavior is shown in the first mode. Further, the symbol 233 ′ is displayed by replacing the symbol 233 displayed as the first behavior in FIG. 23B with the symbol 234.
 その後、車両制御部7は、図23の(a)に示す表示を報知部92に実行させてから第2の所定時間が経過した後に、図23の(d)に示す表示を報知部92に表示させる。ここで、図23の(d)には、次の挙動として、運転者が選択した「加速」を示す記号234’が第2の態様で表示される。 Thereafter, the vehicle control unit 7 causes the notification unit 92 to display the display illustrated in FIG. 23D after the second predetermined time has elapsed since the notification unit 92 has executed the display illustrated in FIG. Display. Here, in FIG. 23D, a symbol 234 'indicating "acceleration" selected by the driver is displayed in the second mode as the next behavior.
 車両制御部7は、次にとる挙動が「加速」と決定した場合、運転特性モデルに含まれる「加速」の挙動に対応する特徴量を読み出し、それらの特徴量を反映させた「加速」を行うように、車両1を制御する。 When it is determined that the next behavior to be taken is “acceleration”, the vehicle control unit 7 reads out feature amounts corresponding to the “acceleration” behavior included in the driving characteristic model, and performs “acceleration” reflecting those feature amounts. The vehicle 1 is controlled to do so.
 図24は、本発明の実施の形態4における報知部92の表示を説明する図である。図24は、図7に示した走行環境の第2の例に対する表示である。なお、図24において、図23と共通する構成には図23と同一の符号を付し、その詳しい説明を省略する。図24は、図23から、「車線変更」を示す記号235が削除された図である。 FIG. 24 is a diagram illustrating the display of the notification unit 92 according to the fourth embodiment of the present invention. FIG. 24 is a display for the second example of the traveling environment shown in FIG. 24, the same reference numerals as those in FIG. 23 are given to components common to those in FIG. 23, and detailed description thereof is omitted. FIG. 24 is a diagram in which the symbol 235 indicating “lane change” is deleted from FIG. 23.
 前述の通り、第2の例(図7)では、第1の例(図5)と異なり、車両1の右方に別の車両が走行しているため、車線変更ができない。そのため、図24の(b)、(c)では、「車線変更」が表示されていない。また、図24の(c)の例では、図23の(c)の場合と同様に、「加速」が選択されたため、車両制御部7は、図23と同様に、運転特性モデルに含まれる「加速」の挙動に対応する特徴量を読み出し、それらの特徴量を反映させた「加速」を行うように、車両1を制御する。 As described above, in the second example (FIG. 7), unlike the first example (FIG. 5), another vehicle is traveling to the right of the vehicle 1, and therefore the lane cannot be changed. Therefore, “lane change” is not displayed in FIGS. 24B and 24C. Further, in the example of FIG. 24C, “acceleration” is selected as in the case of FIG. 23C, and therefore the vehicle control unit 7 is included in the driving characteristic model as in FIG. The vehicle 1 is controlled to read out feature amounts corresponding to the behavior of “acceleration” and perform “acceleration” reflecting those feature amounts.
 図25は、本発明の実施の形態4における報知部92の表示を説明する図である。図25は、図8に示した走行環境の第3の例に対する表示である。 FIG. 25 is a diagram illustrating the display of the notification unit 92 according to the fourth embodiment of the present invention. FIG. 25 is a display for the third example of the travel environment shown in FIG.
 図25の(a)は、図23の(a)と同様である。車両制御部7が図8に示した第3の例の走行環境であることを判定した場合、車両制御部7は、運転者の運転特性が「減速が多い」運転特性であることに基づいて、第1の挙動を「減速」と判定し、図25の(b)の表示を報知部92に実行させる。 (A) in FIG. 25 is the same as (a) in FIG. When the vehicle control unit 7 determines that the driving environment of the third example illustrated in FIG. 8 is satisfied, the vehicle control unit 7 determines that the driving characteristic of the driver is the “high deceleration” driving characteristic. The first behavior is determined as “deceleration”, and the notification unit 92 is caused to execute the display of FIG.
 図25の(b)には、第1の挙動である「減速」を示す記号251が第1の態様(例えば、第1の色)で示されている。また、第2の挙動である「追い越し」を示す記号252と、第2の挙動である「車線変更」を示す記号253が示されている。 In FIG. 25 (b), a symbol 251 indicating "deceleration" which is the first behavior is shown in a first mode (for example, a first color). Further, a symbol 252 indicating “passing” that is the second behavior and a symbol 253 indicating “lane change” that is the second behavior are shown.
 運転者は、実施の形態1で説明したような操作により、「追い越し」への挙動の変更を行った場合、車両制御部7は、図25の(c)の表示を報知部92に実行させる。 When the driver changes the behavior to “passing” by the operation described in the first embodiment, the vehicle control unit 7 causes the notification unit 92 to execute the display of (c) of FIG. 25. .
 図25の(c)には、選択された挙動である「追い越し」を示す記号252’が、第1の態様で示されている。また、記号251’は、図25の(b)において第1の挙動として表示されていた記号251が記号252と入れ替わって表示されたものである。 FIG. 25 (c) shows a symbol 252 'indicating "passing" which is the selected behavior in the first mode. The symbol 251 ′ is displayed by replacing the symbol 251 displayed as the first behavior in FIG. 25B with the symbol 252.
 その後、車両制御部7は、図25の(a)に示す表示を報知部92に実行させてから第2の所定時間が経過した後に、図25の(d)に示す表示を報知部92に表示させる。ここで、図25の(d)には、次の挙動として、運転者が選択した「追い越し」を示す記号252’が第2の態様で表示される。 Thereafter, the vehicle control unit 7 causes the notification unit 92 to display the display shown in FIG. 25D after the second predetermined time has elapsed since the notification unit 92 has executed the display shown in FIG. Display. Here, in (d) of FIG. 25, as the next behavior, a symbol 252 'indicating "overtaking" selected by the driver is displayed in the second mode.
 車両制御部7は、次にとる挙動が「追い越し」と決定した場合、運転特性モデルに含まれる「追い越し」の挙動に対応する特徴量を読み出し、それらの特徴量を反映させた「加速」を行うように、車両1を制御する。 When the behavior to be taken next is determined to be “passing”, the vehicle control unit 7 reads out feature amounts corresponding to the “passing” behavior included in the driving characteristic model, and performs “acceleration” reflecting those feature amounts. The vehicle 1 is controlled to do so.
 次に、運転者の運転特性が「減速が多い」運転特性ではない場合の表示の例を説明する。 Next, an example of the display when the driving characteristics of the driver are not “high deceleration” driving characteristics will be described.
 図26は、本発明の実施の形態4における報知部92の表示を説明する図である。図26は、図5に示した走行環境の第1の例に対する表示である。なお、図26の(a)は、運転者の運転特性が「加速が多い」運転特性である場合の例を示し、図26の(b)は、運転者の運転特性が「車線変更が多い」運転特性である場合の例を示している。 FIG. 26 is a diagram for explaining the display of the notification unit 92 according to the fourth embodiment of the present invention. FIG. 26 is a display for the first example of the travel environment shown in FIG. 26A shows an example in which the driving characteristic of the driver is “high acceleration” driving characteristic, and FIG. 26B shows the driving characteristic of the driver “many lane changes”. ”Shows an example in the case of driving characteristics.
 図26の(a)には、運転者の運転特性が「加速が多い」運転特性であることを示す記号261が示されている。また、第1の挙動である「加速」を示す記号262が第1の態様(例えば、第1の色)で示されている。また、第2の挙動である「車線変更」を示す記号263と、第2の挙動である「減速」を示す記号264が示されている。 FIG. 26 (a) shows a symbol 261 indicating that the driving characteristic of the driver is a driving characteristic with “high acceleration”. Further, a symbol 262 indicating “acceleration” which is the first behavior is shown in the first mode (for example, the first color). Further, a symbol 263 indicating “lane change” as the second behavior and a symbol 264 indicating “deceleration” as the second behavior are shown.
 車両制御部7は、例えば、運転特性から過去に「加速」が多い(いわゆる過去に「加速」という挙動を選択した回数が多い)運転者に対して、図26の(a)のような記号261を含む表示を報知部92に実行させる。また、車両制御部7は、運転者の運転特性が「加速が多い」運転特性であることに基づいて、第1の挙動を「加速」と判定し、図26の(a)の表示を報知部92に実行させる。 For example, the vehicle control unit 7 gives a symbol such as (a) in FIG. 26 to a driver who has a lot of “acceleration” in the past based on driving characteristics (a large number of times the behavior of “acceleration” has been selected in the past). The notification unit 92 is caused to execute display including H.261. Further, the vehicle control unit 7 determines that the first behavior is “acceleration” based on the driving characteristics of the driver being “high acceleration” driving characteristics, and notifies the display of FIG. The unit 92 is caused to execute.
 図26の(b)には、運転者の運転特性が「車線変更が多い」運転特性であることを示す記号265が示されている。また、第1の挙動である「車線変更」を示す記号266が第1の態様(例えば、第1の色)で示されている。また、第2の挙動である「車線変更」を示す記号267と、第2の挙動である「減速」を示す記号268が示されている。 FIG. 26 (b) shows a symbol 265 indicating that the driving characteristic of the driver is a driving characteristic with “many lane changes”. Further, a symbol 266 indicating “lane change” as the first behavior is shown in the first mode (for example, the first color). Further, a symbol 267 indicating “lane change” as the second behavior and a symbol 268 indicating “deceleration” as the second behavior are shown.
 車両制御部7は、例えば、運転特性から過去に「車線変更」が多い(いわゆる過去に「車線変更」という挙動を選択した回数が多い)運転者に対して、図26の(b)のような記号265を含む表示を報知部92に実行させる。車両制御部7は、運転者の運転特性が「車線変更が多い」運転特性であることに基づいて、第1の挙動を「車線変更」と判定し、図26の(b)の表示を報知部92に実行させる。 For example, the vehicle control unit 7 gives the driver a lot of “lane change” in the past from the driving characteristics (the so-called “lane change” has been selected many times in the past) as shown in FIG. The notification unit 92 is caused to perform display including a simple symbol 265. The vehicle control unit 7 determines that the first behavior is “lane change” based on the driving characteristics of the driver being “many lane changes”, and notifies the display of FIG. The unit 92 is caused to execute.
 上記は、運転特性モデルのみを使用して説明したが、ドライバモデルを加味してもよく、図23、図25、図26において、記号231は運転者の操作履歴から選択されたドライバモデルの種類を示してもよい。例えば、図5に示した走行環境の第1の例について、「減速」をよく選ぶ運転者に適用するドライバモデルには図23のような記号231を含む表示を報知部92に実行させ、第1の挙動を「減速」と判定する。「加速」をよく選ぶ運転者に適用するドライバモデルには図26の(a)のような記号261を含む表示を報知部92に実行させ、第1の挙動を「加速」と判定する。「車線変更」をよく選ぶ運転者に適用するドライバモデルには図26の(b)のような記号261を含む表示を報知部92に実行させ、第1の挙動を「車線変更」と判定する。 Although the above has been described using only the driving characteristic model, a driver model may be considered. In FIGS. 23, 25, and 26, the symbol 231 indicates the type of driver model selected from the driver's operation history. May be indicated. For example, in the first example of the driving environment shown in FIG. 5, the driver model applied to the driver who often selects “decelerate” causes the notification unit 92 to execute the display including the symbol 231 as shown in FIG. The behavior of 1 is determined as “deceleration”. In the driver model applied to the driver who often selects “acceleration”, the display including the symbol 261 as shown in FIG. 26A is executed by the notification unit 92, and the first behavior is determined as “acceleration”. In the driver model applied to the driver who often selects “lane change”, the display including the symbol 261 as shown in FIG. 26B is executed by the notification unit 92, and the first behavior is determined as “lane change”. .
 以上説明した本実施の形態によれば、車の将来の挙動を決定する際に、運転者の過去の走行履歴を学習し、その結果を将来の挙動の決定に反映させることができる。また、車両制御部が車を制御する際に、運転者の運転特性(運転嗜好)を学習し、車の制御に反映させることができる。 According to the present embodiment described above, when determining the future behavior of the vehicle, the driver's past driving history can be learned, and the result can be reflected in the determination of the future behavior. Further, when the vehicle control unit controls the vehicle, the driving characteristics (driving preference) of the driver can be learned and reflected in the control of the vehicle.
 これにより、車両が運転者若しくは乗員が嗜好するタイミングあるいは操作量で自動運転を制御でき、実際運転者が手動運転する場合の感覚と乖離することなく、自動運転中に運転者による不要な操作介入を抑制することができる。 As a result, automatic driving can be controlled at the timing or amount of operation that the driver or passenger likes, and unnecessary operation intervention by the driver during automatic driving without deviating from the feeling of actual driving by the actual driver. Can be suppressed.
 なお、本発明では、車両制御部7が実行する機能と同様の機能をクラウドサーバなどのサーバ装置に実行させてもよい。また、記憶部8は、車両1ではなく、クラウドサーバなどのサーバ装置にあってもよい。あるいは、記憶部8は、既に構築されたドライバモデルを記憶し、車両制御部7は、記憶部8に記憶されたドライバモデルを参照して、挙動を判定することとしてもよい。 In the present invention, a server device such as a cloud server may execute a function similar to the function executed by the vehicle control unit 7. Moreover, the memory | storage part 8 may exist not in the vehicle 1 but in server apparatuses, such as a cloud server. Alternatively, the storage unit 8 may store an already constructed driver model, and the vehicle control unit 7 may determine the behavior with reference to the driver model stored in the storage unit 8.
 このように、実施の形態4では、車両制御部7が、運転者の運転特性を示す特徴量の情報を取得し、記憶部8がその特徴量の情報を記憶し、車両制御部7が記憶部8に記憶された特徴量の情報に基づいて、運転者が選択した車両の挙動の傾向を、選択された各挙動の頻度で示すドライバモデルを車両の走行環境毎に構築することとした。 As described above, in the fourth embodiment, the vehicle control unit 7 acquires feature amount information indicating the driving characteristics of the driver, the storage unit 8 stores the feature amount information, and the vehicle control unit 7 stores the feature amount information. Based on the feature amount information stored in the unit 8, a driver model indicating the tendency of the behavior of the vehicle selected by the driver with the frequency of each selected behavior is constructed for each traveling environment of the vehicle.
 また、車両制御部7は、複数の運転者のうち、類似した挙動の選択を行う運転者のグループを決定し、グループ毎、車両の走行環境毎にドライバモデルを構築することとした。 In addition, the vehicle control unit 7 determines a group of drivers who select a similar behavior among a plurality of drivers, and constructs a driver model for each group and for each driving environment of the vehicle.
 また、車両制御部7は、類似した操作を行う運転者のグループ毎に各運転者が選択した挙動の頻度の平均値を算出し、運転者が選択した車両の挙動の傾向を、算出した平均値で示すドライバモデルを車両の走行環境毎に構築することとした。 Further, the vehicle control unit 7 calculates the average value of the behavior frequency selected by each driver for each group of drivers who perform similar operations, and calculates the behavior tendency of the vehicle selected by the driver. The driver model indicated by the value was constructed for each driving environment of the vehicle.
 また、車両制御部7は、特定の運転者が選択した車両の挙動の傾向と類似する傾向がある他の運転者が選択した車両の挙動に基づいて、上記特定の運転者が選択した車両の挙動の傾向を、選択された各挙動の頻度で示すドライバモデルを車両の走行環境毎に構築することとした。 In addition, the vehicle control unit 7 determines the vehicle selected by the specific driver based on the behavior of the vehicle selected by another driver that tends to be similar to the behavior tendency of the vehicle selected by the specific driver. A driver model indicating the tendency of behavior with the frequency of each selected behavior is constructed for each traveling environment of the vehicle.
 以上により、車両制御部7は、運転者の運転傾向により適したドライバモデルを構築でき、構築したドライバモデルに基づいて、運転者に対してより適切な自動運転を行うことができる。 As described above, the vehicle control unit 7 can construct a driver model more suitable for the driving tendency of the driver, and can perform more appropriate automatic driving for the driver based on the constructed driver model.
 (ドライバモデルの変形例)
 なお、上記で説明したドライバモデルは、走行環境毎の運転者による操作(挙動)の傾向を各操作の頻度の情報などに基づいてモデル化したものであったが、本発明はこれに限定されない。
(Modified example of driver model)
In addition, although the driver model demonstrated above modeled the tendency of operation (behavior) by the driver for every driving environment based on the information of the frequency of each operation, etc., the present invention is not limited to this. .
 例えば、ドライバモデルは、過去に走行した走行環境(つまり、シチュエーション)を示す環境パラメータと、その走行環境において運転者が実際に選択した操作(挙動)とを対応させた走行履歴に基づいて構築されてもよい。環境パラメータをドライバモデルに組み込むことにより、走行環境の検出・分類を別途行い、その分類結果をドライバモデルに入力(記憶)するという手続きを踏むことなく、選択肢を決めることが出来る。具体的には、図23、24のような走行環境の違いを、環境パラメータとして取得し、ドライバモデルに直接入力(記憶)することにより、図23では「加速」、「減速」、「車線変更」が選択肢となり、図24では、「加速」、「減速」が選択肢となる。以下では、このようなドライバモデルを構築する場合について説明する。なお、以下に説明するドライバモデルは、シチュエーションデータベースと言い換えても良い。 For example, the driver model is constructed based on a travel history in which environmental parameters indicating travel environments (that is, situations) that have traveled in the past and operations (behaviors) actually selected by the driver in the travel environment are associated with each other. May be. By incorporating environmental parameters into the driver model, options can be determined without going through the procedure of separately detecting and classifying the driving environment and inputting (storing) the classification result into the driver model. Specifically, the driving environment differences as shown in FIGS. 23 and 24 are acquired as environment parameters and directly input (stored) in the driver model, so that “acceleration”, “deceleration”, “lane change” in FIG. "Is an option, and in FIG. 24," acceleration "and" deceleration "are options. Hereinafter, a case where such a driver model is constructed will be described. The driver model described below may be referred to as a situation database.
 ここで、本変形例におけるドライバモデルを構築するための走行履歴について説明する。図27は、走行履歴の一例を示す図である。図27には、運転者xが運転する車両が、過去に走行した走行環境を示す環境パラメータと、その走行環境において運転者が実際に選択した操作(挙動)とを対応させた走行履歴が示されている。 Here, the travel history for constructing the driver model in this modification will be described. FIG. 27 is a diagram illustrating an example of a travel history. FIG. 27 shows a travel history in which an environment parameter indicating a travel environment in which the vehicle driven by the driver x has traveled in the past and an operation (behavior) actually selected by the driver in the travel environment are associated with each other. Has been.
 図27に示す走行履歴の(a)~(c)の環境パラメータは、それぞれ、例えば、図8の(b)、図5の(b)、図7の(b)に示したようにして運転者に車両の挙動を提示した際の走行環境を示すものである。この走行履歴の環境パラメータは、センシング情報およびインフラ情報から得られる。 The environmental parameters (a) to (c) of the travel history shown in FIG. 27 are, for example, operated as shown in FIG. 8 (b), FIG. 5 (b), and FIG. 7 (b), respectively. It shows the driving environment when the behavior of the vehicle is presented to the person. The environmental parameter of the travel history is obtained from sensing information and infrastructure information.
 センシング情報は、車両が有するセンサあるいはレーダ等が検知した情報である。インフラ情報は、GPSの情報、地図情報、路車間通信で取得される情報などである。 Sensing information is information detected by a vehicle sensor or radar. The infrastructure information includes GPS information, map information, information acquired through road-to-vehicle communication, and the like.
 例えば、図27に示す走行履歴の環境パラメータは、「自車両の情報」、自車両aが走行する車線の前方を走行する車両の情報を示す「先行車両の情報」、自車両が走行する車線の側方車線の情報を示す「側方車線の情報」、自車両が走行する位置に合流車線がある場合に、その合流車線の情報を示す「合流車線の情報」、自車両の位置とその周囲の情報を示す「位置情報」などを含む。また、後方車両の情報を含めてもよい。その場合、後方車両と自車両との相対速度、車頭間距離、車頭間距離の変化率などを用いても良い。また、車両の存在の情報を含めてもよい。 For example, the environmental parameters of the travel history shown in FIG. 27 are “own vehicle information”, “preceding vehicle information” indicating information of a vehicle traveling ahead of the lane on which the own vehicle a travels, and the lane on which the own vehicle travels. "Side lane information" indicating the side lane information of the vehicle, and if there is a merging lane at the position where the host vehicle is traveling, the "Merge lane information" indicating the information of the merging lane, It includes “location information” indicating surrounding information. Moreover, you may include the information of a back vehicle. In that case, you may use the relative speed of a back vehicle and the own vehicle, the distance between heads, the change rate of the distance between heads, etc. Moreover, you may include the information of presence of a vehicle.
 例えば、「自車両の情報」は、自車両の速度Vaの情報を含む。「先行車両の情報」は、自車両に対する先行車両の相対速度Vba、先行車両と自車両との車間距離DRba、先行車両のサイズの変化率RSbの情報を含む。 For example, “information on own vehicle” includes information on the speed Va of the own vehicle. The “preceding vehicle information” includes information on the relative speed Vba of the preceding vehicle with respect to the own vehicle, the inter-vehicle distance DRba between the preceding vehicle and the own vehicle, and the rate of change RSb of the size of the preceding vehicle.
 ここで、自車両の速度Vaは、自車両が有する速度センサによって検知される。相対速度Vba、車間距離DRbaは、センサあるいはレーダ等によって検知される。サイズの変化率RSbは、RSb=-Vba/DRbaという関係式によって算出される。 Here, the speed Va of the host vehicle is detected by a speed sensor of the host vehicle. The relative speed Vba and the inter-vehicle distance DRba are detected by a sensor or a radar. The size change rate RSb is calculated by the relational expression RSb = −Vba / DRba.
 「側方車線の情報」は、側方車線において自車両より後方を走行する側後方車両cの情報と、側方車線において自車両より前方を走行する側前方車両dの情報と、自車両の残存側方車線長DRdaの情報とを含む。 “Information on the side lane” includes information on the side rear vehicle c traveling behind the host vehicle in the side lane, information on the side front vehicle d traveling ahead of the host vehicle in the side lane, Information of the remaining side lane length DRda.
 側後方車両の情報は、自車両に対する側後方車両の相対速度Vca、側後方車両と自車両との車頭間距離Dca、車頭間距離の変化率Rcaの情報を含む。側後方車両と自車両との車頭間距離Dcaとは、自車両(および側後方車両)の進行方向に沿った方向において測定される自車両の先端部(車頭)と側後方車両の先端部(車頭)との間の距離である。なお、車頭間距離は、車間距離及び車長から算出してもよい。また、車頭間距離は、車間距離に代替させてもよい。 The information on the side rear vehicle includes information on the relative speed Vca of the side rear vehicle with respect to the own vehicle, the inter-head distance Dca between the side rear vehicle and the own vehicle, and the change rate Rca of the inter-head distance. The inter-head distance Dca between the side rear vehicle and the host vehicle is determined by measuring the front end portion (vehicle head) of the host vehicle and the front end portion of the side rear vehicle (in the direction along the traveling direction of the host vehicle (and the side rear vehicle)). This is the distance between The inter-vehicle distance may be calculated from the inter-vehicle distance and the vehicle length. The inter-vehicle distance may be substituted for the inter-vehicle distance.
 ここで、相対速度Vca、車頭間距離Dcaは、センサあるいはレーダ等によって検知される。車頭間距離の変化率Rcaは、Rca=Vca/Dcaという関係式によって算出される。 Here, the relative speed Vca and the inter-head distance Dca are detected by a sensor or a radar. The change rate Rca of the inter-vehicle head distance is calculated by the relational expression Rca = Vca / Dca.
 また、側前方車両の情報は、自車両に対する側前方車両の相対速度Vda、側前方車両と自車両との車頭間距離Dda、車頭間距離の変化率Rdaの情報を含む。側前方車両と自車両との車頭間距離Ddaは、自車両(および側前方車両)の進行方向に沿って測定される自車両の先端部(車頭)と側前方車両の先端部(車頭)との間の距離である。 Also, the information on the side front vehicle includes information on the relative speed Vda of the side front vehicle with respect to the host vehicle, the distance Dda between the head of the side front vehicle and the host vehicle, and the change rate Rda of the head distance. The head-to-head distance Dda between the side front vehicle and the host vehicle is measured along the traveling direction of the host vehicle (and the side front vehicle) and the tip end portion (vehicle head) of the host vehicle and the tip portion (vehicle head) of the side front vehicle. Is the distance between.
 相対速度Vda、車頭間距離Ddaは、センサあるいはレーダ等によって検知される。また、車頭間距離の変化率Rdaは、Rda=Vda/Ddaという関係式によって算出される。 The relative speed Vda and the inter-head distance Dda are detected by a sensor or a radar. Further, the change rate Rda of the inter-vehicle head distance is calculated by the relational expression Rda = Vda / Dda.
 自車両の残存側方車線長DRdaは、側方車線への車線変更の可能性の高さを示すパラメータである。具体的には、自車両の残存側方車線長DRdaは、自車両(および側前方車両)進行方向に沿った方向において測定される自車両の先端部(車頭)と側前方車両の後端部との間の距離が、先行車両と自車両との車間距離DRbaより長い場合、自車両の先端部(車頭)と側前方車両の後端部との間の距離となり、自車両の先端部(車頭)と側前方車両の後端部との間の距離が、DRbaより短い場合、DRbaとなる。自車両の残存側方車線長DRdaは、センサあるいはレーダ等によって検知される。 The remaining side lane length DRda of the host vehicle is a parameter indicating a high possibility of lane change to the side lane. Specifically, the remaining side lane length DRda of the host vehicle is measured in the direction along the traveling direction of the host vehicle (and the side front vehicle) and the rear end portion of the side front vehicle. Is longer than the inter-vehicle distance DRba between the preceding vehicle and the host vehicle, the distance between the front end portion (vehicle head) of the host vehicle and the rear end portion of the side forward vehicle, and the front end portion of the host vehicle ( When the distance between the vehicle head) and the rear end portion of the side front vehicle is shorter than DRba, DRba is set. The remaining side lane length DRda of the host vehicle is detected by a sensor or a radar.
 「合流車線の情報」は、自車両に対する合流車両の相対速度Vma、合流車両と自車両との車頭間距離Dma、車頭間距離の変化率Rmaの情報を含む。ここで、合流車両と自車両との車頭間距離Dmaは、自車両(および合流車両)の進行方向に沿った方向において測定される自車両の先端部(車頭)と合流車両の先端部(車頭)との間の距離である。 “The information on the merging lane” includes information on the relative speed Vma of the merging vehicle with respect to the own vehicle, the distance Dma between the merging vehicle and the own vehicle, and the rate of change Rma of the inter-vehicle distance. Here, the inter-head distance Dma between the joining vehicle and the host vehicle is measured in the direction along the traveling direction of the host vehicle (and the joining vehicle) and the leading end portion (head of the host vehicle) and the leading end portion (head of the joining vehicle) ).
 相対速度Vma、車頭間距離Dmaは、センサあるいはレーダ等によって検知される。車頭間距離の変化率Rmaは、Rma=Vma/Dmaという関係式によって算出される。 The relative speed Vma and the inter-head distance Dma are detected by a sensor or a radar. The change rate Rma of the inter-vehicle head distance is calculated by the relational expression Rma = Vma / Dma.
 図27に示す走行履歴の例では、上記で説明した速度、距離、及び変化率の数値が複数のレベルに分類され、分類されたレベルを示す数値が記憶されている。なお、速度、距離、及び変化率の数値は、レベルに分類されることなくそのまま記憶されてもよい。 In the example of the travel history shown in FIG. 27, the numerical values of the speed, distance, and change rate described above are classified into a plurality of levels, and numerical values indicating the classified levels are stored. Note that the numerical values of the speed, the distance, and the change rate may be stored as they are without being classified into levels.
 位置情報は、「自車両の位置情報」、「走行車線数」、「自車両の走行車線」、「合流区間の開始・終了地点までの距離」、「分岐区間の開始・終了地点までの距離」、「工事区間開始・終了地点までの距離」、「車線減少区間開始・終了地点までの距離」、「交通事故発生地点までの距離」などの情報を含む。図27には、位置情報の例として「自車両の走行車線」(図27の走行車線)、及び「合流区間の開始・終了地点までの距離」の情報が示されている。 The location information includes “location information of own vehicle”, “number of driving lanes”, “traveling lane of own vehicle”, “distance to start / end points of merge section”, “distance to start / end points of branch section” ”,“ Distance to construction section start / end point ”,“ Distance to lane decrease section start / end point ”,“ Distance to traffic accident occurrence point ”, etc. FIG. 27 shows information on “travel lane of own vehicle” (travel lane of FIG. 27) and “distance to start / end points of merge section” as examples of position information.
 例えば、「自車両の位置情報」の欄には、GPSより得られた緯度・経度を示す数値情報が記憶される。「走行車線数」の欄には、走行している道の車線の数が記憶される。「自車両の走行車線」の欄には、走行している車線の位置を示す数値情報が記憶される。「合流区間の開始・終了地点までの距離」の欄には、所定の距離内に合流区間の開始・終了地点が存在する場合に、合流区間の開始・終了地点までの距離が予め決められた複数のレベルに分類され、分類されたレベルの数値が記憶される。なお、所定の距離内に合流区間の開始・終了地点が存在しない場合、「合流区間の開始・終了地点までの距離」の欄には「0」が記憶される。 For example, numerical information indicating latitude / longitude obtained from GPS is stored in the “location information of own vehicle” column. The number of lanes on the road on which the vehicle is traveling is stored in the “number of lanes” column. Numeric information indicating the position of the lane in which the vehicle is traveling is stored in the “travel lane of own vehicle” column. In the “Distance to start / end point of merge section” column, the distance to the start / end point of the merge section is determined in advance when the start / end point of the merge section exists within a predetermined distance. It is classified into a plurality of levels, and the numerical values of the classified levels are stored. If there is no start / end point of the merging section within the predetermined distance, “0” is stored in the “distance to the start / end point of the merging section” column.
 「分岐区間の開始・終了地点までの距離」の欄には、所定の距離内に分岐区間の開始・終了地点が存在する場合に、分岐区間の開始・終了地点までの距離が予め決められた複数のレベルに分類され、分類されたレベルの数値が記憶される。なお、所定の距離内に分岐区間の開始・終了地点が存在しない場合、「分岐区間の開始・終了地点までの距離」の欄には「0」が記憶される。「工事区間開始・終了地点までの距離」の欄には、所定の距離内に工事区間開始・終了地点が存在する場合に、工事区間開始・終了地点までの距離が予め決められた複数のレベルに分類され、分類されたレベルの数値が記憶される。なお、所定の距離内に工事区間開始・終了地点が存在しない場合、「工事区間開始・終了地点までの距離」の欄には「0」が記憶される。 In the “distance to start / end point of branch section” field, when the start / end point of the branch section exists within a predetermined distance, the distance to the start / end point of the branch section is determined in advance. It is classified into a plurality of levels, and the numerical values of the classified levels are stored. If there is no start / end point of the branch section within the predetermined distance, “0” is stored in the “distance to the start / end point of the branch section”. In the "Distance to construction section start / end point" column, if there is a construction section start / end point within a predetermined distance, the distance to the construction section start / end point is determined in multiple levels. And the numerical value of the classified level is stored. When there is no construction section start / end point within a predetermined distance, “0” is stored in the column “Distance to construction section start / end point”.
 「車線減少区間開始・終了地点までの距離」の欄には、所定の距離内に車線減少区間開始・終了地点が存在する場合に、車線減少区間開始・終了地点までの距離が予め決められた複数のレベルに分類され、分類されたレベルの数値が記憶される。なお、所定の距離内に車線減少区間開始・終了地点が存在しない場合、「車線減少区間開始・終了地点までの距離」の欄には「0」が記憶される。 In the “Distance to start / end point of lane reduction section” column, the distance to the start / end point of lane decrease section is determined in advance when there is a start / end point of lane reduction section within the predetermined distance. It is classified into a plurality of levels, and the numerical values of the classified levels are stored. When there is no lane decrease section start / end point within a predetermined distance, “0” is stored in the “distance to lane decrease section start / end point” column.
 「交通事故発生地点までの距離」の欄には、所定の距離内に交通事故発生地点が存在する場合に、交通事故発生地点までの距離が予め決められた複数のレベルに分類され、分類されたレベルの数値が記憶される。なお、所定の距離内に交通事故発生地点が存在しない場合、「交通事故発生地点までの距離」の欄には「0」が記憶される。 In the “distance to traffic accident occurrence point” column, when the traffic accident occurrence point exists within a predetermined distance, the distance to the traffic accident occurrence point is classified into a plurality of predetermined levels. The numerical value of the selected level is stored. If there is no traffic accident occurrence point within a predetermined distance, “0” is stored in the “distance to the traffic accident occurrence point” column.
 さらに、位置情報は、自車両が走行している道の全車線のうちどの車線が合流車線、分岐車線、工事車線、減少車線、事故発生車線かの情報を含んでも良い。 Furthermore, the position information may include information on which lanes of the road on which the vehicle is traveling are merge lanes, branch lanes, construction lanes, reduced lanes, and accident lanes.
 なお、図27に示した走行履歴はあくまで一例であり、本発明はこれに限定されない。例えば、上記側方車線の情報が右側方車線の情報である場合、走行履歴に、その反対側である「左側方車線の情報」がさらに含まれても良い。 The travel history shown in FIG. 27 is merely an example, and the present invention is not limited to this. For example, when the information on the side lane is information on the right side lane, the travel history may further include “information on the left side lane” on the opposite side.
 「左側方車線の情報」は、左側方車線において自車両より後方を走行する左側後方車両の情報と、左側方車線において自車両より前方を走行する左側前方車両の情報と、自車両の残存左側方車線長DRdaの情報とを含む。 “Left lane information” includes information on the left rear vehicle traveling behind the host vehicle in the left lane, information on the left front vehicle traveling ahead of the host vehicle in the left lane, and the remaining left side of the host vehicle. Information on the direction lane length DRda.
 左側後方車両の情報は、自車両に対する左側後方車両の相対速度Vfa、左側後方車両と自車両との車頭間距離Dfa、車頭間距離の変化率Rfaの情報を含む。左側後方車両と自車両との車頭間距離Dfaとは、自車両(および左側後方車両)の進行方向に沿った方向において測定される自車両の先端部(車頭)と左側後方車両の先端部(車頭)との間の距離である。 The information on the left rear vehicle includes information on the relative speed Vfa of the left rear vehicle with respect to the host vehicle, the head distance Dfa between the left rear vehicle and the host vehicle, and the change rate Rfa of the head head distance. The head-to-head distance Dfa between the left rear vehicle and the host vehicle is a front end portion (vehicle head) of the host vehicle measured in a direction along the traveling direction of the host vehicle (and the left rear vehicle) and a front end portion of the left rear vehicle ( This is the distance between
 ここで、相対速度Vfa、車頭間距離Dfaは、センサあるいはレーダ等によって検知される。また、車頭間距離の変化率Rfaは、Rfa=Vfa/Dfaという関係式によって算出される。 Here, the relative speed Vfa and the inter-vehicle head distance Dfa are detected by a sensor or a radar. Further, the change rate Rfa of the inter-vehicle head distance is calculated by the relational expression Rfa = Vfa / Dfa.
 また、左側前方車両の情報は、自車両に対する左側前方車両の相対速度Vga、左側前方車両と自車両との車頭間距離Dga、車頭間距離の変化率Rgaの情報を含む。左側前方車両と自車両との車頭間距離Dgaは、自車両(および左側前方車両)の進行方向に沿って測定される自車両の先端部(車頭)と左側前方車両の先端部(車頭)との間の距離である。 Further, the information on the left front vehicle includes information on the relative speed Vga of the left front vehicle with respect to the own vehicle, the distance Dga between the left front vehicle and the own vehicle, and the rate of change Rga of the head distance. The head-to-head distance Dga between the left front vehicle and the host vehicle is measured along the traveling direction of the host vehicle (and the left front vehicle) and the tip portion (vehicle head) of the host vehicle and the tip portion (vehicle head) of the left front vehicle. Is the distance between.
 ここで、相対速度Vga、車頭間距離Dgaは、センサあるいはレーダ等によって検知される。また、車頭間距離の変化率Rgaは、Rga=Vga/Dgaという関係式によって算出される。 Here, the relative speed Vga and the inter-head distance Dga are detected by a sensor or a radar. Further, the change rate Rga of the inter-vehicle head distance is calculated by the relational expression Rga = Vga / Dga.
 なお、ここでは、車両の通行が左側通行である場合について説明したが、左右を逆転させることにより右側通行の場合にも同様の処理が可能である。 In addition, although the case where the traffic of the vehicle is the left-hand traffic has been described here, the same processing can be performed in the case of the right-hand traffic by reversing the left and right.
 また、図27に示す走行履歴は、走行車線において自車両より後方を走行する後方車両の情報を示す「後方車両の情報」を含んでもよい。 Further, the travel history shown in FIG. 27 may include “rear vehicle information” indicating information on the rear vehicle traveling behind the host vehicle in the travel lane.
 後方車両の情報は、自車両に対する後方車両の相対速度Vea、後方車両と自車両との車頭間距離Dea、車頭間距離の変化率Reaの情報を含む。後方車両と自車両との車頭間距離Deaとは、自車両(および後方車両)の進行方向に沿った方向において測定される自車両の先端部(車頭)と後方車両の先端部(車頭)との間の距離である。 The information on the rear vehicle includes information on the relative speed Vea of the rear vehicle with respect to the host vehicle, the distance Dea between the rear vehicle and the host vehicle, and the rate of change Rea of the head distance. The head-to-head distance Dea between the rear vehicle and the host vehicle is determined by the front end portion (vehicle head) of the host vehicle and the front end portion (vehicle head) of the rear vehicle measured in the direction along the traveling direction of the host vehicle (and the rear vehicle). Is the distance between.
 ここで、相対速度Vea、車頭間距離Deaは、センサあるいはレーダ等によって検知される。車頭間距離の変化率Reaは、Rea=Vea/Deaという関係式によって算出される。 Here, the relative speed Vea and the inter-head distance Dea are detected by a sensor or a radar. The change rate Rea of the inter-vehicle distance is calculated by a relational expression Rea = Vea / Dea.
 なお、移動体に隠れて車頭間距離が計測できない場合などは、車頭間距離の代替として、計測できる車間距離あるいは車間距離に所定の車長を加えた近似値を使用しても良いし、車間距離に認識した車種ごとの車長を加えて算出してもよい。また、車頭間距離が計測できるかできないかに拘わらず、車頭間距離の代替として、計測できる車間距離あるいは車間距離に所定の車長を加えた近似値を使用しても良いし、車間距離に認識した車種ごとの車長を加えて算出してもよい。 If the distance between the vehicle heads cannot be measured because it is hidden behind a moving body, the measurable vehicle distance or an approximate value obtained by adding a predetermined vehicle length to the vehicle distance may be used as an alternative to the vehicle head distance. The distance may be calculated by adding the length of each recognized vehicle type to the distance. Regardless of whether the head-to-head distance can be measured or not, as an alternative to the head-to-head distance, a measurable head-to-head distance or an approximate value obtained by adding a predetermined vehicle length to the head-to-head distance may be used. You may calculate by adding the vehicle length for every recognized vehicle type.
 走行履歴には、車両の走行環境に関する他の様々な情報が含まれていてもよい。例えば、走行履歴には、先行車両、側方車両、合流車両の大きさあるいは種別、または自車両との相対位置の情報が含まれていてもよい。例えば、後方から接近する車両の種別をカメラセンサで認識し、車両が緊急車両である場合に車両が救急車両であることを示す情報を含めても良い。これにより、緊急車両への対応のための情報報知であることを情報報知できる。あるいは、図22で説明したような、ステアリングホイール、ブレーキ、アクセル操作量を段階的に示した数値あるいは同乗者の情報などが走行履歴に含まれていてもよい。 The traveling history may include various other information related to the traveling environment of the vehicle. For example, the travel history may include information on the size or type of the preceding vehicle, the side vehicle, the joining vehicle, or the relative position with respect to the host vehicle. For example, the type of a vehicle approaching from behind may be recognized by a camera sensor, and information indicating that the vehicle is an emergency vehicle may be included when the vehicle is an emergency vehicle. Thereby, it can inform that it is information reporting for correspondence to an emergency vehicle. Or the numerical value which showed the steering wheel, the brake, the amount of accelerator operation in steps, or the passenger's information etc. as demonstrated in FIG. 22 may be contained in driving | running | working history.
 また、運転者の走行履歴として、自動運転中に選択した挙動が集約されてもよいし、運転者が手動運転中に実際に行った挙動が集約されてもよい。これにより、自動運転あるいは手動運転といった運転状態に応じた走行履歴の収集ができる。 Further, as the driving history of the driver, the behaviors selected during the automatic driving may be aggregated, or the behaviors actually performed by the driver during the manual driving may be aggregated. This makes it possible to collect travel histories according to operating conditions such as automatic driving or manual driving.
 また、図27の例では、走行履歴に含まれる環境パラメータが、運転者に車両の挙動を提示した際の走行環境を示すものとしたが、運転者が挙動の選択を行った際の走行環境を示すものであってもよい。あるいは、運転者に車両の挙動を提示した際の走行環境を示す環境パラメータと、運転者が挙動の選択を行った際の走行環境を示す環境パラメータとが両方とも走行履歴に含まれてもよい。 In the example of FIG. 27, the environmental parameter included in the travel history indicates the travel environment when the behavior of the vehicle is presented to the driver. However, the travel environment when the driver selects the behavior. May be shown. Alternatively, both the environmental parameter indicating the driving environment when the behavior of the vehicle is presented to the driver and the environmental parameter indicating the driving environment when the driver selects the behavior may be included in the driving history. .
 さらに、車両制御部7が、図2の(a)、図5の(a)、図6の(a)、図7の(a)、図8の(a)、図9の(a)、図10の(a)に示す俯瞰図、または図14の(c)に示す表示を生成するに伴い、以下のようにしてもよい。すなわち、第1の挙動、及び、第2の挙動が選択される要因となった、寄与度の高い環境パラメータの情報、および、その環境パラメータに関連する情報(例えば、アイコンなど)の少なくとも一つを報知情報として生成し、生成した報知情報を俯瞰図上に示すなどして報知情報を報知部92に報知させてもよい。 Further, the vehicle control unit 7 performs the following operations: (a) in FIG. 2, (a) in FIG. 5, (a) in FIG. 6, (a) in FIG. 7, (a) in FIG. 8, (a) in FIG. Along with the overhead view shown in (a) of FIG. 10 or the display shown in (c) of FIG. 14, the following may be performed. That is, at least one of the information on the environmental parameter with a high degree of contribution and the information related to the environmental parameter (for example, an icon or the like) that causes the selection of the first behavior and the second behavior. May be generated as notification information, and the notification information may be notified to the notification unit 92 by, for example, showing the generated notification information on an overhead view.
 この場合、例えば、車両制御部7は、先行車両と自車両との車間距離DRbaあるいは先行車両のサイズの変化率RSbの寄与度が高ければ、俯瞰図における先行車両と自車両との間に輝度を上げたり色を変えたりした領域を表示させ、報知情報を報知部92に報知させてもよい。 In this case, for example, if the contribution of the inter-vehicle distance DRba between the preceding vehicle and the own vehicle or the change rate RSb of the size of the preceding vehicle is high, the vehicle control unit 7 increases the luminance between the preceding vehicle and the own vehicle in the overhead view. An area where the color is raised or the color is changed may be displayed to notify the notification unit 92 of the notification information.
 また、車両制御部7が、先行車両と自車両との間の領域に車間距離DRbaあるいは変化率RSbの寄与度が高いことを示すアイコンを報知情報として表示させてもよい。さらに、車両制御部7が、報知部92に、俯瞰図上で先行車両と自車両とを結ぶ線分を報知情報として描画させるか、全ての周辺車両と自車両とを結ぶ線分を報知情報として描画させ、俯瞰図上で先行車両と自車両とを結ぶ線分を強調させてもよい。 Further, the vehicle control unit 7 may display an icon indicating that the contribution degree of the inter-vehicle distance DRba or the change rate RSb is high in the area between the preceding vehicle and the host vehicle as the notification information. Further, the vehicle control unit 7 causes the notification unit 92 to draw a line segment connecting the preceding vehicle and the host vehicle as notification information on the overhead view, or to notify line segments connecting all the surrounding vehicles and the host vehicle. The line segment connecting the preceding vehicle and the host vehicle may be emphasized on the overhead view.
 また、車両制御部7は、俯瞰図ではなく、運転者から見える視点画像の中で、報知部92に先行車両と自車両との間に周囲の領域よりも輝度を上げたり、周囲の領域と異なる色にした領域を報知情報として表示させたりしてAR(Augmented Reality)表示を実現させてもよい。また、車両制御部7が視点画像の中で、先行車両と自車との間の領域に高い寄与度の環境パラメータを示すアイコンを報知情報として報知部92にAR表示させてもよい。 In addition, the vehicle control unit 7 raises the luminance between the preceding vehicle and the host vehicle in the viewpoint image seen from the driver, not the overhead view, and between the preceding vehicle and the host vehicle. AR (Augmented Reality) display may be realized by displaying different colored areas as notification information. In addition, the vehicle control unit 7 may cause the notification unit 92 to display an AR indicating an environmental parameter having a high contribution degree in the region between the preceding vehicle and the host vehicle as notification information in the viewpoint image.
 さらに、車両制御部7が視点画像の中で、先行車両と自車とを結ぶ線分を報知情報としてAR表示させるか、視点画像の中で、全ての周辺車両と自車両とを結ぶ線分を報知情報としてAR表示させ、先行車両と自車両とを結ぶ線分を強調させてもよい。 Further, the vehicle control unit 7 displays the line segment connecting the preceding vehicle and the host vehicle in the viewpoint image as AR information, or the line segment connecting all the surrounding vehicles and the host vehicle in the viewpoint image. May be displayed as the notification information and the line segment connecting the preceding vehicle and the host vehicle may be emphasized.
 なお、寄与度の高い環境パラメータあるいはその環境パラメータに関連する情報を報知する方法は、上記に限定されない。例えば、車両制御部7は、寄与度の高い環境パラメータの対象となる先行車両を強調表示した画像を報知情報として生成し、報知部92に表示させてもよい。 Note that the method of reporting the environmental parameter having a high contribution or information related to the environmental parameter is not limited to the above. For example, the vehicle control unit 7 may generate, as notification information, an image that highlights a preceding vehicle that is a target of an environmental parameter with a high contribution, and may display the image on the notification unit 92.
 また、車両制御部7が、俯瞰図またはAR表示において、寄与度の高い環境パラメータの対象となる先行車両等の方向を示す情報を報知情報として生成し、その情報を自車両または自車両の周辺に表示させてもよい。 In addition, the vehicle control unit 7 generates information indicating the direction of the preceding vehicle or the like that is the target of the environmental parameter with a high contribution in the overhead view or the AR display as the notification information, and the information is the own vehicle or the vicinity of the own vehicle. May be displayed.
 また、例えば、車両制御部7は、寄与度が高い環境パラメータの情報あるいはその環境パラメータに関連する情報を報知する代わりに、寄与度が低い環境パラメータの対象となる先行車両等の表示輝度を低くするなどして目立たなくし、相対的に目立つようにした寄与度が高い環境パラメータの情報あるいはその環境パラメータに関連する情報を報知情報として生成し、報知部92に表示させてもよい。 In addition, for example, the vehicle control unit 7 reduces the display brightness of a preceding vehicle or the like that is the target of the environmental parameter with a low contribution instead of notifying the information about the environmental parameter with a high contribution or information related to the environmental parameter. For example, information on an environmental parameter having a high degree of contribution that is made inconspicuous by making it inconspicuous or information related to the environmental parameter may be generated as notification information and displayed on the notification unit 92.
 次に、運転者の走行履歴に基づくドライバモデルの構築について説明する。ドライバモデルには、複数の運転者の走行履歴をクラスタリングして構築するクラスタリング型と、特定の運転者(例えば、運転者x)の走行履歴と類似する複数の走行履歴から運転者xのドライバモデルを構築する個別適応型とがある。 Next, the construction of the driver model based on the driving history of the driver will be described. The driver model includes a clustering type constructed by clustering the driving histories of a plurality of drivers, and a driver model of the driver x from a plurality of driving histories similar to a driving history of a specific driver (for example, driver x). There is an individual adaptive type that builds.
 まず、クラスタリング型について説明する。クラスタリング型のドライバモデルの構築方法は、図27に示したような運転者の走行履歴を運転者毎に予め集約する。そして、互いの走行履歴の類似度が高い複数の運転者、つまり、類似した運転操作傾向を有する複数の運転者をグループ化してドライバモデルを構築する。 First, the clustering type will be described. In the clustering type driver model construction method, the driving history of the driver as shown in FIG. 27 is aggregated in advance for each driver. Then, a driver model is constructed by grouping a plurality of drivers having a high degree of similarity between the traveling histories, that is, a plurality of drivers having similar driving operation tendencies.
 走行履歴の類似度は、例えば、運転者aと運転者bの走行履歴における挙動を所定のルールに基づいて数値化した場合に、環境パラメータの数値と挙動の数値とを要素とするベクトルの相関値から決定できる。この場合、例えば、運転者aと運転者bの走行履歴から算出した相関値が所定値よりも高い場合に、運転者aと運転者bの走行履歴を1つのグループとする。なお、類似度の算出については、これに限定されない。 For example, when the behaviors of the driving histories of the driver a and the driver b are quantified based on a predetermined rule, the similarity between the driving histories is a correlation between vectors having environmental parameter values and behavior values as elements. Can be determined from the value. In this case, for example, when the correlation value calculated from the driving history of the driver a and the driver b is higher than a predetermined value, the driving history of the driver a and the driver b is set as one group. The calculation of the similarity is not limited to this.
 次に、個別適応型について説明する。個別適応型のドライバモデルの構築方法は、クラスタリング型の場合と同様に、図27に示したような複数の運転者の走行履歴を予め集約する。ここで、クラスタリング型の場合と異なる点は、運転者毎にドライバモデルを構築する点である。例えば、運転者yに対してドライバモデルを構築する場合、運転者yの走行履歴と他の複数の運転者の走行履歴とを比較し、類似度が高い複数の運転者の走行履歴を抽出する。そして、抽出した複数の運転者の走行履歴から運転者yの個別適応型のドライバモデルを構築する。 Next, the individual adaptive type will be described. As in the case of the clustering type, the individual adaptive type driver model construction method aggregates the driving histories of a plurality of drivers as shown in FIG. Here, the difference from the clustering type is that a driver model is constructed for each driver. For example, when a driver model is constructed for the driver y, the driving history of the driver y is compared with the driving histories of other drivers, and the driving histories of a plurality of drivers with high similarity are extracted. . Then, an individually adaptive driver model for the driver y is constructed from the extracted driving histories of the plurality of drivers.
 なお、図27に示す走行履歴に基づくドライバモデル(シチュエーションデータベース)は、クラスタリング型、または、個別適応型に限定されず、例えば、全ての運転者の走行履歴を含むように構成されてもよい。 It should be noted that the driver model (situation database) based on the travel history shown in FIG. 27 is not limited to the clustering type or the individual adaptation type, and may be configured to include the travel history of all drivers, for example.
 ここで、構築したドライバモデルの使用方法について、例を挙げて説明する。以下の例では、運転者xに対し、4人の運転者a~dの走行履歴を集約したドライバモデルが用いられる場合について説明する。なお、ドライバモデルは、車両制御部7によって構築される。 Here, how to use the built driver model will be explained with an example. In the following example, a case will be described in which a driver model in which the driving histories of four drivers a to d are aggregated is used for the driver x. The driver model is constructed by the vehicle control unit 7.
 図28は、本変形例におけるドライバモデルの使用方法を示す図である。図28の(a)は、運転者xが運転する車両の現時点における走行環境を示す環境パラメータである。図28の(b)は、運転者xに対するドライバモデルの一例である。 FIG. 28 is a diagram showing a method of using the driver model in this modification. (A) of FIG. 28 is an environmental parameter which shows the driving environment in the present time of the vehicle which the driver | operator x drives. FIG. 28B is an example of a driver model for the driver x.
 図28の(a)に示すように、現時点における走行環境を示す環境パラメータに対する挙動(操作)はブランクになる。車両制御部7は、環境パラメータを所定の間隔で取得し、環境パラメータのいずれかをトリガとして、図28の(b)に示すドライバモデルから次の挙動を判定する。 As shown in FIG. 28 (a), the behavior (operation) with respect to the environmental parameter indicating the current driving environment is blank. The vehicle control unit 7 acquires environmental parameters at predetermined intervals, and determines one of the environmental parameters as a trigger to determine the next behavior from the driver model shown in FIG.
 トリガとしては、例えば、合流区間の開始地点までの距離が所定の距離以下になった場合、あるいは、先行車両との相対速度が所定値以下になった場合など、車両の操作の変更が必要となる場合を示す環境パラメータをトリガとしてもよい。 As a trigger, for example, when the distance to the start point of the merging section is a predetermined distance or less, or when the relative speed with the preceding vehicle is a predetermined value or less, it is necessary to change the operation of the vehicle. An environmental parameter indicating the case may be used as a trigger.
 車両制御部7は、図28の(a)に示す環境パラメータと、図28の(b)に示すドライバモデルのそれぞれの走行履歴の環境パラメータとを比較し、最も類似する環境パラメータに対応づけられた挙動を第1の挙動であると判定する。また、それ以外の類似する環境パラメータに対応づけられたいくつかの挙動については、第2の挙動と判定する。 The vehicle control unit 7 compares the environmental parameter shown in FIG. 28A with the environmental parameter of each driving history of the driver model shown in FIG. 28B, and is associated with the most similar environmental parameter. The determined behavior is determined to be the first behavior. In addition, some behaviors associated with other similar environmental parameters are determined as second behaviors.
 環境パラメータが類似するか否かは、環境パラメータの数値を要素とするベクトルの相関値から決定できる。例えば、図28の(a)に示す環境パラメータの数値を要素とするベクトルと、図28の(b)に示す環境パラメータの数値を要素とするベクトルとから算出した相関値が所定値よりも高い場合に、これらの環境パラメータが類似すると判定される。なお、環境パラメータが類似するか否かの判定方法については、これに限定されない。 Whether the environmental parameters are similar can be determined from the correlation value of the vectors whose elements are the numerical values of the environmental parameters. For example, the correlation value calculated from the vector whose elements are the numerical values of the environmental parameters shown in FIG. 28A and the vector whose elements are the numerical values of the environmental parameters shown in FIG. 28B is higher than a predetermined value. The environmental parameters are determined to be similar. Note that the method for determining whether the environmental parameters are similar is not limited to this.
 例えば、ここでは環境パラメータの類似度に基づいて挙動を決定することとしたが、まず環境パラメータの類似度の高いグループを作成し、そのグループにおける環境パラメータの統計をとり、その統計データから挙動を決定してもよい。 For example, here we decided to determine the behavior based on the similarity of the environmental parameters, but first create a group with a high degree of similarity of the environmental parameters, take statistics of the environmental parameters in that group, and then determine the behavior from the statistical data. You may decide.
 このように、予め複数の運転者の走行履歴から運転者個人のドライバモデルを構築することにより、運転者により適した挙動を報知できる。なお、より安全な走行履歴をデータベースに登録するため、安全な走行の基準を示す情報を記憶部8が記憶しておき、走行履歴がこの基準を満たすか否かを車両制御部7が判定し、さらに車両制御部7が、この基準を満たす走行履歴をデータベースに登録し、この基準を満たさない走行履歴をデータベースに登録しないこととしてもよい。 In this way, by constructing a driver model of an individual driver from the driving histories of a plurality of drivers in advance, a behavior more suitable for the driver can be notified. In order to register a safer driving history in the database, the storage unit 8 stores information indicating a safe driving criterion, and the vehicle control unit 7 determines whether or not the driving history satisfies this criterion. Furthermore, the vehicle control unit 7 may register a travel history that satisfies this criterion in the database, and may not register a travel history that does not satisfy this criterion in the database.
 さらに、走行環境を示すパラメータと挙動とが対応づけられることにより、車両制御部7が、具体的な走行環境を判定することなく、つまり、走行環境のラベリングを行う事無く、精度よく次の挙動を判定できる。 Further, by associating the parameter indicating the driving environment with the behavior, the vehicle control unit 7 accurately determines the next behavior without determining the specific driving environment, that is, without labeling the driving environment. Can be determined.
 なお、ドライバモデル(シチュエーションデータベース)は、自動運転中に運転者が選択した挙動とその挙動を提示した際の走行環境を示す環境パラメータとを対応づけた走行履歴から構築されてもよい。あるいは、ドライバモデル(シチュエーションデータベース)は、自動運転中に運転者が選択した挙動とその挙動を車両が行った際の走行環境を示す環境パラメータとを対応付けた走行履歴から構築されてもよい。 It should be noted that the driver model (situation database) may be constructed from a travel history in which a behavior selected by the driver during automatic driving and an environment parameter indicating a travel environment when the behavior is presented are associated with each other. Alternatively, the driver model (situation database) may be constructed from a travel history in which a behavior selected by the driver during automatic driving and an environmental parameter indicating a travel environment when the behavior is performed by the vehicle are associated with each other.
 環境パラメータが、運転者により選択された挙動を車両が行った際の走行環境を示すものである場合、以下のようにしてもよい。すなわち、現時点における走行環境を示す環境パラメータから将来の走行環境を示す環境パラメータを予測し、運転者により選択された挙動を車両が行った際の走行環境を示す環境パラメータのうち、予測された環境パラメータに最も類似する環境パラメータに対応付けられた挙動を第1の挙動、それ以外の類似する環境パラメータに対応付けられたいくつかの挙動を第2の挙動であると判定してもよい。 When the environmental parameter indicates the driving environment when the vehicle performs the behavior selected by the driver, the following may be performed. That is, an environmental parameter indicating a future driving environment is predicted from an environmental parameter indicating a current driving environment, and the predicted environment among the environmental parameters indicating the driving environment when the vehicle performs a behavior selected by the driver. The behavior associated with the environmental parameter most similar to the parameter may be determined as the first behavior, and some behaviors associated with other similar environmental parameters may be determined as the second behavior.
 上記予測は、例えば、現時点と現時点よりも前の時点の走行環境を示す環境パラメータから将来の時点の環境パラメータを外挿することにより行う。 The above prediction is performed, for example, by extrapolating environmental parameters at a future time from environmental parameters indicating the driving environment at the current time and a time before the current time.
 あるいは、ドライバモデル(シチュエーションデータベース)は、自動運転中に運転者が選択した挙動とその挙動を提示した際の走行環境を示す環境パラメータとを対応づけた走行履歴、および、自動運転中に運転者が選択した挙動とその挙動を車両が行った際の走行環境を示す環境パラメータとを対応付けた走行履歴の両方から構築されてもよい。 Alternatively, the driver model (situation database) includes a driving history that associates a behavior selected by the driver during automatic driving with an environmental parameter indicating a driving environment when the behavior is presented, and a driver during automatic driving. May be constructed from both the travel history in which the behavior selected by and the environmental parameters indicating the travel environment when the vehicle performs the behavior are associated with each other.
 この場合、例えば、両者の走行履歴が図28の(b)に示したような形式で記憶され、車両制御部7は、それらから次の挙動を判定する。ここで、車両制御部7は、両者の間で優先度を設け、例えば、自動運転中に運転者が選択した挙動とその挙動を車両が行った際の走行環境を示す環境パラメータとを対応付けた走行履歴から優先的に次の挙動を判定してもよい。 In this case, for example, both travel histories are stored in a format as shown in FIG. 28B, and the vehicle control unit 7 determines the next behavior from them. Here, the vehicle control unit 7 gives priority between the two, for example, associating the behavior selected by the driver during the automatic driving with the environment parameter indicating the traveling environment when the vehicle performs the behavior. The next behavior may be determined preferentially from the travel history.
 なお、本発明では、車両制御部7が実行する機能と同様の機能をクラウドサーバなどのサーバ装置に実行させてもよい。特に、記憶部8は走行履歴の蓄積に伴い膨大なデータ数となるため、車両1ではなくクラウドサーバなどのサーバ装置にあってもよい。あるいは、記憶部8は、既に構築されたドライバモデルを記憶し、車両制御部7は、記憶部8に記憶されたドライバモデルを参照して、挙動を判定することとしてもよい。 In the present invention, a server device such as a cloud server may execute a function similar to the function executed by the vehicle control unit 7. In particular, since the storage unit 8 has an enormous number of data as the driving history is accumulated, it may be in a server device such as a cloud server instead of the vehicle 1. Alternatively, the storage unit 8 may store an already constructed driver model, and the vehicle control unit 7 may determine the behavior with reference to the driver model stored in the storage unit 8.
 なお、記憶部8がクラウドサーバに設けられる構成では、通信速度の低下・通信断などの原因により記憶部8にアクセスできない場合に備えキャッシュが設けられることが望ましい。 In the configuration in which the storage unit 8 is provided in the cloud server, it is desirable to provide a cache in case the storage unit 8 cannot be accessed due to a decrease in communication speed or communication disconnection.
 図29は、キャッシュの配置の一例を示すブロック図である。車両制御部7は、通信部291を通じて記憶部8に走行履歴を保存させ、通信部291を通じてキャッシュ292に記憶部8に記憶されたドライバモデル(シチュエーションデータベース)の一部を保持させる。 FIG. 29 is a block diagram showing an example of cache arrangement. The vehicle control unit 7 stores the travel history in the storage unit 8 through the communication unit 291 and holds a part of the driver model (situation database) stored in the storage unit 8 in the cache 292 through the communication unit 291.
 車両制御部7は、キャッシュ292のドライバモデルにアクセスする。このときのキャッシュの作成方法については、環境パラメータの有無で限定する方法、位置情報を用いる方法、データを加工する方法などが考えられる。以下、それぞれについて説明する。 The vehicle control unit 7 accesses the driver model of the cache 292. As a method for creating a cache at this time, a method of limiting by the presence or absence of environmental parameters, a method of using position information, a method of processing data, and the like are conceivable. Each will be described below.
 まず、環境パラメータの有無で限定する方法について説明する。周囲の状況の比較により似た状況を抽出するには、同じ環境パラメータのみが存在する走行環境(シチュエーション)が十分にあれば可能である。従って、車両制御部7は、記憶部8に記憶された走行環境の中から同じ環境パラメータのみを持つ走行環境を抽出して、これらをソートし、キャッシュ292に保持する。 First, a method for limiting the presence or absence of environmental parameters will be described. In order to extract a similar situation by comparing surrounding situations, it is possible to have a sufficient driving environment (situation) in which only the same environmental parameters exist. Therefore, the vehicle control unit 7 extracts driving environments having only the same environmental parameters from the driving environments stored in the storage unit 8, sorts these, and holds them in the cache 292.
 ここで、車両制御部7は、検出された状況から得られる環境パラメータが変更されたタイミングで、一次キャッシュの更新を行う。こうすることで、車両制御部7は、通信速度の低下が発生しても似た周囲の状況を抽出することが可能になる。なお、変更の有無を判断する環境パラメータは、先に挙げた全ての環境パラメータでもよいし、一部の環境パラメータでもよい。 Here, the vehicle control unit 7 updates the primary cache at the timing when the environmental parameter obtained from the detected situation is changed. By doing so, the vehicle control unit 7 can extract a similar surrounding situation even if the communication speed decreases. Note that the environmental parameters for determining whether or not there is a change may be all of the environmental parameters listed above, or some of the environmental parameters.
 さらに、この環境パラメータは刻一刻と変化するため、キャッシュ292内に一次キャッシュおよび二次キャッシュを用意しても良い。例えば、車両制御部7は、同じ環境パラメータを持つ走行環境を一次キャッシュに保持する。さらに、車両制御部7は、環境パラメータが一時キャッシュに保持された走行環境に一つ追加された状態にある走行環境、および、環境パラメータが一時キャッシュに保持された走行環境から一つ削減された状態にある走行環境の少なくとも一方を二次キャッシュに保持する。 Furthermore, since this environmental parameter changes every moment, a primary cache and a secondary cache may be prepared in the cache 292. For example, the vehicle control unit 7 holds a traveling environment having the same environmental parameter in the primary cache. Further, the vehicle control unit 7 is reduced by one from the driving environment in which one environmental parameter is added to the driving environment held in the temporary cache and from the driving environment in which the environmental parameter is held in the temporary cache. At least one of the driving environments in the state is held in the secondary cache.
 このようにすることで、車両制御部7は、一時的な通信断が発生しても、キャッシュ292のデータのみで、似た状況を抽出することが可能になる。 In this way, the vehicle control unit 7 can extract a similar situation using only the data in the cache 292 even if a temporary communication interruption occurs.
 図30を使って、この場合についてさらに具体的に説明する。センサ62により、自車両301の周囲に側前方車両302のみが存在している周囲状況303が検出されたとき、車両制御部7は、側前方車両302のみが存在している走行環境(同一の環境パラメータのみ存在する走行環境)を、全ての走行環境(シチュエーション)が記憶された記憶部8から抽出し、一次キャッシュ304に格納させる。 This case will be described in more detail with reference to FIG. When the sensor 62 detects an ambient situation 303 in which only the side front vehicle 302 exists around the host vehicle 301, the vehicle control unit 7 determines that the traveling environment in which only the side front vehicle 302 exists (the same The driving environment in which only the environmental parameters exist is extracted from the storage unit 8 in which all the driving environments (situations) are stored, and stored in the primary cache 304.
 さらに、車両制御部7は、側前方車両302以外の車が1台だけ追加された走行環境(同一の環境パラメータに1つの環境パラメータが追加された状態にある走行環境)もしくは、側前方車両302のいない走行環境(同一の環境パラメータから1つの環境パラメータが削減された状態にある走行環境)を記憶部8から抽出し、二次キャッシュ305に格納させる。 Further, the vehicle control unit 7 is configured such that the traveling environment in which only one vehicle other than the side front vehicle 302 is added (the traveling environment in which one environmental parameter is added to the same environmental parameter) or the side front vehicle 302 is used. A driving environment without a vehicle (a driving environment in which one environmental parameter is reduced from the same environmental parameter) is extracted from the storage unit 8 and stored in the secondary cache 305.
 そして、センサ62により検出された周囲状況303が変わったとき、車両制御部7は、変わった周囲状況303に対応する走行環境を二次キャッシュ305から一次キャッシュ304にコピーし、変わった周囲状況303に対応する走行環境に対し、環境パラメータが一つ追加された走行環境、及び、環境パラメータが一つ削減された走行環境を記憶部8から抽出し、二次キャッシュ305に格納することで、二次キャッシュ305を更新する。これにより、車両制御部7は、周囲状況の比較により似た周囲状況をスムーズに抽出することが可能になる。 When the ambient condition 303 detected by the sensor 62 changes, the vehicle control unit 7 copies the driving environment corresponding to the changed ambient condition 303 from the secondary cache 305 to the primary cache 304, and the changed ambient condition 303. 2 is extracted from the storage unit 8 and stored in the secondary cache 305 by extracting a driving environment in which one environmental parameter has been added and a driving environment in which one environmental parameter has been reduced. The next cache 305 is updated. As a result, the vehicle control unit 7 can smoothly extract similar surrounding situations by comparing the surrounding situations.
 次に、位置情報を用いる方法について説明する。環境パラメータに位置情報が含まれている場合、車両制御部7は、その位置情報により示される位置が自車位置を中心とする一定の範囲内に含まれる走行環境(シチュエーション)を記憶部8から抽出し、キャッシュ292に格納させることが可能となる。 Next, a method using position information will be described. When position information is included in the environmental parameters, the vehicle control unit 7 displays from the storage unit 8 a driving environment (situation) in which the position indicated by the position information is included within a certain range centered on the vehicle position. It can be extracted and stored in the cache 292.
 この場合、車両制御部7は、走行環境に対応する位置情報により示される位置が上記一定の範囲から外れたときに、キャッシュ292の更新を行う。このようにすることで、車両制御部7は、長期間の通信断が発生しても位置が一定の範囲内であれば似た周囲状況を抽出することが可能になる。 In this case, the vehicle control unit 7 updates the cache 292 when the position indicated by the position information corresponding to the traveling environment is out of the certain range. By doing so, the vehicle control unit 7 can extract a similar ambient situation as long as the position is within a certain range even if communication is interrupted for a long time.
 さらに、データを加工する方法について説明する。記憶部8には環境パラメータを含む操作履歴が蓄積されている。車両制御部7は、各々の環境パラメータを一定の範囲毎に分割し、多次元空間上でメッシュを作成する。そして、車両制御部7は、各々のメッシュに含まれる挙動をその種別ごとにカウントしたテーブルを作成する。 Furthermore, a method for processing data will be described. The storage unit 8 stores operation histories including environmental parameters. The vehicle control unit 7 divides each environmental parameter into a certain range and creates a mesh in a multidimensional space. And the vehicle control part 7 creates the table which counted the behavior contained in each mesh for every classification.
 例えば、使用する環境パラメータを二つに限定して説明する。車両制御部7は、操作履歴に含まれる環境パラメータを図31の(a)のように平面状にマッピングし、これらの各々の軸を一定の範囲で分割することで、平面を複数のブロックに分ける。これをメッシュと呼ぶ。 For example, the explanation will be made by limiting the environment parameters to be used to two. The vehicle control unit 7 maps the environmental parameters included in the operation history in a planar shape as shown in FIG. 31A, and divides each of these axes within a certain range, thereby dividing the plane into a plurality of blocks. Divide. This is called a mesh.
 車両制御部7は、各々のメッシュの中に含まれる挙動の個数をその種別(例えば、加速、減速、車線変更、追い越しなどの種別)ごとにカウントする。図31の(b)に各メッシュの中に含まれる挙動の個数をその種別ごとにカウントしたテーブルを示す。 The vehicle control unit 7 counts the number of behaviors included in each mesh for each type (for example, types such as acceleration, deceleration, lane change, and overtaking). FIG. 31B shows a table in which the number of behaviors included in each mesh is counted for each type.
 車両制御部7は、この内容をキャッシュ292に保持する。そして、車両制御部7は、周囲状況の比較により似た周囲状況の抽出を行う際に、検出された環境パラメータがどのメッシュに位置するかを判別し、判別したメッシュの中に含まれる挙動のうち個数が最大である挙動を選択し、選択された挙動を報知する挙動に決定する。 The vehicle control unit 7 holds this content in the cache 292. Then, the vehicle control unit 7 determines which mesh the detected environmental parameter is located in when extracting a similar surrounding situation by comparing the surrounding situations, and the behavior included in the determined mesh The behavior having the largest number is selected, and the behavior for notifying the selected behavior is determined.
 例えば、車両制御部7は、検出された環境パラメータがメッシュの3番に位置すると判別したとき、3番のメッシュの中に含まれる挙動のうち最大個数を示す挙動(ここでは「加速」)の操作を報知する挙動に決定する。この方法であれば、キャッシュ292の更新タイミングはいつでもよく、キャッシュ292の容量も一定とすることができる。 For example, when the vehicle control unit 7 determines that the detected environmental parameter is located at the third mesh position, the vehicle control section 7 indicates a behavior (here “acceleration”) indicating the maximum number of behaviors included in the third mesh. The behavior for notifying the operation is determined. With this method, the update timing of the cache 292 may be anytime, and the capacity of the cache 292 can be made constant.
 これらの方法を一つもしくは複数を組み合わせることでキャッシュを作成する。ただし、上に挙げた方法は一例であり、キャッシュの作成方法はこの限りではない。 ∙ Create a cache by combining one or more of these methods. However, the above method is only an example, and the cache creation method is not limited to this.
 このように、実施の形態4のドライバモデル拡張の例では、以下のようにすることとした。すなわち、車両制御部7が、過去の走行環境の情報を含む運転者の運転特性を示す特徴量の情報を取得し、記憶部8がその特徴量の情報を記憶し、車両の挙動を変更する必要があると判定された場合、車両制御部7が記憶部8に記憶された特徴量の情報の中から、新たに取得した走行環境の情報を含む運転者の運転特性を示す特徴量に類似する情報を決定し、決定された情報に対応する挙動を報知することとした。 Thus, in the example of the driver model extension of the fourth embodiment, the following is performed. That is, the vehicle control unit 7 acquires feature amount information indicating the driving characteristics of the driver including past driving environment information, and the storage unit 8 stores the feature amount information and changes the behavior of the vehicle. When it is determined that it is necessary, the vehicle control unit 7 is similar to the feature amount indicating the driving characteristics of the driver including the newly acquired information on the driving environment from the feature amount information stored in the storage unit 8. The information to be determined is determined, and the behavior corresponding to the determined information is notified.
 また、実施の形態4のドライバモデル拡張の例では、過去の走行環境の情報を含む運転者の運転特性を示す特徴量の情報は、運転者に車両の挙動を提示した際の特徴量の情報、および、運転者が挙動の選択を行った際の特徴量の情報の少なくとも1つであることとした。 Further, in the example of the driver model extension of the fourth embodiment, the feature amount information indicating the driving characteristics of the driver including the past driving environment information is the feature amount information when the vehicle behavior is presented to the driver. And at least one piece of feature amount information when the driver selects the behavior.
 また、実施の形態4のドライバモデル拡張の例では、過去の走行環境の情報を含む運転者の運転特性を示す特徴量の情報が、運転者に車両の挙動を提示した際の特徴量の情報、および、運転者が挙動の選択を行った際の特徴量の情報の両方である場合、それら両方の特徴量の情報の中から、新たに取得した走行環境の情報を含む運転者の運転特性を示す特徴量に類似する情報を決定し、決定された情報に対応する挙動を報知することとした。 Further, in the example of the driver model extension of the fourth embodiment, the feature amount information indicating the driving characteristics of the driver including the past driving environment information is the feature amount information when the behavior of the vehicle is presented to the driver. And the feature information when the driver has selected the behavior, the driver's driving characteristics including information on the driving environment newly acquired from both feature information. The information similar to the feature amount indicating is determined, and the behavior corresponding to the determined information is notified.
 また、実施の形態4のドライバモデル拡張の例では、以下のようにした。すなわち、過去の走行環境の情報を含む運転者の運転特性を示す特徴量の情報が、運転者に車両の挙動を提示した際の特徴量の情報、および、運転者が挙動の選択を行った際の特徴量の情報の両方である場合、運転者が挙動の選択を行った際の特徴量の情報の中から優先的に、新たに取得した走行環境の情報を含む運転者の運転特性を示す特徴量に類似する情報を決定し、決定された情報に対応する挙動を報知することとした。 In the example of the driver model extension of the fourth embodiment, the following is performed. In other words, feature information indicating the driving characteristics of the driver, including past driving environment information, information on the feature values when the behavior of the vehicle is presented to the driver, and the driver has selected the behavior In the case of both the feature amount information at the time of driving, the driver's driving characteristics including the newly acquired driving environment information are preferentially selected from the feature amount information when the driver selects the behavior. Information similar to the feature amount to be shown is determined, and a behavior corresponding to the determined information is notified.
 また、実施の形態4のドライバモデル拡張の例では、過去の走行環境の情報を含む運転者の運転特性を示す特徴量の情報が、車両の自動運転時、および手動運転時のいずれか一方または両方の運転者の運転特性を示す特徴量の情報であることとした。 Further, in the example of the driver model extension of the fourth embodiment, the feature amount information indicating the driving characteristics of the driver including the past driving environment information is either one of the automatic driving and the manual driving of the vehicle, or The feature amount information indicates the driving characteristics of both drivers.
 以上により、車両制御部7は、運転者の運転傾向により適したドライバモデルを構築でき、構築したドライバモデルに基づいて、運転者に対してより適切な自動運転を行うことができる。走行環境を示すパラメータと挙動とが対応づけられることにより、具体的な走行環境を判定する処理を要することなく、つまり、走行環境のラベリングを行うことなく、精度よく次の挙動を判定できる。 As described above, the vehicle control unit 7 can construct a driver model more suitable for the driving tendency of the driver, and can perform more appropriate automatic driving for the driver based on the constructed driver model. By associating the parameter indicating the driving environment with the behavior, it is possible to accurately determine the next behavior without requiring processing for determining a specific driving environment, that is, without labeling the driving environment.
 (実施の形態5)
 近年、自動車の自動運転に関する開発が進められている。自動運転について、NHTSA(National Highway Traffic Safety Administration)が2013年に定義した自動化レベルは、自動化なし(レベル0)、特定機能の自動化(レベル1)、複合機能の自動化(レベル2)、半自動運転(レベル3)、完全自動運転(レベル4)に分類される。レベル1は加速・減速・操舵の内、1つを自動的に行う運転支援システムであり、レベル2は加速・減速・操舵の内、2つ以上を調和して自動的に行う運転支援システムである。いずれの場合も運転者による運転操作の関与が残る。レベル4は加速・減速・操舵の全てを自動的に行う完全自動走行システムであり、運転者が運転操作に関与しない。レベル3は加速・減速・操舵の全てを自動的に行うが、必要に応じて運転者が運転操作を行う準完全自動走行システムである。
(Embodiment 5)
In recent years, development related to automatic driving of automobiles has been promoted. As for automatic operation, the automation levels defined in 2013 by NHTSA (National Highway Traffic Safety Administration) are no automation (level 0), automation of specific functions (level 1), automation of complex functions (level 2), semi-automatic operation ( Level 3) and fully automatic operation (level 4). Level 1 is a driving support system that automatically performs one of acceleration, deceleration, and steering. Level 2 is a driving support system that automatically performs two or more of acceleration, deceleration, and steering in harmony. is there. In either case, the driver remains involved in the driving operation. Level 4 is a fully automatic driving system that automatically performs all of acceleration, deceleration, and steering, and the driver is not involved in the driving operation. Level 3 is a semi-automatic driving system in which acceleration, deceleration, and steering are all performed automatically, and the driver performs driving operations as necessary.
 以下の実施の形態では主に、車両の自動運転に関する情報を車両の乗員(例えば運転者)との間でやり取りするためのHMI(Human Machine Interface)を制御する装置(以下「運転支援装置」とも呼ぶ。)を提案する。以下の説明における車両の「行動」は、実施の形態1~4の説明における車両の「挙動」に対応し、自動運転または手動運転において、車両の走行中または停止時の操舵や制動などの作動状態、もしくは自動運転制御に係る制御内容を含む。「行動」は、例えば、定速走行、加速、減速、一時停止、停止、車線変更、進路変更、右左折、駐車などである。 In the following embodiments, mainly a device that controls an HMI (Human Machine Interface) for exchanging information on automatic driving of a vehicle with a vehicle occupant (for example, a driver) (hereinafter referred to as a “driving support device”). Suggest). The “behavior” of the vehicle in the following description corresponds to the “behavior” of the vehicle in the description of the first to fourth embodiments, and in automatic operation or manual operation, an operation such as steering or braking while the vehicle is running or stopped. The control content related to the state or automatic operation control is included. “Action” is, for example, constant speed running, acceleration, deceleration, temporary stop, stop, lane change, course change, right / left turn, parking, and the like.
 特に、実施の形態5では、実施の形態4において説明した個別適応型のドライバモデルに関して、次の行動の推定精度をさらに向上させるための処理を説明する。実施の形態4は、運転者ごとの走行履歴を収集してから、各運転者の操作頻度分布を解析することによって、対象となる運転者の走行履歴に類似した他の運転者の走行履歴を選択し、選択した走行履歴をもとにドライバモデルを生成する。つまり、運転者ごとにグルーピングすることにより個人に適応したドライバモデルが生成されている。 In particular, in the fifth embodiment, a process for further improving the accuracy of estimating the next action will be described with respect to the individual adaptive driver model described in the fourth embodiment. In the fourth embodiment, after collecting the driving history for each driver, by analyzing the operation frequency distribution of each driver, the driving history of other drivers similar to the driving history of the target driver is obtained. A driver model is generated based on the selected travel history. That is, a driver model adapted to an individual is generated by grouping for each driver.
 一方、実施の形態5では、運転者の運転行動が、同乗者の有無や同乗者の状態によって変化する場合があることに着目する。例えば、同乗者がいなければ車線を変更するような状況でも、同乗者がいれば車線を変更せずに減速がなされる。これに対応するために、実施の形態5では、運転者と同乗者との組合せごとに操作履歴を収集し、対象となる組合せの走行履歴に類似した他の組合せの走行履歴を選択し、選択した走行履歴をもとにドライバモデルを生成する。つまり、実施の形態4において運転者ごとに実行していた処理を、運転者と同乗者との組合せごとに処理を実行することによって、処理の単位が細分化される。なお、行動推定の精度を向上させるためには、より多くの運転者の走行履歴が必要になり、処理の負荷が大きくなる。そのため、ここでは、クラウドサーバでの処理を前提にする。 On the other hand, in the fifth embodiment, attention is paid to the fact that the driving behavior of the driver may change depending on the presence or absence of the passenger and the passenger's state. For example, even if there is no passenger, the lane is changed, and if there is a passenger, the vehicle is decelerated without changing the lane. In order to cope with this, in the fifth embodiment, the operation history is collected for each combination of the driver and the passenger, and the travel history of another combination similar to the travel history of the target combination is selected and selected. A driver model is generated based on the travel history. That is, the unit of processing is subdivided by executing the processing executed for each driver in the fourth embodiment for each combination of the driver and the passenger. In order to improve the accuracy of action estimation, more driving histories of drivers are required, and the processing load increases. Therefore, here, it is assumed that the processing is performed on the cloud server.
 図32は、車両1000の構成を示すブロック図であり、自動運転に関する構成を示している。車両1000は、自動運転モードで走行可能であり、報知装置1002、入力装置1004、無線装置1008、運転操作部1010、検出部1020、自動運転制御装置1030、運転支援装置1040を備える。図32に示す各装置の間は、専用線あるいはCAN(Controller Area Network)等の有線通信で接続されてもよい。また、USB(Universal Serial Bus)、Ethernet(登録商標)、Wi-Fi(登録商標)、Bluetooth(登録商標)等の有線通信または無線通信で接続されてもよい。 FIG. 32 is a block diagram showing a configuration of the vehicle 1000, and shows a configuration related to automatic driving. The vehicle 1000 can travel in the automatic driving mode, and includes a notification device 1002, an input device 1004, a wireless device 1008, a driving operation unit 1010, a detection unit 1020, an automatic driving control device 1030, and a driving support device 1040. Each device shown in FIG. 32 may be connected by wired communication such as a dedicated line or CAN (Controller-Area-Network). Further, it may be connected by wired communication or wireless communication such as USB (Universal Serial Bus), Ethernet (registered trademark), Wi-Fi (registered trademark), Bluetooth (registered trademark) or the like.
 車両1000は実施の形態1~4の車両1に対応する。報知装置1002は図1、図13の情報報知装置9に対応し、入力装置1004は図1の操作部51、図13の入力部102に対応し、検出部1020は図1、図13の検出部6に対応する。また自動運転制御装置1030と運転支援装置1040は、図1、図13の車両制御部7に対応する。以下、実施の形態1~4で説明済の構成の説明は適宜省略する。 Vehicle 1000 corresponds to vehicle 1 in the first to fourth embodiments. The notification device 1002 corresponds to the information notification device 9 in FIGS. 1 and 13, the input device 1004 corresponds to the operation unit 51 in FIG. 1 and the input unit 102 in FIG. 13, and the detection unit 1020 detects in FIG. This corresponds to part 6. The automatic driving control device 1030 and the driving support device 1040 correspond to the vehicle control unit 7 in FIGS. 1 and 13. Hereinafter, the description of the configuration described in the first to fourth embodiments will be omitted as appropriate.
 報知装置1002は、車両1000の走行に関する情報を運転者に報知する。報知装置1002は、例えば、車内に設置されているカーナビゲーションシステム、ヘッドアップディスプレイ、センターディスプレイ、ステアリングホイール、ピラー、ダッシュボード、メータパネル周りなどに設置されているLED(Light Emitting Diode)などの発光体などのような情報を表示する表示部であってもよい。また、情報を音声に変換して運転者に報知するスピーカであってもよいし、あるいは、運転者が感知できる位置(例えば、運転者の座席、ステアリングホイールなど)に設けられる振動体であってもよい。また、報知装置1002は、これらの組み合わせであってもよい。入力装置1004は、乗員による操作入力を受け付けるユーザインタフェース装置である。例えば入力装置1004は、運転者が入力した自車の自動運転に関する情報を受け付ける。運転支援装置1040は、受け付けた情報を操作信号として運転支援装置1040に出力する。 The notification device 1002 notifies the driver of information related to the traveling of the vehicle 1000. For example, the notification device 1002 emits light such as a car navigation system installed in a vehicle, a head-up display, a center display, a steering wheel, a pillar, a dashboard, and a meter panel. The display unit may display information such as a body. In addition, the speaker may be a speaker that converts information into sound and notifies the driver, or a vibration body provided at a position (for example, the driver's seat, steering wheel, etc.) that the driver can sense. Also good. Further, the notification device 1002 may be a combination of these. The input device 1004 is a user interface device that receives an operation input by an occupant. For example, the input device 1004 receives information related to automatic driving of the host vehicle input by the driver. The driving support device 1040 outputs the received information to the driving support device 1040 as an operation signal.
 図33は、図32の車両1000の室内を模式的に示す。報知装置1002は、ヘッドアップディスプレイ(HUD)1002aであってもよく、センターディスプレイ1002bであってもよい。入力装置1004は、ステアリング1011に設けられた第1操作部1004aであってもよく、運転席と助手席との間に設けられた第2操作部1004bであってもよい。なお、報知装置1002と入力装置1004は一体化されてもよく、例えばタッチパネルディスプレイとして実装されてもよい。 FIG. 33 schematically shows the interior of the vehicle 1000 in FIG. The notification device 1002 may be a head-up display (HUD) 1002a or a center display 1002b. The input device 1004 may be the first operation unit 1004a provided on the steering 1011 or the second operation unit 1004b provided between the driver seat and the passenger seat. Note that the notification device 1002 and the input device 1004 may be integrated, and may be implemented as a touch panel display, for example.
 以下では言及しないが、図33に示すように、車両1000には自動運転に関する情報を音声にて乗員へ提示するスピーカ1006が設けられてもよい。この場合、運転支援装置1040は、自動運転に関する情報を示す画像を報知装置1002に表示させ、それとともに、またはそれに代えて、自動運転に関する情報を示す音声をスピーカ1006から出力させてもよい。 Although not mentioned below, as shown in FIG. 33, the vehicle 1000 may be provided with a speaker 1006 that presents information related to automatic driving to the occupant by voice. In this case, the driving support device 1040 may display an image indicating information related to automatic driving on the notification device 1002 and output a sound indicating information related to automatic driving from the speaker 1006 together with or instead of the information.
 図32に戻り、無線装置1008は、携帯電話通信システム、WMAN(Wireless Metropolitan Area Network)等に対応しており、車両1000外部の装置(図示せず)との無線通信を実行する。運転操作部1010は、ステアリング1011、ブレーキペダル1012、アクセルペダル1013、ウィンカスイッチ1014を備える。ステアリング1011は図1、図13のステアリングホイール5、ブレーキペダル1012は図1、図13のブレーキペダル2、アクセルペダル1013は図1、図13のアクセルペダル3、ウィンカスイッチ1014は図1、図13のウィンカレバー4に対応する。 32, the wireless device 1008 corresponds to a mobile phone communication system, WMAN (Wireless Metropolitan Area Network), and the like, and performs wireless communication with a device (not shown) outside the vehicle 1000. The driving operation unit 1010 includes a steering 1011, a brake pedal 1012, an accelerator pedal 1013, and a blinker switch 1014. The steering wheel 1011 is shown in FIGS. 1 and 13, the brake pedal 1012 is shown in FIG. 1, the brake pedal 2 shown in FIG. 13, the accelerator pedal 1013 is shown in FIG. 1, the accelerator pedal 3 shown in FIG. 13, and the winker switch 1014 is shown in FIGS. This corresponds to the winker lever 4.
 ステアリング1011、ブレーキペダル1012、アクセルペダル1013、ウィンカスイッチ1014は、ステアリングECU、ブレーキECU、エンジンECUとモータECU及びウィンカコントローラにより電子制御が可能である。自動運転モードにおいて、ステアリングECU、ブレーキECU、エンジンECU、モータECUは、自動運転制御装置1030から供給される制御信号に応じて、アクチュエータを駆動する。またウィンカコントローラは、自動運転制御装置1030から供給される制御信号に応じてウィンカランプを点灯あるいは消灯する。 Steering 1011, brake pedal 1012, accelerator pedal 1013, and winker switch 1014 can be electronically controlled by a steering ECU, a brake ECU, an engine ECU, a motor ECU, and a winker controller. In the automatic operation mode, the steering ECU, the brake ECU, the engine ECU, and the motor ECU drive the actuator in accordance with a control signal supplied from the automatic operation control device 1030. The blinker controller turns on or off the blinker lamp according to a control signal supplied from the automatic operation control device 1030.
 検出部1020は、車両1000の周囲状況および走行状態を検出する。実施の形態1~4で一部既述したが、例えば検出部1020は、車両1000の速度、車両1000に対する先行車両の相対速度、車両1000と先行車両との距離、車両1000に対する側方車線の車両の相対速度、車両1000と側方車線の車両との距離、車両1000の位置情報を検出する。検出部1020は、検出した各種情報(以下、「検出情報」という)を自動運転制御装置1030、運転支援装置1040に出力する。なお、検出部1020の詳細は後述する。 Detecting unit 1020 detects the surrounding state and running state of vehicle 1000. As described in part in the first to fourth embodiments, for example, the detection unit 1020 detects the speed of the vehicle 1000, the relative speed of the preceding vehicle with respect to the vehicle 1000, the distance between the vehicle 1000 and the preceding vehicle, and the side lane with respect to the vehicle 1000. The relative speed of the vehicle, the distance between the vehicle 1000 and the vehicle in the side lane, and the position information of the vehicle 1000 are detected. The detection unit 1020 outputs various types of detected information (hereinafter referred to as “detection information”) to the automatic driving control device 1030 and the driving support device 1040. Details of the detection unit 1020 will be described later.
 自動運転制御装置1030は、自動運転制御機能を実装した自動運転コントローラであり、自動運転における車両1000の行動を決定する。自動運転制御装置1030は、制御部1031、記憶部1032、I/O部(入出力部)1033を備える。制御部1031の構成はハードウェア資源とソフトウェア資源の協働、又はハードウェア資源のみにより実現できる。ハードウェア資源としてプロセッサ、ROM、RAM、その他のLSIを利用でき、ソフトウェア資源としてオペレーティングシステム、アプリケーション、ファームウェア等のプログラムを利用できる。記憶部1032は、フラッシュメモリ等の不揮発性記録媒体を備える。I/O部1033は、各種の通信フォーマットに応じた通信制御を実行する。例えば、I/O部1033は、自動運転に関する情報を運転支援装置1040に出力するとともに、制御コマンドを運転支援装置1040から入力する。また、I/O部1033は、検出情報を検出部1020から入力する。 The automatic driving control device 1030 is an automatic driving controller that implements an automatic driving control function, and determines the behavior of the vehicle 1000 in automatic driving. The automatic operation control device 1030 includes a control unit 1031, a storage unit 1032, and an I / O unit (input / output unit) 1033. The configuration of the control unit 1031 can be realized by cooperation of hardware resources and software resources, or only by hardware resources. Processors, ROM, RAM, and other LSIs can be used as hardware resources, and programs such as an operating system, application, and firmware can be used as software resources. The storage unit 1032 includes a nonvolatile recording medium such as a flash memory. The I / O unit 1033 executes communication control according to various communication formats. For example, the I / O unit 1033 outputs information related to automatic driving to the driving support device 1040 and inputs a control command from the driving support device 1040. Further, the I / O unit 1033 inputs detection information from the detection unit 1020.
 制御部1031は、運転支援装置1040から入力した制御コマンド、検出部1020あるいは各種ECUから収集した各種情報を自動運転アルゴリズムに適用して、車両1000の進行方向等の自動制御対象を制御するための制御値を算出する。制御部1031は算出した制御値を、各制御対象のECU又はコントローラに伝達する。本実施の形態ではステアリングECU、ブレーキECU、エンジンECU、ウィンカコントローラに伝達する。なお電気自動車あるいはハイブリッドカーの場合、エンジンECUに代えて又は加えてモータECUに制御値を伝達する。 The control unit 1031 applies the control command input from the driving support device 1040, various information collected from the detection unit 1020 or various ECUs to the automatic driving algorithm, and controls an automatic control target such as the traveling direction of the vehicle 1000. Calculate the control value. The control unit 1031 transmits the calculated control value to each control target ECU or controller. In this embodiment, it is transmitted to the steering ECU, the brake ECU, the engine ECU, and the winker controller. In the case of an electric vehicle or a hybrid car, the control value is transmitted to the motor ECU instead of or in addition to the engine ECU.
 運転支援装置1040は、車両1000と運転者との間のインタフェース機能を実行するHMIコントローラであり、制御部1041、記憶部1042、I/O部1043を備える。制御部1041は、HMI制御等の各種データ処理を実行する。制御部1041は、ハードウェア資源とソフトウェア資源の協働、またはハードウェア資源のみにより実現できる。ハードウェア資源としてプロセッサ、ROM、RAM、その他のLSIを利用でき、ソフトウェア資源としてオペレーティングシステム、アプリケーション、ファームウェア等のプログラムを利用できる。 The driving support device 1040 is an HMI controller that executes an interface function between the vehicle 1000 and the driver, and includes a control unit 1041, a storage unit 1042, and an I / O unit 1043. The control unit 1041 executes various data processing such as HMI control. The control unit 1041 can be realized by cooperation of hardware resources and software resources, or only by hardware resources. Processors, ROM, RAM, and other LSIs can be used as hardware resources, and programs such as an operating system, application, and firmware can be used as software resources.
 記憶部1042は、制御部1041により参照され、または更新されるデータを記憶する記憶領域である。例えばフラッシュメモリ等の不揮発の記録媒体により実現される。I/O部1043は、各種の通信フォーマットに応じた各種の通信制御を実行する。I/O部1043は、操作入力部1050、画像出力部1051、検出情報入力部1052、コマンドIF(Interface)1053、通信IF1056を備える。 The storage unit 1042 is a storage area that stores data that is referred to or updated by the control unit 1041. For example, it is realized by a non-volatile recording medium such as a flash memory. The I / O unit 1043 executes various communication controls according to various communication formats. The I / O unit 1043 includes an operation input unit 1050, an image output unit 1051, a detection information input unit 1052, a command IF (Interface) 1053, and a communication IF 1056.
 操作入力部1050は、入力装置1004に対してなされた運転者あるいは乗員もしくは車外にいるユーザの操作による操作信号を入力装置1004から受信し、制御部1041へ出力する。画像出力部1051は、制御部1041が生成した画像データを報知装置1002へ出力して表示させる。検出情報入力部1052は、検出部1020による検出処理の結果であり、車両1000の現在の周囲状況および走行状態を示す情報(以下「検出情報」と呼ぶ。)を検出部1020から受信し、制御部1041へ出力する。 The operation input unit 1050 receives an operation signal from the input device 1004 by the operation of the driver, the occupant, or the user outside the vehicle made to the input device 1004, and outputs it to the control unit 1041. The image output unit 1051 outputs the image data generated by the control unit 1041 to the notification device 1002 for display. The detection information input unit 1052 is a result of the detection process by the detection unit 1020, receives information (hereinafter referred to as “detection information”) indicating the current surrounding state and running state of the vehicle 1000 from the detection unit 1020, and performs control. Output to the unit 1041.
 コマンドIF1053は、自動運転制御装置1030とのインタフェース処理を実行し、行動情報入力部1054とコマンド出力部1055を含む。行動情報入力部1054は、自動運転制御装置1030から送信された車両1000の自動運転に関する情報を受信し、制御部1041へ出力する。コマンド出力部1055は、自動運転制御装置1030に対して自動運転の態様を指示する制御コマンドを、制御部1041から受け付けて自動運転制御装置1030へ送信する。 The command IF 1053 executes an interface process with the automatic driving control apparatus 1030, and includes a behavior information input unit 1054 and a command output unit 1055. The behavior information input unit 1054 receives information regarding the automatic driving of the vehicle 1000 transmitted from the automatic driving control device 1030 and outputs the information to the control unit 1041. The command output unit 1055 receives from the control unit 1041 a control command that instructs the automatic driving control device 1030 to specify the mode of automatic driving, and transmits the control command to the automatic driving control device 1030.
 通信IF1056は、無線装置1008とのインタフェース処理を実行する。通信IF1056は、制御部1041から出力されたデータを無線装置1008へ送信し、無線装置1008から車外の装置へ送信させる。また、通信IF1056は、無線装置1008により転送された、車外の装置からのデータを受信し、制御部1041へ出力する。 The communication IF 1056 executes interface processing with the wireless device 1008. The communication IF 1056 transmits the data output from the control unit 1041 to the wireless device 1008, and transmits the data from the wireless device 1008 to a device outside the vehicle. The communication IF 1056 receives data from a device outside the vehicle transferred by the wireless device 1008 and outputs the data to the control unit 1041.
 なお、ここでは、自動運転制御装置1030と運転支援装置1040は別個の装置として構成される。変形例として、図32の破線で示すように、自動運転制御装置1030と運転支援装置1040を1つのコントローラに統合してもよい。言い換えれば、1つの自動運転制御装置が、図32の自動運転制御装置1030と運転支援装置1040の両方の機能を備える構成としてもよい。 Here, the automatic driving control device 1030 and the driving support device 1040 are configured as separate devices. As a modified example, as shown by a broken line in FIG. 32, the automatic driving control device 1030 and the driving support device 1040 may be integrated into one controller. In other words, one automatic driving control device may be configured to have both functions of the automatic driving control device 1030 and the driving support device 1040 of FIG.
 図34は、検出部1020および検出情報入力部1052の詳細な構成を示すブロック図である。検出部1020は、第1検出部1060、第2検出部1062を含み、検出情報入力部1052は、第1入力部1070、第2入力部1072を含む。また、第1検出部1060は、位置情報取得部1021、センサ1022、速度情報取得部1023、地図情報取得部1024を含み、第2検出部1062は、運転者センシング部1064、同乗者センシング部1066を含む。 FIG. 34 is a block diagram showing a detailed configuration of the detection unit 1020 and the detection information input unit 1052. The detection unit 1020 includes a first detection unit 1060 and a second detection unit 1062, and the detection information input unit 1052 includes a first input unit 1070 and a second input unit 1072. The first detection unit 1060 includes a position information acquisition unit 1021, a sensor 1022, a speed information acquisition unit 1023, and a map information acquisition unit 1024. The second detection unit 1062 includes a driver sensing unit 1064 and a passenger sensing unit 1066. including.
 第1検出部1060は、前述のごとく、車両1000の周囲状況および走行状態を主として検出する。第1検出部1060は、検出した情報(以下、「第1検出情報」という)を第1入力部1070に出力する。第1入力部1070は、第1検出部1060からの第1検出情報を入力する。一方、第2検出部1062は、車両1000に乗車している運転者、同乗者に関する情報を主として検出する。第2検出部1062は、検出した情報(以下、「第2検出情報」という)を第2入力部1072に出力する。第2入力部1072は、第2検出部1062からの第2検出情報を入力する。なお、第1検出情報、第2検出情報の組合せ、あるいはいずれか一方が、前述の検出情報に相当する。 As described above, the first detection unit 1060 mainly detects the surrounding state and the running state of the vehicle 1000. The first detection unit 1060 outputs the detected information (hereinafter referred to as “first detection information”) to the first input unit 1070. The first input unit 1070 inputs the first detection information from the first detection unit 1060. On the other hand, the second detection unit 1062 mainly detects information on the driver who is on the vehicle 1000 and the passenger. The second detection unit 1062 outputs the detected information (hereinafter referred to as “second detection information”) to the second input unit 1072. The second input unit 1072 inputs the second detection information from the second detection unit 1062. Note that the combination of the first detection information and the second detection information or one of them corresponds to the detection information described above.
 第1検出部1060の位置情報取得部1021は、GPS受信機から車両1000の現在位置を取得する。センサ1022は、車外の状況および車両1000の状態を検出するための各種センサの総称である。車外の状況を検出するためのセンサとして例えばカメラ、ミリ波レーダ、LIDAR(Light Detection and RangingまたはLaser Imaging Detection and Ranging)、気温センサ、気圧センサ、湿度センサ、照度センサ等が搭載される。車外の状況は、車線情報を含む自車の走行する道路状況、天候を含む環境、自車周辺状況、近傍位置にある他車(隣接車線を走行する他車等)を含む。なお、センサが検出できる車外の情報であれば何でもよい。また車両1000の状態を検出するためのセンサとして例えば、加速度センサ、ジャイロセンサ、地磁気センサ、傾斜センサ等が搭載される。 The position information acquisition unit 1021 of the first detection unit 1060 acquires the current position of the vehicle 1000 from the GPS receiver. The sensor 1022 is a generic name for various sensors for detecting the situation outside the vehicle and the state of the vehicle 1000. For example, a camera, a millimeter wave radar, a LIDAR (Light Detection and Ranging or Laser Imaging Detection and Ranging), a temperature sensor, an atmospheric pressure sensor, a humidity sensor, an illuminance sensor, and the like are mounted as sensors for detecting the situation outside the vehicle. The situation outside the vehicle includes a road condition in which the host vehicle travels including lane information, an environment including weather, a situation around the host vehicle, and other vehicles in the vicinity (such as other vehicles traveling in the adjacent lane). Any information outside the vehicle that can be detected by the sensor may be used. For example, an acceleration sensor, a gyro sensor, a geomagnetic sensor, a tilt sensor, and the like are mounted as sensors for detecting the state of the vehicle 1000.
 速度情報取得部1023は、車速センサから車両1000の現在速度を取得する。地図情報取得部1024は、地図データベースから車両1000の現在位置周辺の地図情報を取得する。地図データベースは、車両1000内の記録媒体に記録されていてもよいし、使用時にネットワークを介して地図サーバからダウンロードしてもよい。 Speed information acquisition unit 1023 acquires the current speed of vehicle 1000 from the vehicle speed sensor. The map information acquisition unit 1024 acquires map information around the current position of the vehicle 1000 from the map database. The map database may be recorded on a recording medium in the vehicle 1000, or may be downloaded from a map server via a network when used.
 第2検出部1062の運転者センシング部1064は、車両1000の運転席に着座している運転者個人を認証する。例えば、車両1000の運転席に着座している運転者の顔を撮像可能なカメラが車内に設置されており、当該カメラによって、運転者の顔が撮像される。また、運転者センシング部1064は、車両1000の運転席に着座する可能性がある運転者の顔に関する情報を予め保持する。運転者の顔に関する情報は、例えば、顔の画像、顔の画像の特徴点の情報等である。運転者センシング部1064は、カメラで撮像した画像と、運転者の顔に関する情報とを比較することによって、運転席に着座した運転手個人を特定する。なお、特定には公知の技術が使用されればよいので、ここでは説明を省略する。なお、TOF(Time Of Flight)センサ、指紋センサが車内に設置され、そのようなセンサにおいて取得した情報をもとに、運転者センシング部1064は、運転席に着座した運転手個人を特定してもよい。運転者センシング部1064は、特定した運転者の情報を第2検出情報として出力する。 The driver sensing unit 1064 of the second detection unit 1062 authenticates the driver sitting on the driver's seat of the vehicle 1000. For example, a camera capable of imaging a driver's face seated in the driver's seat of the vehicle 1000 is installed in the vehicle, and the driver's face is imaged by the camera. In addition, the driver sensing unit 1064 previously holds information regarding the driver's face that may be seated in the driver's seat of the vehicle 1000. The information related to the driver's face is, for example, a face image, feature point information of the face image, or the like. The driver sensing unit 1064 identifies an individual driver seated in the driver's seat by comparing the image captured by the camera with information related to the driver's face. In addition, since a well-known technique should just be used for identification, description is abbreviate | omitted here. In addition, a TOF (Time Of Flight) sensor and a fingerprint sensor are installed in the vehicle, and based on information acquired by such a sensor, the driver sensing unit 1064 identifies the individual driver sitting in the driver's seat. Also good. The driver sensing unit 1064 outputs the specified driver information as second detection information.
 同乗者センシング部1066は、車両1000の助手席、後部座席に着座している同乗者を認証する。例えば、各座席には、着座センサが設置されており、同乗者センシング部1066は、着座センサにおいて取得した情報をもとに、同乗者の有無を特定する。ここでは、助手席に同乗者がおり、後部座席に同乗者がいないのような特定がなされる。また、運転者センシング部1064と同様に、助手席、後部座席に着座している同乗者の顔を撮像可能なカメラが車内に設置されており、当該カメラで撮像した画像をもとに、同乗者センシング部1066は、同乗者の有無、同乗者に関する情報を特定してもよい。同乗者に関する情報には、年齢/性別、個人認識、乗員の状態(眠気・車酔い)が含まれる。このような特定にも公知の技術が使用されればよいので、ここでは説明を省略する。さらに、TOFセンサが車内に設置され、TOFセンサにおいて取得した情報をもとに、同乗者センシング部1066は、同乗者の有無、同乗者に関する情報を特定してもよい。同乗者センシング部1066は、特定した同乗者の有無を第2検出情報として出力する。また、同乗者センシング部1066は、同乗者に関する情報を特定している場合、それも第2検出情報として出力する。 The passenger sensing unit 1066 authenticates passengers seated in the passenger seat and rear seat of the vehicle 1000. For example, a seating sensor is installed in each seat, and the passenger sensing unit 1066 identifies the presence or absence of a passenger based on information acquired by the seating sensor. Here, the passenger is identified as having a passenger in the passenger seat and no passenger in the rear seat. Similarly to the driver sensing unit 1064, a camera capable of capturing the face of the passenger sitting in the passenger seat and the rear seat is installed in the vehicle, and based on the images captured by the camera, The person sensing unit 1066 may specify the presence / absence of a passenger and information related to the passenger. Information related to passengers includes age / sex, personal recognition, and occupant status (drowsiness / car sickness). Since a known technique may be used for such identification, the description thereof is omitted here. Furthermore, the TOF sensor is installed in the vehicle, and based on the information acquired by the TOF sensor, the passenger sensing unit 1066 may specify the presence / absence of the passenger and information related to the passenger. The fellow passenger sensing unit 1066 outputs the presence / absence of the identified fellow passenger as second detection information. Further, when the passenger sensing unit 1066 specifies information related to the passenger, the passenger sensing unit 1066 outputs the information as the second detection information.
 図35は、制御部1041の詳細な構成を示すブロック図である。制御部1041は、検出部1100、走行履歴生成部1102、送信部1104、問い合せ部1106、走行履歴の取得部1108、ドライバモデル生成部1110、判定部1112、確認部1114、画面生成部1116、指示部1118を含む。 FIG. 35 is a block diagram illustrating a detailed configuration of the control unit 1041. The control unit 1041 includes a detection unit 1100, a travel history generation unit 1102, a transmission unit 1104, an inquiry unit 1106, a travel history acquisition unit 1108, a driver model generation unit 1110, a determination unit 1112, a confirmation unit 1114, a screen generation unit 1116, an instruction Part 1118.
 検出部1100は、車両1000のドアの開閉センサに接続されるとともに、第2入力部1072に接続される。検出部1100は、ドアが開閉されたタイミングを開閉センサから通知される。開閉センサによる開閉タイミングの検出には公知の技術が使用されればよいので、ここでは説明を省略する。検出部1100は、開閉タイミングの通知を受け付けた場合に、第2入力部1072から第2検出情報を入力する。また、検出部1100は、同乗者の状態が変化した場合に、第2入力部1072から第2検出情報を入力してもよい。検出部1100は、第2検出情報を入力することによって、車両1000の運転者個人を検出する。また、検出部1100は、第2検出情報を入力することによって、車両1000の同乗者の有無を検出する。さらに、車両1000は、車両1000の同乗者に関する情報を検出してもよい。 The detection unit 1100 is connected to a door opening / closing sensor of the vehicle 1000 and is also connected to the second input unit 1072. The detection unit 1100 is notified from the opening / closing sensor of the timing at which the door is opened / closed. Since a known technique may be used for detection of the opening / closing timing by the opening / closing sensor, description thereof is omitted here. The detection unit 1100 receives the second detection information from the second input unit 1072 when receiving notification of the opening / closing timing. The detection unit 1100 may input the second detection information from the second input unit 1072 when the passenger's state changes. The detection unit 1100 detects the individual driver of the vehicle 1000 by inputting the second detection information. Moreover, the detection part 1100 detects the presence or absence of the passenger of the vehicle 1000 by inputting 2nd detection information. Further, the vehicle 1000 may detect information regarding passengers of the vehicle 1000.
 走行履歴生成部1102は、第1入力部1070、検出部1100、指示部1118に接続される。詳細は後述するが、指示部1118が次の行動を自動運転制御装置1030に指示した場合に、指示部1118は、指示した行動を走行履歴生成部1102に通知する。ここで、次の行動は運転者によって選択されており、行動は、例えば、「減速」、「加速」、「定速走行」、「右車線変更」等である。走行履歴生成部1102は、指示部1118からの通知を受け付けた場合に、第1入力部1070から第1検出情報を入力するとともに、検出部1100からの情報を入力する。 The travel history generation unit 1102 is connected to the first input unit 1070, the detection unit 1100, and the instruction unit 1118. Although details will be described later, when the instruction unit 1118 instructs the automatic driving control apparatus 1030 to perform the next action, the instruction unit 1118 notifies the travel history generation unit 1102 of the instructed action. Here, the next action is selected by the driver, and the actions are, for example, “deceleration”, “acceleration”, “constant speed running”, “right lane change”, and the like. When the notification from the instruction unit 1118 is received, the travel history generation unit 1102 inputs the first detection information from the first input unit 1070 and the information from the detection unit 1100.
 走行履歴生成部1102は、第1検出情報に含まれた各種情報をもとに、環境パラメータを導出する。環境パラメータは、例えば、図27に示されるように、「自車両の速度Va」、「自車両に対する先行車両の相対速度Vba」、「先行車両と自車両との車間距離DRba」、「先行車両のサイズの変化率RSb」、「自車両に対する側後方車両の相対速度Vca」を含む。また、「側後方車両と自車両との車頭間距離Dca」、「車頭間距離の変化率Rca」、「自車両に対する側前方車両の相対速度Vda」、「側前方車両と自車両との車頭間距離Dda」、「車頭間距離の変化率Rda」、「自車両の残存側方車線長DRda」、を含む。さらに「自車両に対する合流車両の相対速度Vma」、「合流車両と自車両との車頭間距離Dma」、「車頭間距離の変化率Rma」等を含む。これらについては、前述の通りであるので、ここでは説明を省略する。 The traveling history generation unit 1102 derives environmental parameters based on various information included in the first detection information. For example, as shown in FIG. 27, the environmental parameters include “the speed Va of the own vehicle”, “the relative speed Vba of the preceding vehicle with respect to the own vehicle”, “the inter-vehicle distance DRba between the preceding vehicle and the own vehicle”, and “the preceding vehicle”. Change rate RSb ”and“ relative speed Vca of the vehicle behind the vehicle relative to the host vehicle ”. In addition, “the inter-head distance Dca between the side rear vehicle and the host vehicle”, “the change rate Rca of the inter-head distance”, “the relative speed Vda of the side front vehicle with respect to the host vehicle”, “the head of the side front vehicle and the host vehicle "Inter-distance Dda", "change ratio Rda of head-to-vehicle distance", and "remaining side lane length DRda of the host vehicle". Furthermore, “the relative speed Vma of the joining vehicle with respect to the own vehicle”, “the distance Dma between the head of the joining vehicle and the own vehicle”, “the change rate Rma of the distance between the heads”, and the like are included. Since these are as described above, description thereof is omitted here.
 ここでは、検出部1100から入力した情報の種類ごとに、走行履歴生成部1102において生成される走行履歴を説明する。情報の種類の一例として(1)運転者個人、同乗者の有無、(2)運転者個人、同乗者の年齢・性別、(3)運転者個人、同乗者個人、(4)運転者個人、同乗者の有無、同乗者の状態、(5)運転者個人の5種類を想定する。また、それぞれの走行履歴は、図36、図37、図27に示される。図36は、走行履歴生成部1102において生成される走行履歴のデータ構造を示し、図37は、走行履歴生成部1102において生成される走行履歴の別のデータ構造を示す。 Here, the travel history generated in the travel history generation unit 1102 for each type of information input from the detection unit 1100 will be described. Examples of types of information include: (1) individual driver, presence / absence of passenger, (2) individual driver, age / sex of passenger, (3) individual driver, individual passenger, (4) individual driver, Assume the presence or absence of a passenger, the passenger's condition, and (5) the driver's personality. Each travel history is shown in FIG. 36, FIG. 37, and FIG. FIG. 36 shows a data structure of a travel history generated by the travel history generation unit 1102, and FIG. 37 shows another data structure of a travel history generated by the travel history generation unit 1102.
 (1)運転者個人、同乗者の有無
 走行履歴生成部1102は、図36の(a)に示される走行履歴を生成する。具体的に説明すると、走行履歴生成部1102は、取得部1108からの通知を受け付けたタイミングにおいて、検出部1100からの情報として、運転者の名前、助手席の同乗者の有無、後部座席の同乗者の人数を入力する。図36の(a)において、運転者の名前は「A」や「B」のように示され、助手席の同乗者「有」の場合は「○」と示され、助手席の同乗者「無」の場合は「×」と示され、後部座席の同乗者の人数は「0」や「1」のように示される。また、走行履歴生成部1102は、当該タイミングでの走行履歴として、「Va」等の値を入力する。さらに、走行履歴生成部1102は、入力した情報、値、取得部1108からの通知において示された行動、例えば、「減速」等をまとめて、図36の(a)の1行に格納する。つまり、走行履歴生成部1102は、車両1000が過去に走行した走行環境を示す環境パラメータと、当該環境パラメータに対して運転者が選択した行動とを対応させた走行履歴を生成する。その際、走行履歴は、運転者と同乗者の有無の組合せごとに生成される。
(1) Presence / absence of individual driver and passenger The traveling history generation unit 1102 generates a traveling history shown in FIG. Specifically, the travel history generation unit 1102 receives information from the detection unit 1100 at the timing when the notification from the acquisition unit 1108 is received, and includes the driver's name, the presence / absence of a passenger in the front passenger seat, Enter the number of participants. In FIG. 36 (a), the driver's name is shown as “A” or “B”, and in the case of a passenger in the passenger seat “Yes”, “○” is shown, and the passenger in the passenger seat “ In the case of “none”, “x” is indicated, and the number of passengers in the rear seat is indicated as “0” or “1”. In addition, the travel history generation unit 1102 inputs a value such as “Va” as the travel history at the timing. Furthermore, the travel history generation unit 1102 stores the input information, values, and the action indicated in the notification from the acquisition unit 1108, for example, “deceleration”, and stores them in one line of FIG. That is, the travel history generation unit 1102 generates a travel history in which an environmental parameter indicating a travel environment in which the vehicle 1000 has traveled in the past and an action selected by the driver with respect to the environmental parameter. At that time, the travel history is generated for each combination of the presence / absence of the driver and the passenger.
 (2)運転者個人、同乗者の年齢・性別
 走行履歴生成部1102は、図36の(b)に示される走行履歴を生成する。具体的に説明すると、走行履歴生成部1102は、取得部1108からの通知を受け付けたタイミングにおいて、検出部1100からの情報として、運転者の名前、同乗者の年齢・性別を入力する。図36の(b)において、同乗者の年齢・性別は、「30代・女性」、「30代・女性/男の子」のように示される。ここで、前者は同乗者が1人であることも示し、後者は同乗者が2人であることも示す。このような同乗者の年齢・性別は、前述の同乗者に関する情報といえる。走行履歴生成部1102は、(1)の場合と同様に、運転者の名前、同乗者の年齢・性別、走行履歴に関する値、取得部1108からの通知において示された行動をまとめて、図36の(b)の1行に格納する。つまり、走行履歴生成部1102において走行履歴は、運転者と、過去に検出した同乗者の有無と、同乗者に関する情報の組合せごとに生成される。
(2) Driver's Individual, Passenger's Age / Gender The travel history generation unit 1102 generates a travel history shown in FIG. Specifically, the travel history generation unit 1102 inputs the driver's name and the age / gender of the passenger as information from the detection unit 1100 at the timing when the notification from the acquisition unit 1108 is received. In FIG. 36 (b), the age / sex of the passenger is shown as “30's / female”, “30's / female / boy”. Here, the former also indicates that there is one passenger, and the latter also indicates that there are two passengers. Such a passenger's age and sex can be said to be information on the passenger. Similarly to the case of (1), the travel history generation unit 1102 summarizes the driver's name, the passenger's age and sex, values related to the travel history, and the behavior shown in the notification from the acquisition unit 1108. (B) in one row. In other words, the travel history generation unit 1102 generates a travel history for each combination of the driver, the presence / absence of a passenger detected in the past, and information related to the passenger.
 (3)運転者個人、同乗者個人
 走行履歴生成部1102は、図37の(a)に示される走行履歴を生成する。具体的に説明すると、走行履歴生成部1102は、取得部1108からの通知を受け付けたタイミングにおいて、検出部1100からの情報として、運転者の名前、助手席の同乗者の名前、後部座席の同乗者の名前を入力する。図37の(a)において、同乗者の名前は、「B」、「C」、「D」のように示される。なお、同乗者の名前が確認されることによって、同乗者の人数も明らかになる。走行履歴生成部1102は、(1)の場合と同様に、運転者の名前、助手席の同乗者の名前、後部座席の同乗者の名前、走行履歴に関する値、取得部1108からの通知において示された行動をまとめて、図37の(a)の1行に格納する。ここでも、走行履歴生成部1102において走行履歴は、運転者と、過去に検出した同乗者の有無と、同乗者に関する情報の組合せごとに生成される。
(3) Driver Individual and Passenger Individual The travel history generation unit 1102 generates a travel history shown in FIG. More specifically, the travel history generation unit 1102 receives information from the detection unit 1100 at the timing when the notification from the acquisition unit 1108 is received, and includes the driver's name, the passenger's passenger's name, Enter the person's name. In FIG. 37A, the names of the passengers are indicated as “B”, “C”, “D”. By confirming the passenger's name, the number of passengers is also clarified. As in the case of (1), the travel history generation unit 1102 indicates the driver's name, the passenger's passenger's name, the rear passenger's name, the value related to the travel history, and the notification from the acquisition unit 1108. Collected actions are stored in one line of FIG. Again, the travel history is generated in the travel history generation unit 1102 for each combination of the driver, the presence or absence of a fellow passenger detected in the past, and information related to the fellow passenger.
 (4)運転者個人、同乗者の有無、同乗者の状態
 走行履歴生成部1102は、図37の(b)に示される走行履歴を生成する。具体的に説明すると、走行履歴生成部1102は、取得部1108からの通知を受け付けたタイミングにおいて、検出部1100からの情報として、運転者の名前、助手席の同乗者の有無、当該同乗者の状態を入力する。図37の(b)において、同乗者の状態は、「正常」、「睡眠」、「車酔い」のように示される。なお、同乗者の状態が確認されることによって、同乗者の人数も明らかになる。走行履歴生成部1102は、(1)の場合と同様に、運転者の名前、助手席の同乗者の有無、当該同乗者の状態、走行履歴に関する値、取得部1108からの通知において示された行動をまとめて、図37の(b)の1行に格納する。ここでも、走行履歴生成部1102において走行履歴は、運転者と、過去に検出した同乗者の有無と、同乗者に関する情報の組合せごとに生成される。
(4) Individual driver, presence / absence of passenger, passenger's state The travel history generation unit 1102 generates the travel history shown in FIG. Specifically, the travel history generation unit 1102 receives information from the detection unit 1100 at the timing when the notification from the acquisition unit 1108 is received, and includes the name of the driver, the presence or absence of a passenger in the passenger seat, the passenger's Enter the state. In FIG. 37 (b), the passenger's state is indicated as "normal", "sleep", "car sickness". In addition, the number of passengers is also clarified by confirming the passenger's condition. As in the case of (1), the travel history generation unit 1102 is indicated in the driver's name, the presence / absence of a passenger in the passenger seat, the state of the passenger, the value related to the travel history, and the notification from the acquisition unit 1108. Actions are collected and stored in one line of FIG. Again, the travel history is generated in the travel history generation unit 1102 for each combination of the driver, the presence or absence of a fellow passenger detected in the past, and information related to the fellow passenger.
 (5)運転者個人
 これは、実施の形態4に相当し、走行履歴生成部1102は、図27に示される走行履歴を生成する。そのため、走行履歴生成部1102は、(1)から(4)における同乗者の部分を省略した処理を実行することによって、走行履歴を生成する。つまり、走行履歴生成部1102は、車両1000が過去に走行した走行環境を示す環境パラメータと、当該環境パラメータに対して運転者が選択した行動とを対応させた走行履歴を運転者ごとに生成する。
(5) Individual Driver This corresponds to the fourth embodiment, and the travel history generation unit 1102 generates the travel history shown in FIG. Therefore, the travel history generation unit 1102 generates a travel history by executing the process in which the passenger's part in (1) to (4) is omitted. That is, the travel history generation unit 1102 generates, for each driver, a travel history in which an environment parameter indicating a travel environment in which the vehicle 1000 has traveled in the past and an action selected by the driver with respect to the environment parameter are associated with each other. .
 (1)から(4)において、走行履歴生成部1102は、同乗者の有無、同乗者の年齢・性別、同乗者の名前、同乗者の状態等を使用して走行履歴を生成しているが、これらを任意に組み合わせることによって走行履歴を生成してもよい。また、以下では、説明を明瞭にするために、図36の(a)の走行履歴が生成されたとして説明を進めるが、他の走行履歴が生成された場合であっても同様の処理が実行されればよい。図35に戻って、走行履歴生成部1102は、走行履歴を送信部1104、問い合せ部1106に出力する。 In (1) to (4), the travel history generation unit 1102 generates a travel history using the presence / absence of a passenger, the age / sex of the passenger, the name of the passenger, the state of the passenger, and the like. The travel history may be generated by arbitrarily combining these. In addition, in the following, for the sake of clarity, the description will be made assuming that the travel history of FIG. 36 (a) has been generated, but the same processing is executed even when another travel history is generated. It only has to be done. Returning to FIG. 35, travel history generation section 1102 outputs the travel history to transmission section 1104 and inquiry section 1106.
 送信部1104は、走行履歴生成部1102からの走行履歴を入力する。送信部1104は、走行履歴が入力された場合に、通信IF1056を介して無線装置1008から、図示しないクラウドサーバに対して、走行履歴の更新を通知する。クラウドサーバは、複数の車両1000のそれぞれに搭載された運転支援装置1040において生成された走行履歴を収集するために、車両1000外に設けられる。つまり、クラウドサーバは、複数の運転支援装置1040のそれぞれにおいて生成された走行履歴をまとめて管理するが、説明の便宜上、クラウドサーバに記憶される走行履歴を「総合走行履歴」という。クラウドサーバは、走行履歴の更新通知を受信すると、走行履歴を送信させるために、走行履歴要求を無線装置1008に送信する。送信部1104は、通信IF1056を介してクラウドサーバからの走行履歴要求を入力すると、走行履歴における組合せであって、かつ運転者の名前と同乗者の有無の組合せを識別するための識別情報(以下、「ID」という)を各組合せに対して付与する。 The transmission unit 1104 inputs the travel history from the travel history generation unit 1102. When the travel history is input, the transmission unit 1104 notifies the cloud server (not shown) of the travel history update from the wireless device 1008 via the communication IF 1056. The cloud server is provided outside the vehicle 1000 in order to collect travel histories generated in the driving support device 1040 installed in each of the plurality of vehicles 1000. That is, the cloud server collectively manages the travel histories generated in each of the plurality of driving support devices 1040, but for convenience of explanation, the travel histories stored in the cloud server are referred to as “total travel histories”. When the cloud server receives the travel history update notification, the cloud server transmits a travel history request to the wireless device 1008 to transmit the travel history. When the transmission history request from the cloud server is input via the communication IF 1056, the transmission unit 1104 receives the identification information (hereinafter referred to as a combination of the driver's name and the presence / absence of a passenger) that is a combination in the driving history. , “ID”) is assigned to each combination.
 この処理を説明するために、ここでは、図38、図39を使用する。図38は、送信部1104における処理概要を示す。図38の(a)は、送信部1104に入力された走行履歴のデータ構造を示し、図36の(a)と同一である。図38の(b)は、運転者の名前と同乗者の有無の組合せと、IDとの対応関係を示す。図示のごとく、運転者の名前が「A」であり、助手席の同乗者が「無」であり、後部座席の同乗者が「無」である場合には、ID「0001」が対応づけられている。また、運転者の名前が「A」であり、助手席の同乗者が「有」であり、後部座席の同乗者が「1人」である場合には、ID「0003」が対応づけられている。なお、IDは、複数の運転支援装置1040間においても重複しないように定められている。ここで、走行履歴の更新によって、運転者の名前が「B」であり、助手席の同乗者が「無」であり、後部座席の同乗者が「無」である場合が追加され、かつその組合せにIDが付与されていない場合、送信部1104は、当該組合せにID「0004」を割り当てる。なお、前述の(2)から(4)の場合、組合せの中に、同乗者の年齢・性別等が含まれており、前述の(5)の場合、組合せではなく、運転者の名前だけが含まれている。 In order to explain this process, FIG. 38 and FIG. 39 are used here. FIG. 38 shows an outline of processing in the transmission unit 1104. FIG. 38A shows the data structure of the travel history input to the transmission unit 1104, which is the same as FIG. FIG. 38 (b) shows the correspondence between the driver's name and the presence / absence of a passenger and the ID. As shown in the figure, when the driver's name is “A”, the passenger in the passenger seat is “none”, and the passenger in the rear seat is “none”, the ID “0001” is associated. ing. Further, when the driver's name is “A”, the passenger in the passenger seat is “present”, and the passenger in the rear seat is “one person”, the ID “0003” is associated. Yes. In addition, ID is defined so that it may not overlap between the several driving assistance apparatuses 1040. FIG. Here, the case where the driver's name is “B”, the passenger in the passenger seat is “none”, and the passenger in the rear seat is “none” is added by updating the driving history, and When no ID is assigned to the combination, the transmission unit 1104 assigns an ID “0004” to the combination. In the case of (2) to (4) described above, the passenger's age and gender are included in the combination. In the case of (5) above, only the driver's name, not the combination, is included. include.
 送信部1104は、図38の(b)に示された関係を使用することによって、図38の(a)に示された組合せをIDに置き換える。図39は、送信部1104における別の処理概要を示す。図示のごとく、組合せがIDに置き換えられている。このようなIDを使用することによって、運転者「A」に関する情報であっても、ID「0001」から「0003」という3つの情報に分離される。送信部1104は、IDに置き換えた走行履歴(以下、これもまた「走行履歴」という)を通信IF1056に出力する。通信IF1056は、無線装置1008から、クラウドサーバに走行履歴を送信させる。その際、走行履歴のうちの更新された部分だけが送信されてもよい。クラウドサーバは、総合走行履歴に、受信した走行履歴を追加する。 The transmitting unit 1104 uses the relationship shown in FIG. 38B to replace the combination shown in FIG. 38A with an ID. FIG. 39 shows another processing outline in the transmission unit 1104. As illustrated, the combination is replaced with an ID. By using such an ID, even information relating to the driver “A” is separated into three pieces of information “ID” from “0001” to “0003”. The transmission unit 1104 outputs the travel history replaced with the ID (hereinafter also referred to as “travel history”) to the communication IF 1056. The communication IF 1056 causes the wireless device 1008 to transmit a travel history to the cloud server. At that time, only the updated portion of the travel history may be transmitted. The cloud server adds the received travel history to the total travel history.
 図35に戻る。問い合せ部1106は、走行履歴生成部1102からの走行履歴を入力する。また、問い合せ部1106は、検出部1100からの情報も入力する。ここで入力される情報は、現在の運転者の名前と現在の同乗者の有無の組合せである。問い合せ部1106は、走行履歴生成部1102において生成した走行履歴のうち、現在の運転者の名前と現在の同乗者の有無の組合せに対する走行履歴を抽出する。図40は、問い合せ部1106における処理概要を示す。図40の(a)は、問い合せ部1106に入力された走行履歴のデータ構造を示し、図36の(a)と同一である。ここで、現在の運転者の名前が「A」であり、現在の助手席の同乗者が「有」であり、現在の後部座席の同乗者が「1人」である場合を想定する。図40の(b)は、図40の(a)に示された走行履歴から、現在の組合せに対する走行履歴を抽出した結果を示す。 Return to FIG. The inquiry unit 1106 inputs the travel history from the travel history generation unit 1102. The inquiry unit 1106 also inputs information from the detection unit 1100. The information input here is a combination of the name of the current driver and the presence or absence of the current passenger. The inquiry unit 1106 extracts a travel history for a combination of the name of the current driver and the presence or absence of a current passenger from the travel history generated by the travel history generation unit 1102. FIG. 40 shows an outline of processing in the inquiry unit 1106. 40A shows the data structure of the travel history input to the inquiry unit 1106, which is the same as FIG. 36A. Here, it is assumed that the name of the current driver is “A”, the current passenger in the passenger seat is “present”, and the current passenger in the rear seat is “one person”. FIG. 40B shows a result of extracting a travel history for the current combination from the travel history shown in FIG.
 図35に戻る。問い合せ部1106は、通信IF1056、無線装置1008を介してクラウドサーバに対して、抽出した走行履歴に類似した走行履歴を総合走行履歴から検索してもらうための問い合せ信号を送信する。問い合せ信号には、抽出した走行履歴(以下、「問い合せ走行履歴」という)が含まれる。クラウドサーバは、問い合せ信号を受信すると、問い合せ信号から、問い合せ走行履歴を取得する。クラウドサーバは、問い合せ走行履歴に類似した走行履歴を総合走行履歴から検索して取得する。具体的に説明すると、クラウドサーバは、問い合せ走行履歴から1つの行動と、当該行動に対応した環境パラメータを抽出する。ここでは、抽出した環境パラメータを「第1環境パラメータ」という。クラウドサーバは、総合走行履歴から、抽出した行動に対応した環境パラメータを複数取得する。ここでは、取得した複数の環境パラメータのそれぞれを「第2環境パラメータ」という。 Return to FIG. The inquiry unit 1106 transmits an inquiry signal for retrieving a travel history similar to the extracted travel history from the total travel history to the cloud server via the communication IF 1056 and the wireless device 1008. The inquiry signal includes the extracted traveling history (hereinafter referred to as “inquiry traveling history”). When the cloud server receives the inquiry signal, the cloud server acquires the inquiry travel history from the inquiry signal. The cloud server searches and acquires a travel history similar to the inquiry travel history from the total travel history. More specifically, the cloud server extracts one action and an environment parameter corresponding to the action from the inquiry travel history. Here, the extracted environmental parameters are referred to as “first environmental parameters”. The cloud server acquires a plurality of environmental parameters corresponding to the extracted behavior from the total travel history. Here, each of the acquired plurality of environment parameters is referred to as a “second environment parameter”.
 クラウドサーバは、第1環境パラメータの数値と1つの第2環境パラメータの数値とを要素とするベクトルの相関値を計算する。相関値がしきい値(以下、「サーバ内しきい値」という)より大きければ、クラウドサーバは、第2環境パラメータに対応したIDを特定するとともに、当該IDが付与されたすべての環境パラメータを総合走行履歴から取得する。一方、相関値がサーバ内しきい値以下であれば、クラウドサーバは、取得を実行しない。クラウドサーバは、取得した複数の第2環境パラメータのそれぞれに対して、このような処理を実行するとともに、問い合せ走行履歴に含まれた他の行動に対しても、このような処理を実行する。その結果、クラウドサーバは、問い合せ走行履歴に類似した環境パラメータを1つ以上取得する。取得した環境パラメータに複数のIDが混在することもあり得る。クラウドサーバは、取得した環境パラメータを「類似走行履歴」としてまとめる。その際、各環境パラメータに対応した行動も含まれる。類似走行履歴は、例えば、図39のようなデータ構造を有する。 The cloud server calculates a correlation value of vectors having the numerical value of the first environmental parameter and the numerical value of one second environmental parameter as elements. If the correlation value is larger than a threshold value (hereinafter referred to as “in-server threshold value”), the cloud server specifies an ID corresponding to the second environmental parameter and all the environmental parameters to which the ID is assigned. Obtained from the total travel history. On the other hand, if the correlation value is equal to or less than the intra-server threshold value, the cloud server does not execute acquisition. The cloud server executes such a process for each of the acquired plurality of second environment parameters, and also executes such a process for other actions included in the inquiry travel history. As a result, the cloud server acquires one or more environmental parameters similar to the inquiry travel history. A plurality of IDs may be mixed in the acquired environmental parameter. The cloud server collects the acquired environmental parameters as “similar travel history”. In that case, the action corresponding to each environmental parameter is also included. The similar running history has a data structure as shown in FIG. 39, for example.
 取得部1108は、問い合せ部1106による問い合せの応答として、無線装置1008、通信IF1056を介してクラウドサーバからの類似走行履歴を取得する。前述のごとく、類似走行履歴とは、現在の運転者と現在の同乗者の有無との組合せに対する走行履歴に類似した走行履歴である。なお、前述の(2)から(4)の場合、組合せの中に、同乗者の年齢・性別等が含まれており、前述の(5)の場合、組合せではなく、運転者の名前だけが含まれている。 The acquisition unit 1108 acquires a similar traveling history from the cloud server via the wireless device 1008 and the communication IF 1056 as a response to the inquiry by the inquiry unit 1106. As described above, the similar travel history is a travel history similar to the travel history for the combination of the current driver and the current presence / absence of a passenger. In the case of (2) to (4) described above, the passenger's age and gender are included in the combination. In the case of (5) above, only the driver's name, not the combination, is included. include.
 ドライバモデル生成部1110は、取得部1108からの類似走行履歴を入力する。ドライバモデル生成部1110は、類似走行履歴をもとに、ドライバモデルを生成する。例えば、ドライバモデル生成部1110は、問い合せ走行履歴と類似走行履歴とを組み合わせることによってドライバモデルを生成する。図41は、ドライバモデル生成部1110において生成されるドライバモデルのデータ構造を示す。図示のごとく、類似走行履歴に含まれたID、環境パラメータ、行動が組み合わされている。ここで、IDが示されず、環境パラメータ、行動が組み合わされた部分が問い合せ走行履歴に相当する。なお、ドライバモデルにIDは含まれていなくてもよい。 The driver model generation unit 1110 inputs the similar travel history from the acquisition unit 1108. The driver model generation unit 1110 generates a driver model based on the similar traveling history. For example, the driver model generation unit 1110 generates a driver model by combining an inquiry travel history and a similar travel history. FIG. 41 shows a data structure of a driver model generated by the driver model generation unit 1110. As shown in the figure, IDs, environmental parameters, and actions included in the similar traveling history are combined. Here, the ID is not shown, and the part where the environmental parameter and the action are combined corresponds to the inquiry travel history. Note that an ID may not be included in the driver model.
 図35に戻る。ドライバモデル生成部1110は、これとは別に、問い合せ走行履歴と類似走行履歴とでの同一の行動において、各環境パラメータの数値を平均化することによって、ドライバモデルを生成してもよい。ドライバモデル生成部1110は、ドライバモデルを判定部1112に出力する。 Return to FIG. Alternatively, the driver model generation unit 1110 may generate a driver model by averaging the numerical values of the environmental parameters in the same action in the inquiry travel history and the similar travel history. The driver model generation unit 1110 outputs the driver model to the determination unit 1112.
 判定部1112は、ドライバモデル生成部1110からドライバモデルを入力する。また、判定部1112は、第1入力部1070から第1検出情報を入力する。判定部1112は、第1検出情報に含まれた各種情報をもとに、現在の環境パラメータを導出する。環境パラメータは前述の通りであるので、ここでは説明を省略する。判定部1112は、図41に示されたドライバモデルの各行に示された環境パラメータの値と、現在の環境パラメータの値とを要素とするベクトルの相関値を計算する。また、判定部1112は、図41に示されたドライバモデルの行を変えながら、相関値の計算を繰り返し実行する。その結果、図41に示されたドライバモデルの各行に対応した複数の相関値が導出される。 The determination unit 1112 inputs a driver model from the driver model generation unit 1110. Further, the determination unit 1112 receives the first detection information from the first input unit 1070. The determination unit 1112 derives the current environment parameter based on various types of information included in the first detection information. Since the environmental parameters are as described above, description thereof is omitted here. The determination unit 1112 calculates a correlation value of a vector whose elements are the value of the environmental parameter shown in each row of the driver model shown in FIG. 41 and the value of the current environmental parameter. Further, the determination unit 1112 repeatedly executes the calculation of the correlation value while changing the row of the driver model shown in FIG. As a result, a plurality of correlation values corresponding to each row of the driver model shown in FIG. 41 are derived.
 判定部1112は、複数の相関値のうち、最大の相関値を選択してから、選択した相関値に対応した行に示された行動を「行動候補」として選択する。行動候補の選択は、次の行動を判定することに相当する。なお、判定部1112は、予めしきい値を設定しており、複数の相関値から、しきい値よりも大きい相関値を複数選択してもよい。判定部1112は、選択した複数の行に示された行動の統計を取り、多い順に「第1行動候補」、「第2行動候補」、・・、「第N行動候補」を設定する。なお、行動候補の数に上限値が設定されていてもよい。判定部1112は、1つ以上の行動候補を確認部1114に出力する。 The determination unit 1112 selects a maximum correlation value from among a plurality of correlation values, and then selects an action indicated in a row corresponding to the selected correlation value as an “action candidate”. The selection of an action candidate corresponds to determining the next action. Note that the determination unit 1112 may set a threshold value in advance, and may select a plurality of correlation values larger than the threshold value from a plurality of correlation values. The determination unit 1112 takes statistics of the behaviors shown in the selected plurality of rows, and sets “first behavior candidates”, “second behavior candidates”,..., “Nth behavior candidates” in descending order. Note that an upper limit value may be set for the number of action candidates. The determination unit 1112 outputs one or more action candidates to the confirmation unit 1114.
 確認部1114は、行動情報入力部1054に接続されており、行動情報入力部1054を介して自動運転制御装置1030から、自動運転に関する情報を入力する。自動運転に関する情報では、車両1000の次の行動が示される。当該次の行動(以下、「自動行動」という)は、自動運転制御装置1030において、自動運転アルゴリズムにて決定されている。そのため、自動行動は、運転者の感覚に合っていない場合もある。確認部1114は、判定部1112から1つ以上の行動候補も入力する。確認部1114は、自動行動、1つ以上の行動候補のいずれかを運転者に選択させるために、これらを画面生成部1116に出力する。 The confirmation unit 1114 is connected to the behavior information input unit 1054, and inputs information related to automatic driving from the automatic driving control device 1030 via the behavior information input unit 1054. In the information related to automatic driving, the next action of the vehicle 1000 is indicated. The next action (hereinafter referred to as “automatic action”) is determined by the automatic driving algorithm in the automatic driving control apparatus 1030. For this reason, the automatic behavior may not match the driver's sense. The confirmation unit 1114 also inputs one or more action candidates from the determination unit 1112. The confirmation unit 1114 outputs these to the screen generation unit 1116 in order for the driver to select one of the automatic behavior and one or more behavior candidates.
 画面生成部1116は、確認部1114から、自動行動、1つ以上の行動候補を入力する。画面生成部1116は、これらをまとめた画像を生成する。図42は、画面生成部1116において生成される画面を示す。図示のごとく、画面の中心部分には、行動画像1200が配置される。画面生成部1116は、複数種類の自動行動の内容と、それぞれに対応した画像を予め記憶しており、入力した自動行動に対応した画像を選択することによって、行動画像1200を生成する。また、画面の右側部分には第1行動候補画像1202a、第2行動候補画像1202bが配置される。第1行動候補画像1202a、第2行動候補画像1202bは、行動候補画像1202と総称される。第1行動候補画像1202aは、第1行動候補から生成され、第2行動候補画像1202bは、第2行動候補から生成されるが、画面生成部1116は、行動画像1200の生成と同様に、これらを生成する。画面生成部1116は、生成した画面の画像を画像データとして画像出力部1051に出力する。画像出力部1051は、報知装置1002に画像データを出力することによって、行動候補画像1202の画面を表示させる。 The screen generation unit 1116 inputs automatic behavior and one or more behavior candidates from the confirmation unit 1114. The screen generation unit 1116 generates an image in which these are collected. FIG. 42 shows a screen generated by the screen generation unit 1116. As shown in the figure, an action image 1200 is arranged at the center of the screen. The screen generation unit 1116 stores in advance the contents of a plurality of types of automatic actions and images corresponding to them, and generates an action image 1200 by selecting an image corresponding to the input automatic action. A first action candidate image 1202a and a second action candidate image 1202b are arranged on the right side of the screen. The first action candidate image 1202a and the second action candidate image 1202b are collectively referred to as action candidate images 1202. The first action candidate image 1202a is generated from the first action candidate, and the second action candidate image 1202b is generated from the second action candidate, but the screen generation unit 1116 is similar to the action image 1200 generation. Is generated. The screen generation unit 1116 outputs the generated screen image to the image output unit 1051 as image data. The image output unit 1051 displays the screen of the action candidate image 1202 by outputting the image data to the notification device 1002.
 報知装置1002は、図42に示された画面を表示する。運転者は、入力装置1004を使用しながら、行動画像1200、第1行動候補画像1202a、第2行動候補画像1202bのいずれかを選択する。操作入力部1050は、選択結果を操作信号として入力装置1004から入力し、選択結果を制御部1041に出力する。確認部1114は、操作入力部1050から選択結果を入力する。確認部1114は、選択結果が第1行動候補画像1202aであれば第1行動候補の選択を確認し、選択結果が第2行動候補画像1202bであれば第2行動候補の選択を確認する。また、確認部1114は、選択結果が行動画像1200であれば自動行動の選択を確認する。なお、確認部1114は、自動行動、1つ以上の行動候補を画面生成部1116に出力してから一定時間経過しても選択結果を入力しない場合も、自動行動の選択を確認する。確認部1114は、行動候補を選択した場合、選択した行動候補を指示部1118に出力する。 The notification device 1002 displays the screen shown in FIG. While using the input device 1004, the driver selects any one of the behavior image 1200, the first behavior candidate image 1202a, and the second behavior candidate image 1202b. The operation input unit 1050 inputs the selection result as an operation signal from the input device 1004 and outputs the selection result to the control unit 1041. The confirmation unit 1114 inputs the selection result from the operation input unit 1050. The confirmation unit 1114 confirms selection of the first action candidate if the selection result is the first action candidate image 1202a, and confirms selection of the second action candidate if the selection result is the second action candidate image 1202b. The confirmation unit 1114 confirms the selection of the automatic action if the selection result is the action image 1200. Note that the confirmation unit 1114 confirms the selection of the automatic behavior even when the selection result is not input even after a certain period of time has passed since the one or more behavior candidates are output to the screen generation unit 1116. When the action candidate is selected, the confirmation unit 1114 outputs the selected action candidate to the instruction unit 1118.
 指示部1118は、確認部1114から行動候補の通知を入力した場合、当該行動候補に応じた行動をコマンド出力部1055経由で自動運転制御装置1030に指示する。具体的に説明すると、指示部1118は、入力した行動候補をコマンド出力部1055に出力する。コマンド出力部1055は、指示部1118から行動候補を入力すると、行動候補に対応した制御コマンドを自動運転制御装置1030に出力する。その結果、自動運転制御装置1030は、行動候補を次の行動として、車両1000の自動運転を制御する。そのため、自動行動において「減速」が示されていた場合であっても、行動候補の「右車線変更」が選択された場合、車両1000は、次の行動の「右車線変更」にしたがって走行する。指示部1118は、次の行動を自動運転制御装置1030に指示した場合に、指示した行動を走行履歴生成部1102に通知する。 The instruction unit 1118 instructs the automatic driving control apparatus 1030 via the command output unit 1055 for an action corresponding to the action candidate when the action candidate notification is input from the confirmation unit 1114. Specifically, the instruction unit 1118 outputs the input action candidate to the command output unit 1055. The command output unit 1055, when an action candidate is input from the instruction unit 1118, outputs a control command corresponding to the action candidate to the automatic driving control device 1030. As a result, the automatic driving control device 1030 controls the automatic driving of the vehicle 1000 with the action candidate as the next action. Therefore, even when “deceleration” is indicated in the automatic action, when the “right lane change” of the action candidate is selected, the vehicle 1000 travels according to the “right lane change” of the next action. . The instruction unit 1118 notifies the traveling history generation unit 1102 of the instructed action when the next action is instructed to the automatic driving control apparatus 1030.
 以上の構成による運転支援装置1040の動作を説明する。図43は、第2検出部1062による検出手順を示すフローチャートである。図43の(a)は、1つ目の検出手順を示すフローチャートである。ドアの開閉がなされた場合(S1000のY)、第2検出部1062は、運転者の個人認証、同乗者の個人認証あるいは年齢・性別検出あるいは着座センシングを実行する(S1002)。検出部1100は、運転者/同乗者の情報を取得し、格納する(S1004)。ドアの開閉がなされない場合(S1000のN)、ステップ1002、ステップ1004はスキップされる。 The operation of the driving support apparatus 1040 having the above configuration will be described. FIG. 43 is a flowchart illustrating a detection procedure performed by the second detection unit 1062. FIG. 43A is a flowchart showing a first detection procedure. When the door is opened and closed (Y in S1000), the second detection unit 1062 executes driver personal authentication, passenger personal authentication, age / gender detection, or seating sensing (S1002). The detection unit 1100 acquires and stores driver / passenger information (S1004). When the door is not opened and closed (N in S1000), Step 1002 and Step 1004 are skipped.
 図43の(b)は、2つ目の検出手順を示すフローチャートである。第2検出部1062は、同乗者の状態(正常/眠気/車酔い)を検出する(S1010)。同乗者の状態変更を検出した場合(S1012のY)、検出部1100は、同乗者の状態を更新する(S1014)。同乗者の状態更新を検出しない場合(S1012のN)、ステップ1014はスキップされる。 43 (b) is a flowchart showing a second detection procedure. The second detection unit 1062 detects the passenger's state (normal / sleepiness / car sickness) (S1010). When the passenger's state change is detected (Y in S1012), the detection unit 1100 updates the passenger's state (S1014). When the passenger's state update is not detected (N in S1012), step 1014 is skipped.
 図44は、運転支援装置1040による登録手順を示すシーケンス図である。運転支援装置1040は、走行履歴を生成する(S1050)。運転支援装置1040は、クラウドサーバに走行履歴更新通知を送信する(S1052)。クラウドサーバは、運転支援装置1040に走行履歴要求を送信する(S1054)。運転支援装置1040は、走行履歴のIDを置き換え(S1056)、走行履歴登録を送信する(S1058)。クラウドサーバは、走行履歴を記憶する(S1060)。クラウドサーバは、運転支援装置1040に走行履歴登録結果を送信する(S1062)。 FIG. 44 is a sequence diagram showing a registration procedure by the driving support apparatus 1040. The driving support device 1040 generates a travel history (S1050). The driving support device 1040 transmits a travel history update notification to the cloud server (S1052). The cloud server transmits a travel history request to the driving support device 1040 (S1054). The driving support device 1040 replaces the ID of the travel history (S1056) and transmits the travel history registration (S1058). The cloud server stores the travel history (S1060). The cloud server transmits the travel history registration result to the driving support device 1040 (S1062).
 図45は、送信部1104による送信手順を示すフローチャートである。走行履歴更新があれば(S1100のY)、送信部1104は、走行履歴(更新分)を取得する(S1102)。ID未登録の条件がある場合(S1104のY)、送信部1104は新規IDを割り当てる(S1106)。ID未登録の条件がない場合(S1104のN)、ステップ1106をスキップする。送信部1104は、IDを置き換える(S1108)。走行履歴更新がなければ(S1100のN)、ステップ1102からステップ1108はスキップされる。 FIG. 45 is a flowchart showing a transmission procedure by the transmission unit 1104. If there is a travel history update (Y in S1100), the transmission unit 1104 acquires a travel history (updated) (S1102). When there is an unregistered condition (Y in S1104), the transmission unit 1104 assigns a new ID (S1106). If there is no ID unregistered condition (N in S1104), step 1106 is skipped. The transmission unit 1104 replaces the ID (S1108). If there is no travel history update (N in S1100), steps 1102 to 1108 are skipped.
 図46は、運転支援装置1040によるドライバモデルの生成手順を示すシーケンス図である。運転支援装置1040は、走行履歴を生成する(S1150)。運転支援装置1040は、問い合せ走行履歴を抽出する(S1152)。運転支援装置1040は、クラウドサーバに問い合せ信号を送信する(S1154)。クラウドサーバは、類似走行履歴を抽出する(S1156)。クラウドサーバは、運転支援装置1040に類似走行履歴を送信する(S1158)。運転支援装置1040は、ドライバモデルを生成する(S1160)。 FIG. 46 is a sequence diagram illustrating a driver model generation procedure performed by the driving support device 1040. The driving support device 1040 generates a travel history (S1150). The driving support device 1040 extracts the inquiry travel history (S1152). The driving support device 1040 transmits an inquiry signal to the cloud server (S1154). The cloud server extracts a similar travel history (S1156). The cloud server transmits a similar travel history to the driving support device 1040 (S1158). The driving support device 1040 generates a driver model (S1160).
 図47は、走行履歴生成部1102による走行履歴の更新手順を示すフローチャートである。判定部1112は、次の行動を判定する(S1200)。判定した行動が選択された場合(S1202のY)、走行履歴生成部1102は、走行履歴を更新する(S1204)。判定した行動が選択されない場合(S1202のN)、処理は終了される。 FIG. 47 is a flowchart showing a travel history update procedure performed by the travel history generation unit 1102. The determination unit 1112 determines the next action (S1200). When the determined action is selected (Y in S1202), the travel history generation unit 1102 updates the travel history (S1204). If the determined action is not selected (N in S1202), the process ends.
 本実施例によれば、現在の運転者に対する走行履歴に類似した走行履歴をもとに、ドライバモデルを生成するので、現在の運転者に適したドライバモデルを生成できる。また、現在の運転者に適したドライバモデルと、車両の現在の環境パラメータとをもとに、次の行動を判定するので、判定精度を向上できる。また、現在の運転者と、現在の同乗者の有無との組合せに対する走行履歴に類似した走行履歴をもとにドライバモデルを生成するので、ドライバモデルの精度を向上できる。この場合、現在の運転者と、現在の同乗者の有無と、現在の同乗者に関する情報との組合せに対する走行履歴に類似した走行履歴をもとにドライバモデルを生成するので、ドライバモデルの精度をさらに向上できる。 According to the present embodiment, since the driver model is generated based on the travel history similar to the travel history for the current driver, a driver model suitable for the current driver can be generated. Further, since the next action is determined based on the driver model suitable for the current driver and the current environmental parameters of the vehicle, the determination accuracy can be improved. Further, since the driver model is generated based on the traveling history similar to the traveling history for the combination of the current driver and the current presence / absence of the passenger, the accuracy of the driver model can be improved. In this case, since the driver model is generated based on the driving history similar to the driving history for the combination of the current driver, the presence / absence of the current passenger, and information on the current passenger, the accuracy of the driver model is improved. It can be further improved.
 また、現在の運転者に対する走行履歴に類似した走行履歴をサーバから取得するので、類似した走行履歴の検索をサーバにさせることができる。また、類似した走行履歴の検索をサーバにさせるので、処理量を低減できる。また、走行履歴をサーバに送信するので、さまざまな運転支援装置において生成した走行履歴をサーバに蓄積させることができる。また、さまざまな運転支援装置において生成した走行履歴がサーバに蓄積されるので、類似した走行履歴の検索精度を向上できる。また、各組合せを識別するためのIDを走行履歴に付与するので、サーバでの管理を容易にさせることができる。また、次の行動が示された画像を表示するので、次の行動を運転者に知らせることができる。 Also, since a travel history similar to the travel history for the current driver is acquired from the server, it is possible to cause the server to search for a similar travel history. Further, since the server searches for similar travel histories, the amount of processing can be reduced. Moreover, since the travel history is transmitted to the server, the travel history generated in various driving support devices can be accumulated in the server. In addition, since the travel histories generated in various driving support devices are stored in the server, it is possible to improve the search accuracy of similar travel histories. In addition, since an ID for identifying each combination is given to the travel history, management at the server can be facilitated. In addition, since an image showing the next action is displayed, the driver can be notified of the next action.
 また、現在の運転者に対する走行履歴に類似した走行履歴をもとに生成したドライバモデルと、車両の現在の環境パラメータとをもとに、次の行動を判定するので、自動運転制御装置による次の行動の判定精度を向上できる。また、現在の運転者に対する走行履歴に類似した走行履歴をもとに生成したドライバモデルと、車両の現在の環境パラメータとをもとに、次の行動を判定するので、車両による次の行動の判定精度を向上できる。 Further, the next action is determined based on the driver model generated based on the driving history similar to the driving history for the current driver and the current environmental parameters of the vehicle. It is possible to improve the accuracy of determining the behavior. In addition, since the next action is determined based on the driver model generated based on the driving history similar to the driving history for the current driver and the current environmental parameters of the vehicle, the next action by the vehicle is determined. The determination accuracy can be improved.
 以上、本発明に係る実施形態について図面を参照して詳述してきたが、上述した装置や各処理部の機能は、コンピュータプログラムにより実現され得る。 As described above, the embodiments according to the present invention have been described in detail with reference to the drawings. However, the functions of the above-described devices and processing units can be realized by a computer program.
 上述した機能をプログラムにより実現するコンピュータは、キーボードやマウス、タッチパッドなどの入力装置、ディスプレイやスピーカなどの出力装置、CPU(Central Processing Unit)、ROM(Read Only Memory)、RAM(Random Access Memory)を備える。そして、ハードディスク装置やSSD(Solid State Drive)などの記憶装置、DVD-ROM(Digital Versatile Disk Read Only Memory)やUSB(Universal Serial Bus)メモリなどの記録媒体から情報を読み取る読取装置、ネットワークを介して通信を行うネットワークカードなどをさらに備え、各部はバスにより接続される。 Computers that realize the functions described above by programs include input devices such as keyboards, mice, and touch pads, output devices such as displays and speakers, CPU (Central Processing Unit), ROM (Read Only Memory), RAM (Random Access Memory) Is provided. And a storage device such as a hard disk device or SSD (Solid State Drive), a reading device that reads information from a recording medium such as a DVD-ROM (Digital Versatile Disk Read Only Memory) or USB (Universal Serial Bus) memory, via a network A network card for performing communication is further provided, and each unit is connected by a bus.
 そして、読取装置は、上記プログラムを記録した記録媒体からそのプログラムを読み取り、記憶装置に記憶させる。あるいは、ネットワークカードが、ネットワークに接続されたサーバ装置と通信を行い、サーバ装置からダウンロードした上記各装置の機能を実現するためのプログラムを記憶装置に記憶させる。 The reading device reads the program from the recording medium on which the program is recorded, and stores the program in the storage device. Or a network card communicates with the server apparatus connected to the network, and memorize | stores the program for implement | achieving the function of said each apparatus downloaded from the server apparatus in a memory | storage device.
 そして、CPUが、記憶装置に記憶されたプログラムをRAMにコピーし、そのプログラムに含まれる命令をRAMから順次読み出して実行することにより、上記各装置の機能が実現される。 Then, the CPU copies the program stored in the storage device to the RAM, and sequentially reads out and executes the instructions included in the program from the RAM, thereby realizing the functions of the respective devices.
 本発明の一態様の概要は、次の通りである。本発明のある態様の運転支援装置は、車両が過去に走行した走行環境を示す環境パラメータと当該環境パラメータに対して運転者が選択した行動とを対応させた走行履歴を運転者ごとに生成する走行履歴生成部を備える。この運転支援装置は、さらに、走行履歴生成部において生成した走行履歴のうち、現在の運転者に対する走行履歴に類似した走行履歴を取得する取得部と、を備える。この運転支援装置は、さらに、取得部において取得した走行履歴をもとに、ドライバモデルを生成するドライバモデル生成部と、ドライバモデル生成部において生成したドライバモデルと、車両の現在の走行環境を示す環境パラメータとをもとに、次の行動を判定する判定部と、を備える。 The outline of one embodiment of the present invention is as follows. A driving support device according to an aspect of the present invention generates, for each driver, a driving history in which an environmental parameter indicating a driving environment in which the vehicle has traveled in the past and an action selected by the driver with respect to the environmental parameter are associated with each other. A travel history generation unit is provided. The driving support apparatus further includes an acquisition unit that acquires a travel history similar to the travel history for the current driver among the travel histories generated by the travel history generation unit. The driving support device further indicates a driver model generation unit that generates a driver model, a driver model generated by the driver model generation unit, and a current driving environment of the vehicle based on the travel history acquired by the acquisition unit. A determination unit that determines the next action based on the environmental parameter.
 この態様によると、現在の運転者に対する走行履歴に類似した走行履歴をもとに生成したドライバモデルと、車両の現在の環境パラメータとをもとに、次の行動を判定するので、判定精度を向上できる。 According to this aspect, since the next action is determined based on the driver model generated based on the driving history similar to the driving history for the current driver and the current environmental parameter of the vehicle, the determination accuracy is improved. It can be improved.
 運転支援装置は、車両の同乗者の有無を検出する検出部をさらに備えてもよい。走行履歴生成部は、運転者ごとに加えて、検出部において過去に検出した同乗者の有無ごとに、走行履歴を生成し、取得部は、現在の運転者と、検出部において検出した現在の同乗者の有無との組合せに対する走行履歴に類似した走行履歴を取得してもよい。この場合、現在の運転者と、現在の同乗者の有無との組合せに対する走行履歴に類似した走行履歴をもとにドライバモデルを生成するので、ドライバモデルの精度を向上できる。 The driving support device may further include a detection unit that detects the presence or absence of a passenger in the vehicle. The travel history generation unit generates a travel history for each driver and for each passenger who has been detected in the past by the detection unit, and the acquisition unit detects the current driver and the current detected by the detection unit. A travel history similar to the travel history for the combination with the presence or absence of a passenger may be acquired. In this case, since the driver model is generated based on the travel history similar to the travel history for the combination of the current driver and the current presence / absence of the passenger, the accuracy of the driver model can be improved.
 検出部は、車両の同乗者に関する情報も検出し、走行履歴生成部は、運転者ごとに加えて、検出部において過去に検出した同乗者の有無と、同乗者に関する情報ごとに、走行履歴を生成し、取得部は、現在の運転者と、検出部において検出した現在の同乗者の有無と、現在の同乗者に関する情報との組合せに対する走行履歴に類似した走行履歴を取得してもよい。この場合、現在の運転者と、現在の同乗者の有無と、現在の同乗者に関する情報との組合せに対する走行履歴に類似した走行履歴をもとにドライバモデルを生成するので、ドライバモデルの精度を向上できる。 The detection unit also detects information related to the passengers of the vehicle, and the travel history generation unit calculates the travel history for each driver, and for each information related to the passengers and the presence or absence of the passengers detected in the past by the detection unit. The generation unit may generate a travel history similar to a travel history for a combination of the current driver, the presence / absence of the current passenger detected by the detection unit, and information related to the current passenger. In this case, since the driver model is generated based on the driving history similar to the driving history for the combination of the current driver, the presence / absence of the current passenger, and information on the current passenger, the accuracy of the driver model is improved. It can be improved.
 運転支援装置は、走行履歴生成部において生成した走行履歴のうち、現在の運転者に対する走行履歴をもとに、サーバに問い合せを実行する問い合せ部をさらに備えてもよい。取得部は、問い合せ部による問い合せの応答として、サーバから、現在の運転者に対する走行履歴に類似した走行履歴を取得してもよい。この場合、現在の運転者に対する走行履歴に類似した走行履歴をサーバから取得するので、処理量を低減できる。 The driving support device may further include an inquiry unit that executes an inquiry to the server based on the driving history for the current driver among the driving histories generated by the driving history generation unit. The acquisition unit may acquire a travel history similar to the travel history for the current driver from the server as a response to the inquiry by the inquiry unit. In this case, since a travel history similar to the travel history for the current driver is acquired from the server, the amount of processing can be reduced.
 運転支援装置は、走行履歴生成部において生成した走行履歴のうち、現在の運転者と現在の同乗者の有無との組合せに対する走行履歴をもとに、サーバに問い合せを実行する問い合せ部をさらに備えてもよい。取得部は、問い合せ部による問い合せの応答として、サーバから、現在の運転者と現在の同乗者の有無との組合せに対する走行履歴に類似した走行履歴を取得してもよい。この場合、現在の運転者に対する走行履歴に類似した走行履歴をサーバから取得するので、処理量を低減できる。 The driving support device further includes an inquiry unit that executes an inquiry to the server based on a traveling history of a combination of the current driver and the current presence / absence of a passenger among the traveling history generated by the traveling history generation unit. May be. The acquisition unit may acquire a travel history similar to the travel history for the combination of the current driver and the presence or absence of the current passenger from the server as a response to the inquiry by the inquiry unit. In this case, since a travel history similar to the travel history for the current driver is acquired from the server, the amount of processing can be reduced.
 運転支援装置は、走行履歴生成部において生成した走行履歴のうち、現在の運転者と現在の同乗者の有無と現在の同乗者に関する情報との組合せをもとに、サーバに問い合せを実行する問い合せ部をさらに備えてもよい。取得部は、問い合せ部による問い合せの応答として、サーバから、現在の運転者と現在の同乗者の有無と現在の同乗者に関する情報との組合せに対する走行履歴に類似した走行履歴を取得してもよい。この場合、現在の運転者に対する走行履歴に類似した走行履歴をサーバから取得するので、処理量を低減できる。 The driving support device is an inquiry that executes an inquiry to the server based on a combination of the current driver and the presence / absence of the current passenger and information on the current passenger in the driving history generated by the driving history generation unit. A part may be further provided. The acquisition unit may acquire a travel history similar to the travel history for the combination of the current driver and the presence / absence of the current passenger and information on the current passenger as a response to the inquiry by the inquiry unit. . In this case, since a travel history similar to the travel history for the current driver is acquired from the server, the amount of processing can be reduced.
 運転支援装置は、走行履歴生成部において生成した走行履歴をサーバに送信する送信部をさらに備えてもよい。この場合、走行履歴をサーバに送信するので、さまざまな運転支援装置において生成した走行履歴をサーバに蓄積させることができる。 The driving support device may further include a transmission unit that transmits the travel history generated by the travel history generation unit to the server. In this case, since the traveling history is transmitted to the server, traveling histories generated in various driving support devices can be accumulated in the server.
 運転支援装置は、走行履歴生成部において生成した走行履歴をサーバに送信する送信部をさらに備えてもよい。送信部は、走行履歴における各組合せを識別するための識別情報を付与してもよい。この場合、各組合せを識別するための識別情報を付与するので、サーバでの管理を容易にさせることができる。 The driving support device may further include a transmission unit that transmits the travel history generated by the travel history generation unit to the server. The transmission unit may add identification information for identifying each combination in the travel history. In this case, since identification information for identifying each combination is given, management on the server can be facilitated.
 運転支援装置は、判定部において判定した次の行動が示された画像を報知装置に表示させる画像出力部をさらに備えてもよい。この場合、次の行動を運転者に知らせることができる。 The driving support device may further include an image output unit that causes the notification device to display an image indicating the next action determined by the determination unit. In this case, the driver can be notified of the next action.
 本発明の別の態様は、自動運転制御装置である。この装置は、車両が過去に走行した走行環境を示す環境パラメータと当該環境パラメータに対して運転者が選択した行動とを対応させた走行履歴を運転者ごとに生成する走行履歴生成部を備える。この自動運転装制御置は、さらに、走行履歴生成部において生成した走行履歴のうち、現在の運転者に対する走行履歴に類似した走行履歴を取得する取得部と、取得部において取得した走行履歴をもとに、ドライバモデルを生成するドライバモデル生成部と、を備える。この自動運転装制御置は、さらに、ドライバモデル生成部において生成したドライバモデルと、車両の現在の走行環境を示す環境パラメータとをもとに、次の行動を判定する判定部と、判定部において判定した次の行動をもとに、車両の自動運転を制御する自動運転制御部と、を備える。 Another aspect of the present invention is an automatic operation control device. This device includes a travel history generation unit that generates, for each driver, a travel history in which an environment parameter indicating a travel environment in which the vehicle has traveled in the past and an action selected by the driver with respect to the environment parameter. The automatic driving equipment control device further includes an acquisition unit for acquiring a travel history similar to the travel history for the current driver among the travel histories generated by the travel history generation unit, and a travel history acquired by the acquisition unit. And a driver model generation unit that generates a driver model. The automatic driving equipment control device further includes a determination unit that determines the next action based on the driver model generated by the driver model generation unit and an environmental parameter indicating the current traveling environment of the vehicle, and a determination unit An automatic driving control unit that controls automatic driving of the vehicle based on the determined next action.
 この態様によると、現在の運転者に対する走行履歴に類似した走行履歴をもとに生成したドライバモデルと、車両の現在の環境パラメータとをもとに、次の行動を判定するので、判定精度を向上できる。 According to this aspect, since the next action is determined based on the driver model generated based on the driving history similar to the driving history for the current driver and the current environmental parameter of the vehicle, the determination accuracy is improved. It can be improved.
 本発明のさらに別の態様は、車両である。この車両は、運転支援装置を備える車両であって、運転支援装置は、車両が過去に走行した走行環境を示す環境パラメータと当該環境パラメータに対して運転者が選択した行動とを対応させた走行履歴を運転者ごとに生成する走行履歴生成部を備える。この運転支援装置は、さらに、走行履歴生成部において生成した走行履歴のうち、現在の運転者に対する走行履歴に類似した走行履歴を取得する取得部と、を備える。この運転支援装置は、さらに、取得部において取得した走行履歴をもとに、ドライバモデルを生成するドライバモデル生成部と、ドライバモデル生成部において生成したドライバモデルと、車両の現在の走行環境を示す環境パラメータとをもとに、次の行動を判定する判定部と、を備える。 Still another aspect of the present invention is a vehicle. This vehicle is a vehicle including a driving support device, and the driving support device travels by causing an environment parameter indicating a travel environment in which the vehicle has traveled in the past and an action selected by the driver to the environment parameter. A travel history generation unit that generates a history for each driver is provided. The driving support apparatus further includes an acquisition unit that acquires a travel history similar to the travel history for the current driver among the travel histories generated by the travel history generation unit. The driving support device further indicates a driver model generation unit that generates a driver model, a driver model generated by the driver model generation unit, and a current driving environment of the vehicle based on the travel history acquired by the acquisition unit. A determination unit that determines the next action based on the environmental parameter.
 この態様によると、現在の運転者に対する走行履歴に類似した走行履歴をもとに生成したドライバモデルと、車両の現在の環境パラメータとをもとに、次の行動を判定するので、判定精度を向上できる。 According to this aspect, since the next action is determined based on the driver model generated based on the driving history similar to the driving history for the current driver and the current environmental parameter of the vehicle, the determination accuracy is improved. It can be improved.
 本発明のさらに別の態様は、運転支援方法である。この方法は、車両が過去に走行した走行環境を示す環境パラメータと、当該環境パラメータに対して運転者が選択した行動とを対応させた走行履歴を運転者ごとに生成するステップと、を備える。この運転支援装置は、さらに、生成した走行履歴のうち、現在の運転者に対する走行履歴に類似した走行履歴を取得するステップと、取得した走行履歴をもとに、ドライバモデルを生成するステップと、生成したドライバモデルと、車両の現在の走行環境を示す環境パラメータとをもとに、次の行動を判定するステップと、を備える。 Still another aspect of the present invention is a driving support method. This method includes a step of generating, for each driver, an environmental parameter indicating a driving environment in which the vehicle has traveled in the past, and a driving history in which the driver selects an action selected by the environmental parameter. The driving support device further includes a step of acquiring a driving history similar to the driving history for the current driver among the generated driving history, a step of generating a driver model based on the acquired driving history, Determining the next action based on the generated driver model and an environmental parameter indicating the current driving environment of the vehicle.
 実施の形態5において、総合走行履歴の管理、類似走行履歴の抽出はクラウドサーバにおいてなされている。しかしながらこれに限らず例えば、これらの処理は、運転支援装置1040においてなされてもよい。その場合、複数の車両1000のそれぞれに搭載された運転支援装置1040は、互いに走行履歴を交換することによって、運転支援装置1040ごとの総合走行履歴を生成する。本変形例によれば、クラウドサーバの設置を不要にできる。 In the fifth embodiment, the management of the total travel history and the extraction of the similar travel history are performed in the cloud server. However, the present invention is not limited thereto, and for example, these processes may be performed in the driving support device 1040. In that case, the driving assistance device 1040 mounted in each of the plurality of vehicles 1000 generates an overall traveling history for each driving assistance device 1040 by exchanging the traveling history with each other. According to this modification, the installation of a cloud server can be made unnecessary.
 本発明に係る運転支援方法およびそれを利用した運転支援装置、自動運転制御装置、車両、プログラムは、運転者に情報を伝達するのに好適である。 The driving support method according to the present invention and the driving support device, automatic driving control device, vehicle, and program using the driving support method are suitable for transmitting information to the driver.
 1 車両
 2,1012 ブレーキペダル
 3,1013 アクセルペダル
 4 ウィンカレバー
 5 ステアリングホイール
 6 検出部
 7 車両制御部
 8 記憶部
 9 情報報知装置
 10 タッチパネル
 29a,29b,29c,29g,39a,39b,39c,39g,59b,79a~79g,89a,89b,99a,99b,109a~109e,121,121a,121b,121c,121d 表示領域
 51 操作部
 51a~51h 操作ボタン
 59,69,79,89,99 文字情報
 61 位置情報取得部
 62 センサ
 63 速度情報取得部
 64 地図情報取得部
 91 情報取得部
 92 報知部
 102 入力部
 111,112,113,121’,122’,123,131,131’,132,133,134,134’,135,136,137,252,253 記号
 1000 車両
 1002 報知装置
 1004 入力装置
 1004a 第1操作部
 1004b 第2操作部
 1006 スピーカ
 1008 無線装置
 1010 運転操作部
 1020 検出部
 1021 位置情報取得部
 1022 センサ
 1023 速度情報取得部
 1024 地図情報取得部
 1030 自動運転制御装置
 1031 制御部
 1032 記憶部
 1033 I/O部
 1040 運転支援装置
 1041 制御部
 1042 記憶部
 1043 I/O部
 1050 操作入力部
 1051 画像出力部
 1052 検出情報入力部
 1053 コマンドIF
 1054 行動情報入力部
 1055 コマンド出力部
 1056 通信IF
 1060 第1検出部
 1062 第2検出部
 1064 運転者センシング部
 1066 同乗者センシング部
 1070 第1入力部
 1072 第2入力部
 1100 検出部
 1102 走行履歴生成部
 1104 送信部
 1106 問い合せ部
 1108 取得部
 1110 ドライバモデル生成部
 1112 判定部
 1114 確認部
 1116 画面生成部
 1118 指示部
DESCRIPTION OF SYMBOLS 1 Vehicle 2,1012 Brake pedal 3,1013 Accelerator pedal 4 Winker lever 5 Steering wheel 6 Detection part 7 Vehicle control part 8 Memory | storage part 9 Information alerting device 10 Touchscreen 29a, 29b, 29c, 29g, 39a, 39b, 39c, 39g, 59b, 79a to 79g, 89a, 89b, 99a, 99b, 109a to 109e, 121, 121a, 121b, 121c, 121d Display area 51 Operation section 51a to 51h Operation buttons 59, 69, 79, 89, 99 Character information 61 Position Information acquisition unit 62 Sensor 63 Speed information acquisition unit 64 Map information acquisition unit 91 Information acquisition unit 92 Notification unit 102 Input unit 111, 112, 113, 121 ′, 122 ′, 123, 131, 131 ′, 132, 133, 134, 134 ', 135, 136, 137 , 252, 253 Symbol 1000 Vehicle 1002 Notification device 1004 Input device 1004a First operation unit 1004b Second operation unit 1006 Speaker 1008 Radio device 1010 Driving operation unit 1020 Detection unit 1021 Position information acquisition unit 1022 Sensor 1023 Speed information acquisition unit 1024 Map information Acquisition unit 1030 Automatic operation control device 1031 Control unit 1032 Storage unit 1033 I / O unit 1040 Driving support device 1041 Control unit 1042 Storage unit 1043 I / O unit 1050 Operation input unit 1051 Image output unit 1052 Detection information input unit 1053 Command IF
1054 Action information input unit 1055 Command output unit 1056 Communication IF
1060 First detection unit 1062 Second detection unit 1064 Driver sensing unit 1066 Passenger sensing unit 1070 First input unit 1072 Second input unit 1100 Detection unit 1102 Travel history generation unit 1104 Transmission unit 1106 Inquiry unit 1108 Acquisition unit 1110 Driver model Generation unit 1112 Determination unit 1114 Confirmation unit 1116 Screen generation unit 1118 Instruction unit

Claims (13)

  1.  車両が過去に走行した走行環境を示す環境パラメータと当該環境パラメータに対して運転者が選択した行動とを対応させた走行履歴を運転者ごとに生成する走行履歴生成部と、
     前記走行履歴生成部において生成した走行履歴のうち、現在の運転者に対する走行履歴に類似した走行履歴を取得する取得部と、
     前記取得部において取得した走行履歴をもとに、ドライバモデルを生成するドライバモデル生成部と、
     前記ドライバモデル生成部において生成した前記ドライバモデルと、車両の現在の走行環境を示す環境パラメータとをもとに、次の行動を判定する判定部と、
     を備える運転支援装置。
    A travel history generating unit that generates, for each driver, a travel history in which an environment parameter indicating a travel environment in which the vehicle traveled in the past and an action selected by the driver with respect to the environment parameter are associated with each other;
    Of the travel history generated in the travel history generation unit, an acquisition unit that acquires a travel history similar to the travel history for the current driver;
    Based on the travel history acquired in the acquisition unit, a driver model generation unit that generates a driver model,
    A determination unit that determines the next action based on the driver model generated in the driver model generation unit and an environmental parameter indicating a current traveling environment of the vehicle,
    A driving support apparatus comprising:
  2.  車両の同乗者の有無を検出する検出部をさらに備え、
     前記走行履歴生成部は、運転者ごとに加えて、前記検出部において過去に検出した同乗者の有無ごとに、走行履歴を生成し、
     前記取得部は、前記現在の運転者と、前記検出部において検出した現在の同乗者の有無との組合せに対する走行履歴に類似した走行履歴を取得する請求項1に記載の運転支援装置。
    A detection unit for detecting the presence or absence of a passenger in the vehicle;
    The travel history generation unit generates a travel history for each driver, and for each presence or absence of a passenger detected in the past in the detection unit,
    The driving support device according to claim 1, wherein the acquisition unit acquires a travel history similar to a travel history for a combination of the current driver and the presence / absence of a current passenger detected by the detection unit.
  3.  前記検出部は、車両の同乗者に関する情報も検出し、
     前記走行履歴生成部は、運転者ごとに加えて、前記検出部において過去に検出した同乗者の有無と、同乗者に関する情報ごとに、走行履歴を生成し、
     前記取得部は、前記現在の運転者と、前記検出部において検出した現在の同乗者の有無と、前記現在の同乗者に関する情報との組合せに対する走行履歴に類似した走行履歴を取得する請求項2に記載の運転支援装置。
    The detection unit also detects information about passengers in the vehicle,
    The travel history generation unit generates a travel history for each driver, in addition to the presence or absence of a passenger detected in the past in the detection unit, and for each information related to the passenger,
    The said acquisition part acquires the driving | running history similar to the driving | running history with respect to the combination of the said present driver | operator, the presence or absence of the current passenger detected in the said detection part, and the information regarding the said current passenger. The driving support device according to 1.
  4.  前記走行履歴生成部において生成した走行履歴のうち、前記現在の運転者に対する走行履歴をもとに、サーバに問い合せを実行する問い合せ部をさらに備え、
     前記取得部は、前記問い合せ部による前記問い合せの応答として、前記サーバから、前記現在の運転者に対する走行履歴に類似した走行履歴を取得する請求項1に記載の運転支援装置。
    Of the travel history generated in the travel history generation unit, further comprising an inquiry unit that executes an inquiry to the server based on the travel history for the current driver,
    The driving support device according to claim 1, wherein the acquisition unit acquires a travel history similar to a travel history for the current driver from the server as a response to the inquiry by the inquiry unit.
  5.  前記走行履歴生成部において生成した走行履歴のうち、前記現在の運転者と現在の同乗者の有無との組合せに対する走行履歴をもとに、サーバに問い合せを実行する問い合せ部をさらに備え、
     前記取得部は、前記問い合せ部による前記問い合せの応答として、前記サーバから、前記現在の運転者と現在の同乗者の有無との組合せに対する走行履歴に類似した走行履歴を取得する請求項2に記載の運転支援装置。
    Of the travel history generated in the travel history generation unit, based on the travel history for the combination of the current driver and the presence or absence of the current passenger, further comprising an inquiry unit that executes an inquiry to the server,
    The said acquisition part acquires the driving | running history similar to the driving | running history with respect to the combination with the said present driver and the presence or absence of a fellow passenger from the said server as a response of the said inquiry by the said inquiry part. Driving assistance device.
  6.  前記走行履歴生成部において生成した走行履歴のうち、前記現在の運転者と現在の同乗者の有無と前記現在の同乗者に関する情報との組合せをもとに、サーバに問い合せを実行する問い合せ部をさらに備え、
     前記取得部は、前記問い合せ部による前記問い合せの応答として、前記サーバから、前記現在の運転者と現在の同乗者の有無と前記現在の同乗者に関する情報との組合せに対する走行履歴に類似した走行履歴を取得する請求項3に記載の運転支援装置。
    An inquiry unit that executes an inquiry to the server based on a combination of the current driver and the presence or absence of a current passenger and information on the current passenger among the driving history generated in the driving history generation unit In addition,
    The acquisition unit, as a response to the inquiry by the inquiry unit, from the server, a travel history similar to a travel history with respect to a combination of the current driver and the presence or absence of the current passenger and information on the current passenger The driving support device according to claim 3, wherein
  7.  前記走行履歴生成部において生成した走行履歴をサーバに送信する送信部をさらに備える請求項1から3のいずれかに記載の運転支援装置。 The driving support device according to any one of claims 1 to 3, further comprising a transmission unit that transmits the travel history generated by the travel history generation unit to a server.
  8.  前記走行履歴生成部において生成した走行履歴をサーバに送信する送信部をさらに備え、
     前記送信部は、前記走行履歴における前記組合せを識別するための識別情報を付与する請求項2または3に記載の運転支援装置。
    A transmission unit that transmits the travel history generated in the travel history generation unit to a server;
    The driving support device according to claim 2 or 3, wherein the transmission unit provides identification information for identifying the combination in the travel history.
  9.  前記判定部において判定した次の行動が示された画像を報知装置に表示させる画像出力部をさらに備える請求項1から8のいずれかに記載の運転支援装置。 The driving support device according to any one of claims 1 to 8, further comprising an image output unit that causes the notification device to display an image indicating the next action determined by the determination unit.
  10.  車両が過去に走行した走行環境を示す環境パラメータと当該環境パラメータに対して運転者が選択した行動とを対応させた走行履歴を運転者ごとに生成する走行履歴生成部と、
     前記走行履歴生成部において生成した走行履歴のうち、現在の運転者に対する走行履歴に類似した走行履歴を取得する取得部と、
     前記取得部において取得した走行履歴をもとに、ドライバモデルを生成するドライバモデル生成部と、
     前記ドライバモデル生成部において生成した前記ドライバモデルと、車両の現在の走行環境を示す環境パラメータとをもとに、次の行動を判定する判定部と、
     前記判定部において判定した前記次の行動をもとに、前記車両の自動運転を制御する自動運転制御部と、
     を備える自動運転制御装置。
    A travel history generating unit that generates, for each driver, a travel history in which an environment parameter indicating a travel environment in which the vehicle traveled in the past and an action selected by the driver with respect to the environment parameter are associated with each other;
    Of the travel history generated in the travel history generation unit, an acquisition unit that acquires a travel history similar to the travel history for the current driver;
    Based on the travel history acquired in the acquisition unit, a driver model generation unit that generates a driver model,
    A determination unit that determines the next action based on the driver model generated in the driver model generation unit and an environmental parameter indicating a current traveling environment of the vehicle,
    Based on the next action determined by the determination unit, an automatic driving control unit that controls automatic driving of the vehicle,
    An automatic operation control device comprising:
  11.  運転支援装置を備える車両であって、
     前記運転支援装置は、
     前記車両が過去に走行した走行環境を示す環境パラメータと当該環境パラメータに対して運転者が選択した行動とを対応させた走行履歴を運転者ごとに生成する走行履歴生成部と、
     前記走行履歴生成部において生成した走行履歴のうち、現在の運転者に対する走行履歴に類似した走行履歴を取得する取得部と、
     前記取得部において取得した走行履歴をもとに、ドライバモデルを生成するドライバモデル生成部と、
     前記ドライバモデル生成部において生成した前記ドライバモデルと、前記車両の現在の走行環境を示す環境パラメータとをもとに、次の行動を判定する判定部と、
     を備える車両。
    A vehicle equipped with a driving support device,
    The driving support device includes:
    A travel history generating unit that generates, for each driver, a travel history in which an environment parameter indicating a travel environment in which the vehicle has traveled in the past and an action selected by the driver with respect to the environment parameter are associated;
    Of the travel history generated in the travel history generation unit, an acquisition unit that acquires a travel history similar to the travel history for the current driver;
    Based on the travel history acquired in the acquisition unit, a driver model generation unit that generates a driver model,
    A determination unit that determines the next action based on the driver model generated in the driver model generation unit and an environmental parameter indicating a current traveling environment of the vehicle;
    A vehicle comprising:
  12.  車両が過去に走行した走行環境を示す環境パラメータと当該環境パラメータに対して運転者が選択した行動とを対応させた走行履歴を運転者ごとに生成するステップと、
     前記生成した走行履歴のうち、現在の運転者に対する走行履歴に類似した走行履歴を取得するステップと、
     取得した走行履歴をもとに、ドライバモデルを生成するステップと、
     生成した前記ドライバモデルと、前記車両の現在の走行環境を示す環境パラメータとをもとに、次の行動を判定するステップと、
     を備える運転支援方法。
    Generating, for each driver, a driving history in which an environmental parameter indicating a driving environment in which the vehicle has traveled in the past and an action selected by the driver with respect to the environmental parameter are associated with each other;
    Of the generated travel history, obtaining a travel history similar to the travel history for the current driver;
    A step of generating a driver model based on the acquired traveling history,
    Determining the next action based on the generated driver model and an environmental parameter indicating a current driving environment of the vehicle;
    A driving support method comprising:
  13.  車両が過去に走行した走行環境を示す環境パラメータと当該環境パラメータに対して運転者が選択した行動とを対応させた走行履歴を運転者ごとに生成するステップと、
     前記生成した走行履歴のうち、現在の運転者に対する走行履歴に類似した走行履歴を取得するステップと、
     前記取得した走行履歴をもとに、ドライバモデルを生成するステップと、
     前記生成したドライバモデルと、前記車両の現在の走行環境を示す環境パラメータとをもとに、次の行動を判定するステップとをコンピュータに実行させるためのプログラム。
    Generating, for each driver, a driving history in which an environmental parameter indicating a driving environment in which the vehicle has traveled in the past and an action selected by the driver with respect to the environmental parameter are associated with each other;
    Of the generated travel history, obtaining a travel history similar to the travel history for the current driver;
    Generating a driver model based on the acquired travel history;
    A program for causing a computer to execute a step of determining a next action based on the generated driver model and an environmental parameter indicating a current traveling environment of the vehicle.
PCT/JP2016/002048 2015-04-21 2016-04-15 Driving assistance method, driving assistance device using same, automatic driving control device, vehicle, and driving assistance program WO2016170763A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US15/564,702 US10919540B2 (en) 2015-04-21 2016-04-15 Driving assistance method, and driving assistance device, driving control device, vehicle, and recording medium using said method
EP20177508.7A EP3738854A1 (en) 2015-04-21 2016-04-15 Driving assistance method, driving assistance device using same, automatic driving control device, vehicle, and driving assistance program
CN201680021986.8A CN107531252B (en) 2015-04-21 2016-04-15 Driving support method, and driving support device, automatic driving control device, vehicle, and storage medium using the driving support method
EP16782787.2A EP3269609B1 (en) 2015-04-21 2016-04-15 Driving assistance method, driving assistance device using same, automatic driving control device, vehicle, and driving assistance program

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
JP2015-087069 2015-04-21
JP2015087069 2015-04-21
JP2015099474 2015-05-14
JP2015-099474 2015-05-14
JP2015-119139 2015-06-12
JP2015119139 2015-06-12
JP2015-252667 2015-12-24
JP2015252667A JP6761967B2 (en) 2015-04-21 2015-12-24 Driving support method and driving support device, automatic driving control device, vehicle, program using it

Publications (1)

Publication Number Publication Date
WO2016170763A1 true WO2016170763A1 (en) 2016-10-27

Family

ID=57143812

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/002048 WO2016170763A1 (en) 2015-04-21 2016-04-15 Driving assistance method, driving assistance device using same, automatic driving control device, vehicle, and driving assistance program

Country Status (1)

Country Link
WO (1) WO2016170763A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107139921A (en) * 2017-04-05 2017-09-08 吉利汽车研究院(宁波)有限公司 A kind of steering collision-proof method and system for vehicle
CN109131340A (en) * 2017-06-15 2019-01-04 株式会社日立制作所 Active vehicle adjusting performance based on driving behavior
CN109747621A (en) * 2017-11-02 2019-05-14 本田技研工业株式会社 Controller of vehicle
CN110325422A (en) * 2017-02-23 2019-10-11 松下知识产权经营株式会社 Information processing system, information processing method, program and recording medium
CN110446645A (en) * 2017-04-07 2019-11-12 日立汽车系统株式会社 Controller of vehicle
CN111201554A (en) * 2017-10-17 2020-05-26 本田技研工业株式会社 Travel model generation system, vehicle in travel model generation system, processing method, and program
JP2020086801A (en) * 2018-11-22 2020-06-04 三菱電機株式会社 Automatic driving control device and automatic driving control method
WO2021012528A1 (en) * 2019-07-25 2021-01-28 平安科技(深圳)有限公司 Driving safety assistance method and apparatus, vehicle, and readable storage medium
CN112513950A (en) * 2018-08-27 2021-03-16 日立汽车系统株式会社 Update system and electronic control device
WO2022107442A1 (en) * 2020-11-20 2022-05-27 株式会社デンソー Hmi control device and drive control device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007122579A (en) * 2005-10-31 2007-05-17 Equos Research Co Ltd Vehicle controller
JP2012113631A (en) * 2010-11-26 2012-06-14 Toyota Motor Corp Driving support system and driving support management center
JP2013069020A (en) * 2011-09-21 2013-04-18 Nissan Motor Co Ltd Eco-driving support device
JP2015081057A (en) * 2013-10-24 2015-04-27 日産自動車株式会社 Display device for vehicle

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007122579A (en) * 2005-10-31 2007-05-17 Equos Research Co Ltd Vehicle controller
JP2012113631A (en) * 2010-11-26 2012-06-14 Toyota Motor Corp Driving support system and driving support management center
JP2013069020A (en) * 2011-09-21 2013-04-18 Nissan Motor Co Ltd Eco-driving support device
JP2015081057A (en) * 2013-10-24 2015-04-27 日産自動車株式会社 Display device for vehicle

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3269609A4 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110325422A (en) * 2017-02-23 2019-10-11 松下知识产权经营株式会社 Information processing system, information processing method, program and recording medium
CN107139921B (en) * 2017-04-05 2019-10-29 吉利汽车研究院(宁波)有限公司 A kind of steering collision-proof method and system for vehicle
CN107139921A (en) * 2017-04-05 2017-09-08 吉利汽车研究院(宁波)有限公司 A kind of steering collision-proof method and system for vehicle
CN110446645B (en) * 2017-04-07 2022-09-20 日立安斯泰莫株式会社 Vehicle control device
CN110446645A (en) * 2017-04-07 2019-11-12 日立汽车系统株式会社 Controller of vehicle
CN109131340B (en) * 2017-06-15 2021-07-16 株式会社日立制作所 Active vehicle performance adjustment based on driver behavior
CN109131340A (en) * 2017-06-15 2019-01-04 株式会社日立制作所 Active vehicle adjusting performance based on driving behavior
CN111201554A (en) * 2017-10-17 2020-05-26 本田技研工业株式会社 Travel model generation system, vehicle in travel model generation system, processing method, and program
CN109747621A (en) * 2017-11-02 2019-05-14 本田技研工业株式会社 Controller of vehicle
CN112513950A (en) * 2018-08-27 2021-03-16 日立汽车系统株式会社 Update system and electronic control device
JP2020086801A (en) * 2018-11-22 2020-06-04 三菱電機株式会社 Automatic driving control device and automatic driving control method
WO2021012528A1 (en) * 2019-07-25 2021-01-28 平安科技(深圳)有限公司 Driving safety assistance method and apparatus, vehicle, and readable storage medium
WO2022107442A1 (en) * 2020-11-20 2022-05-27 株式会社デンソー Hmi control device and drive control device

Similar Documents

Publication Publication Date Title
JP6761967B2 (en) Driving support method and driving support device, automatic driving control device, vehicle, program using it
JP6447929B2 (en) Information processing system, information processing method, and program
WO2018079392A1 (en) Information processing system, information processing method, and program
JP6895634B2 (en) Information processing systems, information processing methods, and programs
JP6074553B1 (en) Information processing system, information processing method, and program
WO2016170763A1 (en) Driving assistance method, driving assistance device using same, automatic driving control device, vehicle, and driving assistance program
WO2016170764A1 (en) Driving assistance method and driving assistance device, driving control device, vehicle, and driving assistance program using such method
WO2016170773A1 (en) Driving assistance method, and driving assistance device, automatic driving control device, vehicle, and driving assistance program using said method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16782787

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 15564702

Country of ref document: US

REEP Request for entry into the european phase

Ref document number: 2016782787

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE