WO2018029789A1 - 自動運転車両の制御方法及び制御装置 - Google Patents
自動運転車両の制御方法及び制御装置 Download PDFInfo
- Publication number
- WO2018029789A1 WO2018029789A1 PCT/JP2016/073471 JP2016073471W WO2018029789A1 WO 2018029789 A1 WO2018029789 A1 WO 2018029789A1 JP 2016073471 W JP2016073471 W JP 2016073471W WO 2018029789 A1 WO2018029789 A1 WO 2018029789A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- interest
- driving vehicle
- occupant
- vehicle
- degree
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 66
- 210000001508 eye Anatomy 0.000 claims description 46
- 210000005252 bulbus oculi Anatomy 0.000 claims description 38
- 230000002093 peripheral effect Effects 0.000 claims description 17
- 238000001514 detection method Methods 0.000 abstract description 69
- 238000010586 diagram Methods 0.000 description 38
- 230000006399 behavior Effects 0.000 description 13
- 238000010801 machine learning Methods 0.000 description 9
- 230000001133 acceleration Effects 0.000 description 7
- 238000005286 illumination Methods 0.000 description 7
- 210000001747 pupil Anatomy 0.000 description 6
- 238000000605 extraction Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 3
- 230000002411 adverse Effects 0.000 description 2
- 230000000903 blocking effect Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000004397 blinking Effects 0.000 description 1
- 230000001186 cumulative effect Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/10—Path keeping
- B60W30/12—Lane keeping
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K31/00—Vehicle fittings, acting on a single sub-unit only, for automatically controlling vehicle speed, i.e. preventing speed from exceeding an arbitrarily established velocity or maintaining speed at a particular velocity, as selected by the vehicle operator
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R21/00—Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
- B60R21/01—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
- B60R21/015—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting the presence or position of passengers, passenger seats or child seats, and the related safety parameters therefor, e.g. speed or timing of airbag inflation in relation to occupant position or seat belt use
- B60R21/01512—Passenger detection systems
- B60R21/01552—Passenger detection systems detecting position of specific human body parts, e.g. face, eyes or hands
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/10—Path keeping
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/14—Adaptive cruise control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/14—Adaptive cruise control
- B60W30/16—Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
- B60W40/09—Driving style or behaviour
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/0098—Details of control systems ensuring comfort, safety or stability not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/082—Selecting or switching between different modes of propelling
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/085—Changing the parameters of the control units, e.g. changing limit values, working points by control input
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/10—Interpretation of driver requests or demands
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/005—Handover processes
- B60W60/0053—Handover processes from vehicle to occupant
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
- B60W2040/0818—Inactivity or incapacity of driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
- B60W2040/0872—Driver physiology
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W2050/0062—Adapting control system settings
- B60W2050/007—Switching between manual and automatic parameter input, and vice versa
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/10—Longitudinal speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/045—Occupant permissions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/18—Steering angle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/21—Voice
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/22—Psychological state; Stress level or workload
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/225—Direction of gaze
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/229—Attention level, e.g. attentive to driving, reading or sleeping
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/30—Driving style
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
- B60W2552/05—Type of road, e.g. motorways, local streets, paved or unpaved roads
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
- B60W2552/53—Road markings, e.g. lane marker or crosswalk
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
- B60W2554/402—Type
- B60W2554/4023—Type large-size vehicles, e.g. trucks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
- B60W2554/402—Type
- B60W2554/4026—Cycles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
- B60W2554/402—Type
- B60W2554/4029—Pedestrians
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/80—Spatial relation or speed relative to objects
- B60W2554/801—Lateral distance
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/80—Spatial relation or speed relative to objects
- B60W2554/802—Longitudinal distance
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2555/00—Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
Definitions
- the present invention relates to a control method and a control device for an autonomous driving vehicle.
- Patent Document 1 discloses a technique for detecting biological information of a vehicle occupant and assisting a driving operation according to the biological information.
- Patent Document 1 since the conventional example disclosed in Patent Document 1 does not examine the degree of interest in the driving situation by the occupant, automatic driving control according to the degree of interest of the occupant cannot be executed. For this reason, there existed a problem that the automatic driving
- the present invention has been made to solve such a conventional problem, and an object of the present invention is to provide a control method and control for an automatic driving vehicle capable of automatic driving control appropriately reflecting the occupant's intention. To provide an apparatus.
- the degree of interest of an occupant with respect to the traveling state of an autonomous driving vehicle is detected, and the vehicle is controlled based on driving characteristics corresponding to the degree of interest.
- FIG. 1 is a block diagram showing a configuration of a control device for an autonomous driving vehicle according to an embodiment of the present invention.
- FIG. 2 is a block diagram showing the configuration of the eyeball state detection unit according to the embodiment of the present invention.
- FIG. 3 is a block diagram showing the configuration of the image processing unit according to the embodiment of the present invention.
- FIG. 4 is an explanatory diagram showing the occupant's eyeball, the center of the pupil included in the eyeball, and the center of the reflected light.
- FIG. 5A is an explanatory diagram illustrating a situation in front of the host vehicle and movement in the sight line direction of the occupant.
- FIG. 5B is an explanatory diagram showing gaze and non-gaze on an occupant's object.
- FIG. 6 is an explanatory diagram illustrating a procedure for extracting a blink parameter by photographing a passenger's face.
- FIG. 7 is a graph showing changes in the degree of occupant's eye opening over time.
- FIG. 8 is an explanatory diagram showing related switches and non-related switches mounted on the vehicle.
- FIG. 9 is a block diagram illustrating a detailed configuration of the conversation determination unit.
- FIG. 10 is a block diagram illustrating a detailed configuration of the host vehicle state detection unit.
- FIG. 11 is a block diagram illustrating a detailed configuration of the surrounding state detection unit.
- FIG. 12 is an explanatory diagram showing three methods for detecting feature points.
- FIG. 13 is an explanatory diagram showing the flow of learning driving behavior for each detected feature point.
- FIG. 14 is an explanatory diagram showing classification of travel situations.
- FIG. 15 is an explanatory diagram showing an example of disassembling data of other vehicles into meaningful items.
- FIG. 16 is an explanatory diagram illustrating an example of manual driving characteristics learned by the manual driving characteristic learning unit.
- FIG. 17A is an explanatory diagram illustrating a traveling state when the host vehicle cruises without adopting the automatic driving characteristic.
- FIG. 17B is an explanatory diagram illustrating a traveling situation when the host vehicle travels using the automatic driving characteristics.
- FIG. 18A is an explanatory diagram illustrating a traveling state when the host vehicle travels following without adopting the automatic driving characteristic.
- FIG. 18B is an explanatory diagram illustrating a traveling situation when the host vehicle travels following an automatic driving characteristic.
- FIG. 19A is an explanatory diagram illustrating a traveling state when the host vehicle passes through an intersection without adopting the automatic driving characteristic.
- FIG. 19B is an explanatory diagram illustrating a traveling situation when the host vehicle adopts automatic driving characteristics and passes through an intersection.
- FIG. 20A is an explanatory diagram illustrating a traveling situation when the host vehicle temporarily stops at an intersection and then turns right.
- FIG. 20B is an explanatory diagram illustrating a traveling situation when the host vehicle turns right without stopping at the intersection.
- FIG. 21 is a graph showing the frequency when the vehicle turns right without stopping at the intersection, and then turns right after stopping.
- FIG. 22 is an explanatory diagram showing a probability distribution of manual driving characteristics.
- FIG. 22 is an explanatory diagram showing a probability distribution of manual driving characteristics.
- FIG. 23 is a flowchart showing a processing procedure of the control device for an autonomous driving vehicle according to the embodiment of the present invention.
- FIG. 24 is a flowchart illustrating processing for determining whether the degree of interest is higher than a reference value based on the movement of the eyeball.
- FIG. 25 is a flowchart illustrating processing for determining whether the degree of interest is higher than a reference value based on the frequency of switches operated by the occupant.
- FIG. 1 is a block diagram showing a configuration of a control device for an autonomous driving vehicle according to an embodiment of the present invention.
- the control device for an autonomous driving vehicle includes an interest level detection unit 1, a traveling state detection unit 14, a personally adapted driving characteristic determination unit 4, an automatic driving characteristic setting unit 8, and a driving characteristic switching unit. 11 is provided.
- Each function shown in the present embodiment can be implemented by one or a plurality of processing circuits.
- the processing circuit includes a processing device including an electric circuit. Processing devices also include devices such as application specific integrated circuits (ASICs) and conventional circuit components arranged to perform the functions described in the embodiments.
- ASICs application specific integrated circuits
- the degree-of-interest detection unit 1 detects the degree of interest of an occupant (for example, a driver) of the own vehicle with respect to the current traveling state of the own vehicle, and determines whether the degree of interest is higher than a reference value.
- the interest level detection unit 1 includes an eyeball state detection unit 2 that detects movement of the occupant's eyeball, a switch operation detection unit 3 that detects the frequency of various switch operations mounted on the vehicle, and a conversation that analyzes the occupant's conversation.
- a determination unit 16 is provided. The detection results of the eyeball state detection unit 2, the switch operation detection unit 3, and the conversation determination unit 16 are output to the interest level determination unit 12 of the driving characteristic switching unit 11.
- FIG. 2 is a block diagram illustrating a configuration of the eyeball state detection unit 2.
- the eyeball state detection unit 2 includes an infrared illumination 21 that irradiates infrared rays toward the occupant's eyeball 18, an infrared camera 20 that captures infrared rays reflected by the pupil 19 of the eyeball 18, an infrared illumination 21, and And an illumination / camera controller 23 for controlling the infrared camera 20.
- the eyeball state detection unit 2 acquires a vehicle outside image captured by the vehicle exterior camera 17 that captures the outside of the vehicle (for example, the front of the vehicle), and based on the vehicle exterior image and the image captured by the infrared camera 20, An image processing unit 22 is provided for executing processing such as occupant gaze analysis and blink analysis. Then, the eyeball state detection unit 2 detects the sight line direction of the occupant based on the movement of the occupant's eyeball 18.
- FIG. 3 is a block diagram showing the configuration of the eyeball state detection unit 2 in detail.
- the infrared camera 20 includes a lens 25, a visible light blocking filter 26, a shutter / aperture 27, and an infrared image sensor 28. Then, under the control of the illumination / camera controller 23, the shutter / aperture 27 and the infrared image sensor 28 are controlled to photograph the reflected infrared light applied to the eyeball 18.
- the image processing unit 22 includes a digital filter 29, an image processing GPU (Graphics Processing Unit) 30, and a parameter extraction unit 31.
- the digital filter 29 filters the image captured by the infrared camera 20 and the image captured by the vehicle exterior camera 17.
- the image processing GPU 30 performs various image processing such as analysis of the sight line direction of the occupant and analysis of the blink of the occupant based on the image captured by the infrared camera 20 and the image captured by the outside camera 17. To do.
- the parameter extraction unit 31 determines whether or not the periphery of the vehicle is watched based on the sight line direction of the occupant obtained by the image processing performed by the image processing GPU 30 and the image outside the vehicle photographed by the vehicle exterior camera 17.
- the “peripheral gaze parameter” shown is extracted.
- a “blink parameter” indicating whether or not a blink is present is extracted. Then, the parameter extraction unit 31 outputs the extracted peripheral gaze parameters and blink parameters to the interest level determination unit 12 illustrated in FIG.
- FIG. 4 is an explanatory diagram showing the occupant's eyeball 18, the center r1 of the pupil 19 included in the eyeball 18, and the center r2 of the reflected light.
- the infrared ray 21 shown in FIG. 2 irradiates the occupant's eyeball 18 with an infrared beam.
- the reflected light of the infrared beam and the center of the pupil 19 are detected by the infrared camera 20. Further, an output vector R1 from the center r1 of the pupil 19 to the center r2 of the reflected light is calculated.
- the positional relationship between the infrared camera 20 and the reflected light is calculated based on the position of the reflected light of the infrared beam. Further, the positional relationship between the infrared camera 20 and the center of the pupil 19 is obtained based on the positional relationship between the infrared camera 20 and the reflected light and the output vector R1. As a result, it is possible to recognize the sight line direction of the occupant, that is, which position around the vehicle the occupant is looking at.
- FIG. 5A is an explanatory diagram showing a front image of the host vehicle
- FIG. 5B is an explanatory diagram showing gaze and non-gaze on an occupant's object.
- a forward object such as another vehicle
- the sight line of the occupant it is defined as “visual recognition”. For example, when the line of sight of the occupant faces the preceding vehicle e1 illustrated in FIG. 5A, it is determined that the occupant is viewing the preceding vehicle e1. When the line of sight of the occupant faces the vehicle e2 existing on the roadside, it is determined that the occupant is viewing the vehicle e2.
- the peripheral gaze degree F1 [%], which is the ratio of the gaze time within a certain time, is defined by the following equation (1).
- F1 Ta / (Ta + Tb) * 100 (1)
- Ta represents a gaze time of the object within a certain time
- Tb represents a non-gaze time within the certain time.
- the high degree of gazing at the occupant's surroundings can be presumed that the occupant is highly interested in the driving situation.
- the peripheral gaze degree F1 calculated by the above equation (1) is output to the interest level determination unit 12 shown in FIG. 1 as a peripheral gaze parameter.
- step h A blink parameter detection procedure indicating whether or not the passenger is blinking will be described with reference to FIG.
- step h ⁇ b> 1 in FIG. 6 the occupant's face image 71 is captured by the infrared camera 20.
- step h2 a face area 72 is extracted from the face image 71 photographed in the process of h1.
- step h3 an image 73 obtained by extracting feature points from the face area 72 is acquired.
- step h4 an image 74 indicating the posture of the face is acquired from the facial feature points.
- step h5 the eye opening part and the eye closing part are recognized from the image of the occupant's eyes. Based on the eye opening part and the eye closing part, it is possible to obtain an eye opening degree indicating the ratio of the eye opening to the full opening.
- step h6 a blink parameter is detected. Since the image processing shown in steps h1 to h5 is a well-known technique, detailed description thereof is omitted.
- FIG. 7 is a graph showing changes in the degree of occupant opening with time. As shown by the curve Q1, the occupant's eye opening degree changes periodically. The time interval at which the maximum degree of eye opening is set as the blink interval T1. When the degree of eye opening is 20% or less, the eye closing time T2 is calculated by defining the eye closing. As the maximum eye opening degree, a numerical value calibrated in advance for each occupant is used.
- the opening / closing behavior feature amount PE indicating the ratio of the eye closing time to the blink interval of the eyeball 18 is calculated by the following equation (2).
- PE (T2 / T1) * 100 [%] (2)
- an eye opening degree X2 [%] at the time of eye opening is calculated.
- the high opening / closing behavior feature amount PE means that the degree of eyes closed by the occupant is high, and it can be said that the degree of interest in the driving situation is low.
- the opening / closing behavior feature amount PE is lower than a preset threshold value (second threshold value PEth)
- second threshold value PEth a preset threshold value
- the high degree of eye-closing speed X1 or the high degree of eye opening X2 at the time of eye opening means that the degree of interest in the driving situation of the occupant is high because the degree of gazing around the host vehicle is high.
- the eye closing speed X1, the eye opening degree X2 at the time of eye opening, and the opening / closing behavior feature amount PE calculated by the equation (2) are output as blink parameters to the interest level determination unit 12 shown in FIG.
- the switch operation detection unit 3 detects various operations mounted on the vehicle, and outputs detection data to the interest level determination unit 12 shown in FIG.
- Various switches mounted on the vehicle are classified into switches related to vehicle travel (hereinafter referred to as “related switches”) and switches not related to vehicle travel (hereinafter referred to as “unrelated switches”).
- the related switches are, for example, a speed setting switch, an inter-vehicle distance setting switch, a lane change switch, and the like.
- the unrelated switch is, for example, a window opening / closing switch, an audio operation switch, a navigation operation switch, a seat position adjustment switch, an illumination switch, or the like.
- the degree-of-interest determination unit 12 shown in FIG. 1 determines that the degree of interest in the driving situation of the occupant is high when the operation frequency of the related switches (the number of operations within a certain time) is high. . Conversely, when the operation frequency of the related switch (the number of operations within a certain time) is low, it is determined that the degree of interest of the occupant in the traveling state is low. Further, when the operation frequency of the unrelated switch is high, it is determined that the degree of interest in the traveling situation of the occupant is low, and when the frequency of operation is low, it is determined that the degree of interest in the traveling state of the occupant is high.
- the conversation determination unit 16 includes a microphone 42 that detects voice, a speaker 43, an information presentation unit 44 that presents various types of information to the occupant, and an analysis unit 45 that analyzes the occupant's conversation. It is equipped with.
- the conversation determination unit 16 recognizes the occupant's voice using the occupant's voice data registered in advance in order to identify the occupant's voice and other voices.
- the conversation includes a conversation between the occupant and the vehicle in addition to a conversation between the occupant and the other occupants.
- the interest level may be detected by analyzing the voice such as the conversation speed of the passenger and the volume of the voice.
- the information presentation unit 44 can throw various conversations (daily conversation or quiz) from the speaker 43 to the occupant. For example, questions such as “What is the speed limit on the road?” And “What color is the preceding vehicle?” Are asked.
- the microphone 42 detects the utterance (voice) of the occupant, and the analysis unit 45 recognizes and analyzes the utterance (voice) of the occupant with respect to the conversation.
- the interest level determination unit 12 estimates the amount of consciousness of the occupant analyzed by the conversation determination unit 16, and determines that the degree of interest is high when the amount of consciousness is large.
- the traveling state detection unit 14 includes a host vehicle state detection unit 6 that detects a traveling state of the host vehicle, and a surrounding state detection unit 9 that detects a surrounding state of the host vehicle.
- the host vehicle state detection unit 6 acquires vehicle speed data detected by the vehicle speed sensor 32, acceleration data detected by the acceleration sensor 33, and steering angle data detected by the steering angle sensor 34. Based on these data, the traveling state of the host vehicle is detected.
- the data detected by the own vehicle state detection unit 6 is output to the manual driving characteristic learning unit 7 shown in FIG.
- the surrounding state detection unit 9 includes a vehicle interval detection unit 35, a non-vehicle detection unit 36, a surrounding vehicle type detection unit 37, a lane detection unit 38, a road type detection unit 39, and traffic.
- An information detection unit 40 is provided.
- interval detection part 35 detects the vehicle space
- the non-vehicle detection unit 36 detects an object other than a vehicle such as a pedestrian or a bicycle existing around the host vehicle based on an image captured by a camera that captures the surroundings.
- the surrounding vehicle type detection unit 37 detects a vehicle existing around the host vehicle from an image captured by the camera, and detects the type of the detected vehicle. For example, passenger cars, trucks, buses, motorcycles, etc. are detected.
- the lane detector 38 detects a road lane from an image taken by a camera.
- the road type detection unit 39 detects the road type from information obtained from the navigation device.
- the traffic information detection unit 40 detects traffic information from information obtained from the navigation device.
- the various types of information described above may be detected by inter-vehicle communication, road-to-vehicle communication, or by other sensors such as sonar.
- the data detected by the ambient condition detection unit 9 is output to the automatic driving characteristic setting unit 8 shown in FIG.
- the personal compatible driving characteristic determination unit 4 includes a manual driving characteristic learning unit 7 that learns driving characteristics of the occupant during manual driving of the host vehicle, and a manual driving characteristic database 5 that stores manual driving characteristics.
- the manual driving characteristic learning unit 7 acquires various driving characteristics when the occupant performs manual driving, and accumulates them in the manual driving characteristic database 5.
- This driving characteristic is a driving characteristic that suits the occupant's preference, and is adopted when it is determined that the degree of interest of the occupant with respect to the traveling state of the host vehicle is higher than the reference value, as will be described later. Details will be described below.
- the manual driving characteristic learning unit 7 detects the driving characteristic of the occupant from each data (data acquired by each sensor shown in FIG. 10) indicating the driving condition detected by the own vehicle condition detecting unit 6.
- Driving characteristics are the timing when the lane change, merging point when entering the expressway, merging speed, inter-vehicle distance, average cruising speed, acceleration / deceleration, brake timing, steering angular speed, when the occupant performs manual driving.
- the travel position in the lane left side, right side), the right turn passing timing at the intersection, and the like. Then, the driving behavior at each detected feature point is learned.
- FIG. 12 is an explanatory diagram showing three learning methods.
- learning is performed by human analysis.
- learning method “2” a hypothesis based on human knowledge and experience is set, and further learning is performed by machine learning.
- learning method “3” learning is performed fully automatically by machine learning. In the present embodiment, learning is performed by employing the learning method “2” as an example.
- FIG. 13 is an explanatory diagram showing a flow of learning features from the data detected by the traveling state detection unit 14.
- step a1 data is collected from the traveling state detection unit 14. Collect the data on the driving situation and surroundings of the vehicle. After collecting the data, necessary attribute data is extracted in step a2.
- the data collected by the driving situation detection unit 14 is not necessarily related to driving behavior, and if data irrelevant to driving behavior is used as learning material, the learning result may be adversely affected. is there. For this reason, only necessary data (attribute data) is extracted in the process of step a2.
- step a3 elements that adversely affect learning such as noise included in the attribute data extracted in the process of a2 are removed, and the attribute data is corrected.
- step a4 the attribute data is classified into meaningful items (parameters).
- FIG. 15 shows an example of classifying data of other vehicles into meaningful items.
- each of these data is reclassified to acquire various items such as “the number of preceding vehicles”, “the number of preceding vehicles”, and “distance between the preceding vehicles”.
- step a5 the parameters generated by the pre-processing are used as input for machine learning, and machine learning is executed.
- algorithm for machine learning for example, SOM (Self-Organizing-Map), SVC (Support Vector-Machine Classification), SGD (Stochastic-Gradient-Decent), logistics regression, or the like can be used.
- SOM Self-Organizing-Map
- SVC Small Vector-Machine Classification
- SGD Stochastic-Gradient-Decent
- logistics regression or the like.
- the type of road on which the vehicle is traveling is output. As shown in FIG. 14, various road types are classified into (for example, b1 to b8). Specifically, when driving on a highway, “b1. Highway” is set, and when driving on two roads on one side of a general road, “b2. Main road” is set. “B3.
- Non-trunk road when traveling, and “b4. Intersection” when traveling on a general road intersection. Further, when traveling on a general road or highway and no preceding vehicle is present, “b5. Cruise traveling” is set. When traveling on a general road or highway and a preceding vehicle is present, “b6. If the vehicle restarts after stopping at the intersection of general roads, it is classified as “b7. Crossing the intersection”, and if it turns right at the intersection of general roads, it is classified as “b8. This classification method is not limited to the above contents, and the number of classification items may be increased or the number of classification items may be decreased.
- a junction point of an expressway a branch point of an expressway, a right turn lane on a main road, and the like may be added.
- a branch point of an expressway a right turn lane on a main road, and the like.
- step a6 the road type specified by learning and the driving characteristics of the road type are stored in the driving characteristic database 5.
- the classification items in steps a1 to a5 in FIG. 13 are set manually, and in step a6, situation parameters are automatically generated through machine learning.
- FIG. 16 is an explanatory diagram showing an example of manual driving characteristics learned by the manual driving characteristic learning unit 7.
- the host vehicle V1 travels at 60 km / h on the left lane of a two-lane road, and two vehicles are in the right lane.
- the other vehicles V2 and V3 are driving at a speed of 80 km / h.
- the type of travel situation, the positional relationship with other vehicles before and after, road information (speed limit), and current vehicle travel information for example, travel speed
- the cruise traveling shown in the present embodiment is defined as a situation in which an inter-vehicle time (a numerical value obtained by dividing the inter-vehicle distance by the traveling speed) with the preceding vehicle continues for 30 seconds or more.
- the automatic driving characteristic setting unit 8 shown in FIG. 1 will be described.
- the automatic driving characteristic setting unit 8 sets driving characteristics to be selected when the degree of interest in the driving situation of the occupant is low. Hereinafter, this will be described in detail with reference to FIGS.
- FIG. 17A and FIG. 17B are explanatory diagrams illustrating an example of determining the automatic driving characteristics when the host vehicle is cruising by automatic driving.
- road information such as the type of travel condition (in this case, cruise travel), the positional relationship with other vehicles traveling in front of and behind the host vehicle, and the speed limit is acquired.
- control is performed so that the traveling speed of the host vehicle matches the speed of other vehicles traveling in the vicinity within a range not exceeding the speed limit.
- the traffic jam can be eliminated. Specifically, as shown in FIG.
- FIG. 18A and FIG. 18B are explanatory diagrams illustrating an example in which automatic driving characteristics are determined in the case of following traveling that follows a preceding vehicle that travels ahead of the host vehicle.
- the follow-up traveling shown in the present embodiment is defined as a situation in which the inter-vehicle time between the preceding vehicle is less than 2 seconds and continues for 30 seconds or more.
- the type of driving situation (following driving in this case) and the positional relationship with other vehicles that drive in front of and behind the host vehicle are acquired.
- the time between vehicles is shortened within a range where rear-end collision with the preceding vehicle can be avoided. That is, as shown in FIG.
- the inter-vehicle time is 4 [sec]
- the inter-vehicle time is 2 [sec] as shown in FIG. 18B.
- FIG. 19A and FIG. 19B are explanatory diagrams illustrating an example of determining the automatic driving characteristics when the host vehicle starts at an intersection. It is assumed that the host vehicle stops at the intersection when the traffic light is lit in red, then switches to green and starts. As input parameters, information on the type of driving situation (cruising driving in this case), the positional relationship between the host vehicle and the preceding vehicle, and traffic signals is acquired.
- the number of vehicles passing through the intersection during the green lighting time can be increased by adjusting the acceleration and the start timing for the preceding vehicle within a range that does not collide with the preceding vehicle when starting. That is, when the control at the time of starting is not executed, the interval between the vehicles increases as shown in FIG. 19A, and the number of passing vehicles is small. Specifically, three units z1, z2, and z3 pass.
- the number of vehicles passing through the intersection is four, z4, z5, z6, and z7, and the number of passing vehicles is increased. Can be made.
- FIG. 20A, FIG. 20B, and FIG. 21 are explanatory diagrams illustrating an example of determining whether to stop or turn right when the host vehicle V1 makes a right turn at an intersection.
- arrival time s1 the time required for the oncoming vehicle V3 to reach the intersection
- arrival time s2 the time required for the host vehicle to reach the intersection
- the difference between the arrival times s1 and s2 s1 ⁇ s2; this is set as the gap time ⁇ s
- the gap time ⁇ s is calculated, and when the gap time ⁇ s is longer than a preset threshold time (for example, 6 seconds), The vehicle V1 turns right at the intersection without stopping.
- a preset threshold time for example, 6 seconds
- the host vehicle V1 temporarily stops at the intersection.
- FIG. 21 is a graph showing the cumulative frequency of the gap time ⁇ s when the vehicle temporarily stops when turning right and when it turns right without stopping.
- the curve q1 shows the relationship between the gap time ⁇ s and the frequency when stopping at an intersection, and the number of vehicles to stop increases as the gap time ⁇ s becomes shorter.
- Curve q2 shows the relationship between the gap time ⁇ s and the frequency when turning right without stopping at the intersection, and the longer the gap time ⁇ s, the more vehicles that turn right without stopping at the intersection.
- the intersection of the curves q1 and q2 is set as the threshold time.
- the threshold time is 6 seconds. That is, when the gap time ⁇ s is longer than 6 seconds, the intersection is turned right without stopping the host vehicle V1, and when the gap time ⁇ s is 6 seconds or less, the host vehicle V1 is stopped at the intersection. To control. By doing so, a smooth right turn is possible at the intersection, and traffic congestion at the intersection can be mitigated.
- the driving characteristic switching unit 11 includes an interest level determination unit 12 and a driving characteristic setting unit 13.
- the interest level determination unit 12 determines the level of interest of the occupant in the traveling state based on the “peripheral gaze parameter” and the “blink parameter” output from the eyeball state detection unit 2 described above. Specifically, when the peripheral gaze degree F1 shown in the above equation (1) is higher than a preset first threshold value F1th, it is determined that the degree of interest in the driving situation is higher than the reference value.
- the opening / closing behavior feature amount PE shown in the expression (2) is lower than the second threshold value PEth set in advance, the eye opening degree X2 at the time of eye opening is higher than the third threshold value X2th set in advance, or When the eye-closing speed X1 is higher than a preset fourth threshold value X1th, it is determined that the degree of interest in the driving situation is higher than the reference value.
- the degree of interest of the occupant in the traveling state is determined according to the operation state of the related switch and the unrelated switch output from the switch operation detection unit 3. Specifically, when the operation frequency of the related switch is higher than a preset fifth threshold, it is determined that the degree of interest in the driving situation is higher than the reference value. Further, when the operation frequency of the unrelated switch is higher than a preset sixth threshold value, it is determined that the degree of interest in the driving situation is lower than the reference value. Furthermore, as described above, when the amount of consciousness for the driver analyzed by the conversation determination unit 16 is estimated and the amount of consciousness is higher than a preset seventh threshold, the degree of interest in the driving situation is higher than the reference value. Judge.
- the driving characteristic setting unit 13 determines the control content of the automatic driving control based on the interest level determined by the interest level determination unit 12. Specifically, when the degree of interest in the driving situation of the occupant is higher than the reference value, automatic driving is performed so as to conform to the driving characteristics of the occupant. For example, the vehicle speed and the inter-vehicle distance are controlled so as to match the characteristics of the occupant. That is, when the driver's degree of interest in the current driving is higher than the reference value, it is possible to suppress the uncomfortable feeling given to the driver by driving with driving characteristics that the driver (occupant) prefers as much as possible. Therefore, the driving characteristic data when the driver performs the manual driving is extracted from the manual driving characteristic database 5 and the automatic driving is performed so as to match the driver characteristics of the driver.
- the manual operation characteristic database 5 for example, as shown in FIG. 22, three manual operation characteristics u1, u2, and u3 are stored. And the present driving characteristic u0 by the manual driving of the own vehicle is acquired. Specifically, the type of driving situation, the positional relationship with other vehicles before and after, road information (speed limit), and current vehicle driving information (for example, driving speed) are acquired as input parameters. Then, a significant parameter is calculated using a machine learning algorithm, and the current driving characteristic is acquired.
- the manual driving characteristic u2 having the most similar current driving characteristic u0 of the host vehicle is selected from the manual driving characteristics u1 to u3.
- the driving characteristic setting unit 13 selects the manual driving characteristic u2 and outputs a control command.
- FIG. 23 shows the overall processing procedure
- FIGS. 24 and 25 show the detailed processing of S13 in FIG.
- step S11 of FIG. 23 the driving characteristic setting unit 13 determines the traveling state of the host vehicle.
- vehicle speed data, acceleration data, steer angle data, and the like detected by the own vehicle state detection unit 6 are used as shown in FIG.
- the current traveling state can also be determined based on vehicle speed, acceleration, and steer angle information acquired by CAN (Controller Area Network) and sensor information such as a radar and a camera.
- CAN Controller Area Network
- step S12 the driving characteristic setting unit 13 determines the degree of interest in the traveling situation of the occupant.
- the degree of interest is determined based on the movement of the occupant's eyeball, the frequency of the switch operation, the content of the conversation, and the like. Further, in step S13, it is determined whether or not the interest level is higher than a reference value.
- FIG. 24 is a flowchart illustrating processing for determining the degree of interest based on the eyeball information.
- the interest level determination unit 12 acquires the “peripheral gaze parameter” from the eyeball state detection unit 2 in step S31, and acquires the “blink parameter” in step S32.
- the degree-of-interest determination unit 12 determines whether or not the peripheral gaze degree F1 is higher than the first threshold value F1th based on the peripheral gaze parameters. When it is determined that the peripheral gaze degree F1 is higher than the first threshold value F1th (YES in step S33), in step S37, it is determined that the degree of interest of the occupant in the traveling state is higher than the reference value.
- step S34 it is determined whether the opening / closing behavior feature amount PE is lower than the second threshold value PEth. To do.
- step S34 If the opening / closing behavior feature amount PE is lower than the second threshold value PEth (YES in step S34), the process of step S37 is executed. On the other hand, when the opening / closing behavior feature amount PE is higher than the second threshold value PEth (NO in step S34), in step S35, the interest level determination unit 12 determines the degree of eye opening at the time of eye opening based on the blink information of the occupant. Determine if it is high. In this process, it is determined whether or not the eye opening degree X2 at the time of eye opening is higher than the third threshold value X2th.
- step S35 If the eye opening degree X2 at the time of eye opening is higher than the third threshold value X2th (YES in step S35), the process proceeds to step S37.
- step S36 the interest level determination unit 12 determines that the eye closing speed X1 is higher than the fourth threshold value X1th based on the blink parameter. Determine if it is high.
- step S36 If it is determined that the eye closing speed X1 is higher than the fourth threshold value X1th (YES in step S36), it is determined in step S37 that the degree of interest is higher than the reference value. On the other hand, when it is determined that the eye closing speed X1 is lower than the fourth threshold value X1th (NO in step S36), it is determined in step S38 that the interest level is lower than the reference value. And based on the determination result of step S37, S38, the process of step S13 of FIG. 23 is performed.
- step S51 the interest level determination unit 12 acquires frequency information of various switch operations output from the switch operation detection unit 3.
- step S52 the interest level determination unit 12 determines whether the operation frequency of the related switch is high.
- the related switches are, for example, a speed setting switch, an inter-vehicle distance setting switch, a lane change switch, and the like.
- step S54 the interest level determination unit 12 determines that the interest level of the occupant's travel status is higher than the reference value. Judge as high.
- step S53 determines whether or not the operation frequency of the unrelated switch is higher than the sixth threshold value.
- the unrelated switch is, for example, a window opening / closing switch, an audio operation switch, a navigation operation switch, a seat position adjustment switch, or the like. If the operation frequency of the unrelated switch is higher than the sixth threshold value (YES in step S53), in step S55, the interest level determination unit 12 determines that the interest level of the occupant's driving situation is lower than the reference value. To do.
- step S53 If the operation frequency of the unrelated switch is lower than the sixth threshold value (NO in step S53), it is assumed that the frequency of the switch operation is low or no operation is performed, so the determination is suspended. Then, the process returns to step S52.
- step S13 of FIG. 23 it is determined whether or not the processing shown in step S13 of FIG. 23, that is, whether the degree of interest of the occupant in the traveling state is higher than the reference value. Note that, as described above, it is possible to determine the level of interest in driving based on the status of the conversation by determining the status of the passenger's conversation.
- step S14 the driving characteristic setting unit 13 sets the driving characteristic that matches the manual driving characteristic of the occupant ( Vehicle speed, distance between vehicles, etc.).
- the vehicle As an example of the high degree of interest in the occupant's driving situation, if the vehicle is not following the flow of other vehicles in the vicinity, if it is not able to cope with the road situation, trucks, luxury cars, obstacles, road structures, etc. For example, when an object requiring attention is approaching. In such a driving situation, the occupant automatically sets the driving characteristics such as the vehicle speed, the inter-vehicle distance (front and rear, left and right), and the lane change to driving characteristics that suit the occupant's preference, thereby suppressing the uncomfortable feeling given to the occupant. Driving is possible.
- step S15 the driving characteristic setting unit 13 sets the vehicle speed and the inter-vehicle distance according to the surrounding situation.
- step S16 either the driving characteristic set in the process of step S14 or the driving characteristic set in the process of step S15 is determined.
- step S17 the driving characteristic setting unit 13 determines whether or not the automatic driving section has ended, and if not, returns the process to step S11. If completed, end automatic operation and switch to manual operation.
- the degree of interest of the occupant is detected, and the own vehicle is controlled based on the driving characteristics corresponding to the degree of interest.
- the occupant's intention can be reflected in the driving characteristics of the vehicle, automatic driving traveling appropriately reflecting the occupant's intention can be performed.
- the driving characteristics suitable for the surrounding conditions are set and the automatic driving control is executed, so the driving speed and the inter-vehicle distance match the surrounding vehicles. So that traffic congestion can be mitigated without disturbing traffic flow.
- the driving characteristics based on the driving characteristics during manual driving of the occupant are set, so that automatic driving control without a sense of incongruity for the occupant is possible.
- the degree of interest of the occupant with respect to vehicle travel is detected based on the movement of the eyeball detected by the eyeball state detection unit 2. Specifically, when the peripheral gaze degree F1 is higher than the first threshold value F1th, it is determined that the degree of interest is higher than the reference value, so that the degree of interest can be determined with high accuracy. Moreover, since the degree of interest is determined to be higher than the reference value when the ratio of the eye closing time to the blink interval (opening / closing behavior feature amount PE) is lower than the second threshold value PEth, the interest level is determined with high accuracy. Is possible.
- the eye opening degree X2 at the time of eye opening is higher than the third threshold value X2th, it is determined that the degree of interest is higher than the reference value, so that the degree of interest can be determined with high accuracy.
- the eye-closing speed X1 is higher than the fourth threshold value X1th, it is determined that the degree of interest is higher than the reference value, so that the degree of interest can be determined with high accuracy.
- the interest level is determined to be higher than the reference value when the operation frequency of the related switch is higher than the fifth threshold value, the interest level can be determined with a simple process. Further, since the interest level is determined to be lower than the reference value when the operation frequency of the unrelated switch is higher than the sixth threshold value, the interest level can be determined with a simple process.
- the degree of interest is determined based on passenger conversation. Specifically, the consciousness amount of the occupant analyzed by the conversation determination unit 16 is estimated, and when the consciousness amount is large, it is determined that the degree of interest is higher than the reference value. Accordingly, it is possible to determine the degree of interest with high accuracy based on the conversation content of the passenger.
- the interest level may be determined.
- the interest level determination is further performed by the switch operation, and when the interest level is lower than the reference value, Only when it is determined, the interest level determination based on the content of the conversation may be executed.
- the amount of consciousness of the occupant analyzed by the conversation determination unit 16 is estimated, and when the amount of consciousness is large, it is possible to determine that the degree of interest is high and execute the process of step S13. is there.
- the driver is described as an example of the occupant.
- the present invention is not limited to the driver, and is automatically performed using the driving characteristics of passengers other than the driver. It is also possible to execute operation control.
- the modified example when an occupant of the own vehicle is gazing at another vehicle as an object, it is determined that the degree of interest in the other vehicle is higher than a reference value. Further, the driving characteristics of the other vehicle are extracted, and the host vehicle is controlled based on the driving characteristics. That is, when the degree of interest of the occupant is higher than the reference value, the driving characteristic of the object in which the occupant is interested is detected, and the automatic driving vehicle is controlled based on the driving characteristic of the object.
- the other vehicle being watched by the occupant is specified, and the traveling speed of the other vehicle, the preceding vehicle, and Detect driving characteristics such as the distance between cars. Then, the host vehicle is controlled to match the driving characteristics of the other vehicle. As a result, when there is another vehicle modeled by the occupant and the occupant is gazing at the other vehicle, the host vehicle is controlled to match the driving characteristics of the other vehicle. It becomes possible to control the host vehicle with driving characteristics that suit the user's preference.
- the object that the occupant is watching is not limited to other automobiles, but may be any moving object such as a motorcycle, a bicycle, or a pedestrian. May be controlled.
- sensing may be performed in a direction extending to the line of sight to specify a stationary object and its characteristics.
- the automatic operation is performed based on the operation characteristics at the time of manual operation, but the automatic operation is performed based on the operation characteristics at the time of manual operation.
- the interest level may be measured to adjust the driving characteristics of the automatic driving. For example, if the interest level is higher than the reference value, the higher the interest level is, the higher the interest level is between the driving characteristics during manual driving and the driving characteristics according to the surrounding conditions. You may make it set. For example, when the degree of interest is higher than the reference value, the greater the movement of the sight line of the occupant, the higher the degree of interest may be, even though the degree of interest is higher than the reference value.
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- General Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Traffic Control Systems (AREA)
- Aviation & Aerospace Engineering (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
- Game Theory and Decision Science (AREA)
- Health & Medical Sciences (AREA)
- Business, Economics & Management (AREA)
Abstract
Description
本実施形態で示す各機能は、1又は複数の処理回路により実装され得る。処理回路は、電気回路を含む処理装置を含む。処理装置は、また、実施形態に記載された機能を実行するようにアレンジされた特定用途向け集積回路(ASIC)や従来型の回路部品のような装置を含む。
関心度検出部1は、自車両の乗員(例えば、運転者)の自車両の現在の走行状況に対する関心度を検出し、関心度が基準値よりも高いか否かを判定する。関心度検出部1は、乗員の眼球の動きを検出する眼球状態検出部2と、車両に搭載される各種のスイッチ操作の頻度を検出するスイッチ操作検出部3と、乗員の会話を分析する会話判定部16を備えている。眼球状態検出部2、スイッチ操作検出部3、及び会話判定部16の検出結果は、運転特性切替部11の関心度判定部12に出力される。
図2は、眼球状態検出部2の構成を示すブロック図である。図2に示すように眼球状態検出部2は、乗員の眼球18に向けて赤外線を照射する赤外線照明21と、眼球18の瞳19で反射する赤外線を撮影する赤外線カメラ20と、赤外線照明21及び赤外線カメラ20を制御する照明・カメラコントローラ23と、を備えている。
デジタルフィルタ29は、赤外線カメラ20で撮影された画像、及び車外カメラ17で撮影された画像をフィルタ処理する。
F1=Ta/(Ta+Tb)*100 …(1)
(1)式において、Taは一定時間内で対象物の注視時間、Tbは一定時間内の非注視時間を示す。
PE=(T2/T1)*100[%] …(2)
また、最大開眼度となった時刻(例えば、t1)から閉眼となった時刻(例えば、t2)までの経過時間を測定し、閉眼速度X1[%/sec]を演算する。更に、開眼時の開眼度X2[%]を演算する。
そして、閉眼速度X1、開眼時の開眼度X2及び、(2)式で演算した開閉挙動特徴量PEを瞬きパラメータとして、図1に示す関心度判定部12に出力する。
次に、スイッチ操作検出部3について説明する。スイッチ操作検出部3は、車両に搭載される各種の操作を検出し、検出データを図1に示す関心度判定部12に出力する。車両に搭載される各種のスイッチを、車両走行と関連するスイッチ(以下、「関連スイッチ」という)と、車両走行と関連しないスイッチ(以下、「非関連スイッチ」という)に分類する。
次に、会話判定部16について説明する。図9に示すように、会話判定部16は、音声を検出するマイク42と、スピーカ43と、乗員に対して各種の情報を提示する情報提示部44と、乗員の会話を分析する分析部45、を備えている。会話判定部16は、乗員の音声とその他の音声を識別するために、予め登録された乗員の音声データを用いて乗員の音声を認識する。会話には、乗員とその他の乗員との会話の他に、乗員と車両との会話も含まれる。乗員の会話では、乗員の会話速度、声の大きさなど、音声を分析して関心度を検出するようにしても良い。例えば、乗員の会話速度が速い場合は、運転よりも会話に集中しているとして、関心度は低いと判定するようにしても良い。また、例えば、乗員の声の大きさが小さい場合は、乗員の独り言の可能性が高く、会話に集中していない状態として、関心度を高くするようにしてもよい。乗員と車両との会話は、情報提示部44が、スピーカ43から乗員に対して様々な会話(日常会話でもよくクイズでもよい)を投げかけることができる。例えば、「道路の制限速度は何kmですか?」、「先行車両は何色ですか?」等の質問を投げかける。そして、マイク42は、乗員の発話(音声)を検出し、分析部45は、この会話に対する乗員の発話(音声)を認識し分析する。
次に、図1に示す走行状況検出部14について説明する。走行状況検出部14は、自車両の走行状況を検出する自車両状況検出部6と、自車両の周囲の状況を検出する周囲状況検出部9を備えている。
周囲状況検出部9は、図11に示すように、車両間隔検出部35と、非車両検出部36と、周辺車両種類検出部37と、車線検出部38と、道路種類検出部39、及び交通情報検出部40を備えている。
周辺車両種類検出部37は、カメラで撮影された画像から、自車両の周囲に存在する車両を検出し、検出した車両の種類を検出する。例えば、乗用車、トラック、バス、バイク等を検出する。車線検出部38は、カメラで撮影された画像から、道路の車線を検出する。
次に、図1に示す個人適合運転特性判定部4について説明する。個人適合運転特性判定部4は、自車両の手動運転時に乗員の運転特性を学習する手動運転特性学習部7と、手動運転特性を記憶する手動運転特性データベース5を備えている。
この走行状況に対して、前述した手法により、走行状況の種類、前後の他車両との位置関係、道路情報(制限速度)、現在の車両の走行情報(例えば、走行速度)が取得される。
次に、図1に示す自動運転特性設定部8について説明する。自動運転特性設定部8は、後述するように、乗員の走行状況に対する関心度が低い場合に選択する運転特性を設定する。以下、図17~図20を参照して詳細に説明する。
このように、ギャップ時間Δsを設定することにより、右折の判断を的確に行うことができるので、交差点での渋滞を緩和することができる。
次に、図1に示す運転特性切替部11について説明する。運転特性切替部11は、関心度判定部12と運転特性設定部13を備えている。
更に、前述したように、会話判定部16で分析された運転者に対する意識量を推定し、意識量が予め設定した第7閾値よりも高い場合に、走行状況に対する関心度が基準値よりも高いと判断する。
次に、本実施形態に係る自動運転車両の制御装置による処理手順を、図23、図24、図25に示すフローチャートを参照して説明する。図23は、全体の処理手順を示し、図24、図25は図23のS13の詳細な処理を示している。
ステップS33において、関心度判定部12は、周辺注視パラメータに基づき、周辺注視度F1が第1閾値F1thよりも高いか否かを判定する。
そして、周辺注視度F1が第1閾値F1thよりも高いと判定された場合には(ステップS33でYES)、ステップS37において、乗員の走行状況に対する関心度は基準値よりも高いと判定する。
初めにステップS51において、関心度判定部12は、スイッチ操作検出部3より出力される各種のスイッチ操作の頻度情報を取得する。
そして、非関連スイッチの操作頻度が第6閾値よりも高い場合には(ステップS53でYES)、ステップS55において、関心度判定部12は乗員の走行状況に対する関心度は基準値よりも低いと判定する。
ステップS16において、ステップS14の処理で設定された運転特性、或いはステップS15の処理で設定された運転特性のうちのいずれか一方を決定する。
また、瞬き間隔に対する閉眼時間の割合(開閉挙動特徴量PE)が第2閾値PEthよりも低い場合に、関心度が基準値よりも高いと判断するので、関心度の判定を高精度に行うことが可能となる。
更に、開眼時の開眼度X2が第3閾値X2thよりも高い場合に、関心度が基準値よりも高いと判断するので、関心度の判定を高精度に行うことが可能となる。
また、閉眼速度X1が第4閾値X1thよりも高い場合に、関心度が基準値よりも高いと判断するので、関心度の判定を高精度に行うことが可能となる。
また、非関連スイッチの操作頻度が第6閾値よりも高い場合に、関心度が基準値よりも低いと判定するので、簡単な処理で関心度の判定が可能となる。
また、上述した実施形態では、乗員の一例として運転者を例に挙げて説明したが、本発明は、運転者に限定されるものではなく、運転者以外の搭乗者の運転特性を用いて自動運転制御を実行することも可能である。
次に、前述した実施形態の変形例について説明する。上述した実施形態では、乗員の走行状況に対する関心度が基準値よりも高いと判定された場合には、手動運転特性データベース5に記憶されている運転特性から乗員の好みに合った運転特性を抽出して自動運転を行うようにした。
次に、実施形態の変形例2について説明する。変形例2では、自車両の乗員が対象物として自車両周囲の静止物を注視している場合に、この静止物の特性に応じて関心度を判断し、関心度に応じて自車両を制御する。例えば、乗員が、上記の静止物として、道路標識を注視していた場合は、関心度が高いと判断する。逆に、乗員が、上記の静止物として、山や空などの景色を注視している場合は、関心度が低いと判断する。乗員が、車外のどの静止物を注視しているか検出する方法として、標識や地理データを用いて、視線とその延長方向にある静止物とその特性を特定するようにしても良い。また、他にも、視線に延長する方向に対して、センシングを実施し、静止物とその特性を特定するようにしても良い。
2 眼球状態検出部
3 スイッチ操作検出部
4 個人適合運転特性判定部
5 手動運転特性データベース
6 自車両状況検出部
7 手動運転特性学習部
8 自動運転特性設定部
9 周囲状況検出部
11 運転特性切替部
12 関心度判定部
13 運転特性設定部
14 走行状況検出部
16 会話判定部
17 車外カメラ
18 眼球
19 瞳
20 赤外線カメラ
21 赤外線照明
22 画像処理部
23 照明・カメラコントローラ
25 レンズ
26 可視光遮光フィルタ
28 赤外線イメージセンサ
29 デジタルフィルタ
30 画像処理用GPU
31 パラメータ抽出部
32 車速センサ
33 加速度センサ
34 ステア角度センサ
35 車両間隔検出部
36 非車両検出部
37 周辺車両種類検出部
38 車線検出部
39 道路種類検出部
40 交通情報検出部
42 マイク
43 スピーカ
44 情報提示部
45 分析部
Claims (13)
- 自動運転車両を制御する自動運転車両の制御方法であって、
前記自動運転車両の走行状況に対する乗員の関心度を検出し、
前記関心度に応じた運転特性に基づいて、前記自動運転車両を制御すること
を特徴とする自動運転車両の制御方法。 - 請求項1に記載の自動運転車両の制御方法において、
前記自動運転車両の周囲状況を検出し、
前記関心度が基準値よりも低い場合には、前記周囲状況に応じた運転特性に基づいて前記自動運転車両を制御すること
を特徴とする自動運転車両の制御方法。 - 請求項1または2に記載の自動運転車両の制御方法において、
前記乗員の、手動運転時の運転特性を学習し、
前記関心度が基準値よりも高い場合には、前記手動運転時の運転特性に基づいて前記自動運転車両を制御すること
を特徴とする自動運転車両の制御方法。 - 請求項1~3のいずれか1項に記載の自動運転車両の制御方法において、
前記関心度が基準値よりも高い場合は、前記乗員が関心を示す対象物の特性を検出し、
前記対象物の特性に基づいて自動運転車両を制御すること
を特徴とする自動運転車両の制御方法。 - 請求項1~4のいずれか1項に記載の自動運転車両の制御方法において、
前記乗員の眼球の動きを検出し、周辺注視度が第1閾値よりも高い場合に、前記関心度が基準値よりも高いと判定することを特徴とする自動運転車両の制御方法。 - 請求項1~5のいずれか1項に記載の自動運転車両の制御方法において、
前記乗員の眼球の動きを検出し、瞬き間隔に対する閉眼時間の割合が第2閾値よりも低い場合に、前記関心度が基準値よりも高いと判定することを特徴とする自動運転車両の制御方法。 - 請求項1~6のいずれか1項に記載の自動運転車両の制御方法において、
前記乗員の眼球の動きを検出し、開眼時の開眼度が第3閾値よりも高い場合に、前記関心度が基準値よりも高いと判定することを特徴とする自動運転車両の制御方法。 - 請求項1~7のいずれか1項に記載の自動運転車両の制御方法において、
前記乗員の眼球の動きを検出し、閉眼速度が第4閾値よりも高い場合に、前記関心度が基準値よりも高いと判定することを特徴とする自動運転車両の制御方法。 - 請求項1~8のいずれか1項に記載の自動運転車両の制御方法において、
前記乗員のスイッチ操作を検出し、
自動運転車両の走行と関連するスイッチの操作頻度が第5閾値よりも高い場合には、前記関心度が基準値よりも高いと判定すること
を特徴とする自動運転車両の制御方法。 - 請求項1~9のいずれか1項に記載の自動運転車両の制御方法において、
前記乗員のスイッチ操作を検出し、
自動運転車両の走行と関連しないスイッチの操作頻度が第6閾値よりも高い場合には、前記関心度が基準値よりも低いと判定すること
を特徴とする自動運転車両の制御方法。 - 請求項1~10のいずれか1項に記載の自動運転車両の制御方法において、
乗員の会話を検出し、会話の内容に基づいて前記関心度が基準値よりも高いか否かを判断すること
を特徴とする自動運転車両の制御方法。 - 請求項1~11のいずれか1項に記載の自動運転車両の制御方法において、
前記乗員は、運転者であること
を特徴とする自動運転車両の制御方法。 - 自動運転車両を制御する自動運転車両の制御装置であって、
前記自動運転車両の走行状況に対する乗員の関心度を検出し、前記関心度に応じた運転特性に基づいて自動運転車両の運転特性を設定するように構成された自動運転車両の制御装置。
Priority Applications (10)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
MX2019001528A MX2019001528A (es) | 2016-08-09 | 2016-08-09 | Metodo de control y dispositivo de control de vehiculo de conduccion automatica. |
CN201680088216.5A CN109562763B (zh) | 2016-08-09 | 2016-08-09 | 自动驾驶车辆的控制方法及控制装置 |
KR1020197006601A KR102101867B1 (ko) | 2016-08-09 | 2016-08-09 | 자동 운전 차량의 제어 방법 및 제어 장치 |
JP2018533346A JP6690715B2 (ja) | 2016-08-09 | 2016-08-09 | 自動運転車両の制御方法及び制御装置 |
CA3033463A CA3033463A1 (en) | 2016-08-09 | 2016-08-09 | Control method and control device of automatic driving vehicle |
EP16912670.3A EP3498558B1 (en) | 2016-08-09 | 2016-08-09 | Control method and control device of automatic driving vehicle |
RU2019104753A RU2704053C1 (ru) | 2016-08-09 | 2016-08-09 | Способ управления (варианты) и устройство управления для автоматически управляемого транспортного средства |
US16/324,011 US10671071B2 (en) | 2016-08-09 | 2016-08-09 | Control method and control device of automatic driving vehicle |
BR112019002648-7A BR112019002648B1 (pt) | 2016-08-09 | 2016-08-09 | Método de controle e dispositivo de controle de veículo de condução automática |
PCT/JP2016/073471 WO2018029789A1 (ja) | 2016-08-09 | 2016-08-09 | 自動運転車両の制御方法及び制御装置 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/073471 WO2018029789A1 (ja) | 2016-08-09 | 2016-08-09 | 自動運転車両の制御方法及び制御装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018029789A1 true WO2018029789A1 (ja) | 2018-02-15 |
Family
ID=61161846
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/073471 WO2018029789A1 (ja) | 2016-08-09 | 2016-08-09 | 自動運転車両の制御方法及び制御装置 |
Country Status (10)
Country | Link |
---|---|
US (1) | US10671071B2 (ja) |
EP (1) | EP3498558B1 (ja) |
JP (1) | JP6690715B2 (ja) |
KR (1) | KR102101867B1 (ja) |
CN (1) | CN109562763B (ja) |
BR (1) | BR112019002648B1 (ja) |
CA (1) | CA3033463A1 (ja) |
MX (1) | MX2019001528A (ja) |
RU (1) | RU2704053C1 (ja) |
WO (1) | WO2018029789A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110171420A (zh) * | 2018-02-16 | 2019-08-27 | 本田技研工业株式会社 | 车辆控制装置 |
JP2020121619A (ja) * | 2019-01-30 | 2020-08-13 | 三菱電機株式会社 | 車両制御装置および車両制御方法 |
EP3767621A4 (en) * | 2018-03-14 | 2022-03-30 | Clarion Co., Ltd. | ON-BOARD DEVICE, MOVEMENT STATE ESTIMATION METHOD, SERVER DEVICE, INFORMATION PROCESSING METHOD AND MOVEMENT STATE ESTIMATION SYSTEM |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107207013B (zh) | 2014-12-12 | 2020-01-21 | 索尼公司 | 自动驾驶控制设备以及自动驾驶控制方法和程序 |
JP6686869B2 (ja) * | 2016-12-22 | 2020-04-22 | 株式会社デンソー | 運転交代制御装置、及び運転交代制御方法 |
DE102017211931B4 (de) * | 2017-07-12 | 2022-12-29 | Volkswagen Aktiengesellschaft | Verfahren zum Anpassen zumindest eines Betriebsparameters eines Kraftfahrzeugs, System zum Anpassen zumindest eines Betriebsparameters eines Kraftfahrzeugs und Kraftfahrzeug |
EP3732085A4 (en) * | 2017-12-27 | 2021-08-11 | Bayerische Motoren Werke Aktiengesellschaft | VEHICLE LINE CHANGE PREDICTION |
GB2587135B (en) * | 2018-03-23 | 2022-09-14 | Jaguar Land Rover Ltd | Vehicle controller and control method |
KR20210034843A (ko) * | 2019-09-23 | 2021-03-31 | 삼성전자주식회사 | 차량의 제어 장치 및 방법 |
US20210101625A1 (en) * | 2019-10-08 | 2021-04-08 | Motional Ad Llc | Navigating multi-way stop intersections with an autonomous vehicle |
CN111361567B (zh) * | 2020-02-25 | 2022-02-15 | 南京领行科技股份有限公司 | 一种车辆驾驶中应急处理的方法及设备 |
US11535253B2 (en) * | 2020-09-18 | 2022-12-27 | GM Global Technology Operations LLC | Lane change maneuver intention detection systems and methods |
JP7125969B2 (ja) * | 2020-10-28 | 2022-08-25 | 本田技研工業株式会社 | 車両制御装置、車両制御方法、およびプログラム |
JP7363757B2 (ja) * | 2020-12-22 | 2023-10-18 | トヨタ自動車株式会社 | 自動運転装置及び自動運転方法 |
CN116034066B (zh) * | 2020-12-28 | 2024-05-14 | 本田技研工业株式会社 | 车辆控制装置、车辆控制方法 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09249104A (ja) * | 1996-03-18 | 1997-09-22 | Nissan Motor Co Ltd | 車両用自動ブレーキ |
JP2008174092A (ja) * | 2007-01-18 | 2008-07-31 | Aisin Seiki Co Ltd | 速度制御装置 |
JP2010111260A (ja) * | 2008-11-06 | 2010-05-20 | Honda Motor Co Ltd | 車速制御装置 |
WO2010134396A1 (ja) * | 2009-05-21 | 2010-11-25 | 日産自動車株式会社 | 運転支援装置、及び運転支援方法 |
JP2015089801A (ja) * | 2013-11-07 | 2015-05-11 | 株式会社デンソー | 運転制御装置 |
WO2015151243A1 (ja) * | 2014-04-02 | 2015-10-08 | 日産自動車株式会社 | 車両用情報呈示装置 |
Family Cites Families (51)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06150199A (ja) * | 1992-11-13 | 1994-05-31 | Mitsubishi Electric Corp | 車両予防安全装置 |
US20030006888A1 (en) * | 2001-07-06 | 2003-01-09 | Burchette Robert L. | Vehicle-mounted mirror-integrated radar system |
BRPI0411056A (pt) | 2003-06-06 | 2007-04-17 | Volvo Technology Corp | método e disposição para controlar subsistemas veiculares baseados na atividade interpretativa do condutor |
JP4797588B2 (ja) * | 2005-11-17 | 2011-10-19 | アイシン精機株式会社 | 車両周辺表示装置 |
JP4716371B2 (ja) * | 2006-05-19 | 2011-07-06 | 富士通株式会社 | 移動体用運転支援装置 |
JP4946374B2 (ja) | 2006-11-13 | 2012-06-06 | トヨタ自動車株式会社 | 自動運転車両 |
US8068968B2 (en) * | 2007-02-06 | 2011-11-29 | Denso Corporation | Vehicle travel control system |
JP4333797B2 (ja) * | 2007-02-06 | 2009-09-16 | 株式会社デンソー | 車両用制御装置 |
JP2008197916A (ja) | 2007-02-13 | 2008-08-28 | Toyota Motor Corp | 車両用居眠り運転防止装置 |
US8301343B2 (en) * | 2007-05-02 | 2012-10-30 | Toyota Jidosha Kabushiki Kaisha | Vehicle behavior control device |
RU2413632C2 (ru) * | 2009-04-10 | 2011-03-10 | Учреждение Российской академии наук Центр информационных технологий в проектировании РАН | Способ предупреждения засыпания водителя транспортного средства |
US8552850B2 (en) * | 2010-02-17 | 2013-10-08 | Honeywell International Inc. | Near-to-eye tracking for adaptive operation |
JP5488105B2 (ja) * | 2010-03-26 | 2014-05-14 | マツダ株式会社 | 車両の運転支援装置 |
JP2011204053A (ja) * | 2010-03-26 | 2011-10-13 | Mazda Motor Corp | 車両の運転支援装置 |
US9298985B2 (en) * | 2011-05-16 | 2016-03-29 | Wesley W. O. Krueger | Physiological biosensor system and method for controlling a vehicle or powered equipment |
KR20120058230A (ko) * | 2010-11-29 | 2012-06-07 | 한국전자통신연구원 | 이동체 안전운행 장치 및 방법 |
DE102012214852B4 (de) * | 2012-08-21 | 2024-01-18 | Robert Bosch Gmbh | Verfahren und Vorrichtung zum Selektieren von Objekten in einem Umfeld eines Fahrzeugs |
JP5942761B2 (ja) | 2012-10-03 | 2016-06-29 | トヨタ自動車株式会社 | 運転支援装置および運転支援方法 |
US9751534B2 (en) * | 2013-03-15 | 2017-09-05 | Honda Motor Co., Ltd. | System and method for responding to driver state |
JP5999032B2 (ja) * | 2013-06-14 | 2016-09-28 | 株式会社デンソー | 車載表示装置およびプログラム |
US20150054951A1 (en) * | 2013-08-22 | 2015-02-26 | Empire Technology Development, Llc | Influence of line of sight for driver safety |
US9090260B2 (en) * | 2013-12-04 | 2015-07-28 | Mobileye Vision Technologies Ltd. | Image-based velocity control for a turning vehicle |
KR20150076627A (ko) * | 2013-12-27 | 2015-07-07 | 한국전자통신연구원 | 차량 운전 학습 시스템 및 방법 |
US9340207B2 (en) | 2014-01-16 | 2016-05-17 | Toyota Motor Engineering & Manufacturing North America, Inc. | Lateral maneuver planner for automated driving system |
KR101555444B1 (ko) * | 2014-07-10 | 2015-10-06 | 현대모비스 주식회사 | 차량탑재 상황감지 장치 및 그 방법 |
JP6135618B2 (ja) * | 2014-08-08 | 2017-05-31 | トヨタ自動車株式会社 | 車両制御装置 |
JP2016084092A (ja) | 2014-10-28 | 2016-05-19 | 富士重工業株式会社 | 車両の走行制御装置 |
GB2532457B (en) * | 2014-11-19 | 2018-04-18 | Jaguar Land Rover Ltd | Dynamic control apparatus and related method |
KR102263725B1 (ko) * | 2014-11-21 | 2021-06-11 | 현대모비스 주식회사 | 주행 정보 제공 장치 및 주행 정보 제공 방법 |
DE102014225383A1 (de) * | 2014-12-10 | 2016-06-16 | Robert Bosch Gmbh | Verfahren zum Betreiben eines Kraftfahrzeugs, Kraftfahrzeug |
JP6323318B2 (ja) * | 2014-12-12 | 2018-05-16 | ソニー株式会社 | 車両制御装置および車両制御方法、並びにプログラム |
CN107207013B (zh) * | 2014-12-12 | 2020-01-21 | 索尼公司 | 自动驾驶控制设备以及自动驾驶控制方法和程序 |
US10705521B2 (en) * | 2014-12-30 | 2020-07-07 | Visteon Global Technologies, Inc. | Autonomous driving interface |
KR101659034B1 (ko) * | 2015-01-20 | 2016-09-23 | 엘지전자 주식회사 | 차량의 주행 모드 전환 장치 및 그 방법 |
US9666079B2 (en) * | 2015-08-20 | 2017-05-30 | Harman International Industries, Incorporated | Systems and methods for driver assistance |
US10234859B2 (en) * | 2015-08-20 | 2019-03-19 | Harman International Industries, Incorporated | Systems and methods for driver assistance |
DE102015220237A1 (de) * | 2015-10-16 | 2017-04-20 | Zf Friedrichshafen Ag | Fahrzeugsystem und Verfahren zur Aktivierung einer Selbstfahreinheit zum autonomen Fahren |
EP3372466B1 (en) * | 2015-11-04 | 2020-06-10 | Nissan Motor Co., Ltd. | Autonomous vehicle operating apparatus and autonomous vehicle operating method |
JP6369487B2 (ja) * | 2016-02-23 | 2018-08-08 | トヨタ自動車株式会社 | 表示装置 |
JP6460008B2 (ja) * | 2016-02-25 | 2019-01-30 | トヨタ自動車株式会社 | 自動運転装置 |
US20170277182A1 (en) * | 2016-03-24 | 2017-09-28 | Magna Electronics Inc. | Control system for selective autonomous vehicle control |
US20170285641A1 (en) * | 2016-04-01 | 2017-10-05 | GM Global Technology Operations LLC | Systems and processes for selecting contextual modes for use with autonomous, semi-autonomous, and manual-driving vehicle operations |
US10317900B2 (en) * | 2016-05-13 | 2019-06-11 | GM Global Technology Operations LLC | Controlling autonomous-vehicle functions and output based on occupant position and attention |
US9956963B2 (en) * | 2016-06-08 | 2018-05-01 | GM Global Technology Operations LLC | Apparatus for assessing, predicting, and responding to driver fatigue and drowsiness levels |
KR20170141484A (ko) * | 2016-06-15 | 2017-12-26 | 엘지전자 주식회사 | 차량용 제어장치 및 그것의 제어방법 |
JP6778872B2 (ja) * | 2016-06-28 | 2020-11-04 | パナソニックIpマネジメント株式会社 | 運転支援装置及び運転支援方法 |
JP6737057B2 (ja) * | 2016-08-10 | 2020-08-05 | 富士通株式会社 | 視線検出装置、視線検出方法、及び視線検出プログラム |
DE102017208971A1 (de) * | 2017-05-29 | 2018-11-29 | Volkswagen Aktiengesellschaft | Verfahren und vorrichtung zur unterstützung eines in einem fahrzeug befindlichen fahrzeuginsassen |
EP3474253B1 (en) * | 2017-10-20 | 2024-04-10 | Honda Research Institute Europe GmbH | Gaze-guided communication for assistance in mobility |
US10836403B2 (en) * | 2017-12-04 | 2020-11-17 | Lear Corporation | Distractedness sensing system |
US10710590B2 (en) * | 2017-12-19 | 2020-07-14 | PlusAI Corp | Method and system for risk based driving mode switching in hybrid driving |
-
2016
- 2016-08-09 RU RU2019104753A patent/RU2704053C1/ru active
- 2016-08-09 EP EP16912670.3A patent/EP3498558B1/en active Active
- 2016-08-09 BR BR112019002648-7A patent/BR112019002648B1/pt active IP Right Grant
- 2016-08-09 US US16/324,011 patent/US10671071B2/en active Active
- 2016-08-09 WO PCT/JP2016/073471 patent/WO2018029789A1/ja unknown
- 2016-08-09 JP JP2018533346A patent/JP6690715B2/ja active Active
- 2016-08-09 KR KR1020197006601A patent/KR102101867B1/ko active IP Right Grant
- 2016-08-09 CN CN201680088216.5A patent/CN109562763B/zh active Active
- 2016-08-09 CA CA3033463A patent/CA3033463A1/en not_active Abandoned
- 2016-08-09 MX MX2019001528A patent/MX2019001528A/es unknown
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09249104A (ja) * | 1996-03-18 | 1997-09-22 | Nissan Motor Co Ltd | 車両用自動ブレーキ |
JP2008174092A (ja) * | 2007-01-18 | 2008-07-31 | Aisin Seiki Co Ltd | 速度制御装置 |
JP2010111260A (ja) * | 2008-11-06 | 2010-05-20 | Honda Motor Co Ltd | 車速制御装置 |
WO2010134396A1 (ja) * | 2009-05-21 | 2010-11-25 | 日産自動車株式会社 | 運転支援装置、及び運転支援方法 |
JP2015089801A (ja) * | 2013-11-07 | 2015-05-11 | 株式会社デンソー | 運転制御装置 |
WO2015151243A1 (ja) * | 2014-04-02 | 2015-10-08 | 日産自動車株式会社 | 車両用情報呈示装置 |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110171420A (zh) * | 2018-02-16 | 2019-08-27 | 本田技研工业株式会社 | 车辆控制装置 |
JP2019144691A (ja) * | 2018-02-16 | 2019-08-29 | 本田技研工業株式会社 | 車両制御装置 |
EP3767621A4 (en) * | 2018-03-14 | 2022-03-30 | Clarion Co., Ltd. | ON-BOARD DEVICE, MOVEMENT STATE ESTIMATION METHOD, SERVER DEVICE, INFORMATION PROCESSING METHOD AND MOVEMENT STATE ESTIMATION SYSTEM |
US11498576B2 (en) | 2018-03-14 | 2022-11-15 | Clarion Co., Ltd. | Onboard device, traveling state estimation method, server device, information processing method, and traveling state estimation system |
JP2020121619A (ja) * | 2019-01-30 | 2020-08-13 | 三菱電機株式会社 | 車両制御装置および車両制御方法 |
Also Published As
Publication number | Publication date |
---|---|
US10671071B2 (en) | 2020-06-02 |
CA3033463A1 (en) | 2018-02-15 |
EP3498558A4 (en) | 2019-12-18 |
KR102101867B1 (ko) | 2020-04-20 |
KR20190034331A (ko) | 2019-04-01 |
BR112019002648B1 (pt) | 2023-01-17 |
JPWO2018029789A1 (ja) | 2019-07-11 |
MX2019001528A (es) | 2019-07-04 |
US20190171211A1 (en) | 2019-06-06 |
BR112019002648A2 (pt) | 2019-05-28 |
EP3498558B1 (en) | 2021-10-20 |
EP3498558A1 (en) | 2019-06-19 |
CN109562763B (zh) | 2021-05-04 |
JP6690715B2 (ja) | 2020-04-28 |
CN109562763A (zh) | 2019-04-02 |
RU2704053C1 (ru) | 2019-10-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018029789A1 (ja) | 自動運転車両の制御方法及び制御装置 | |
CN109562758B (zh) | 自动驾驶车辆的控制方法及控制装置 | |
US10800428B2 (en) | Vehicle driving assistance method and vehicle | |
US9802622B2 (en) | Driver assistance apparatus and vehicle including the same | |
CN111295699B (zh) | 辅助方法以及利用该辅助方法的辅助系统、辅助装置 | |
US10713501B2 (en) | Focus system to enhance vehicle vision performance | |
US10994612B2 (en) | Agent system, agent control method, and storage medium | |
CA3033745A1 (en) | Vehicle control apparatus, vehicle control method, and movable object | |
US20190318746A1 (en) | Speech recognition device and speech recognition method | |
KR20180130433A (ko) | 차량 내에 위치하는 차량 탑승자를 지원하는 방법 및 장치 | |
CA3033738A1 (en) | Driving assistant apparatus, driving assistant method, moving object, and program | |
KR102331882B1 (ko) | 음성 인식 기반의 차량 제어 방법 및 장치 | |
JP6631569B2 (ja) | 運転状態判定装置、運転状態判定方法及び運転状態判定のためのプログラム | |
JP2017129973A (ja) | 運転支援装置および運転支援方法 | |
JP2019180075A (ja) | 運転支援システム、画像処理装置及び画像処理方法 | |
JP2017199205A (ja) | 車両の制御装置 | |
JP6701913B2 (ja) | 車両の制御装置 | |
US20200023863A1 (en) | Concentration degree determination device, concentration degree determination method, and program for determining concentration degree | |
JP6471707B2 (ja) | 運転教示装置 | |
CN112506353A (zh) | 车辆交互系统、方法、存储介质和车辆 | |
WO2018168046A1 (ja) | 集中度判定装置、集中度判定方法及び集中度判定のためのプログラム | |
US20220084518A1 (en) | Information Processing Device And Information Processing Method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16912670 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2018533346 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 3033463 Country of ref document: CA |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20197006601 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2016912670 Country of ref document: EP Effective date: 20190311 |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112019002648 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112019002648 Country of ref document: BR Kind code of ref document: A2 Effective date: 20190208 |