CN111469849B - Vehicle control device and vehicle - Google Patents

Vehicle control device and vehicle Download PDF

Info

Publication number
CN111469849B
CN111469849B CN202010073925.4A CN202010073925A CN111469849B CN 111469849 B CN111469849 B CN 111469849B CN 202010073925 A CN202010073925 A CN 202010073925A CN 111469849 B CN111469849 B CN 111469849B
Authority
CN
China
Prior art keywords
vehicle
occupant
vehicle control
control device
driver
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010073925.4A
Other languages
Chinese (zh)
Other versions
CN111469849A (en
Inventor
广濑峰史
池田雅也
八嶋淳平
石坂贤太郎
渡边崇
幸加木彻
八代胜也
西田大树
高田雄太
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Publication of CN111469849A publication Critical patent/CN111469849A/en
Application granted granted Critical
Publication of CN111469849B publication Critical patent/CN111469849B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • B60W30/182Selecting between different operative modes, e.g. comfort and performance modes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

The invention provides a vehicle control device and a vehicle. The vehicle control device (10) automatically travels the vehicle (12), and has a detection device (84), wherein the detection device (84) can detect the holding operation of the occupant on the fixed structure fixed at a position above the seat surface of the occupant, and when the holding operation is performed and the vehicle (12) is in a specific travel state, the control of the vehicle (12) is changed. Accordingly, the vehicle control system can be used for not only the driver but also all the passengers, and when the sense of uneasiness of the driver and the passenger to the vehicle operation during the automatic driving is predicted, the automatic driving control can be changed, so that the convenience and commodity value of the automatic driving control can be improved, and the sense of uneasiness to the vehicle operation can be further grasped by detecting the vehicle by the fixing structure of the upper body which can fix the driver or the passenger and is easy to shake.

Description

Vehicle control device and vehicle
Technical Field
The present invention relates to a vehicle control device and a vehicle.
Background
Japanese patent laying-open No. 2016-215793 has the technical problem of providing an automatic driving control device that can suppress switching from automatic driving to manual driving when a driver feels uneasy, thereby further improving the convenience of automatic driving control.
To solve this problem, an automatic driving control device described in japanese patent laying-open No. 2016-215793 includes: a detection unit that detects an operation performed by a driver; a determination unit that determines whether the operation of the driver detected by the detection unit during the automatic driving satisfies a 1 st operation condition or a 2 nd operation condition, wherein the 1 st operation condition is a condition that the driver is regarded as being uncomfortable with the automatic driving; the 2 nd action condition is a condition that is regarded as the driver feeling less restless against the automatic driving than in the 1 st action condition; and a notification unit that notifies the driver of a control plan during automatic driving when the determination unit determines that the operation of the driver satisfies the 1 st operation condition, and that switches from automatic driving to manual driving when the determination unit determines that the operation of the driver satisfies the 2 nd operation condition.
Disclosure of Invention
However, the technique described in japanese unexamined patent publication 2016-215793 has a problem in that it is difficult to cope with a plurality of passengers, considering only the structure of the driver. That is, when it is necessary to address not only the driver but also all the occupants, there is a problem in grasping the sense of uneasiness of the driver or the occupants to the vehicle operation, and the like.
The present invention aims to provide a vehicle control device and a vehicle capable of realizing the following (1) and (2).
(1) The present invention can be applied to not only a driver but also all passengers, and when a sense of uneasiness of the driver or the passenger to the vehicle operation during the automatic driving is predicted to be detected, the automatic driving control can be changed, so that convenience and commodity value of the automatic driving control can be improved.
(2) The detection is performed by the fixing structure capable of fixing the upper body of the driver or the passenger, which is easy to shake, and thus the sense of uneasiness in the operation of the vehicle can be further grasped.
A vehicle control device according to an aspect of the present invention is a vehicle control device for automatically driving a vehicle, the vehicle control device including a detection device capable of detecting a gripping operation of a fixed structure fixed above a seat surface of an occupant in the vehicle by the occupant, and changing control of the vehicle when the gripping operation is performed and the vehicle is in a specific driving state.
A vehicle according to another aspect of the present invention includes the vehicle control device described above.
According to the present invention, a vehicle control device and a vehicle that can realize the following (1) and (2) can be provided.
(1) Not only can the vehicle driver be associated with the vehicle, but also all the passengers can be associated with the vehicle, and when the driver or the passenger is predicted to detect the sense of uneasiness of the vehicle operation during the automatic driving, the automatic driving control can be changed, so that the convenience and commodity value of the automatic driving control can be improved.
(2) The detection is performed by the fixing structure capable of fixing the upper body of the driver or the passenger, which is easy to shake, and thus the sense of uneasiness in the operation of the vehicle can be further grasped.
The above objects, features and advantages should be easily understood by the following description of the embodiments with reference to the attached drawings.
Drawings
Fig. 1 is a plan view schematically showing the interior of a vehicle according to the present embodiment.
Fig. 2 is a side view schematically showing the interior of the vehicle of the present embodiment.
Fig. 3 is a block diagram showing a vehicle control device according to the present embodiment.
Fig. 4A is table 1 showing details of control change items and change durations for each specific scene, fig. 4B is table 2 showing priorities of riding positions when lane change is performed in the rightward direction, and fig. 4C is table 3 showing priorities of riding positions when deceleration is performed with respect to a preceding vehicle.
Fig. 5 is a flowchart showing an example of the 1 st processing operation of the vehicle control device according to the present embodiment.
Fig. 6 is a flowchart showing an example of the 2 nd processing operation of the vehicle control device according to the present embodiment.
Fig. 7 is a flowchart showing an example of the 3 rd processing operation of the vehicle control device according to the present embodiment.
Fig. 8 is a flowchart showing an example of the 4 th processing operation of the vehicle control device according to the present embodiment.
Detailed Description
Hereinafter, a vehicle control device and a vehicle according to the present invention will be described in detail with reference to the accompanying drawings, by taking a preferred embodiment as an example.
The vehicle control device 10 and the vehicle 12 according to the present embodiment will be described with reference to the drawings.
First, as shown in fig. 1, a vehicle (host vehicle) 12 is fixed with various fixing structures. As the fixing structure, for example, four armrests (grad rails) 14, four door handles (door handles) 16, door armrests (door armrest) 18, four seat belts 20, two seat handles (seat handles) 22, two headrests 24, four seats 26, and the like are provided. The seat belt 20 has a flexible belt, and is fixed to a seat frame by a retractor (not shown) and a buckle (buckle), and thus is one of fixing structures. As shown in fig. 2, the armrest 14, the door handle 16, the door-side armrest 18, the seat handle 22, the headrest 24, and the like are fixed above the seat surface of the occupant.
As shown in fig. 3, the armrest 14, the door handle 16, the door side armrest 18, the seat belt 20, the seat handle 22, the headrest 24, and the like are provided with a 1 st sensor 30a, a 2 nd sensor 30b, a 3 rd sensor 30c, a 4 th sensor 30d, a 5 th sensor 30e, a 6 th sensor 30f, and the like, respectively, for detecting the gripping or contact of an occupant, and the seat 26 is provided with a 7 th sensor 30g, an 8 th sensor 30h, and the like, for detecting the seating of an occupant, for example, against the backrest. As the various sensors, a sensor for detecting pressure (a sheet-like pressure sensor or the like), a sensor for detecting load, a sensor for detecting perspiration, and the like can be mentioned.
In addition, an in-vehicle camera (in-vehicle monitoring camera or the like) or an electromagnetic wave sensor or the like that captures the expression of the occupant may be provided in the vehicle.
As shown in fig. 3, the vehicle 12 further includes an external sensor 40, a vehicle body behavior sensor 42, a vehicle operation sensor 44, a communication unit 46, and an HMI (human-machine interface) 48. The vehicle 12 further has a drive device 50, a brake device 52, a steering device 54, a navigation device 56, and a positioning portion 58. The vehicle 12 further includes components other than these components, but the description thereof is omitted here.
The outside world sensor 40 acquires outside world information, that is, surrounding information of the vehicle 12. The external sensor 40 has a plurality of cameras and a plurality of radars, which are not shown. The ambient sensor 40 also has a plurality of LiDARs (Light Detection And Ranging: light detection and ranging/Laser Imaging Detection and Ranging: laser imaging detection and ranging), not shown.
The information acquired by the camera described above, that is, camera information is supplied from the camera to the vehicle control device 10. As camera information, imaging information and the like can be given. The camera information and the radar information are combined to form external information.
The radar transmits a transmission wave to the outside of the vehicle 12, and receives a reflected wave reflected by the object to be detected among the transmitted transmission waves. Examples of the transmission wave include electromagnetic waves. Examples of the electromagnetic wave include millimeter waves. Examples of the detection object include other vehicles including a preceding vehicle. The radar generates radar information (reflected wave signal) from reflected waves and the like. The radar supplies the generated radar information to the vehicle control device 10. Further, the radar is not limited to millimeter wave radar. For example, a laser radar, an ultrasonic sensor, or the like may be used as the radar.
LiDAR emits laser light continuously in all directions to the vehicle 12, measures the three-dimensional position of the reflection point from the reflected wave of the emitted laser light, and outputs information about the three-dimensional position, that is, three-dimensional information. The LiDAR supplies the three-dimensional information, i.e., liDAR information, to the vehicle control device 10.
The vehicle body behavior sensor 42 acquires information about the behavior of the vehicle 12, that is, vehicle body behavior information. The vehicle body behavior sensor 42 includes a vehicle speed sensor, not shown, a wheel rotation speed sensor, not shown, an acceleration sensor, not shown, and a yaw rate sensor, not shown. The vehicle speed sensor detects the speed of the vehicle 12, i.e., the vehicle speed. In addition, the vehicle speed sensor also detects the traveling direction of the vehicle 12. The wheel rotation speed sensor detects the rotation speed of a wheel, not shown, that is, the wheel speed. The acceleration sensor detects acceleration of the vehicle 12. The acceleration includes a front-rear acceleration, a lateral acceleration, and an up-down acceleration. Further, only acceleration in a part of the directions may be detected by the acceleration sensor. The yaw rate sensor detects a yaw rate of the vehicle 12.
The vehicle operation sensor 44 acquires information about a driving operation of a user (driver), that is, driving operation information. The vehicle operation sensor 44 includes an accelerator pedal sensor, a brake pedal sensor, a rudder angle sensor, and a steering torque sensor. The accelerator pedal sensor detects an operation amount of an accelerator pedal, not shown. The brake pedal sensor detects an operation amount of a brake pedal, not shown. The steering angle sensor detects a steering angle of a steering wheel, not shown. The steering torque sensor detects a torque applied to a steering wheel.
The communication unit 46 performs wireless communication with an external device not shown. The external device may include, for example, an external server not shown. The communication unit 46 may be detachable from the vehicle 12 or may be detachable from the vehicle 12. The communication unit 46 that is detachable from the vehicle 12 includes, for example, a mobile phone, a smart phone, and the like.
The HMI48 accepts operation inputs from the user (occupant) and provides various information to the user in a visual, audible, or tactile manner. The HMI48 includes, for example, an autopilot switch (driving assist switch) 60, a display not shown, a contact sensor not shown, an in-vehicle camera 62 (including the above-described in-vehicle monitoring camera and the like), a speaker not shown, and an operation tool not shown.
The autopilot switch 60 is a switch for a user to instruct the start and stop of autopilot. The autopilot switch 60 includes a start switch, not shown, and a stop switch, not shown. The start switch outputs a start signal to the vehicle control device 10 according to the operation of the user. The stop switch outputs a stop signal to the vehicle control device 10 according to the operation of the user.
The display includes, for example, a liquid crystal panel, an organic EL panel, and the like. Here, the case where the display is a touch panel will be described as an example, but the present invention is not limited to this.
The touch sensor is a sensor for detecting whether a user (driver) is touching the steering wheel. The signal output from the contact sensor is supplied to the vehicle control device 10. The vehicle control device 10 can determine whether the user is touching the steering wheel based on the signal supplied from the contact sensor.
The in-vehicle camera 62 photographs the interior of the vehicle 12, i.e., the cabin interior. The in-vehicle camera 62 may be provided on, for example, a dashboard not shown, or may be provided on a ceiling not shown. The in-vehicle camera 62 may be provided to capture only the driver, or may be provided to capture each passenger. The in-vehicle camera 62 outputs image information, which is information obtained by capturing the interior of the vehicle cabin, to the vehicle control device 10.
The speaker is a means for providing various information to a user in a voice manner. The vehicle control device 10 outputs various notifications, alarms, and the like using a speaker.
The driving device (driving force control system) 50 has a driving ECU not shown and a driving source not shown. The drive ECU controls the driving force (torque) of the vehicle 12 by controlling the driving source. Examples of the drive source include an engine and a drive motor. The drive ECU controls the drive source in accordance with the operation of the accelerator pedal by the user, whereby the braking force can be controlled. The drive ECU can control the driving force by controlling the driving source in accordance with the command supplied from the vehicle control device 10. The driving force of the driving source is transmitted to wheels, not shown, via a transmission, not shown, or the like.
The brake device (braking force control system) 52 includes a brake ECU (not shown) and a brake mechanism (not shown). The brake mechanism operates the brake member by a brake motor, a hydraulic mechanism, or the like. The brake ECU controls the brake mechanism according to the operation of the brake pedal by the user, whereby the braking force can be controlled. The brake ECU controls the brake mechanism according to a command supplied from the vehicle control device 10, thereby enabling control of the braking force.
The steering device (steering system) 54 includes an unillustrated steering ECU, that is, an EPS (electric power steering system: electric Power Steering system) ECU and an unillustrated steering motor. The steering ECU controls the steering motor in response to a user's operation of the steering wheel, thereby controlling the direction of the wheels (steering wheels). The steering ECU controls the steering motor in accordance with a command supplied from the vehicle control device 10, thereby controlling the direction of the wheels. Further, steering may be performed by changing torque distribution or braking force distribution to the left and right wheels.
The navigation device 56 has a GNSS (Global Navigation Satellite System ) sensor, not shown. The navigation device 56 further includes an unillustrated arithmetic unit and an unillustrated storage unit. The GNSS sensor detects the current position of the vehicle 12. The computing unit reads map information corresponding to the current position detected by the GNSS sensor from the map database stored in the storage unit. The calculation unit uses the map information to determine a target path from the current position to the destination. Further, the destination is entered by the user via the HMI48. As described above, the display is a touch screen. Input of the destination is made by operating the touch screen by the user. The navigation device 56 outputs the produced target path to the vehicle control device 10. The vehicle control device 10 supplies the target path to the HMI48. The HMI48 displays the target path on a display.
The positioning unit 58 has a GNSS not shown. The positioning unit 58 includes an IMU (Inertial Measurement Unit: inertial measurement unit), not shown, and a map database (map DB), not shown. The positioning portion 58 suitably uses the information acquired by the GNSS, the information acquired by the IMU, and the map information stored in the map database to determine the position of the vehicle 12. The positioning unit 58 can supply the vehicle control device 10 with vehicle position information, which is information indicating the position of the vehicle 12. The positioning unit 58 can supply map information to the vehicle control device 10.
The vehicle control device 10 further includes a calculation unit 70, a storage unit 72, and an input/output unit 74. The arithmetic unit 70 is responsible for controlling the entire vehicle control device 10. The arithmetic unit 70 is constituted by CPU (Central Processing Unit), for example. The arithmetic unit 70 controls each unit according to a program stored in the storage unit 72, and thereby executes vehicle control. The input/output unit 74 acquires and inputs signals, data, and the like from the outside of the vehicle control device 10 to the arithmetic unit 70 and the storage unit 72, and outputs signals and data output from the arithmetic unit 70 and the storage unit 72 to the outside.
The computing unit 70 further includes an occupant detection unit 80, a specific running state detection unit 82, an occupant specific motion detection unit 84, a priority setting unit 86, a vehicle control changing unit 88, and a detection target exclusion specific unit 90.
The occupant detection unit 80 detects which seat 26 the occupant sits on by the 7 th sensor 30g, the 8 th sensor 30h, the in-vehicle camera 62, and the like.
The specific running state detection unit 82 detects whether or not the vehicle 12 is in a preset specific running state, that is, whether or not a specific scene is encountered, for example. The specific scene is a scene in which the operation of the vehicle 12 is in a transitional state, and examples thereof include an acceleration/deceleration state, a turning state, a lane change state, a timing of gripping/not gripping the steering wheel, a pressure change, a timing of gripping the steering wheel when entering a curve, a positional relationship with surrounding vehicles, a speed difference, and the like. These states are detected based on map information, the current position of the vehicle 12, information such as the position of other vehicles in the surrounding area obtained by the communication unit 46, drive signals from the drive device 50, the brake device 52, the steering device 54, and the like. As specific scenes, for example, as shown in table 1 of fig. 4A, acceleration and deceleration, curve, lane change, split-flow, acceleration and deceleration immediately after the start of automatic driving, and the like, which are adapted to the preceding vehicle, are given.
The occupant-specific motion detection unit 84 detects the gripping of the fixed structure by the occupant and the contact with the fixed structure. By this detection, it is possible to detect whether the occupant feels uneasy.
In this case, the holding of the fixed structure or the contact with the fixed structure may be detected for all the passengers, or when a plurality of passengers are detected, the passenger at the passenger position having a high priority set in advance or the passenger at the passenger position having a high priority set in accordance with a specific scene may be set by the priority setting unit 86, and the holding of the fixed structure or the contact with the fixed structure may be detected for the passenger at the passenger position having a high priority.
Here, the riding position with a high priority set in advance is a rear seat of the riding position of the driver. Since the rear seat of the riding position of the driver is the safest, when the occupant seated in the rear seat grips or contacts the fixed structure and feels uneasy, it can be determined that all the occupants feel uneasy.
The riding position set with a high priority according to the specific scene means, for example, if a lane change is performed in the right direction, the right rear seat, the left rear seat, the right front seat, and the left front seat are in the order of the priority from high to low as shown in table 2 of fig. 4B.
Similarly, if acceleration and deceleration are performed with respect to the forward traveling vehicle, the left front seat, the right front seat, the left rear seat, and the right rear seat are in the order of the priority from high to low as shown in table 3 of fig. 4C.
In this way, the priority of the occupant can be determined for each specific scene, whereby detection of the uncomfortable occupant can be performed in a short time.
The holding of the fixed structure or the contact with the fixed structure may be detected using a camera image, a pressure sensor, a sweat sensor, an electromagnetic wave sensor, or the like. The health state of the occupant may be estimated from the camera image of the face of the occupant, for example, by measuring the pulse rate of the occupant and comparing the measured pulse rate with a standard pulse rate. Of course, the accuracy of estimating the health state of each passenger may be improved by acquiring medical information (medical history, etc.) of each passenger by the communication unit 46, for example.
In the camera image, for example, the occupant is perceived as being uncomfortable when the occupant is detected as looking down and shaking, the occupant looks at a bit, the occupant looks around in a panic, the occupant looks at a speedometer, and the like. In the case of a pressure sensor, a load sensor, or the like, when a pressure or load equal to or higher than a predetermined pressure is applied to the pressure sensor or the load sensor, the pressure sensor or the load sensor is recognized as being uncomfortable. In addition, in the pressure sensor and the load sensor, the correlation coefficient between the sensor waveform from these sensors and the reference waveform (a plurality of waveforms representative of the case where a person feels uncomfortable) measured in advance is taken, and it is determined that the occupant feels uncomfortable as the correlation coefficient approaches 1. In addition, various methods can be cited.
The sweat sensor may be combined to improve the accuracy for detecting whether or not it is uncomfortable. The emotion of the occupant may be recognized from data detected by an electromagnetic wave sensor (see http:// newswitch.jp/p/6219 "device for knowing the true emotion of a person from a distance").
On the other hand, the vehicle control changing unit 88 changes the vehicle control in accordance with a plurality of scenes set in advance.
As a modification of the vehicle control, as shown in table 1 of fig. 4A, for example, the vehicle-to-vehicle distance is enlarged during acceleration and deceleration compatible with the preceding vehicle, and the lateral movement speed during steering and the vehicle speed are reduced during a curve, a lane change, and a split-flow, for example. Further, in acceleration and deceleration immediately after the start of automatic driving, for example, suppression control of acceleration gain is given.
In the case where acceleration and deceleration is being performed in accordance with the traveling vehicle ahead, the duration of change of the vehicle control may be, for example, a time period following the traveling vehicle ahead, and in a curve, a lane change, and a split-flow, the duration of change of the vehicle control may be, for example, a time period required for the vehicle to move laterally. In addition, in acceleration and deceleration immediately after the start of the automatic driving, the duration of change of the vehicle control may be, for example, from the completion (end) of the automatic driving (for example, 30 minutes or 40 km). At the stage when the change duration has elapsed, the vehicle control device 10 returns the change content of the vehicle control to the original state.
Of course, the duration of the change of the vehicle control may be a predetermined distance of travel. Examples of the method include: in the case where acceleration and deceleration is being performed in accordance with the preceding vehicle, for example, the change of the vehicle control is maintained until the distance between the following preceding vehicle reaches a predetermined distance, and in the course of the curve, the lane change, and the split-flow, the change of the vehicle control is maintained until the lateral movement of the vehicle reaches a predetermined distance, for example. The vehicle control device 10 may restore the changed content of the vehicle control to the original state at a stage when the vehicle has traveled the predetermined distance.
In addition, if the health state of the occupant detected by the occupant specific motion detection unit 84 is deteriorated, the vehicle control change unit 88 increases the degree of change of the vehicle control. In this case, the degree of change in the vehicle control is increased according to the difference between the pulse rate of the occupant whose health state is deteriorated and the standard pulse rate. As a medium for detecting a health state from a camera image, there are exemplified a heart rate, a respiration rate, a heart beat fluctuation, and the like, in addition to the above pulse rate.
The detection target exclusion specifying unit 90 specifies an occupant excluded from the detection targets among the occupants. Specifically, when the vehicle 12 is not in the specific running state, the occupant who is holding the fixed structure and is in contact with the fixed structure for a predetermined time or longer or a predetermined distance or longer is excluded from the detection targets. Accordingly, the time for detecting the uncomfortable occupant can be shortened, and the commodity value as an autonomous vehicle can be improved.
Next, some processing operations of the vehicle control device 10 according to the present embodiment will be described with reference to fig. 5 to 8.
[ processing action 1 ]
The 1 st processing operation is a basic processing operation of the vehicle control device 10, and first, in step S1 of fig. 5, the occupant detection unit 80 detects which seat 26 the occupant sits on by a pressure sensor of the seat 26, an in-vehicle camera, or the like.
In step S2, the specific traveling state detection unit 82 detects whether or not the host vehicle 12 is in a preset specific traveling state, that is, whether or not a specific scene is encountered.
If the host vehicle 12 encounters a specific scene, the flow advances to step S3, where the occupant specific motion detection unit 84 detects a specific motion of the occupant, that is, a gripping of the fixed structure or a contact with the fixed structure.
When the above-described specific operation of the occupant is detected, the flow proceeds to step S4, and the vehicle control changing unit 88 changes the vehicle control in accordance with the plurality of preset scenes, as exemplified above (see table 1 in fig. 4A).
When the process in step S4 ends, or when no specific action of the occupant is detected in step S3, or when it is determined in step S2 that the host vehicle 12 has not encountered a specific scene, the process in step S1 and the subsequent steps are repeated after a predetermined period of time has elapsed.
[ processing action 2 ]
The 2 nd processing operation is the processing operation including the priority setting processing in the 1 st processing operation, and first, in step S101 of fig. 6, the priority setting unit 86 sets the occupant of the riding position having the higher priority set in advance or the riding position having the higher priority set in accordance with the specific scene as described above.
In step S102, the occupant detection unit 80 detects whether or not the occupant is seated at the riding position having the high priority set by the priority setting unit 86, by a pressure sensor of the seat 26, an in-vehicle camera, or the like.
In step S103, the specific traveling state detection unit 82 detects whether or not the host vehicle 12 is in a preset specific traveling state, that is, whether or not a specific scene is encountered.
If the host vehicle 12 encounters a specific scene, the flow proceeds to step S104, where the occupant specific motion detection unit 84 detects a specific motion of the occupant to the riding position with a high priority, that is, a grip on or contact with the fixed structure.
When the above-described specific actions of the occupant are detected, the flow proceeds to step S105, and the vehicle control changing unit 88 changes the vehicle control in accordance with the predetermined plurality of scenes as described above (see table 1 in fig. 4A).
When the process in step S105 ends, or when no specific action of the occupant is detected in step S104, or when it is determined in step S103 that the host vehicle 12 has not encountered a specific scene, the process in step S101 and the subsequent steps are repeated after a predetermined period of time has elapsed.
[ 3 rd processing action ]
The 3 rd processing operation is the processing operation including the object exclusion detection processing in the 1 st processing operation, and first, in step S201 in fig. 7, the occupant detection unit 80 detects which seat 26 the occupant sits on by a pressure sensor of the seat 26, an in-vehicle camera, or the like.
In step S202, the specific traveling state detection unit 82 detects whether or not the host vehicle 12 is in a preset specific traveling state, that is, whether or not a specific scene is encountered.
If the vehicle 12 does not encounter the specific scene, the detection target exclusion specifying unit 90 determines whether or not an occupant excluded from the detection targets is present among the occupants, in step S203. If so, the flow advances to step S204 to specify the occupant excluded from the detection object.
On the other hand, if the host vehicle 12 encounters a specific scene in step S202, the flow proceeds to step S205, where the occupant specific motion detection unit 84 detects a specific motion, that is, a gripping of the fixed structure or a contact with the fixed structure, for an occupant other than the occupant excluded from the detection targets.
When the above-described specific operation of the occupant is detected, the flow proceeds to step S206, and the vehicle control changing unit 88 changes the vehicle control in accordance with a plurality of preset scenes as exemplified above (see table 1 in fig. 4A).
When the process in step S206 ends, or when no specific operation by the occupant is detected in step S205, or when the occupant excluded from the detection target is specified in step S204, or when it is determined in step S203 that the occupant excluded from the detection target is not present, the process in step S201 and the subsequent steps are repeated after a predetermined period of time has elapsed.
[ action of processing 4 ]
The 4 th processing operation is a processing operation including the priority setting processing in the 3 rd processing operation, and first, in step S301 of fig. 8, the priority setting unit 86 sets the occupant of the previously set high-priority riding position or the high-priority riding position set in accordance with the specific scene as described above.
In step S302, the occupant detection unit 80 detects whether or not the occupant is seated at the riding position having the high priority set by the priority setting unit 86, by a pressure sensor of the seat 26, an in-vehicle camera, or the like.
In step S303, the specific traveling state detection unit 82 detects whether or not the host vehicle 12 is in a preset specific traveling state, that is, whether or not a specific scene is encountered.
If the vehicle 12 does not encounter the specific scene, the detection target exclusion specifying unit 90 determines whether or not an occupant excluded from the detection targets is present among the occupants, in step S304. If so, the flow advances to step S305 to specify the occupant excluded from the detection object.
On the other hand, if the host vehicle 12 encounters a specific scene in step S303, the flow proceeds to step S306, and the occupant specific motion detection unit 84 detects a specific motion, that is, a gripping of the fixed structure or a contact with the fixed structure, for an occupant other than the occupant excluded from the detection targets.
When the above-described specific operation of the occupant is detected, the flow proceeds to step S307, and the vehicle control changing unit 88 changes the vehicle control in accordance with a plurality of preset scenes as exemplified above (see table 1 in fig. 4A).
When the process in step S307 is completed, or when no specific action of the occupant is detected in step S306, or when an occupant excluded from the detection target is specified in step S305, or when it is determined in step S304 that there is no occupant excluded from the detection target, the process in step S301 and the subsequent steps are repeated after a predetermined period of time has elapsed.
Summary of the embodiments
The following describes the invention that can be grasped according to the above embodiment.
The present embodiment is a vehicle control device 10 for automatically driving a vehicle 12, and includes a detection device (a 1 st sensor 30a to an 8 th sensor 30h, etc.) capable of detecting a gripping operation of a fixed structure in the vehicle 12 by an occupant, the fixed structure being fixed at a position above a seat surface of the occupant, and changing control of the vehicle 12 when the gripping operation is performed and the vehicle 12 is in a specific driving state.
Accordingly, for example, the vehicle control is changed according to the state of the holding operation (gripping or the like) of the occupant holding the fixing structure in a specific scene. For example, in a running state in which the vehicle 12 runs by automatic driving, if the occupant unintentionally grips (grips or the like) the fixed structure, the occupant is considered to be present and is uncomfortable with the running state, and by changing the vehicle control, the occupant can be prevented from feeling uncomfortable as much as possible. This may correspond to all occupants including the driver.
By changing (reflecting or the like) the vehicle control in a specific scene, it is possible to prevent erroneous learning by a person who has a gripping habit at ordinary times, and to detect a person who is actually frightened to grip with high accuracy, and to perform control reflection. Further, since the fixing structure is fixed in the vehicle above the seat surface of the occupant, the sense of uneasiness in the operation of the vehicle can be further grasped by detecting the fixing structure that can fix the driver or the upper body of the occupant that is prone to rattle.
In the present embodiment, the detection device can detect the holding operation of the occupant with respect to the plurality of fixing structures fixed to the vehicle 12.
As an example, a fixing structure (for example, the armrest 14 and an inner handle (door handle 16) of a door) that can fix an upper body that is easy to shake by a driver or a passenger can be mentioned, and by using these fixing devices, the sense of uneasiness of the vehicle operation by all the passengers including the driver can be further grasped.
In the present embodiment, the specific running state defines a plurality of running states, and the change parameter of the vehicle control is changed in accordance with each running state.
For example, there are a plurality of specific running states as defined specific running states, and the vehicle control is changed according to the state of the gripping operation (gripping or the like) of the occupant on the fixed structure in accordance with the running states of the vehicle 12 in these specific running states.
That is, among the change parameters of the vehicle control set in correspondence with the plurality of specific running states, the change parameters corresponding to the specific running state in which the occupant performs the gripping operation (gripping or the like) on the fixed structure are changed. Accordingly, appropriate control reflection can be performed, and the occupant can be prevented from feeling uncomfortable as much as possible.
In the present embodiment, the specific running state defines a plurality of running states, and the changing period of the vehicle control is changed in accordance with each running state.
That is, the period in which the change of the vehicle control is reflected can be changed for each specific running state, and thus appropriate control reflection can be performed. Accordingly, the occupant can be kept from giving a sense of anxiety as much as possible. Further, the appropriate control reflection can be performed only during the necessary period.
In the present embodiment, the vehicle-mounted camera detects the movement of the occupant, and the vehicle-mounted camera detects the health state of the occupant, and when the health state is deteriorated, the vehicle control change amount is increased.
Other parameters (camera image, health state, etc.) are considered in the condition of the holding action (gripping, etc.) of the occupant holding the fixed structure. For example, if the health state is deteriorated, the change in the control amount is further increased, so that the passenger to be treated can be prevented from feeling of anxiety as much as possible.
In the present embodiment, when a plurality of occupants are detected, the priority is determined according to the riding position of each occupant, and the vehicle control is changed according to the detected condition of the occupant having the higher priority.
By giving priority to the occupant according to the riding position, it is possible to determine according to the occupant having a high priority. Accordingly, the vehicle control can be changed in advance, and thus, the occupant to be subjected to the vehicle control can be prevented from feeling of anxiety as much as possible.
In the present embodiment, when a plurality of occupants are detected, the riding position with high priority is determined according to the specific traveling state, and the vehicle control is changed according to the detection condition of the occupant with the riding position with high priority.
That is, the priority given to the passenger according to the riding position is changed according to the specific running state. Accordingly, it is possible to determine that the vehicle control is changed according to the occupant of the riding position with the higher priority for each driving scene.
In the present embodiment, the vehicle control is changed, and when at least one of the predetermined travel distance and the predetermined travel time is satisfied after the change of the vehicle control, the change content of the vehicle control is restored.
That is, the change content of the vehicle control is restored to the original state at a stage of traveling a predetermined travel distance after the change of the vehicle control or at a stage of traveling a predetermined travel time. That is, since the occupant is accustomed to the vehicle control based on the automatic driving after the change as the predetermined distance is travelled or the predetermined time elapses, returning to the vehicle control before the change is effective to improve the driving efficiency.
In the present embodiment, the occupant who has continued to hold the fixed structure for a predetermined time or more in a traveling state other than the specific traveling state is excluded from the detection object.
Since a person who grips the armrest 14 or the like has a gripping habit even in a traveling state other than a specific traveling state, and does not feel a strong sense of uneasiness even when gripping, it is possible to shorten the time for detecting a passenger feeling uneasiness by excluding the person from the detection object, and thus it is possible to improve the commodity value as an autonomous vehicle.
The vehicle 12 has the vehicle control device 10 as described above.
Modification example
The vehicle control changing unit 88 may recommend the change of the inter-vehicle distance and the travel mode, or correct the vehicle control to a safer side at the initial stage of the automatic driving (for example, when starting the riding), in addition to changing the vehicle control in accordance with the plurality of predetermined scenes.
Further, when the duration of the automatic driving has elapsed or the travel distance of the automatic driving has reached the predetermined distance, the vehicle control change according to the specific action of the occupant in the specific scene as described above may be gradually not performed, but the vehicle may be returned to the travel in the normal automatic driving mode or the recommended rest may be recommended.

Claims (7)

1. A vehicle control device (10) for automatically driving a vehicle (12), characterized in that,
comprises a detection device (84), wherein the detection device (84) can detect the holding action of a passenger of a rear seat of a riding position of a driver on a fixing structure fixed at a position above a seat surface of the passenger,
when the detection device (84) detects the holding operation of the occupant of the rear seat of the riding position of the driver and the vehicle (12) is in a specific running state, control of the vehicle (12) is changed, but when the holding state of the occupant of the rear seat of the riding position of the driver for the fixed structure in a running state other than the specific running state continues for a prescribed time or more, the occupant is excluded from detection objects.
2. The vehicle control device (10) according to claim 1, characterized in that,
the detection device (84) is capable of detecting the gripping actions of the occupant on a plurality of the fixed structures fixed to the vehicle (12).
3. The vehicle control device (10) according to claim 1, characterized in that,
defining a plurality of travel states as the specific travel states,
and changing the changing parameters of the vehicle control according to the driving states.
4. The vehicle control device (10) according to claim 1, characterized in that,
defining a plurality of travel states as the specific travel states,
and changing a change period of the vehicle control in accordance with each of the running states.
5. The vehicle control device (10) according to claim 1, characterized in that,
has an onboard camera (62) for detecting the motion of the occupant,
detecting the health state of the passenger by using the vehicle-mounted camera (62),
when the state of health is deteriorated, the vehicle control change amount is increased.
6. The vehicle control device (10) according to claim 1, characterized in that,
and restoring the change content of the vehicle control when at least one of the predetermined travel distance and the predetermined travel time is satisfied after the change of the vehicle control.
7. A vehicle (12) having a vehicle control device (10) is characterized in that,
comprises a detection device (84), wherein the detection device (84) can detect the holding action of a passenger of a rear seat of a riding position of a driver on a fixing structure fixed at a position above a seat surface of the passenger,
the vehicle control device (10) changes control of the vehicle (12) when the occupant of the rear seat in the riding position of the driver performs the holding operation and the vehicle (12) is in a specific running state, but excludes the occupant from detection objects when the holding state of the occupant of the rear seat in the riding position of the driver for the fixed structure is continued for a predetermined time or more in a running state other than the specific running state.
CN202010073925.4A 2019-01-24 2020-01-22 Vehicle control device and vehicle Active CN111469849B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-010375 2019-01-24
JP2019010375A JP7157671B2 (en) 2019-01-24 2019-01-24 Vehicle control device and vehicle

Publications (2)

Publication Number Publication Date
CN111469849A CN111469849A (en) 2020-07-31
CN111469849B true CN111469849B (en) 2023-05-09

Family

ID=71747063

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010073925.4A Active CN111469849B (en) 2019-01-24 2020-01-22 Vehicle control device and vehicle

Country Status (2)

Country Link
JP (1) JP7157671B2 (en)
CN (1) CN111469849B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7294294B2 (en) * 2020-10-06 2023-06-20 トヨタ自動車株式会社 Handrail disinfection device for vehicles with handrails

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10163678A1 (en) * 2001-12-21 2003-07-10 Joachim Lauffer Determination of the grip coefficient of road surfaces, e.g. for road building, by use of any motor vehicle equipped with movement and or acceleration measurement equipment
JP4199788B2 (en) * 2006-08-23 2008-12-17 トヨタ自動車株式会社 Headrest control device and control method
JP4946374B2 (en) 2006-11-13 2012-06-06 トヨタ自動車株式会社 Self-driving vehicle
CN101105097B (en) * 2007-08-23 2011-01-19 上海交通大学 Platform shield door intelligent control method
US9688271B2 (en) 2015-03-11 2017-06-27 Elwha Llc Occupant based vehicle control
JP6421714B2 (en) 2015-07-13 2018-11-14 株式会社デンソー Vehicle control device
JP6607164B2 (en) 2016-10-11 2019-11-20 株式会社デンソー Vehicle safe driving system
JP6834578B2 (en) 2017-02-22 2021-02-24 トヨタ自動車株式会社 Vehicle health examination device
JP6848533B2 (en) 2017-03-02 2021-03-24 株式会社デンソー Vehicle driving control system

Also Published As

Publication number Publication date
JP7157671B2 (en) 2022-10-20
JP2020117091A (en) 2020-08-06
CN111469849A (en) 2020-07-31

Similar Documents

Publication Publication Date Title
JP6342856B2 (en) Vehicle control device
JP6521803B2 (en) Automatic operation control device, footrest, automatic operation control method, and operation information output method
US20190344790A1 (en) Travel support device
US9758176B2 (en) Vehicle control apparatus
US10906550B2 (en) Vehicle control apparatus
US20220169284A1 (en) Vehicle control device
US11220270B2 (en) Control system of vehicle, control method of the same, and non-transitory computer-readable storage medium
JP2018122647A (en) Vehicular warning device
WO2022202032A1 (en) Automated driving control device, automated driving control program, presentation control device, and presentation control program
EP4294689A1 (en) Physical feedback confirmation from assisted-driving system about traffic event
CN111469849B (en) Vehicle control device and vehicle
CN115349144B (en) Presentation control device and persistent computer-readable medium
WO2016092773A1 (en) Autonomous driving control device, driving information output device, footrest, autonomous driving control method, and driving information output method
JP2009059229A (en) Operation support method and operation support system
JP2017030578A (en) Automatic drive control apparatus, and automatic drive control method
US11292484B2 (en) Vehicle control device, vehicle, and vehicle control method
JP6648551B2 (en) Automatic driving device
CN112406886B (en) Vehicle control device and control method, vehicle, and storage medium
US11897496B2 (en) Vehicle warning system
JP2020062990A (en) Parking assisting method and parking assisting device
CN111619567B (en) Vehicle control device, vehicle, and vehicle control method
US20230105600A1 (en) Vehicle, vehicle control method, and non-transitory recording medium
US20240227868A9 (en) Presentation control device, and automated drive control device
JP6801550B2 (en) Information presentation control device and occupant support system
JP2022151544A (en) Automated driving control device, automated driving control program, presentation control device, and presentation control program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant