CN111469849A - Vehicle control device and vehicle - Google Patents

Vehicle control device and vehicle Download PDF

Info

Publication number
CN111469849A
CN111469849A CN202010073925.4A CN202010073925A CN111469849A CN 111469849 A CN111469849 A CN 111469849A CN 202010073925 A CN202010073925 A CN 202010073925A CN 111469849 A CN111469849 A CN 111469849A
Authority
CN
China
Prior art keywords
vehicle
vehicle control
occupant
specific
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010073925.4A
Other languages
Chinese (zh)
Other versions
CN111469849B (en
Inventor
广濑峰史
池田雅也
八嶋淳平
石坂贤太郎
渡边崇
幸加木彻
八代胜也
西田大树
高田雄太
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Publication of CN111469849A publication Critical patent/CN111469849A/en
Application granted granted Critical
Publication of CN111469849B publication Critical patent/CN111469849B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • B60W30/182Selecting between different operative modes, e.g. comfort and performance modes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

The invention provides a vehicle control device and a vehicle. A vehicle control device (10) automatically travels a vehicle (12), and is provided with a detection device (84), wherein the detection device (84) can detect a gripping operation of an occupant on a fixed structure fixed above a seat surface of the occupant, and when the gripping operation is performed and the vehicle (12) is in a specific traveling state, the control of the vehicle (12) is changed. Accordingly, the automated driving control can be changed in the case where the driver or the passenger is predicted and detected to have a feeling of uneasiness to the vehicle operation during the automated driving in accordance with not only the driver but also all the passengers, so that the convenience and the commodity value of the automated driving control can be improved.

Description

Vehicle control device and vehicle
Technical Field
The invention relates to a vehicle control device and a vehicle.
Background
In japanese patent laid-open publication No. 2016-.
In order to solve the above problem, an automatic driving control device described in japanese patent laid-open publication No. 2016-: a detection unit that detects an operation performed by a driver; a determination unit that determines whether the driver's motion detected by the detection unit during autonomous driving satisfies a 1 st operation condition or a 2 nd operation condition, the 1 st operation condition being a condition that is regarded as uncomfortable for the driver in autonomous driving; the 2 nd operating condition is a condition that the driver feels less uncomfortable with the automatic driving than in the 1 st operating condition; and a notification unit configured to notify the driver of the control plan during automatic driving when the determination unit determines that the operation of the driver satisfies the 1 st operation condition, and to issue a switch from automatic driving to manual driving to the driver when the determination unit determines that the operation of the driver satisfies the 2 nd operation condition.
Disclosure of Invention
However, the technique disclosed in japanese patent laid-open publication No. 2016-215793 has a problem that it is difficult to cope with a plurality of occupants, considering only the configuration of the driver. That is, when it is necessary to support not only the driver but also all the occupants, there is a problem in that the driver or the occupants are not comfortable with the vehicle operation.
The invention aims to provide a vehicle control device and a vehicle which can realize the following (1) and (2).
(1) The present invention can be applied not only to a driver but also to all passengers, and can change the automatic driving control when the driver or the passenger is predicted and detected to have a feeling of uneasiness with respect to the vehicle operation during the automatic driving, thereby improving the convenience and the commercial value of the automatic driving control.
(2) The detection is performed by a fixing structure capable of fixing the upper body of the driver or the passenger, which is likely to shake, and thus, the feeling of uneasiness of the vehicle operation can be further grasped.
A vehicle control device according to an aspect of the present invention is a vehicle control device that automatically travels a vehicle, and includes a detection device that is capable of detecting a grasping operation of a fixed structure fixed above a seat surface of an occupant in the vehicle by the occupant, and changes control of the vehicle when the grasping operation is performed and the vehicle is in a specific travel state.
A vehicle according to another aspect of the present invention includes the vehicle control device described above.
According to the present invention, a vehicle control device and a vehicle that can realize the following (1) and (2) can be provided.
(1) The present invention can be applied not only to a driver but also to all passengers, and can change the automatic driving control when it is predicted that the driver or the passenger has detected a sense of unease about the vehicle behavior during the automatic driving, thereby improving convenience and commercial value of the automatic driving control.
(2) The detection is performed by a fixing structure capable of fixing the upper body of the driver or the passenger, which is likely to shake, and thus, the feeling of uneasiness of the vehicle operation can be further grasped.
The above objects, features and advantages should be readily understood by the following description of the embodiments with reference to the accompanying drawings.
Drawings
Fig. 1 is a plan view schematically showing the interior of a vehicle according to the present embodiment.
Fig. 2 is a side view schematically showing the interior of the vehicle according to the present embodiment.
Fig. 3 is a block diagram showing a vehicle control device according to the present embodiment.
Fig. 4A is a table 1 showing details of control change items and change durations for each specific scene, fig. 4B is a table 2 showing priorities of riding positions when a lane change is performed to the right, and fig. 4C is a table 3 showing priorities of riding positions when acceleration and deceleration are performed with respect to a preceding traveling vehicle.
Fig. 5 is a flowchart showing an example of the 1 st processing operation of the vehicle control device according to the present embodiment.
Fig. 6 is a flowchart showing an example of the 2 nd processing operation of the vehicle control device according to the present embodiment.
Fig. 7 is a flowchart showing an example of the 3 rd processing operation of the vehicle control device according to the present embodiment.
Fig. 8 is a flowchart showing an example of the 4 th processing operation of the vehicle control device according to the present embodiment.
Detailed Description
Hereinafter, a vehicle control device and a vehicle according to the present invention will be described in detail with reference to the accompanying drawings.
The vehicle control device 10 and the vehicle 12 according to the present embodiment will be described with reference to the drawings.
First, as shown in fig. 1, the vehicle (own vehicle) 12 is fixed with various fixing structures. Examples of the fixed structure include four armrests (grab rails) 14, four door handles (door grip)16, a door side armrest (door armrest)18, four seat belts 20, two seat handles (seat grip)22, two headrests 24, and four seats 26. The seat belt 20 has a flexible belt, and is fixed to the seat frame by a retractor (retractor) and a buckle (buckle), not shown, and thus is one of the fixing structures. As shown in fig. 2, the armrest 14, the door handle 16, the door side armrest 18, the seat handle 22, the headrest 24, and the like are fixed above the seat surface of the occupant.
As shown in fig. 3, the armrest 14, the door handle 16, the door side armrest 18, the seatbelt 20, the seat handle 22, the headrest 24, and the like are provided with a 1 st sensor 30a, a 2 nd sensor 30b, a 3 rd sensor 30c, a 4 th sensor 30d, a 5 th sensor 30e, a 6 th sensor 30f, and the like, respectively, which detect the gripping or contact of an occupant, and the seat 26 is provided with a 7 th sensor 30g, which detects the seating of an occupant, an 8 th sensor 30h, which detects the leaning against the backrest, and the like. Similarly, examples of the various sensors include a sensor for detecting pressure (e.g., a sheet-like pressure sensor), a sensor for detecting load, and a sensor for detecting perspiration.
In addition, an in-vehicle camera (a car interior monitoring camera or the like) that captures the expression of the occupant, an electromagnetic wave sensor, or the like may be provided in the vehicle.
As shown in fig. 3, the vehicle 12 further includes an external sensor 40, a vehicle body behavior sensor 42, a vehicle operation sensor 44, a communication unit 46, and an HMI (human machine interface) 48. The vehicle 12 also has a drive device 50, a brake device 52, a steering device 54, a navigation device 56, and a positioning portion 58. The vehicle 12 also has components other than these components, but the description thereof is omitted here.
The ambient sensor 40 acquires ambient information, i.e., the surrounding information of the vehicle 12. the ambient sensor 40 has a plurality of cameras and a plurality of radars, not shown. the ambient sensor 40 also has a plurality of L iDAR, not shown (L light Detection and Ranging: light Detection and Ranging/L ase Imaging Detection and Ranging: laser Imaging Detection and Ranging).
The information acquired by the camera, i.e., the camera information, is supplied from the camera to the vehicle control device 10, the camera information may include camera information and the like, and the camera information is combined with radar information and L irda information to constitute external information.
The radar transmits a transmission wave to the outside of the vehicle 12 and receives a reflected wave reflected from a detected object in the transmitted transmission wave. Examples of the transmission wave include an electromagnetic wave. Examples of the electromagnetic wave include a millimeter wave. Examples of the detection object include other vehicles including a preceding vehicle. The radar generates radar information (reflected wave signal) from the reflected wave or the like. The radar supplies the generated radar information to the vehicle control device 10. Further, the radar is not limited to the millimeter wave radar. For example, a laser radar, an ultrasonic sensor, or the like may be used as the radar.
L iDAR emits laser light continuously in all directions to the vehicle 12, measures the three-dimensional position of the reflection point from the reflected wave of the emitted laser light, and outputs information relating to the three-dimensional position, i.e., three-dimensional information L iDAR supplies the three-dimensional information, i.e., L iDAR information, to the vehicle control device 10.
The vehicle body behavior sensor 42 acquires vehicle body behavior information that is information relating to the behavior of the vehicle 12. The vehicle body behavior sensors 42 include a vehicle speed sensor, a wheel rotation speed sensor, an acceleration sensor, and a yaw rate sensor, which are not shown. The vehicle speed sensor detects the speed of the vehicle 12, i.e., the vehicle speed. In addition, the vehicle speed sensor also detects the traveling direction of the vehicle 12. The wheel speed sensor detects a wheel speed, which is a rotational speed of a wheel not shown. The acceleration sensor detects the acceleration of the vehicle 12. The acceleration includes a front-rear acceleration, a lateral acceleration, and an up-down acceleration. Further, only a part of the directional accelerations may be detected by the acceleration sensor. The yaw rate sensor detects a yaw rate of the vehicle 12.
The vehicle operation sensor 44 acquires driving operation information, which is information related to a driving operation of a user (driver). The vehicle operation sensors 44 include an accelerator pedal sensor, a brake pedal sensor, a steering angle sensor, and a steering torque sensor, which are not shown. The accelerator pedal sensor detects an operation amount of an accelerator pedal, not shown. The brake pedal sensor detects an operation amount of a brake pedal, not shown. The steering angle sensor detects a steering angle of a steering wheel, not shown. The steering torque sensor detects a torque applied to the steering wheel.
The communication unit 46 performs wireless communication with an external device not shown. The external device may include, for example, an external server not shown. The communication unit 46 may be detachable from the vehicle 12 or detachable from the vehicle 12. Examples of the communication unit 46 that is attachable to and detachable from the vehicle 12 include a mobile phone and a smart phone.
The HMI48 accepts operation input by a user (occupant), and provides the user with various information in a visual, audible, or tactile manner. The HMI48 includes, for example, an automatic driving switch (driving assistance switch) 60, a display (not shown), a contact sensor (not shown), an in-vehicle camera 62 (including the above-described in-vehicle monitoring camera and the like), a speaker (not shown), and an operation member (not shown).
The automatic driving switch 60 is a switch for a user to instruct the start and stop of automatic driving. The automatic drive switch 60 includes a start switch not shown and a stop switch not shown. The start switch outputs a start signal to the vehicle control device 10 in accordance with an operation by a user. The stop switch outputs a stop signal to the vehicle control device 10 according to an operation by a user.
The display includes, for example, a liquid crystal panel, an organic E L panel, and the like, and here, a case where the display is a touch panel is exemplified, but the display is not limited thereto.
The touch sensor is a sensor for detecting whether a user (driver) is touching a steering wheel. The signal output from the contact sensor is supplied to the vehicle control device 10. The vehicle control device 10 can determine whether the user is touching the steering wheel based on the signal supplied from the contact sensor.
The in-vehicle camera 62 photographs the interior of the vehicle 12, i.e., the cabin interior. The in-vehicle camera 62 may be provided on, for example, an instrument panel not shown, or may be provided on a ceiling not shown. In addition, the in-vehicle camera 62 may be provided to photograph only the driver, or may be provided to photograph each passenger. The in-vehicle camera 62 outputs image information, which is information obtained by imaging the inside of the vehicle compartment, to the vehicle control device 10.
The speaker is a means for providing various information to the user in a voice manner. The vehicle control device 10 outputs various notifications, alarms, and the like using a speaker.
The driving device (driving force control system) 50 includes a driving ECU (not shown) and a driving source (not shown). The drive ECU controls the driving force (torque) of the vehicle 12 by controlling the driving source. Examples of the drive source include an engine and a drive motor. The drive ECU controls the drive source in accordance with the operation of the accelerator pedal by the user, whereby the braking force can be controlled. The drive ECU can control the driving force by controlling the driving source in accordance with the command supplied from the vehicle control device 10. The driving force of the driving source is transmitted to wheels, not shown, via a transmission, not shown, and the like.
The brake device (braking force control system) 52 includes a brake ECU (not shown) and a brake mechanism (not shown). The brake mechanism operates the brake member by a brake motor, a hydraulic mechanism, or the like. The brake ECU controls the brake mechanism in accordance with an operation of a brake pedal by a user, whereby the braking force can be controlled. The brake ECU can control the braking force by controlling the brake mechanism in accordance with a command supplied from the vehicle control device 10.
The Steering device (Steering system) 54 includes an EPS (Electric Power Steering) ECU (Electric Power Steering) which is a Steering ECU not shown, and a Steering motor not shown. The steering ECU controls the steering motor in accordance with the operation of the steering wheel by the user, thereby controlling the direction of the wheels (steered wheels). The steering ECU controls the steering motor in accordance with a command supplied from the vehicle control device 10, and controls the direction of the wheels. Steering may be performed by changing the torque distribution or the braking force distribution to the left and right wheels.
The Navigation device 56 includes a GNSS (Global Navigation Satellite System) sensor (not shown). The navigation device 56 further includes an unillustrated computing unit and an unillustrated storage unit. The GNSS sensors detect the current position of the vehicle 12. The calculation unit reads map information corresponding to the current position detected by the GNSS sensor from the map database stored in the storage unit. The calculation unit specifies a target route from the current position to the destination using the map information. Further, the destination is input by the user via the HMI 48. As mentioned above, the display is a touch screen. The input of the destination is performed by the user operating the touch panel. The navigation device 56 outputs the made target path to the vehicle control device 10. The vehicle control device 10 supplies the target route to the HMI 48. The HMI48 displays the target path on a display.
The positioning unit 58 has a GNSS not shown. The positioning unit 58 includes an IMU (Inertial measurement unit), not shown, and a map database (map DB), not shown. The positioning portion 58 determines the position of the vehicle 12 using, as appropriate, the information acquired by the GNSS, the information acquired by the IMU, and map information stored in a map database. The positioning unit 58 can supply vehicle position information, which is information indicating the position of the vehicle 12, to the vehicle control device 10. In addition, the positioning section 58 can supply the map information to the vehicle control device 10.
The vehicle control device 10 further includes a calculation unit 70, a storage unit 72, and an input/output unit 74. The arithmetic unit 70 is responsible for controlling the entire vehicle control device 10. The arithmetic unit 70 is constituted by, for example, a cpu (central Processing unit). The arithmetic unit 70 controls each unit based on a program stored in the storage unit 72, and executes vehicle control based on the control. The input/output unit 74 acquires signals, data, and the like from the outside of the vehicle control device 10, inputs the signals, data, and the like to the arithmetic unit 70 and the storage unit 72, and outputs the signals, data, and the like output from the arithmetic unit 70 and the storage unit 72 to the outside.
The computing unit 70 includes an occupant detection unit 80, a specific travel state detection unit 82, an occupant specific motion detection unit 84, a priority setting unit 86, a vehicle control change unit 88, and a detection target exclusion determination unit 90.
The occupant detection unit 80 detects which seat 26 the occupant sits on by the 7 th sensor 30g, the 8 th sensor 30h, the in-vehicle camera 62, and the like.
The specific travel state detection unit 82 detects, for example, whether the vehicle 12 is in a specific travel state set in advance, that is, whether a specific scene is encountered. The specific scene is a scene in which the motion of the vehicle 12 becomes a transient state, and examples thereof include an acceleration/deceleration state, a turning state, a lane change state, a timing of gripping/non-gripping a steering wheel, a pressure change, a timing of gripping a steering wheel when entering a curve, a positional relationship with a surrounding vehicle, a speed difference, and the like. These states are detected based on map information, the current position of the vehicle 12, information such as the position of another vehicle in the peripheral area obtained by the communication unit 46, drive signals from the drive device 50, the brake device 52, and the steering device 54, and the like. As shown in table 1 of fig. 4A, examples of the specific scene include acceleration and deceleration, curve, lane change, split flow and merge, acceleration and deceleration immediately after start of automatic driving, and the like, which are adapted to the vehicle traveling ahead.
The occupant-specific motion detection unit 84 detects the gripping of the fixed structure by the occupant and the contact with the fixed structure. By this detection, it is possible to detect whether the occupant feels uneasy.
In this case, the gripping of the fixed structure or the contact with the fixed structure may be detected for all the occupants, or when a plurality of occupants are detected, the occupant at the higher-priority riding position set in advance or the higher-priority riding position set according to the specific scene may be set by the priority setting unit 86, and the gripping of the fixed structure or the contact with the fixed structure may be detected for the occupant at the higher-priority riding position.
Here, the riding position with a high priority set in advance is a rear seat of the riding position of the driver. Since the driver's rear seat is the safest at the riding position, when the occupant seated in the rear seat feels uneasy by gripping or contacting the fixed structure, it can be determined that all the occupants feel uneasy.
The riding position with a high priority set according to the specific scene means that if the lane change is performed in the right direction, for example, as shown in table 2 of fig. 4B, the right rear seat, the left rear seat, the right front seat, and the left front seat are set in order of the priority from high to low.
Similarly, when the vehicle is accelerated or decelerated with respect to the forward traveling vehicle, the priority is set to the left front seat, the right front seat, the left rear seat, and the right rear seat in descending order as shown in table 3 in fig. 4C.
In this way, the priority of the occupant can be determined for each specific scene, and thus detection of an occupant who feels uncomfortable can be performed in a short time.
The above-described gripping of the fixed structure or the contact with the fixed structure may be detected using a camera image, a pressure sensor, a perspiration sensor, an electromagnetic wave sensor, or the like. The health state of the occupant may be estimated by measuring the pulse rate of the occupant, for example, from the camera image of the face of the occupant and comparing the pulse rate with a standard pulse rate. Of course, the communication unit 46 may acquire medical information (medical history, etc.) of each occupant, for example, to improve the estimation accuracy of the health state of each occupant.
In the camera image, for example, when it is detected that the occupant looks down and shakes, that the occupant looks at a little bit with his/her eyes, that the occupant looks around the surroundings in a hurry, that the occupant looks at the speedometer, or the like, it is recognized that the occupant feels uneasy. In a pressure sensor, a load sensor, or the like, when a pressure or a load equal to or higher than a certain pressure is applied to the pressure sensor or the load sensor, it is recognized that the user feels uneasy. In the pressure sensor and the load sensor, correlation coefficients between sensor waveforms from these sensors and reference waveforms (a plurality of waveforms representative of when a person feels uneasy) measured in advance are obtained, and it is determined that the closer the correlation coefficient is to 1, the more uneasy the occupant feels. Further, various methods can be enumerated.
A perspiration sensor may be combined to improve the accuracy for detecting whether discomfort is felt. Further, the emotion of the occupant can be recognized from the data detected by the electromagnetic wave sensor (see http:// news. jp/p/6219, "a device for knowing the real emotion of a person from a distance").
On the other hand, the vehicle control changing unit 88 changes the vehicle control in accordance with a plurality of predetermined scenes.
As shown in table 1 of fig. 4A, the items of change in vehicle control include, for example, an increase in the inter-vehicle distance when acceleration and deceleration are performed in accordance with a vehicle traveling ahead, and a decrease in the lateral movement speed and a decrease in the vehicle speed during steering when a curve, a lane change, and a diversion merge are performed. Further, in acceleration and deceleration immediately after the start of automatic driving, for example, suppression control of an acceleration gain is given.
When acceleration and deceleration are performed in accordance with a vehicle traveling ahead, the duration of change of the vehicle control may be, for example, the time required for following the vehicle traveling ahead, and the duration of change of the vehicle control may be, for example, the time required for the vehicle to move laterally in a curve, a lane change, or a diversion junction. In addition, in acceleration and deceleration immediately after the start of automated driving, the duration of change of vehicle control may be, for example, up to the completion (end) of automated driving (for example, 30 minutes or 40 km). At the stage when the change duration has elapsed, the vehicle control device 10 restores the content of the change of the vehicle control to the original state.
Of course, the vehicle control may be continuously changed for a predetermined distance. Examples thereof include: when acceleration and deceleration are being performed in accordance with a preceding vehicle, for example, the change of the vehicle control is maintained until the distance to follow the preceding vehicle reaches a predetermined distance, and the change of the vehicle control is maintained until the lateral movement of the vehicle reaches a predetermined distance in a curve, a lane change, a split flow, and a merge. The vehicle control device 10 may restore the changed contents of the vehicle control to the original state at the stage when the vehicle has traveled the predetermined distance.
Further, the vehicle control changing unit 88 increases the degree of change of the vehicle control if the state of health of the occupant detected by the occupant-specific motion detecting unit 84 deteriorates. In this case, the degree of change of the vehicle control may be increased according to the difference between the pulse rate of the occupant whose health state is deteriorated and the standard pulse rate. As a medium for detecting the health state from the camera image, the pulse rate, the respiration rate, the heartbeat fluctuation, and the like can be cited.
The detection target exclusion specifying unit 90 specifies an occupant excluded from the detection target among the occupants. Specifically, when the vehicle 12 is not in the specific traveling state, the occupant who continues to grip the fixed structure for a predetermined time or longer or for a predetermined distance or longer or is in contact with the fixed structure is excluded from the detection target. This can shorten the time required to detect an occupant who feels uncomfortable, and can improve the commercial value of the autonomous vehicle.
Next, some processing operations of the vehicle control device 10 according to the present embodiment will be described with reference to fig. 5 to 8.
[ 1 st treatment action ]
The 1 st processing operation is a basic processing operation of the vehicle control device 10, and first, in step S1 of fig. 5, the occupant detection unit 80 detects which seat 26 the occupant sits on, by a pressure sensor of the seat 26, an in-vehicle camera, or the like.
In step S2, the specific traveling state detection unit 82 detects whether the host vehicle 12 is in a specific traveling state set in advance, that is, whether a specific scene is encountered.
If the host vehicle 12 encounters a specific scene, the process proceeds to step S3, and the occupant-specific motion detection unit 84 detects a specific motion of the occupant, that is, a grip on or contact with the fixed structure.
When the above-described specific operation of the occupant is detected, the process proceeds to step S4, and the vehicle control changing unit 88 changes the vehicle control in accordance with a plurality of predetermined scenes as exemplified above (see table 1 in fig. 4A).
When the process at step S4 is completed, or when the occupant' S specific motion is not detected at step S3, or when it is determined at step S2 that the host vehicle 12 has not encountered the specific scene, the process from step S1 onward is repeated after a predetermined period of time has elapsed.
[ 2 nd treatment action ]
The 2 nd processing operation is a processing operation including the priority setting processing in the 1 st processing operation, and first, in step S101 in fig. 6, as described above, the priority setting unit 86 sets the occupant at the riding position with the preset high priority or the riding position with the high priority set according to the specific scene.
In step S102, the occupant detection unit 80 detects whether or not the occupant is seated at the riding position with the higher priority set by the priority setting unit 86, by a pressure sensor of the seat 26, an in-vehicle camera, or the like.
In step S103, the specific travel state detection unit 82 detects whether the host vehicle 12 is in a specific travel state set in advance, that is, whether a specific scene is encountered.
If the host vehicle 12 encounters a specific scene, the process proceeds to step S104, and the occupant-specific-motion detection unit 84 detects a specific motion of the occupant with respect to the riding position with a high priority, that is, a grip on the fixed structure or a contact with the fixed structure.
When the above-described specific operation of the occupant is detected, the process proceeds to step S105, and the vehicle control changing unit 88 changes the vehicle control in accordance with a plurality of predetermined scenes as exemplified above (see table 1 in fig. 4A).
When the process at step S105 is completed, when the occupant' S specific motion is not detected at step S104, or when it is determined at step S103 that the host vehicle 12 has not encountered the specific scene, the process from step S101 onward is repeated after a predetermined period of time has elapsed.
[ 3 rd treatment action ]
The 3 rd processing operation is a processing operation including the object exclusion detection processing in the 1 st processing operation, and first, in step S201 in fig. 7, the occupant detection unit 80 detects which seat 26 the occupant sits on, by a pressure sensor of the seat 26, an in-vehicle camera, or the like.
In step S202, the specific travel state detection unit 82 detects whether the host vehicle 12 is in a specific travel state set in advance, that is, whether a specific scene is encountered.
When the host vehicle 12 does not encounter the specific scene, the process proceeds to step S203, and the detection target exclusion specifying unit 90 determines whether or not there is an occupant excluded from the detection targets among the occupants. If so, the process proceeds to step S204, where the occupant excluded from the detection object is specified.
On the other hand, in step S202, if the host vehicle 12 encounters a specific scene, the process proceeds to step S205, and the occupant-specific motion detection unit 84 detects a specific motion, that is, a grip on or contact with the fixed structure, for the occupants other than the occupant excluded from the detection target.
When the above-described specific operation of the occupant is detected, the process proceeds to step S206, and the vehicle control changing unit 88 changes the vehicle control in accordance with a plurality of predetermined scenes as exemplified above (see table 1 in fig. 4A).
When the process of step S206 is completed, or when the specific action by the occupant is not detected in step S205, or when the occupant excluded from the detection target is specified in step S204, or when it is determined that there is no occupant excluded from the detection target in step S203, the process of step S201 and the subsequent steps is repeated after a predetermined period of time has elapsed.
[ 4 th treatment action ]
The 4 th processing operation is a processing operation including the priority setting processing in the 3 rd processing operation, and first, in step S301 in fig. 8, as described above, the priority setting unit 86 sets the occupant at the preset high-priority riding position or the high-priority riding position set according to the specific scene.
In step S302, the occupant detection unit 80 detects whether or not the occupant is seated at the riding position with the higher priority set by the priority setting unit 86, by a pressure sensor of the seat 26, an in-vehicle camera, or the like.
In step S303, the specific traveling state detection unit 82 detects whether the host vehicle 12 is in a specific traveling state set in advance, that is, whether a specific scene is encountered.
When the host vehicle 12 does not encounter the specific scene, the process proceeds to step S304, and the detection target exclusion specifying unit 90 determines whether or not there is an occupant excluded from the detection targets among the occupants. If so, the process proceeds to step S305, where the occupant excluded from the detection target is specified.
On the other hand, if the host vehicle 12 encounters a specific scene in step S303, the process proceeds to step S306, and the occupant-specific motion detection unit 84 detects a specific motion, that is, a grip on or contact with the fixed structure, for the occupants other than the occupant excluded from the detection target.
When the above-described specific operation of the occupant is detected, the process proceeds to step S307, and the vehicle control changing unit 88 changes the vehicle control in accordance with a plurality of predetermined scenes as exemplified above (see table 1 in fig. 4A).
When the process of step S307 is completed, or when the occupant specification operation is not detected in step S306, or when the occupant excluded from the detection target is specified in step S305, or when it is determined that the occupant excluded from the detection target is not present in step S304, the process of step S301 and the subsequent process are repeated after a predetermined period of time has elapsed.
[ summary of the embodiments ]
The invention that can be grasped from the above-described embodiments is described below.
The present embodiment is a vehicle control device 10 for automatically traveling a vehicle 12, and includes a detection device (e.g., the 1 st to 8 th sensors 30a to 30 h) capable of detecting a grasping operation of a fixed structure in the vehicle 12 by an occupant, the fixed structure being fixed above a seat surface of the occupant, and changing control of the vehicle 12 when the grasping operation is performed and the vehicle 12 is in a specific traveling state.
Accordingly, for example, the vehicle control is changed according to the state of the holding operation (gripping, etc.) in which the occupant holds the fixed structure in a specific scene. For example, in a traveling state in which the vehicle 12 travels by autonomous driving, when the occupant unintentionally performs a grasping operation (such as grasping) on the fixed structure, the occupant is considered to be present as uncomfortable to the driving state, and the occupant can be prevented from being disturbed by changing the vehicle control as much as possible. This may correspond to all occupants, including the driver.
By changing (reflecting, etc.) the vehicle control in a specific scene, it is possible to prevent the erroneous learning of a person having a holding habit at ordinary times, and to detect a person held with real fear with high accuracy and perform the control reflection. Further, since the fixed structure is fixed to the vehicle at a position above the seat surface of the passenger, the sense of uneasiness of the vehicle operation can be further grasped by detecting the fixed structure that can fix the upper body of the driver or the passenger, which is likely to shake.
In the present embodiment, the detection device can detect the grasping operation of the occupant with respect to the plurality of fixed structures fixed to the vehicle 12.
Examples of the fixing structure include a fixing structure (for example, an armrest 14 and an inside handle of a door (e.g., a door handle 16)) capable of fixing an upper body that is likely to be shaken by a driver or a passenger), and by using these fixing devices, it is possible to further grasp a feeling of uneasiness of all passengers including the driver with respect to the vehicle operation.
In the present embodiment, the specific travel state defines a plurality of travel states, and the change parameter of the vehicle control is changed in accordance with each travel state.
For example, a plurality of specific travel states exist as the defined specific travel states, and vehicle control is changed in accordance with the state of the occupant's grasping action (gripping, etc.) of the fixed structure in accordance with the travel state of the vehicle 12 in the specific travel states.
That is, of the change parameters of the vehicle control set in accordance with the plurality of specific traveling states, the change parameter corresponding to the specific traveling state in which the occupant grips (grips or the like) the fixed structure is changed. Accordingly, appropriate control reflection can be performed, and the occupant can be prevented from being disturbed as much as possible.
In the present embodiment, the specific travel state defines a plurality of travel states, and the change period of the vehicle control is changed in accordance with each travel state.
That is, the period in which the change of the vehicle control is reflected can be changed for each specific traveling state, and thus appropriate control reflection can be performed. This makes it possible to prevent the occupant from being disturbed as much as possible. Moreover, appropriate control reflection can be performed only for a necessary period.
In the present embodiment, the operation performed by the occupant is detected by the in-vehicle camera, and the state of health of the occupant is detected by the in-vehicle camera, and when the state of health deteriorates, the vehicle control change amount is increased.
Other parameters (camera image, health status, etc.) are taken into account in the situation of the holding action (gripping, etc.) of the occupant holding the fixed structure. For example, when the health condition deteriorates, the change of the control amount is further increased, so that the subject occupant can be prevented from being disturbed as much as possible.
In the present embodiment, when a plurality of occupants are detected, the priority is determined according to the riding position of each occupant, and the vehicle control is changed according to the detection state of the occupant having a high priority.
By giving priority to the occupant according to the riding position, it is possible to make a determination from the occupant with a high priority. Accordingly, the vehicle control can be changed at an early stage, and thus the subject occupant can be prevented from being disturbed as much as possible.
In the present embodiment, when a plurality of occupants are detected, the riding position with a high priority is determined according to the specific traveling state, and the vehicle control is changed according to the detection state of the occupant at the riding position with a high priority.
That is, the priority given according to the riding position of the occupant is changed according to the specific traveling state. Accordingly, it is possible to determine that the vehicle control is changed according to the occupant at the riding position with a high priority for each traveling scene.
In the present embodiment, the vehicle control is changed, and when at least one of the conditions of the predetermined travel distance and the predetermined travel time is met after the change of the vehicle control, the content of the change of the vehicle control is restored.
That is, the content of the change in the vehicle control is restored to the original state at the stage when the vehicle has traveled the predetermined travel distance or the predetermined travel time has elapsed after the change in the vehicle control. That is, since the occupant becomes accustomed to the vehicle control based on the automatic driving after the change as the vehicle travels the predetermined distance or the predetermined time elapses, it is effective to return to the vehicle control before the change to improve the driving efficiency.
In the present embodiment, the occupant who has continued the holding state of the fixed structure for a predetermined time or a predetermined distance or more in the traveling state other than the specific traveling state is excluded from the detection target.
Since a person holding the armrest 14 or the like has a gripping habit even in a traveling state other than the specific traveling state and does not have a strong sense of uneasiness even when the person is gripping the armrest, the time required for detecting an occupant who feels uncomfortable can be shortened by excluding the occupant from the detection target, and the commercial value of the autonomous vehicle can be improved.
The vehicle 12 has the vehicle control device 10 as described above.
[ modified examples ]
The vehicle control changing unit 88 may recommend changing the inter-vehicle distance and the traveling pattern, or may correct the vehicle control to a safer side at an initial stage of starting the automatic driving (for example, when starting the riding), in addition to changing the vehicle control according to the plurality of preset scenes.
Further, when the duration of the automated driving has elapsed after a predetermined time or the travel distance of the automated driving has reached a predetermined distance, the vehicle control may be returned to the travel in the normal automated driving mode or a recommendation for a recommended rest may be made without changing the vehicle control according to the specific action of the occupant in the specific scene as described above.

Claims (10)

1. A vehicle control device (10) that automatically drives a vehicle (12),
has a detection device (84), the detection device (84) can detect the holding action of the passenger to the fixed structure fixed at the position above the seat surface of the passenger,
when the detection device (84) detects the gripping action of the occupant and the vehicle (12) is in a specific traveling state, the control of the vehicle (12) is changed.
2. The vehicle control apparatus (10) according to claim 1,
the detection device (84) is capable of detecting a gripping action of an occupant on a plurality of the fixed structures fixed to the vehicle (12).
3. The vehicle control apparatus (10) according to claim 1,
a plurality of running states are defined as the specific running state,
changing a change parameter of the vehicle control in accordance with each of the running states.
4. The vehicle control apparatus (10) according to claim 1,
a plurality of running states are defined as the specific running state,
the change period of the vehicle control is changed in accordance with each of the running states.
5. The vehicle control apparatus (10) according to claim 1,
has an in-vehicle camera (62) for detecting the movement of the occupant,
detecting a health state of the occupant using the in-vehicle camera (62),
when the state of health deteriorates, the vehicle control change amount is increased.
6. The vehicle control apparatus (10) according to claim 1,
when a plurality of the occupants are detected, a priority is determined according to the riding position of each of the occupants, and the vehicle control is changed based on the detection condition of the occupant having the high priority.
7. The vehicle control apparatus (10) according to claim 1,
when a plurality of the occupants are detected, a riding position with a high priority is determined according to the specific traveling state, and the vehicle control is changed based on a detection situation of the occupant at the riding position with the high priority.
8. The vehicle control apparatus (10) according to claim 1,
and restoring the changed content of the vehicle control to the original state when at least one of a predetermined travel distance and a predetermined travel time is met after the change of the vehicle control.
9. The vehicle control apparatus (10) according to claim 1,
an occupant who has continued to hold the fixed structure for a predetermined time or a predetermined distance or more in a traveling state other than the specific traveling state is excluded from the detection target.
10. A vehicle (12) having a vehicle control device (10),
has a detection device (84), the detection device (84) can detect the holding action of the passenger to the fixed structure fixed at the position above the seat surface of the passenger,
when the grasping operation is performed and the vehicle (12) is in a specific traveling state, the vehicle control device (10) changes the control of the vehicle (12).
CN202010073925.4A 2019-01-24 2020-01-22 Vehicle control device and vehicle Active CN111469849B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-010375 2019-01-24
JP2019010375A JP7157671B2 (en) 2019-01-24 2019-01-24 Vehicle control device and vehicle

Publications (2)

Publication Number Publication Date
CN111469849A true CN111469849A (en) 2020-07-31
CN111469849B CN111469849B (en) 2023-05-09

Family

ID=71747063

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010073925.4A Active CN111469849B (en) 2019-01-24 2020-01-22 Vehicle control device and vehicle

Country Status (2)

Country Link
JP (1) JP7157671B2 (en)
CN (1) CN111469849B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114379438A (en) * 2020-10-06 2022-04-22 丰田自动车株式会社 Handrail sanitizing system for a vehicle with a handrail

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10163678A1 (en) * 2001-12-21 2003-07-10 Joachim Lauffer Determination of the grip coefficient of road surfaces, e.g. for road building, by use of any motor vehicle equipped with movement and or acceleration measurement equipment
CN101105097A (en) * 2007-08-23 2008-01-16 上海交通大学 Platform shield door intelligent control method
CN101130349A (en) * 2006-08-23 2008-02-27 丰田自动车株式会社 Headrest control apparatus and method
JP2017021651A (en) * 2015-07-13 2017-01-26 株式会社デンソー Vehicle control device
CN107531236A (en) * 2015-03-11 2018-01-02 埃尔瓦有限公司 Wagon control based on occupant

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4946374B2 (en) 2006-11-13 2012-06-06 トヨタ自動車株式会社 Self-driving vehicle
JP6607164B2 (en) 2016-10-11 2019-11-20 株式会社デンソー Vehicle safe driving system
JP6834578B2 (en) 2017-02-22 2021-02-24 トヨタ自動車株式会社 Vehicle health examination device
JP6848533B2 (en) 2017-03-02 2021-03-24 株式会社デンソー Vehicle driving control system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10163678A1 (en) * 2001-12-21 2003-07-10 Joachim Lauffer Determination of the grip coefficient of road surfaces, e.g. for road building, by use of any motor vehicle equipped with movement and or acceleration measurement equipment
CN101130349A (en) * 2006-08-23 2008-02-27 丰田自动车株式会社 Headrest control apparatus and method
CN101105097A (en) * 2007-08-23 2008-01-16 上海交通大学 Platform shield door intelligent control method
CN107531236A (en) * 2015-03-11 2018-01-02 埃尔瓦有限公司 Wagon control based on occupant
JP2017021651A (en) * 2015-07-13 2017-01-26 株式会社デンソー Vehicle control device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114379438A (en) * 2020-10-06 2022-04-22 丰田自动车株式会社 Handrail sanitizing system for a vehicle with a handrail
CN114379438B (en) * 2020-10-06 2023-11-28 丰田自动车株式会社 Armrest sterilization system for an armrest-equipped vehicle

Also Published As

Publication number Publication date
JP7157671B2 (en) 2022-10-20
CN111469849B (en) 2023-05-09
JP2020117091A (en) 2020-08-06

Similar Documents

Publication Publication Date Title
JP6342856B2 (en) Vehicle control device
JP5288045B2 (en) Emergency vehicle evacuation device
JP5527411B2 (en) Emergency vehicle evacuation device
US20190344790A1 (en) Travel support device
JP6521803B2 (en) Automatic operation control device, footrest, automatic operation control method, and operation information output method
CN108459654B (en) Driving support device
JP2016193683A (en) Vehicle control device
US10906550B2 (en) Vehicle control apparatus
JP2014019301A (en) Emergency evacuation device
JP6267275B2 (en) Method and apparatus for controlling a vehicle having automatic driving control capability
WO2021039779A1 (en) Vehicle control device
JP2017146744A (en) Driver state determination device
JP2018122647A (en) Vehicular warning device
CN114103810A (en) Information display system for vehicle
CN115349144B (en) Presentation control device and persistent computer-readable medium
KR102060303B1 (en) Apparatus for controlling autonomous driving and method thereof
CN111469849B (en) Vehicle control device and vehicle
US11220270B2 (en) Control system of vehicle, control method of the same, and non-transitory computer-readable storage medium
JP2009059229A (en) Operation support method and operation support system
WO2022202032A1 (en) Automated driving control device, automated driving control program, presentation control device, and presentation control program
CN112406886B (en) Vehicle control device and control method, vehicle, and storage medium
JP7310779B2 (en) display system
US11897496B2 (en) Vehicle warning system
US20230339474A1 (en) Emotion determination device
US20230339493A1 (en) Driving assistance apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant