WO2019069516A1 - 状態検出装置、状態検出システム及び状態検出プログラム - Google Patents
状態検出装置、状態検出システム及び状態検出プログラム Download PDFInfo
- Publication number
- WO2019069516A1 WO2019069516A1 PCT/JP2018/025031 JP2018025031W WO2019069516A1 WO 2019069516 A1 WO2019069516 A1 WO 2019069516A1 JP 2018025031 W JP2018025031 W JP 2018025031W WO 2019069516 A1 WO2019069516 A1 WO 2019069516A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- driver
- detection
- state
- leg
- driving
- Prior art date
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 168
- 230000008859 change Effects 0.000 claims abstract description 30
- 238000012508 change request Methods 0.000 claims abstract description 8
- 238000012545 processing Methods 0.000 claims description 19
- 210000003371 toe Anatomy 0.000 claims description 17
- 230000005540 biological transmission Effects 0.000 claims description 5
- 210000002414 leg Anatomy 0.000 description 142
- 230000036544 posture Effects 0.000 description 86
- 230000006870 function Effects 0.000 description 18
- 238000000034 method Methods 0.000 description 18
- 230000008569 process Effects 0.000 description 16
- 210000002683 foot Anatomy 0.000 description 13
- 238000010586 diagram Methods 0.000 description 12
- 210000003127 knee Anatomy 0.000 description 10
- 238000012986 modification Methods 0.000 description 10
- 230000004048 modification Effects 0.000 description 10
- 238000012546 transfer Methods 0.000 description 10
- 230000003287 optical effect Effects 0.000 description 6
- 230000000694 effects Effects 0.000 description 5
- 230000000638 stimulation Effects 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 210000003423 ankle Anatomy 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 210000004394 hip joint Anatomy 0.000 description 2
- 210000000629 knee joint Anatomy 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000010485 coping Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000001747 exhibiting effect Effects 0.000 description 1
- 210000001624 hip Anatomy 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000007562 laser obscuration time method Methods 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 230000015654 memory Effects 0.000 description 1
- 230000008929 regeneration Effects 0.000 description 1
- 238000011069 regeneration method Methods 0.000 description 1
- 230000015541 sensory perception of touch Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
Definitions
- the disclosure of this specification relates to a state detection device, a state detection system, and a state detection program for detecting a driver's state in a vehicle provided with an autonomous driving function.
- Patent Document 1 discloses a vehicle control device capable of performing a driving operation on behalf of a driver.
- the vehicle control device performs a driving change from the state of automatic driving to the manual driving by the driver as the handover. Further, the vehicle control device detects whether or not the driver can return to the manual driving, for example, by a driver monitor camera provided on the top surface of the steering column, a touch sensor provided on the steering hole, or the like.
- An object of the present disclosure is to provide a state detection device, a state detection system, and a state detection program that can determine that the driver can not respond to the shift request when the posture of the leg is inappropriate for driving operation. Do.
- a state detection device is a state detection device that detects a state of a driver in a vehicle having an automatic driving function capable of performing a driving operation on behalf of the driver, and includes: Based on the information acquisition unit that acquires detection information at which the position is detected, and the driver's leg state indicated by the detection information, it is determined whether the driver can cope with the drive change based on the change request from the automatic drive function. And a state determination unit.
- a state detection program is a state detection program for detecting a state of a driver in a vehicle having an automatic driving function capable of performing a driving operation on behalf of the driver, at least one processing unit
- the driver can respond to driving change based on the request for change from the automatic driving function based on the information acquisition unit that acquires the detection information that detects the position of the driver's leg, and the state of the driver's leg indicated by the detection information And a command to function as a state determination unit that determines whether or not to
- a state detection system includes the above-described state detection device, and a position detection sensor that outputs a detection signal that detects the position of a leg toward an information acquisition unit, and the position detection sensor It has a transmission part which transmits a detection wave to the direction, and a detection part which detects a detection wave reflected by an object located in a detection space.
- the position of the driver's leg is acquired as detection information. Then, whether the driver can cope with the driving change based on the change request is determined based on the state of the driver's leg. According to the above, it is possible to determine that it is not possible to respond to the shift request from the automatic driving function when the posture of the leg that is inappropriate for the driving operation is taken.
- FIG. 1 is a block diagram showing an overview of a configuration related to automatic driving mounted on a vehicle
- FIG. 2 is a time chart showing the process of handover in time series
- FIG. 3 is a diagram showing the details of the arrangement of the leg sensor and the position detection of the leg as viewed from the right side of the vehicle
- FIG. 4 is a diagram showing the details of the arrangement of the leg sensor and the position detection of the leg as viewed from the ceiling side of the vehicle
- FIG. 5 is a view showing a list of features of “driving posture”, “appropriate posture” and “inappropriate posture” determined by the state determination unit, FIG.
- FIG. 6 is a diagram showing the concept of actuation in an automatic operation mode implemented by the coordination of the HCU and the presentation device
- FIG. 7 is a flowchart showing the details of the state detection process performed in the automatic operation mode
- FIG. 8 is a diagram showing the details of the arrangement of the leg sensor and the position detection of the leg according to the second embodiment
- FIG. 9 is a diagram showing the details of the arrangement of the leg sensor of the second embodiment and the position detection of the leg
- FIG. 10 is a block diagram showing the configuration of the state detection system of the third embodiment
- FIG. 11 is a diagram showing the details of the arrangement of the leg sensor and the position detection of the leg according to the third embodiment
- FIG. 12 is a diagram showing the details of the arrangement of the leg sensor and the position detection of the leg according to the third embodiment
- FIG. 13 is a diagram showing the details of the arrangement of the leg sensor and the position detection of the leg according to the fourth embodiment
- FIG. 14 is a diagram showing the details of the arrangement of the leg sensor of the fifth embodiment and the position detection of the leg
- FIG. 15 is a diagram showing the details of the arrangement of the leg sensor and the position detection of the leg according to the fifth embodiment
- FIG. 16 is a view showing a list of contents sorted based on the state of the hip joint and the knee joint of the right leg with regard to the division of the “appropriate posture” and the “inappropriate posture”.
- HCU Human Machine Interface
- HCU 30 Human Machine Interface control unit
- the HCU 30 is mounted on the vehicle A together with electronic control units such as a vehicle control ECU (Electronic Control Unit) 80 and an automatic driving ECU 50.
- the HCU 30, the vehicle control ECU 80, and the autonomous driving ECU 50 are directly or indirectly electrically connected to each other, and can communicate with each other.
- the vehicle A has an automatic driving function by the operation of the vehicle control ECU 80 and the automatic driving ECU 50.
- the vehicle control ECU 80 is electrically connected directly or indirectly to the on-vehicle actuator group 91 mounted on the vehicle A.
- the vehicle control ECU 80 integrally controls the operation of the in-vehicle actuator group 91 to control the behavior of the vehicle A.
- the on-vehicle actuator group 91 includes, for example, a throttle actuator of an electronically controlled throttle, an injector, a brake actuator, and a motor generator for driving and regeneration.
- the vehicle control ECU 80 mainly includes a computer having a processing unit, a RAM, a memory device, an input / output interface, and the like.
- the vehicle control ECU 80 constructs an actuator control unit 81 as a functional block related to vehicle control by causing the processing unit to execute a vehicle control program stored in the memory device.
- the actuator control unit 81 generates a control signal output toward the on-vehicle actuator group 91 based on at least one of the operation information based on the driver's driving operation and the autonomous traveling information acquired from the automatic driving ECU 50.
- the autonomous driving ECU 50 is electrically connected directly or indirectly to the GNSS receiver 71, the map database 72, the autonomous sensor group 73, and the like as a configuration for acquiring information necessary for autonomous traveling.
- a GNSS (Global Navigation Satellite System) receiver 71 can receive positioning signals transmitted from a plurality of satellites. The GNSS receiver 71 sequentially outputs the received positioning signal toward the autonomous driving ECU 50 as information for specifying the current position of the vehicle A.
- the GNSS receiver 71 can receive positioning signals transmitted from a plurality of satellites. The GNSS receiver 71 sequentially outputs the received positioning signal toward the autonomous driving ECU 50 as information for specifying the current position of the vehicle A.
- the map database 72 is a storage device storing a large number of map data.
- the map data includes structural information such as curvature, slope, and section length of each road, and non-temporary traffic control information such as speed limit and one-way traffic.
- the map database 72 provides map data of the vicinity of the current position of the vehicle A and the traveling direction to the autonomous driving ECU 50 based on the request from the autonomous driving ECU 50.
- the autonomous sensor group 73 detects moving objects such as pedestrians and other vehicles, and stationary objects such as falling objects on the road, traffic signals, guard rails, curbs, road signs, road markings, and division lines.
- the autonomous sensor group 73 includes, for example, a camera unit, a rider, a millimeter wave radar, and the like.
- the autonomous sensor group 73 sequentially outputs, to the autonomous driving ECU 50, object information on the detected moving object and stationary object respectively.
- the autonomous driving ECU 50 mainly includes a computer having a processing unit 61, a RAM 62, a memory device 63, and an input / output interface.
- the processing unit 61 includes at least one of a central processing unit (CPU), a graphics processing unit (GPU), and a field-programmable gate array (FPGA).
- the processing unit 61 may be provided with a dedicated processor specialized for learning and inference of artificial intelligence (AI).
- AI artificial intelligence
- the autonomous driving ECU 50 performs the acceleration / deceleration control and the steering control of the vehicle A in cooperation with the vehicle control ECU 80, thereby exhibiting an autonomous driving function capable of performing the driving operation of the vehicle A on behalf of the driver.
- the autonomous driving ECU 50 switches a plurality of control modes in which the location of control of the driving operation is different.
- the plurality of control modes include an automatic driving mode in which the vehicle A is autonomously traveled by the automatic driving function, and a manual driving mode in which the vehicle A travels by driving operation of the driver (see FIG. 2).
- the control mode includes a change request mode that is executed at the time of transition from the automatic driving mode to the manual driving mode, and a minimum risk maneuver (MRM) mode for automatically evacuating the vehicle A.
- MRM minimum risk maneuver
- the autonomous driving ECU 50 can execute the autonomous driving program stored in the memory device 63 by the processing unit 61.
- the autonomous driving program includes a program for causing the vehicle A to travel autonomously, a program for controlling driving shift, and the like.
- a self-vehicle position specifying unit 51, an environment recognition unit 52, a plan generation unit 53, an autonomous traveling control unit 54, and the like are constructed in the automatic driving ECU 50.
- the vehicle position specifying unit 51 specifies the current position of the vehicle A based on the positioning signal received by the GNSS receiver 71.
- the vehicle position specifying unit 51 can identify the detailed current position of the vehicle A by collating the image of the front area acquired from the camera unit of the autonomous sensor group 73 with the detailed map data acquired from the map database 72 is there.
- the environment recognition unit 52 combines the position information identified by the vehicle position identification unit 51, the map data acquired from the map database 72, the object information acquired from the autonomous sensor group 73, etc. Recognize the driving environment.
- the environment recognition unit 52 recognizes the shape and movement state of an object around the vehicle A based on the integration result of each object information, particularly within the detection range of each autonomous sensor.
- the environment recognition unit 52 generates a virtual space in which the actual traveling environment is reproduced in three dimensions by combining the information of the recognized surrounding objects with the position information and the map data.
- the plan generation unit 53 Based on the traveling environment recognized by the environment recognition unit 52, the plan generation unit 53 generates a traveling plan for causing the vehicle A to autonomously travel by the automatic driving function.
- a travel plan a long medium-term travel plan and a short-term travel plan are generated.
- a route for directing the vehicle A to a destination set by the driver is defined.
- a planned travel path for realizing travel according to the long-mid-term travel plan is defined. Specifically, executions such as lane tracking and steering for lane change, acceleration / deceleration for speed adjustment, and sudden braking for collision avoidance are determined based on a short-term travel plan.
- the system side acquires the control right at its own discretion in the handover where the control side is deliberately handed over the control to the driver, and in a highly urgent situation There is an override that exists.
- the plan generation unit 53 predicts the occurrence of a situation in which the autonomous traveling can not be continued in the automatic driving mode based on map data in the traveling direction, information acquired from the autonomous sensor group 73, and the like.
- the plan generation unit 53 formulates a handover schedule (hereinafter, “transfer of authority plan”) when it is difficult to formulate a travel plan and it is impossible to continue the automatic driving.
- the off timing of the automatic operation (time t3 in FIG. 2) is defined.
- the execution timing (FIG. 2, time t1 in FIG. 2) of a shift request (hereinafter referred to as “TOR: Take Over Request”) is set, which is reversely calculated from the off timing and requests the driving shift from the automatic driving function to the driver.
- the execution timing of TOR is set to a time earlier by a predetermined allowance time Ta (for example, 4 seconds, see FIG. 2) with respect to the off timing of the automatic operation, and notified to the HCU 30.
- the plan generation unit 53 controls the handover process in the shift request mode in cooperation with the HCU 30 based on the formulated authority transfer plan.
- the autonomous traveling control unit 54 generates autonomous traveling information that instructs acceleration / deceleration and steering of contents based on the planned traveling route formulated by the plan generation unit 53 in the automatic driving mode.
- the autonomous traveling control unit 54 sequentially outputs the generated autonomous traveling information to the vehicle control ECU 80.
- the autonomous traveling control unit 54 cooperates with the actuator control unit 81 to cause the vehicle A to autonomously travel along the planned traveling route.
- the HCU 30 is an electronic control unit provided with a function to control information presentation to the driver in an integrated manner and a function to detect the driver's state.
- the HCU 30 is electrically connected directly or indirectly to the presentation device 20, the DSM (Driver Status Monitor) 11, the leg sensor 12, and the like.
- the presentation device 20 presents various information related to the vehicle A to the occupants of the vehicle A including the driver based on the presentation control signal output by the HCU 30.
- the presentation device 20 includes, for example, a display device that presents information by display, a speaker 21 that presents information by a notification sound and a message voice, and a tactile stimulation device 22 that presents information by vibration.
- the tactile stimulation device 22 is provided, for example, on the seat surface 111 (see FIG. 3) or the backrest of the driver's seat 110.
- the DSM 11 is configured of a near infrared light source and a near infrared camera, a control unit that controls these, and the like.
- the DSM 11 is disposed, for example, on the top surface of the steering column in a posture in which the near infrared camera is directed to the driver's seat side.
- the DSM 11 captures the head of the driver who has been irradiated with near-infrared light by the near-infrared light source with a near-infrared camera, and monitors the driver's condition by analyzing the captured face image.
- the DSM 11 sequentially outputs the detection information of the driver obtained by the analysis of the face image to the HCU 30.
- the leg sensor 12 shown in FIGS. 1, 3 and 4 is, for example, an optical sensor, and is an active type object detection sensor that detects an object by light irradiation.
- the leg sensor 12 is installed on the driver's side of the center console 117.
- the leg sensor 12 is installed at a height position lower than the seating surface 111 and closer to the seating surface 111 of the driver's seat 110 than the floor surface 113 in the vehicle A.
- the leg sensor 12 can individually detect both legs of the driver in the detection space 120.
- the detection space 120 is predefined below the steering wheel 115 and in front of the driver's seat 110.
- the shape and the material of the structure such as the door that divides the detection space 120 in the vehicle compartment are different for each vehicle type. Therefore, it is preferable to define the detection space 120 for each vehicle type in consideration of the shape and the material of the surrounding structure.
- the leg sensor 12 has a light emitting unit 13 and a light receiving unit 14.
- the light projection unit 13 is a light source that emits near infrared light.
- the light projecting unit 13 transmits pulsed near infrared rays as detection waves toward the detection space 120 based on the control of the HCU 30.
- the near infrared rays are spot-irradiated from the light emitting unit 13 along the lateral direction of the vehicle A, for example.
- the light projecting unit 13 repeats irradiation of near infrared rays at predetermined time intervals.
- the directivity (irradiation angle) of the near infrared light is set so as to exceed the thickness of the leg and to have a width such that each pedal or the like is not erroneously detected.
- the light receiving unit 14 is a light receiving element that converts an optical signal into an electrical signal.
- the light receiving unit 14 detects near infrared light reflected by an object located in the detection space 120.
- the near infrared light reflected by the left leg Ll near the center console 117 and the near infrared light reflected by the right leg Lr far from the center console 117 The light receiving unit 14 receives light in order.
- the light receiving unit 14 converts the received light into an electrical signal, and sequentially transmits the signal to the HCU 30 as a detection signal in which the position of the leg is detected.
- the HCU 30 mainly includes a computer having a processing unit 41, a RAM 42, a memory device 43, and an input / output interface.
- the processing unit 41 is configured to include at least one of a central processing unit (CPU), a graphics processing unit (GPU), and a field-programmable gate array (FPGA).
- the processing unit 41 may be provided with a dedicated processor specialized for learning and inference of artificial intelligence (AI).
- the HCU 30 constitutes a state detection system 10 together with the DSM 11 and the leg sensor 12 and the like.
- the state detection system 10 controls the information presentation by the presentation device 20 in the automatic driving mode, and maintains the driver's posture in an appropriate state.
- the HCU 30 causes the processing unit 41 to execute the state detection program stored in the memory device 43, and constructs a plurality of functional blocks related to the state detection and attitude maintenance of the driver. Specifically, in the HCU 30, an information acquisition unit 31, a state determination unit 32, a notification control unit 33, and the like are constructed.
- the information acquisition unit 31 acquires detection information of the driver detected in the DSM 11.
- the information acquisition unit 31 controls the operation of the leg sensor 12 and acquires detection information obtained by detecting the positions of the left and right legs of the driver. Specifically, based on the detection signal of the leg sensor 12, the information acquisition unit 31 measures the difference from the time when the near infrared light is irradiated to the time when the reflected near infrared light is received. Then, the information acquisition unit 31 calculates the positions of the left and right legs from the difference in time, and acquires it as detection information indicating the positions of both legs of the driver.
- the information acquisition unit 31 acquires status information indicating the control mode and the like of the automatic driving ECU 50. For example, when the plan transfer unit 53 formulates an authority transfer plan based on the prediction of occurrence of the system limit of the automatic operation, the information acquisition unit 31 acquires a command instructing execution of TOR from the HCU 30.
- the state determination unit 32 determines whether the driver can cope with the driving change within the allowance time Ta (see FIG. 2) based on the TOR based on the state of the driver's upper body and legs indicated by the detection information. . If there is no problem with the posture of the upper body, the state determination unit 32 classifies the driver's posture into one of "driving posture”, “appropriate posture” and “inappropriate posture” based on the state of the driver's legs. (See FIG. 5).
- the “driving posture” is the posture of the driver in the state of manual driving. If a portion of the right leg Lr beyond the ankle (hereinafter, “right toe”) touches the accelerator pedal or the brake pedal, the state determination unit 32 determines that the “driving posture” is set. .
- the "appropriate attitude” is an attitude that can be returned to the driving attitude within the allowance time Ta (see FIG. 2).
- the “inappropriate attitude” is an attitude that can not return to the driving attitude within the allowance time Ta.
- the state determination unit 32 determines that the driver can not cope with the driving change within the allowance time Ta, and is classified as "inappropriate posture". In this case, it is determined that the driver can not cope with the driving change within the allowance time Ta.
- the state determination unit 32 determines the “appropriate posture” and the “inappropriate posture” mainly based on the estimated position of the driver's right foot tip.
- the state determination unit 32 determines that the driver is in the “appropriate posture”. Even if the portion of the driver's left leg Ll beyond the ankle (hereinafter referred to as the "left toe") is not detected from below the seat surface 111, the condition is still present if the right foot is below the seat surface 111.
- the determination unit 32 determines that the “appropriate posture” capable of coping with the driving change.
- the state determination unit 32 determines that the “inappropriate posture” is set.
- the notification control unit 33 generates a presentation control signal to be output to the presentation device 20 based on the information acquired by the information acquisition unit 31.
- the notification control unit 33 controls the information presentation by display, sound, vibration, and the like by the output of the presentation control signal directed to the presentation device 20.
- the notification control unit 33 instructs the driver to take over the driving operation at time t1 (see FIG. 2) set in the authority transfer plan.
- the requested information presentation is performed using the presentation device 20.
- the driver who has noticed the notification of TOR starts driving operation while correcting the posture after confirming the surrounding situation.
- the state determination unit 32 determines that the state determination unit 32 is in the “driving posture” based on the detection information of the DSM 11 and the leg sensor 12.
- the automatic driving ECU 50 that has acquired the determination result of the state determining unit 32 determines that the driving operation is possible at time t2 (see FIG. 2), and switches the control mode from the shift request mode to the manual driving mode.
- the automatic driving ECU 50 changes the control mode at time t3 (see FIG. 2) Switch from request mode to MRM mode. As a result, the vehicle A stops at the searched evacuation area by evacuation traveling.
- the notification control unit 33 In order to prevent the transition to the MRM mode, the notification control unit 33 returns the driver's state to the “appropriate attitude” when the state determination unit 32 determines that the “improper attitude” is in the automatic driving mode. Is continuously performed using the presentation device 20 (see FIG. 6).
- the notification control unit 33 guides the user to the “appropriate posture” state by information presentation using the speaker 21 and the tactile sense stimulation device 22 when, for example, the driver is performing a sitting position or a sitting position.
- the notification control unit 33 displays “inappropriate posture by information presentation using the speaker 21 and the tactile stimulation device 22. To warn that it is ",” prompting a return to "appropriate attitude". In this way, the driver's condition prior to TOR is maintained so that a drive change may occur.
- the state detection process shown in FIG. 7 is started by the HCU 30 based on switching from the manual operation mode to the automatic operation mode.
- S101 the detection information by the DSM 11 and the leg sensor 12 is acquired, and the process proceeds to S102.
- S102 it is determined whether or not the driver's state is "appropriate attitude" based on the detection information acquired in the immediately preceding S101. If it is determined at S101 that the driver's state is "appropriate posture", the process proceeds to S106. On the other hand, when it is determined in S102 that the driver's state is the "inappropriate posture", the process proceeds to S103.
- S103 based on the state of the driver detected in S102, the guidance or warning for prompting the return to the "appropriate posture” is performed, and the process proceeds to S104 and S105.
- S104 and S105 as in S101 and S102, it is determined based on the detection information whether the driver has returned to the "appropriate posture". If it is determined in S105 that the "appropriate posture” is restored, the process returns to S101 and the state detection process is continued. On the other hand, when the driver's posture does not return to the “appropriate posture”, the information presentation to prompt the correction of the sitting posture is continued by repeating the processing of S103 to S105.
- S106 when it is determined in S102 that the “appropriate posture” is set, it is determined whether a notification instruction of TOR based on the limit prediction of the system is acquired from the autonomous driving ECU 50 or not. If it is determined in S106 that the start of TOR has been instructed, the process proceeds to a process for implementing TOR. On the other hand, when it is determined that the automatic driving mode is to be continued, the process returns to S101, and the state detection process is continued.
- the HCU 30 notifies TOR. In this case, after the notification of TOR, the HCU 30 provides the automatic driving ECU 50 with a determination result indicating that return to the “driving posture” is not possible. Based on the determination result, the autonomous driving ECU 50 starts transitioning to the MRM mode before the elapse of the margin time Ta (see FIG. 2).
- the position of the driver's leg is acquired as detection information. And it is judged based on the state of a driver's leg whether a driver can cope with the driving change based on TOR. According to the above, it is possible to determine that the TOR can not be handled when the leg posture is inappropriate for driving operation.
- the state determination unit 32 determines that the “appropriate posture” is set. If at least the right toe is below the seating surface 111, the driver can start the operation of the accelerator pedal or the brake pedal within the allowance time Ta after the occurrence of TOR. Therefore, if it is determined that it is possible to cope with TOR based on the position of the right toe, execution of driving alternation for a driver who can not cope with TOR can be avoided while reducing the calculation load of attitude determination.
- the leg sensor 12 of the first embodiment detects the driver's leg in a detection space 120 defined below the steering wheel 115 and in front of the driver's seat 110.
- the state determination unit 32 can determine whether the state of the driver's leg can handle TOR. Therefore, complication of the configuration of the leg sensor 12 can be avoided while securing the accuracy of the state determination of the driver.
- the detection space 120 described above is a very dark space because ambient light is blocked by an instrument panel, a steering column, and the like. Therefore, if the active type optical sensor that emits near infrared light is used as the leg sensor 12, the information acquisition unit 31 can acquire the detection information without being affected by the brightness around the vehicle A.
- the state detection system 10 of the first embodiment no camera is used to detect the state of the leg. Therefore, by matching the pattern of the posture taken by the driver with the image data captured at that time, the number of development steps for creating a determiner for determining the driver's posture becomes unnecessary. Furthermore, by not using a camera for detecting the state of the leg, a highly marketable state detection system 10 can be realized in consideration of the driver's privacy problem.
- the leg sensor 12 of the first embodiment is installed at a height position closer to the seating surface 111 than the floor surface 113.
- the positions of the legs are positions farther from the knees, and the portions closer to the floor surface 113 are more likely to be scattered along the floor surface 113 in the front, rear, left, and right.
- the height of the seating surface 111 is close, the position of the leg (knee) does not change significantly. Therefore, if the leg sensor 12 is disposed at a height close to the seat surface 111, even if the irradiation range (directivity) of the near infrared light (detection wave) is narrowed, detection leakage hardly occurs. Therefore, the leg sensor 12 can be simplified while securing the detection accuracy of the position of the leg.
- the leg sensor 12 corresponds to a "position detection sensor”
- the light emitting unit 13 corresponds to a “transmission unit”
- the light receiving unit 14 corresponds to a “detection unit”
- the HCU 30 is in a "state”.
- the leg sensor 12 corresponds to a "position detection sensor”
- the light emitting unit 13 corresponds to a "transmission unit”
- the light receiving unit 14 corresponds to a “detection unit”
- the HCU 30 is in a "state”.
- the HCU 30 is in a "state”.
- the second embodiment shown in FIGS. 8 and 9 is a modification of the first embodiment.
- the leg sensor 212 of the second embodiment is an optical sensor as in the first embodiment, and is installed at a position different from that of the first embodiment.
- the leg sensor 212 is housed in front of the seat portion of the driver's seat 110 and at a height position closer to the seat 111 than the floor surface 113.
- the light emitting unit 213 of the leg sensor 212 emits near infrared light toward the detection space 120 defined in the front.
- the light projection unit 213 can scan the light irradiation direction at least in the horizontal direction.
- the light projection unit 213 repeats the irradiation of the near-infrared light at predetermined time intervals while turning the light irradiation direction in the horizontal direction.
- the directivity (irradiation angle) of the near infrared light is narrowed to about the thickness of the leg. For example, when the near infrared light is emitted in the left direction, the near infrared light reflected by the left leg L1 is detected by the light receiving unit 214. Similarly, when the near infrared light is emitted in the right direction, the near infrared light reflected by the right leg Lr is detected by the light receiving unit 214.
- the information acquisition unit 31 (see FIG. 1) individually detects the positions of the driver's legs in the detection space 120 based on the detection signal received from the leg sensor 212. Specifically, the information acquisition unit 31 estimates the positions of both legs based on the irradiation direction of the near infrared light by the light projection unit 213 and the difference time from the irradiation time to the light reception time.
- the reflected light detected after progress of predetermined time ts from the irradiation time of near infrared rays is filtered as light reflected by the structure of the vehicle A out of the area
- the information acquisition unit 31 detects the presence and the position of each of the legs Ll and Lr based on the reflected light received until the predetermined time ts elapses from the irradiation time.
- the structure of the vehicle A represented by the door is defined in shape and material for each vehicle type. Therefore, the light reception time and the light reception intensity of the reflected light are previously known values. From such a thing, light reception time and light reception intensity according to a structure around detection space 120 may be set up beforehand, and distinction with a leg and a structure may be carried out.
- the same effect as that of the first embodiment can be obtained, and it can be determined that TOR can not be handled when the posture of the leg unsuitable for driving operation is taken.
- the information acquisition unit 31 determines whether the two legs are aligned in the front-rear direction of the vehicle A, based on the detection signal, You can know the position of both legs.
- the leg sensor 212 corresponds to a “position detection sensor”
- the light emitting unit 213 corresponds to a “transmission unit”
- the light receiving unit 214 corresponds to a “detection unit”.
- the third embodiment shown in FIGS. 10 to 12 is another modification of the first embodiment.
- the state detection system 310 of the third embodiment includes the HCU 30 and the DSM 11 substantially the same as the first embodiment.
- the state detection system 310 has the leg sensor 12 substantially the same as the first embodiment and the leg sensor 212 substantially the same as the second embodiment as a configuration for detecting the driver's legs.
- the state detection system 310 detects the driver's legs in the detection space 120 from different directions using the two leg sensors 12 and 212.
- the two leg sensors 12 and 212 are both provided at a height closer to the seating surface 111 than the floor surface 113.
- the leg sensor 12 disposed on the center console 117 is installed at a position slightly higher than the leg sensor 212 disposed on the driver's seat 110 in the height direction of the vehicle A (see FIG. 11).
- the information acquisition unit 31 integrates detection signals output from the two leg sensors 12 and 212, and estimates the positions of the legs Ll and Lr in the detection space 120.
- the same effects as in the first embodiment can be obtained, and it can be determined that TOR can not be handled when the posture of the leg unsuitable for driving operation is taken.
- the HCU 30 can more accurately acquire the state of the driver's two legs and appropriately determine whether or not the driving alternation is possible. It can be carried out.
- both of the leg sensors 12 and 212 correspond to the "position detection sensor".
- the fourth embodiment shown in FIG. 13 is still another modification of the first embodiment.
- two leg sensors 412t and 412b are provided on the center console 117.
- Each of the leg sensors 412t and 412b has substantially the same configuration as the leg sensor 12 (see FIG. 1) of the first embodiment.
- the leg sensor 412t is provided at a position higher than the leg sensor 412b in the height direction of the vehicle.
- the leg sensor 412t is disposed rearward of the leg sensor 412b in the front-rear direction of the vehicle.
- the information acquisition unit 31 can detect, for example, a state where the legs are assembled on the knee, from detection signals of the two leg sensors 412t and 412b. Specifically, in the state of being assembled on the knee, the upper leg sensor 412t outputs a detection signal in which both the right leg Lr and the left leg Ll are detected. On the other hand, the lower leg sensor 412b outputs a detection signal in which only the right leg Lr is detected. The information acquisition unit 31 can estimate the state of the driver's leg set from the combination of these two detection signals.
- the same effect as that of the first embodiment can be obtained, and it can be determined that it can not cope with TOR when the posture of the leg unsuitable for driving operation is taken.
- the HCU 30 by using the detection information of the upper and lower two leg sensors 412t and 412b, the HCU 30 appropriately grasps the state of the leg that is difficult to return to the driving posture, and to the driver who can not cope with TOR. It is possible to avoid the implementation of driving changes.
- both of the leg sensors 412t and 412b correspond to the "position detection sensor".
- the fifth embodiment shown in FIGS. 14 and 15 is still another modification of the first embodiment.
- a floor sensor 15 is provided instead of the leg sensor 12 (see FIG. 1) of the first embodiment.
- the floor sensor 15 is a sensor that detects both legs Ll and Lr of the driver from the detection space 120 below the steering wheel 115 and in front of the driver's seat 110, similarly to the leg sensor 12.
- the floor sensor 15 is formed in a sheet shape and is provided on the back side of the floor surface 113.
- the floor sensor 15 is formed to have the same size as the bottom surface of the detection space 120 so as to indirectly reach the entire bottom surface of the detection space 120.
- the floor sensor 15 has a plurality of pressure detection units arranged in a two-dimensional manner.
- the pressure detection unit is provided, for example, one by one in each of the areas (see the dashed-dotted line in FIG. 15) in which the floor sensor 15 is partitioned in a grid shape.
- Each pressure detection unit detects a load due to the foot sole being placed on the floor surface 113.
- the floor sensor 15 sequentially outputs the pressure acting on each area as a detection signal to the HCU 30 (see FIG. 1).
- the information acquisition unit 31 detects the pressure (load) acting on each area of the floor sensor 15. Based on such detection information, the state determination unit 32 (see FIG. 1) detects the footprints of both legs Ll and Lr from the detection space 120, for example, a pressure equal to or greater than a threshold acts on at least two regions. Is determined to be an "appropriate attitude". On the other hand, if only one area is affected by pressure above the threshold, etc., and only one foot footprint can be detected, and if pressure above the threshold is not applied to any area and even one foot footprint can not be detected, The state determination unit 32 determines that the “inappropriate posture” is set.
- the sixth embodiment of the present disclosure is a modification of the first embodiment shown in FIGS. 1 and 2.
- the allowance time Ta set in the authority transfer plan is variable. For example, when the traveling environment of the vehicle A is good (such as fine weather) and the detection distance of the autonomous sensor group 73 is sufficiently long, the autonomous driving ECU 50 grasps the distant situation and hasten the off timing of the autonomous driving. You can perform TOR at the timing. Specifically, it becomes possible to formulate an authority transfer plan that secures an allowance time Ta of about 10 seconds.
- the allowance time Ta set in the authority transfer plan is shorter than that in the case of a good traveling environment, for example, about 4 seconds.
- the state determination unit 32 sets allowance criteria (see FIG. 5) for dividing “appropriate attitude” and “inappropriate attitude” into the allowance time Ta. Vary according to the length of That is, if the margin time Ta becomes long, the tolerance standard is relaxed. On the contrary, if the margin time Ta becomes short, the tolerance standard becomes stricter.
- the driver's posture such as a seat on the seat or a seat on the seat, and a leg set on the knee is determined as "appropriate posture" in a traveling environment where a sufficient allowance time Ta can be secured (approximately 10 seconds). In the traveling environment where the allowance time Ta is short (about 4 seconds), it is determined as "inappropriate posture”.
- the posture in which the right foot tip is hooked on the steering wheel 115 (see FIG. 3), the door on the driver's seat side, etc. is determined to be “inappropriate posture” regardless of the allowance time Ta.
- the state determination unit 32 makes the tolerance standard stricter as the remaining time until the off timing of the automatic operation becomes shorter after the TOR. For example, at the timing when the remaining time until the off timing of the automatic driving is about 4 seconds, the state determination unit 32 sets “appropriate postures such as a sitting position on the seat and a leg on the knee. It is determined that the "inappropriate posture" is not "posture”.
- the same effect as that of the first embodiment can be obtained, and when the posture of the leg unsuitable for driving operation is taken, it is possible to obtain the determination result that it can not cope with TOR. .
- the allowance criteria for separating the "appropriate posture" and the "inappropriate posture” may be changed in accordance with the length of the margin time Ta that can be secured in the shift request mode. According to such control, since the driver's permitted posture is increased in the automatic driving mode, unnecessary guidance for posture correction is reduced. According to the above, it is possible to provide an automatic driving function that is useful for a user such as a driver.
- the margin time set in the authority transfer plan is longer than the first embodiment and the like and is set to a fixed length (for example, 10 seconds). Therefore, in the first modification, the change of the acceptance criteria in the automatic operation mode is not performed. For example, if the detection distance and accuracy of the autonomous sensor group are sufficient, such setting is possible. Therefore, in the automatic driving mode, postures such as a black seat, a straight seat, and a leg set on the knee are allowed.
- the state determination unit makes the tolerance standard stricter as the remaining time until the automatic driving off timing becomes shorter after TOR. Specifically, at the timing when the remaining time until the off timing of the automatic driving becomes less than a predetermined threshold (for example, 4 seconds), the state determination unit determines the posture such as the leg set on the palm, the straight seat, and the knee. , Not the "appropriate attitude” but the "inappropriate attitude”. As a result, the driver who takes the above attitude is warned at an appropriate timing.
- a predetermined threshold for example, 4 seconds
- the state determination unit may change the tolerance reference at an appropriate timing such as after TOR.
- the specific value of the spare time may be changed as appropriate according to the laws and guidelines of the area where the vehicle is used, the speed limit of the road on which the vehicle travels, and the like.
- an optical sensor, a floor sensor, and the like are employed as the configuration for detecting the position of the leg.
- a line sensor, an ultrasonic sensor or the like can be employed.
- a vibrating unit that oscillates an ultrasonic wave as a detection wave corresponds to a "transmission unit”
- a receiving unit that receives an ultrasonic wave reflected by a leg corresponds to a "detection unit”.
- various active and passive object detection sensors can be employed as sensors for detecting the position of the leg.
- the above-described method of distinguishing between a leg and a structure is also applicable to a form using an ultrasonic sensor or the like.
- the sensor provided in the state detection system can detect the attitude of the dot range shown in FIG.
- the posture of the driver's right leg directly linked to the pedal operation is defined by the hip joint angle ⁇ h and the knee joint angle ⁇ b, and the driver's posture when the joint angles ⁇ h and ⁇ b are changed. Is shown in the list.
- the range of the dot is a range of posture in which the change of the body axis is unnecessary until it is returned to the correct driving posture.
- the leg set under the knee is classified into the “appropriate posture” because it is not necessary to change the body axis to return to the driving posture.
- the right leg by stretching the right leg, if it is hung on the right door or window or the steering wheel 115, it is necessary to change the body axis to return to the driving posture.
- the leg is set on the knee, it is necessary to change the body axis. Therefore, these postures are divided into "inappropriate postures”.
- the “appropriate posture” and the “inappropriate posture” are determined from the state of the right leg based on the detection of the leg sensor.
- the detection information of the in-vehicle camera provided in front of and above the driver is combined with the detection information of the leg sensor, for example, the posture as described above in which the right leg is thrown forward can be specified. Therefore, with the state detection system that detects the driver's state by combining the detection results of the upper body and the leg, it is possible to implement a highly accurate alerting.
- the range of the detection space by the position detection sensor may be changed as appropriate.
- the condition of the driver's hip located on the seat may be detected.
- the position of the position detection sensor can also be changed in accordance with the setting range of the detection space.
- the position detection sensor may be installed on the side surface of the center cluster, the lower surface of the steering column, and in the vicinity of the pedal.
- the configuration for monitoring the driver's upper body is not limited to the DSM.
- the state detection system may include, for example, an electrostatic sensor or a pressure sensor that detects the grip of the steering wheel 115 (see FIG. 3), or the above-described in-vehicle camera or the like, together with or in place of the DSM.
- configurations such as DSM may be omitted from the state detection system.
- the automatic driving ECU enables so-called automatic driving of eye-off, which takes charge of monitoring of the traveling environment on behalf of the driver.
- the driver's state detection method is suitable for the driver's attitude detection in a period in which the level 3 automatic driving is performed.
- the state detection method according to the present disclosure is also applicable to level 2 automatic driving in which monitoring of the driving environment is performed by the driver.
- a state detection apparatus may be implement
- the processing unit of the electronic control unit in which the autonomous driving ECU and the HCU are integrally configured may implement the state detection method described above.
- a plurality of electronic control devices including the HCU, the autonomous driving ECU, and the vehicle control ECU may process the state detection program according to the present disclosure in a distributed manner.
- non-transitory tangible storage media such as flash memories and hard disks can be employed in a memory device such as an HCU as a configuration for storing a state detection program.
- the storage medium for storing the state detection program is not limited to the storage medium provided in the on-vehicle electronic control unit, and may be an optical disc as a copy source to the storage medium, a hard disk drive of a general purpose computer May be
- each unit is expressed as S101, for example.
- each part can be divided into a plurality of sub-parts, while a plurality of parts can be combined into one part.
- each part configured in this way can be referred to as a circuit, a device, a module, or a means.
- hardware unit e.g., computer
- hardware e.g., an integrated circuit, As part of hardwired logic, it may be implemented with or without the functionality of the associated device.
- the hardware part can also be configured inside the microcomputer.
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Human Computer Interaction (AREA)
- Traffic Control Systems (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017194618A JP6766791B2 (ja) | 2017-10-04 | 2017-10-04 | 状態検出装置、状態検出システム及び状態検出プログラム |
JP2017-194618 | 2017-10-04 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019069516A1 true WO2019069516A1 (ja) | 2019-04-11 |
Family
ID=65995118
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/025031 WO2019069516A1 (ja) | 2017-10-04 | 2018-07-02 | 状態検出装置、状態検出システム及び状態検出プログラム |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP6766791B2 (enrdf_load_stackoverflow) |
WO (1) | WO2019069516A1 (enrdf_load_stackoverflow) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113173181A (zh) * | 2020-01-27 | 2021-07-27 | 丰田自动车株式会社 | 自动驾驶装置 |
CN114750747A (zh) * | 2022-03-23 | 2022-07-15 | 东风汽车集团股份有限公司 | 一种可匹配不同人机参数的自动识别泊车方法及系统 |
CN114845920A (zh) * | 2019-12-23 | 2022-08-02 | 梅赛德斯-奔驰集团股份公司 | 车辆操作方法 |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7384185B2 (ja) | 2020-03-23 | 2023-11-21 | 株式会社デンソー | 情報提示制御装置 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030085810A1 (en) * | 2001-10-01 | 2003-05-08 | Wilfried Bullinger | Method for sensing the readiness of a driver to brake |
JP2009541888A (ja) * | 2006-11-28 | 2009-11-26 | ローベルト ボッシュ ゲゼルシャフト ミット ベシュレンクテル ハフツング | 存在監視部を備える運転者支援システム |
WO2017072953A1 (ja) * | 2015-10-30 | 2017-05-04 | 三菱電機株式会社 | 車両情報表示制御装置および自動運転情報の表示方法 |
WO2017110914A1 (ja) * | 2015-12-25 | 2017-06-29 | 株式会社デンソー | 車両制御装置 |
JP6264494B1 (ja) * | 2017-03-14 | 2018-01-24 | オムロン株式会社 | 運転者監視装置、運転者監視方法、学習装置及び学習方法 |
-
2017
- 2017-10-04 JP JP2017194618A patent/JP6766791B2/ja active Active
-
2018
- 2018-07-02 WO PCT/JP2018/025031 patent/WO2019069516A1/ja active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030085810A1 (en) * | 2001-10-01 | 2003-05-08 | Wilfried Bullinger | Method for sensing the readiness of a driver to brake |
JP2009541888A (ja) * | 2006-11-28 | 2009-11-26 | ローベルト ボッシュ ゲゼルシャフト ミット ベシュレンクテル ハフツング | 存在監視部を備える運転者支援システム |
WO2017072953A1 (ja) * | 2015-10-30 | 2017-05-04 | 三菱電機株式会社 | 車両情報表示制御装置および自動運転情報の表示方法 |
WO2017110914A1 (ja) * | 2015-12-25 | 2017-06-29 | 株式会社デンソー | 車両制御装置 |
JP6264494B1 (ja) * | 2017-03-14 | 2018-01-24 | オムロン株式会社 | 運転者監視装置、運転者監視方法、学習装置及び学習方法 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114845920A (zh) * | 2019-12-23 | 2022-08-02 | 梅赛德斯-奔驰集团股份公司 | 车辆操作方法 |
CN113173181A (zh) * | 2020-01-27 | 2021-07-27 | 丰田自动车株式会社 | 自动驾驶装置 |
CN114750747A (zh) * | 2022-03-23 | 2022-07-15 | 东风汽车集团股份有限公司 | 一种可匹配不同人机参数的自动识别泊车方法及系统 |
Also Published As
Publication number | Publication date |
---|---|
JP6766791B2 (ja) | 2020-10-14 |
JP2019064539A (ja) | 2019-04-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6686869B2 (ja) | 運転交代制御装置、及び運転交代制御方法 | |
US10093316B2 (en) | Vehicle traveling control device | |
JP6521803B2 (ja) | 自動運転制御装置、フットレスト、自動運転制御方法、および運転情報出力方法 | |
JP6409699B2 (ja) | 自動運転システム | |
JP6617692B2 (ja) | 運転交代制御装置、及び運転交代制御方法 | |
US11001271B2 (en) | Drive assistance device | |
JP6443403B2 (ja) | 車両制御装置 | |
WO2019069516A1 (ja) | 状態検出装置、状態検出システム及び状態検出プログラム | |
JP3991915B2 (ja) | 車両用運転操作補助装置およびその装置を備えた車両 | |
JP6942236B1 (ja) | 車両制御装置、車両制御方法、およびプログラム | |
JP7464688B2 (ja) | 車両制御装置、車両制御方法、およびプログラム | |
JP7194224B2 (ja) | 車両制御装置、車両制御方法、およびプログラム | |
US20240208538A1 (en) | Vehicle control system | |
JP7209681B2 (ja) | 車両制御装置、車両制御方法、及びプログラム | |
JP6604368B2 (ja) | 車両制御装置 | |
JP7329142B2 (ja) | 車両制御装置 | |
JP7075550B1 (ja) | 車両制御装置、車両制御方法、およびプログラム | |
JP2024030413A (ja) | 車両制御装置、車両制御方法、およびプログラム | |
JP2023182401A (ja) | 移動体制御装置、移動体制御方法、およびプログラム | |
JP2022185787A (ja) | 車両制御装置、車両制御方法、およびプログラム | |
JP2022152697A (ja) | 車両制御装置、車両制御方法、及びプログラム | |
JP7421692B2 (ja) | 車両制御装置 | |
JP7425133B1 (ja) | 車両制御装置、車両制御方法、およびプログラム | |
US20250091485A1 (en) | Exit assist function and seat device including the same | |
JP2024002010A (ja) | 運転支援装置、運転支援方法、およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18865062 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18865062 Country of ref document: EP Kind code of ref document: A1 |