WO2022018781A1 - Vehicle control device, program, and vehicle control method - Google Patents

Vehicle control device, program, and vehicle control method Download PDF

Info

Publication number
WO2022018781A1
WO2022018781A1 PCT/JP2020/027990 JP2020027990W WO2022018781A1 WO 2022018781 A1 WO2022018781 A1 WO 2022018781A1 JP 2020027990 W JP2020027990 W JP 2020027990W WO 2022018781 A1 WO2022018781 A1 WO 2022018781A1
Authority
WO
WIPO (PCT)
Prior art keywords
driver
unit
passenger
specified
state
Prior art date
Application number
PCT/JP2020/027990
Other languages
French (fr)
Japanese (ja)
Inventor
天龍 三沢
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2022538492A priority Critical patent/JP7361929B2/en
Priority to PCT/JP2020/027990 priority patent/WO2022018781A1/en
Publication of WO2022018781A1 publication Critical patent/WO2022018781A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system

Definitions

  • This disclosure relates to a vehicle control device, a program, and a vehicle control method.
  • Patent Document 1 describes a driving support device that determines whether or not the passenger is nervous when the direction of the line of sight of the passenger of the vehicle is outside the vehicle or in front of the vehicle. This driving support device supports the driver's driving operation based on the degree of tension of the passenger.
  • the position or a plurality of aspects of the present disclosure is intended to enable more accurate determination of the driver's condition by using the behavior of the passenger when determining the driver's condition.
  • the vehicle control device includes a driver consciousness level specifying unit that specifies a driver consciousness level, which is a level of consciousness of the driver of a vehicle, and an action of a passenger other than the driver in the vehicle.
  • a passenger behavior specifying unit that specifies a certain passenger behavior
  • a driver state specifying unit that specifies the driver state that is the driver's state from the combination of the driver awareness level and the passenger behavior, and the driving. It is characterized by including a vehicle control unit that controls the vehicle according to a person's state.
  • the vehicle control device is a combination of a driver consciousness level, which is the level of consciousness of the driver of the vehicle, and a passenger behavior, which is the behavior of a passenger other than the driver in the vehicle, and the driving.
  • a driver consciousness level which is the level of consciousness of the driver of the vehicle
  • a passenger behavior which is the behavior of a passenger other than the driver in the vehicle
  • the driving From the driver state storage unit that stores driver state-related information associated with the driver state, which is the state of the driver, the driver consciousness level specifying unit that specifies the driver consciousness level from the driver, and the passenger.
  • the driver consciousness level specifying unit is specified.
  • the driver state is specified from the combination of the driver consciousness level and the passenger behavior specified by the passenger behavior specifying unit, and the driver consciousness level and the driver consciousness level specified by the driver consciousness level specifying unit are specified.
  • the combination of the passenger behavior specified by the passenger behavior specifying unit and the specified passenger behavior are associated with each other and stored in the driver state storage unit as the driver state-related information.
  • the driver awareness level and the driver awareness level specified by the driver awareness level specifying unit from the driver state-related information
  • the driver state specifying unit that specifies the driver state associated with the combination of the passenger behaviors specified by the passenger behavior specifying unit and the driver state specified by the driver state specifying unit. Accordingly, it is characterized by including a vehicle control unit that controls the vehicle.
  • the program according to one aspect of the present disclosure uses a computer as a driver consciousness level specifying unit that specifies a driver consciousness level, which is a level of consciousness of the driver of a vehicle, and an action of a passenger other than the driver in the vehicle.
  • a passenger behavior specifying unit that specifies a certain passenger behavior
  • a driver state specifying unit that specifies the driver state that is the driver's state from the combination of the driver consciousness level and the passenger behavior, and the driving. It is characterized in that it functions as a vehicle control unit that controls the vehicle according to a person's state.
  • the program according to one aspect of the present disclosure uses a computer as a combination of a driver consciousness level, which is the level of consciousness of the driver of the vehicle, and a passenger behavior, which is the behavior of a passenger other than the driver in the vehicle.
  • a driver state storage unit that stores driver state-related information associated with the driver state, which is the driver's state, a driver consciousness level specifying unit that specifies the driver consciousness level from the driver, and the passenger said The driving specified by the driver consciousness level specifying unit when the amount of data stored in the passenger behavior specifying unit for specifying the passenger behavior and the driver state storage unit is less than a predetermined threshold value.
  • the driver state is specified from the combination of the person consciousness level and the passenger behavior specified by the passenger behavior specifying unit, and the driver consciousness level and the riding passenger specified by the driver consciousness level specifying unit.
  • the combination of the passenger behavior specified by the person behavior specifying unit and the specified passenger behavior are associated with each other and stored in the driver state storage unit as the driver state-related information, and the driving is performed.
  • the driver consciousness level and the rider specified by the driver consciousness level specifying unit from the driver state related information.
  • the driver state specifying unit that specifies the driver state associated with the combination of passenger behaviors specified by the person behavior specifying unit, and the driver state specified by the driver state specifying unit. It is characterized in that it functions as a vehicle control unit that controls the vehicle.
  • the vehicle control method specifies the driver consciousness level, which is the level of consciousness of the driver of the vehicle, and specifies the passenger behavior, which is the behavior of passengers other than the driver in the vehicle. It is characterized in that the driver state, which is the state of the driver, is specified from the combination of the driver consciousness level and the passenger behavior, and the vehicle is controlled according to the driver state.
  • the vehicle control method specifies the driver consciousness level, which is the level of consciousness of the driver of the vehicle, and specifies the passenger behavior, which is the behavior of passengers other than the driver in the vehicle. ,
  • the specified driver consciousness level and the specified passenger behavior are combined to identify the driver state, which is the driver's state, and the specified driver consciousness level and the specified passenger.
  • It is a vehicle control method that stores driver state-related information that associates a combination of actions with the specified driver state, and the amount of data of the stored driver state-related information is equal to or higher than a predetermined threshold value.
  • the method of identifying the driver state is associated with the combination of the specified driver awareness level and the specified passenger behavior from the driver state related information. It is characterized by switching to a method of specifying.
  • the behavior of the passenger can also be used when determining the state of the driver, so that the state of the driver can be determined more accurately.
  • FIG. (A) and (B) are block diagrams showing a hardware configuration example.
  • FIG. 1 It is a state chart diagram which shows an example of the process which controls a vehicle from the specific result in the driver consciousness level specifying part and the passenger behavior specifying part in Embodiment 1.
  • FIG. It is a state chart diagram which shows an example of the processing in a passenger behavior identification part. It is a state chart diagram which shows an example of the processing of the vehicle control part in Embodiment 1.
  • FIG. It is a schematic diagram which shows an example of the inside of the cockpit of a vehicle equipped with a vehicle control device.
  • FIG. 3 is a block diagram schematically showing a configuration of a vehicle control device according to a second embodiment. It is a block diagram which shows schematic structure of the vehicle control part in Embodiment 2.
  • FIG. 3 is a block diagram schematically showing a configuration of a vehicle control device according to a second embodiment. It is a block diagram which shows schematic structure of the vehicle control part in Embodiment 2.
  • FIG. 3 is a block diagram schematically showing a configuration of a vehicle control device according to
  • FIG. It is a state chart diagram which shows an example of the process which controls a vehicle from the specific result in the driver consciousness level specifying part and the passenger behavior specifying part in Embodiment 2.
  • FIG. It is a state chart diagram which shows an example of the processing of the vehicle control part in Embodiment 2.
  • FIG. 1 is a block diagram schematically showing the configuration of the vehicle control device 100 according to the first embodiment.
  • the vehicle control device 100 includes a passenger behavior specifying unit 110, a driver consciousness level specifying unit 160, a driver state specifying unit 170, and a vehicle control unit 180.
  • the passenger behavior specifying unit 110 identifies the passenger behavior, which is the behavior of a passenger other than the driver in the vehicle. For example, the passenger behavior specifying unit 110 identifies the passenger behavior from information about the passenger such as line-of-sight information, voice information, or biological information of the passenger, and gives the specified passenger behavior to the driver state specifying unit 170. .. Here, it is desirable that the passenger behavior is an behavior performed by the passenger by observing the driver.
  • FIG. 2 is a block diagram schematically showing the configuration of the passenger behavior specifying unit 110 in the first embodiment.
  • the passenger behavior specifying unit 110 includes a passenger monitoring unit 120, a load analysis unit 140, a line-of-sight analysis unit 141, a voice analysis unit 142, and a specific unit 146.
  • the passenger monitoring unit 120 detects the state of the passenger.
  • the passenger monitoring unit 120 includes a load detection unit 121, a load determination unit 122, a face image acquisition unit 123, a line-of-sight detection unit 124, a line-of-sight determination unit 125, a heart rate detection unit 126, and a heart rate determination unit 127.
  • a sweating amount detection unit 128, a sweating amount determination unit 129, a voice detection unit 130, and a voice determination unit 131 are provided.
  • the load detection unit 121 detects the value of the load on the passenger seat from the signal detected by the load sensor (not shown) installed in the passenger seat of the vehicle. Then, the load detection unit 121 generates load detection data indicating the value of the load, and gives the load detection data to the load determination unit 122.
  • the load determination unit 122 gives the load detection data given by the load detection unit 121 to the load analysis unit 140. Further, the load determination unit 122 gives the load information generated by the load analysis unit 140 to the specific unit 146, as will be described later.
  • the load analysis unit 140 detects the presence or absence of a passenger by analyzing the load detection data given by the load detection unit 121. Further, the load analysis unit 140 analyzes the load detection data given by the load detection unit 121 when there is a passenger, so that the position of the passenger in the vehicle, the direction in which the load is applied by the passenger, and the load are applied by the passenger. It identifies the magnitude of the load and generates load information indicating the identified position, direction and magnitude. The load information is given to the load determination unit 122, and is given to the specific unit 146 from the load determination unit 122.
  • the load sensor installed in the passenger's seat reacts, so that the load analysis unit 140 can detect that the passenger has boarded. Further, even when the passenger shifts the center of gravity to the driver's seat side or the window side, the load analysis unit 140 detects in which direction the passenger shifts the center of gravity from the load status detected by the load sensor. Can be done.
  • the face image acquisition unit 123 identifies the passenger's face from the image data showing the image of the inside of the vehicle, and the face image data which is the data of the face image showing the specified face. To get.
  • the face image data is given to the line-of-sight detection unit 124.
  • the line-of-sight detection unit 124 detects the line-of-sight of the passenger from the face image indicated by the face image data given by the face image acquisition unit 123.
  • the line-of-sight detection unit 124 generates line-of-sight data indicating the detected line-of-sight, and gives the line-of-sight data to the line-of-sight determination unit 125.
  • the line-of-sight determination unit 125 gives the line-of-sight data given by the line-of-sight detection unit 124 to the line-of-sight analysis unit 141. Then, the line-of-sight determination unit 125 gives the line-of-sight information generated by the line-of-sight analysis unit 141 to the specific unit 146, as will be described later.
  • the line-of-sight analysis unit 141 identifies the direction the passenger is looking at by analyzing the line-of-sight data given by the line-of-sight detection unit 124. Then, the line-of-sight analysis unit 141 generates line-of-sight information indicating the direction the passenger is looking at, and gives the line-of-sight information to the line-of-sight determination unit 125.
  • the heart rate detection unit 126 detects the heart rate of the passenger and generates heart rate information indicating the detected heart rate.
  • the heart rate information is given to the heart rate determination unit 127.
  • the heart rate determination unit 127 gives heart rate information to the specific unit 146.
  • the sweating amount detection unit 128 detects the sweating amount of the passenger and generates sweating amount information indicating the detected sweating amount.
  • the sweating amount information is given to the sweating amount determination unit 129.
  • the sweating amount determination unit 129 gives the sweating amount information to the specific unit 146.
  • the voice detection unit 130 detects the voice emitted by the passenger to generate voice data indicating the detected voice.
  • the voice data is given to the voice determination unit 131.
  • the voice determination unit 131 gives voice data to the voice analysis unit 142. Further, the voice determination unit 131 gives the voice information generated by the voice analysis unit 142 to the specific unit 146, as will be described later.
  • the voice analysis unit 142 identifies the content of the voice emitted by the passenger by analyzing the voice data.
  • the voice analysis unit 142 includes a language analysis unit 143, a dictionary storage unit 144, and a dictionary reference unit 145.
  • the language analysis unit 143 divides the voice indicated by the voice data into morphemes by performing morphological analysis. Then, the language analysis unit 143 gives the divided morphemes to the dictionary reference unit 145.
  • the dictionary storage unit 144 stores dictionary data indicating the meaning of each morpheme.
  • the dictionary reference unit 145 identifies the meaning of each morpheme given by the language analysis unit 143 by referring to the dictionary data stored in the dictionary storage unit 144, and gives the meaning to the language analysis unit 143.
  • the language analysis unit 143 identifies the content of the voice emitted by the passenger from the meaning of each morpheme, and generates voice information indicating the specified content.
  • the voice information is given to the voice determination unit 131, and the voice determination unit 131 gives the voice information to the specific unit 146.
  • the voice emitted by the passenger includes non-verbal voice such as screaming or sobbing.
  • the load detection unit 121, the load determination unit 122, and the load analysis unit 140 constitute a load identification unit 111 that specifies the direction and magnitude of the load in the passenger's seat.
  • the face image acquisition unit 123, the line-of-sight detection unit 124, the line-of-sight determination unit 125, and the line-of-sight analysis unit 141 constitute a line-of-sight identification unit 112 that specifies the direction of the passenger's line of sight.
  • the heart rate detection unit 126 and the heart rate determination unit 127 configure a heart rate identification unit 113 for specifying the heart rate of the passenger.
  • the sweating amount detecting unit 128 and the sweating amount determining unit 129 constitute a sweating amount specifying unit 114 that specifies the sweating amount of the passenger. It is assumed that the heart rate specifying unit 113 and the sweating amount specifying unit 114 constitute a physical quantity specifying unit that specifies the physical quantity of the passenger's living body.
  • the heart rate and the sweating amount are specified as physical quantities related to the living body of the passenger, but the first embodiment is not limited to such an example. For example, blood pressure, respiratory rate, etc. may be specified.
  • the physical quantity specifying unit provides the specific unit 146 with biological information indicating the physical quantity of the passenger's living body.
  • the voice detection unit 130, the voice determination unit 131, and the voice analysis unit 142 constitute a voice identification unit 115 for specifying the content of the voice emitted by the passenger.
  • the passenger behavior specifying unit 110 does not have to include all of the load specifying unit 111, the line-of-sight specifying unit 112, the heart rate specifying unit 113, the sweating amount specifying unit 114, and the voice specifying unit 115, and at least any of these. You only need to have one.
  • the identification unit 146 identifies the passenger behavior, which is the behavior of the passenger, based on the information given from the passenger monitoring unit 120. For example, the identification unit 146 identifies the passenger behavior as shown in FIG. 3 based on the information given from the passenger monitoring unit 120.
  • FIG. 3 is a schematic diagram showing an example of a passenger behavior result table for specifying passenger behavior.
  • the passenger action result table 150 shown in FIG. 3 includes a corresponding column 150a, a PID (Passenger Identity) column 150b, and a passenger action column 150c.
  • the passenger behavior row 150c stores the passenger behavior.
  • the PID column 150b stores the PID as the passenger behavior identification information which is the identification information for identifying the passenger behavior.
  • the corresponding column 150a stores whether or not it corresponds to the passenger behavior according to the information given from the passenger monitoring unit 120.
  • the specific unit 146 has “the center of gravity of the passenger tilted toward the driver”, “the center of gravity of the passenger tilts toward the window”, and “the passenger stands up from the seat” based on the load information. "Or” the passenger is shaking the driver “. Specifically, the specific unit 146 does not correspond to "the center of gravity of the passenger is tilted toward the driver", “the passenger is standing up from the seat”, or “the passenger is shaking the driver”. In that case, it may be determined that the center of gravity of the passenger is tilted toward the window.
  • the identification unit 146 specifies that "the line of sight of the passenger is facing forward” or “the line of sight of the passenger is facing the driver” based on the line-of-sight information. Specifically, the specific unit 146 may determine that "the passenger's line of sight is facing forward” when it does not correspond to "the passenger's line of sight is facing the driver".
  • the specifying unit 146 specifies that "the passenger's heart rate is high”, “the passenger's heart rate is normal”, or "the passenger's heart rate is low” based on the heart rate information. For example, the specific unit 146 may make this determination using a predetermined threshold value.
  • the specific unit 146 specifies that "the passenger's sweating amount is large”, “the passenger's sweating amount is normal”, or "the passenger's sweating amount is small” based on the sweating amount information. For example, the specific unit 146 may make this determination using a predetermined threshold value.
  • the specific unit 146 Based on the voice information, the specific unit 146 "passenger is uttering an arbitrary word”, “passenger is uttering a specific word”, “passenger is screaming”, or “passenger is screaming”. The person is not speaking. “ Specifically, the specific unit 146 does not correspond to "the passenger is uttering a specific word”, “the passenger is screaming”, or “the passenger is not speaking”. , "The passenger is uttering any word.”
  • the specific unit 146 provides the driver state specific unit 170 with passenger behavior information indicating the passenger behavior specified as described above.
  • passenger behavior information indicating the passenger behavior specified as described above.
  • various data such as load, line of sight, heart rate, sweating amount, and voice are adopted, but other data related to the passenger, such as pupil diameter or number of blinks, are used. It may be used.
  • the specific unit 146 acquires information from all of the load specific unit 111, the line of sight specific unit 112, the heart rate specific unit 113, the sweating amount specific unit 114, and the voice specific unit 115, and specifies the passenger behavior.
  • Embodiment 1 is not limited to such an example. If the passenger behavior can be specified from the result specified by at least one of the load specifying unit 111, the line-of-sight specifying unit 112, the heart rate specifying unit 113, the sweating amount specifying unit 114, and the voice specifying unit 115. good.
  • the driver consciousness level specifying unit 160 specifies the driver consciousness level, which is the level of the driver's consciousness.
  • the driver consciousness level specifying unit 160 is shown in FIG. 4 based on at least one of the driver's image, voice, heart rate, and sweating amount, similarly to the passenger behavior specifying unit 110. Identify the driver awareness level shown in the table. The identified driver awareness level is given to the driver state identification unit 170.
  • FIG. 4 is a schematic diagram showing an example of a driver consciousness level result table for specifying a driver consciousness level.
  • the driver awareness level result table 151 shown in FIG. 4 includes a corresponding column 151a, a DID (Driver IDentification) column 151b, and a driver awareness level column 151c.
  • DID Driver IDentification
  • the driver awareness level column 151c stores the driver awareness level.
  • the DID column 151b stores the DID as the driver consciousness level identification information for identifying the driver consciousness level.
  • the corresponding column 151a stores whether or not it corresponds to the driver consciousness level determined by the driver consciousness level specifying unit 160.
  • the driver state specifying unit 170 is a driver's state based on a combination of the passenger behavior given by the passenger behavior specifying unit 110 and the driver consciousness level given by the driver consciousness level specifying unit 160. To identify. When the passenger is not in the vehicle, the driver state specifying unit 170 may specify the driver state from the driver consciousness level. For example, the driver state specifying unit 170 identifies the driver state shown in the table shown in FIG. The identified driver state is given to the vehicle control unit 180.
  • FIG. 5 is a schematic diagram showing an example of a driver state result table for specifying a driver state.
  • the driver status result table 152 shown in FIG. 5 includes a corresponding row 152a, a JCS (Japan COMA Scale) row 152b, and a driver status row 152c.
  • the JCS column 152b stores the consciousness level number specified by JCS indicating the driver status.
  • the driver status column 152c stores the driver status result corresponding to the awareness level number.
  • the corresponding column 152a stores whether or not the driver state corresponds to the driver state determined by the driver state specifying unit 170.
  • one or more passenger behaviors specified in the passenger behavior result table shown in FIG. 3 and one or more driving specified in the driver awareness level result table shown in FIG. 5 is associated with the combination with the person consciousness level.
  • the passenger behavior result table shown in FIG. 3 the passenger behavior corresponds to two or more items among P01, P03, P04, P05, P07, P09, P11, P12 and P15.
  • the driver consciousness level is two of D02, D03, D05, D06, D08, D09, D10, D11, D13, D14, D15 and D16.
  • the driver state specifying unit 170 determines that the driver is unconscious.
  • JCS is adopted as the driver state, but the present embodiment is not limited to such an example.
  • GCS Globalsgow Coma Scale
  • other consciousness level indicators may be adopted. That is, the driver state specifying unit 170 outputs any driver state as shown in FIG. 5 from the combination of the passenger behavior and the driver consciousness level.
  • the vehicle control unit 180 controls the vehicle according to the driver state given by the driver state specifying unit 170.
  • the vehicle control unit 180 shall control the vehicle safely.
  • the vehicle control unit 180 controls the vehicle when the driver state specified by the driver state specifying unit 170 indicates that the driver cannot normally drive the vehicle.
  • the vehicle control unit 180 stops the vehicle in a safe place.
  • FIG. 6 is a block diagram schematically showing the configuration of the vehicle control unit 180 according to the first embodiment.
  • the vehicle control unit 180 includes a road condition acquisition unit 181, a stop position determination unit 182, and a stop unit 183.
  • the road condition acquisition unit 181 acquires the road condition indicating the road condition.
  • the road condition acquisition unit 181 has a road condition around the vehicle from a front camera, a side camera, a sonar, a millimeter wave radar, a LIDAR (Light Detection and Ranking), etc., which are installed in the vehicle. And give the road condition to the stop position determination unit 182.
  • the stop position determination unit 182 determines a stop position at which the vehicle can be safely stopped according to the road condition given by the road condition acquisition unit 181. Then, the stop position determination unit 182 gives the stop position information indicating the determined stop position to the stop unit 183.
  • the stop unit 183 stops the vehicle at the stop position indicated by the stop position information given by the stop position determination unit 182.
  • a part or all of the vehicle control device 100 described above includes, for example, a memory 10 and a CPU (Central Processing) that executes a program stored in the memory 10, as shown in FIG. 7A. It can be configured by a processor 11 such as Unit). Such a program may be provided through a network, or may be recorded and provided on a recording medium. That is, such a program may be provided, for example, as a program product. In such a case, the vehicle control device 100 can be realized by, for example, a computer.
  • a computer Central Processing
  • a part or all of the vehicle control device 100 may be, for example, as shown in FIG. 7B, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, or an ASIC (Application Specific ASIC). It can also be configured by a processing circuit 12 such as an Integrated Circuit) or an FPGA (Field Programmable Gate Array). As described above, the vehicle control device 100 can be realized by the processing network.
  • FIG. 8 is a state chart diagram showing an example of a process of controlling a vehicle from the specific results of the driver consciousness level specifying unit 160 and the passenger behavior specifying unit 110 in the first embodiment.
  • the load analysis unit 140 of the passenger behavior specifying unit 110 determines whether or not the passenger has boarded based on the value of the load detected by the load detection unit 121 (S10).
  • driver consciousness level specifying unit 160 grasps the driver's behavior, specifies the driver consciousness level, and gives the specified driver consciousness level to the driver state specifying unit 170 (S11).
  • step S12 when there is a passenger, the passenger behavior specifying unit 110 grasps the behavior of the passenger, identifies the passenger behavior, and drives the passenger behavior information indicating the specified passenger behavior. It is given to the person state specifying unit 170 (S13).
  • step S12 If the driver is on board but the passenger is not on board, it means that there is no passenger in step S12, and the passenger behavior specifying unit 110 does not perform any processing, and the processing is the latter stage of S11. Continue to the step of. Here, it is assumed that the processing in the passenger behavior specifying unit 110 and the processing in the driver consciousness level specifying unit 160 are performed at the same time.
  • the driver state specifying unit 170 is a driver according to the driver consciousness level given by the driver consciousness level specifying unit 160 and the passenger behavior given by the passenger behavior specifying unit 110.
  • the state is specified (S15). Then, the driver state specifying unit 170 provides the vehicle control unit 180 with the driver state information indicating the specified driver state.
  • the driver state specifying unit 170 specifies the driver state from the driver consciousness level given by the driver consciousness level specifying unit 160 (S16). Then, the driver state specifying unit 170 provides the vehicle control unit 180 with the driver state information indicating the specified driver state.
  • step S15 or step S16 the vehicle control unit 180 controls the vehicle according to the driver state (S17).
  • FIG. 9 is a state chart diagram showing an example of processing in the passenger behavior specifying unit 110.
  • FIG. 9 shows the details of the process in step S13 of FIG.
  • the face image acquisition unit 123 acquires a passenger's face image from an image captured by a camera, which is an image pickup unit (not shown) (S20).
  • the line-of-sight detection unit 124 detects the line-of-sight of the passenger from the acquired face image, generates line-of-sight data indicating the detected line-of-sight, and gives the line-of-sight data to the line-of-sight determination unit 125 (S21). ..
  • the line-of-sight determination unit 125 gives line-of-sight data to the line-of-sight analysis unit 141.
  • the line-of-sight analysis unit 141 identifies the direction the passenger is looking at by analyzing the line-of-sight data (S22). Then, the line-of-sight analysis unit 141 generates line-of-sight information indicating the specified direction, and gives the line-of-sight information to the line-of-sight determination unit 125. The line-of-sight determination unit 125 gives the line-of-sight information to the specific unit 146.
  • the heart rate detection unit 126 detects the heart rate of the passenger and gives the heart rate information indicating the detected heart rate to the heart rate determination unit 127 (S23).
  • the heart rate determination unit 127 gives the heart rate information to the specific unit 146.
  • the sweating amount detection unit 128 detects the sweating amount of the passenger and gives the sweating amount information indicating the detected sweating amount to the sweating amount determination unit 129 (S24).
  • the sweating amount determination unit 129 gives the sweating amount information to the specific unit 146.
  • the voice detection unit 130 detects the voice of the passenger and generates voice data indicating the detected voice (S25).
  • the voice detection unit 130 gives the generated voice data to the voice determination unit 131.
  • the voice determination unit 131 gives the voice data to the language analysis unit 143.
  • the language analysis unit 143 identifies the content of the voice emitted by the passenger by analyzing the voice data (S26). Specifically, the language analysis unit 143 divides the voice indicated by the voice data into morphemes by performing morphological analysis. Then, the language analysis unit 143 gives the divided morphemes to the dictionary reference unit 145.
  • the dictionary reference unit 145 identifies the meaning of each morpheme given by the language analysis unit 143 by referring to the dictionary data stored in the dictionary storage unit 144, and gives the meaning to the language analysis unit 143.
  • the language analysis unit 143 identifies the content of the voice emitted by the passenger from the meaning of each morpheme, and generates voice information indicating the specified content. Then, the language analysis unit 143 gives the voice information to the voice determination unit 131, and the voice determination unit 131 gives the voice information to the specific unit 146.
  • the identification unit 146 identifies the passenger behavior, which is the behavior of the passenger, based on the information given from the passenger monitoring unit 120 (S27).
  • FIG. 10 is a state chart diagram showing an example of processing of the vehicle control unit 180 in the first embodiment.
  • FIG. 10 is the process in step S17 of FIG.
  • the road condition acquisition unit 181 determines whether or not the driver state indicated by the driver state information given by the driver state specifying unit 170 is clear consciousness (S30). Then, in step S31, if the driver state is clear consciousness, the process ends, and if the driver state is not clear consciousness, the process proceeds to step S32.
  • the road condition acquisition unit 181 acquires the road condition indicating the condition of the surrounding road (S31).
  • the road condition acquisition unit 181 gives the acquired road condition to the stop position determination unit 182.
  • the stop position determination unit 182 determines a position where the vehicle can be safely stopped based on the given road condition (S33). Finally, the stop unit 183 stops the vehicle at a position where it can be safely stopped (S34).
  • FIG. 11 is a schematic view showing an example of the inside of the cockpit 101 of a vehicle in which the vehicle control device 100 is mounted.
  • the cockpit 101 is provided with one sound collector 102 and one wide-angle imager 103.
  • the sound collector 102 is a sound collecting device that collects all the sounds from the driver's seat on the right side, the passenger seat as the passenger's seat on the left side, and the passenger's seat behind.
  • the voice signal which is a voice signal collected by the sound collector 102, is given to the voice detection unit 130.
  • the voice detection unit 130 can identify who utters the voice from the reception direction of the voice signal. As a result, the voice detection unit 130 can identify the voice of the passenger from the voice inside the vehicle.
  • the wide-angle imager 103 is an image pickup device as an image pickup unit capable of capturing all images of the driver's seat on the right side, the passenger seat on the left side, and the passenger seat behind.
  • the image data of the image captured by the wide-angle imager 103 is given to the face image acquisition unit 123.
  • the face image acquisition unit 123 can identify the face of the passenger from the image shown by the image data.
  • the configuration is such that someone can be identified based on the data from one sound collector 102 and one wide-angle imager 103, but this is the number of sound collector 102 and wide-angle imager 103. This is to reduce costs by making each one one. If the cost is not a concern, the number of sound collectors and imagers may be multiple.
  • the driver's state can be specified by the passenger behavior and the driver's consciousness level, and when the driver cannot drive the vehicle normally, the vehicle is safely placed in the stop position. Can be stopped at.
  • FIG. 12 is a block diagram schematically showing the configuration of the vehicle control device 200 according to the second embodiment.
  • the vehicle control device 200 includes a passenger behavior specifying unit 110, a driver consciousness level specifying unit 160, a driver state specifying unit 270, a vehicle control unit 280, and a driver state storage unit 290.
  • the passenger behavior specifying unit 110 and the driver consciousness level specifying unit 160 of the vehicle control device 200 according to the second embodiment are the passenger behavior specifying unit 110 and the driver consciousness level specifying unit of the vehicle control device 100 according to the first embodiment. It is the same as the part 160.
  • the driver state specifying unit 270 is a driver state, which is a driver's state, based on a combination of the passenger behavior given by the passenger behavior specifying unit 110 and the driver consciousness level given by the driver consciousness level specifying unit 160. To identify.
  • the driver state identification unit 270 when the amount of data stored in the driver state storage unit 290 is less than a predetermined threshold value and there is a passenger, the driver state identification unit 270 is shown in FIG. Driver Status Results Shown Driver status is identified based on a combination of passenger behavior and driver awareness level so that the driver status shown in Table 152 can be identified. Then, the driver state specifying unit 270 gives the specified driver state to the vehicle control unit 280, and associates the specified driver state with the combination of the passenger behavior and the driver awareness level, and the driver state. It is stored in the driver state storage unit 290 as related information.
  • the driver state identification unit 270 is shown in FIG.
  • the driver state is specified based on the driver awareness level so that the driver state shown in the driver state result table 152 can be specified.
  • the driver state specifying unit 270 gives the specified driver state to the vehicle control unit 280, and associates the specified driver state with the driver consciousness level as the driver state-related information. It is stored in the storage unit 290.
  • the driver state identification unit 270 may perform passenger behavior and driver awareness.
  • the driver state associated with the combination of levels is specified from the driver state related information stored in the driver state storage unit 290. Then, the driver state specifying unit 270 gives the specified driver state to the vehicle control unit 280.
  • the driver state identification unit 270 is associated with the driver awareness level.
  • the driver state is specified from the driver state-related information stored in the driver state storage unit 290. Then, the driver state specifying unit 270 gives the specified driver state to the vehicle control unit 280.
  • the driver state storage unit 290 stores the driver state-related information given by the driver state identification unit 270.
  • the vehicle control unit 280 controls the vehicle according to the driver state given by the driver state specifying unit 270.
  • the vehicle control unit 280 in the second embodiment automatically sets a nearby destination such as an emergency hospital according to the driver's condition, and automatically drives the vehicle to the set destination.
  • the vehicle control unit 280 in the second embodiment may stop the vehicle in a safe place when the driver cannot normally drive the vehicle, as in the vehicle control unit 180 in the first embodiment.
  • FIG. 13 is a block diagram schematically showing the configuration of the vehicle control unit 280 according to the second embodiment.
  • the vehicle control unit 280 includes a destination automatic setting unit 284, a route search determination unit 285, and a travel control unit 286.
  • the destination automatic setting unit 284 identifies the position of the vehicle from the INS (Inertial Navigation System) or GPS (Global Positioning System) receiver installed in the vehicle, and maps the position. By referring to the information, the destination is set according to the driver's condition. For example, when the driver's state is the "state in which the driver does not awaken even if stimulated" shown in FIG. 5, the destination automatic setting unit 284 searches for an emergency hospital near the vehicle and searches for the emergency hospital. Set as a destination. Then, the destination automatic setting unit 284 gives the destination information indicating the set destination to the route search determination unit 285. In addition, it is assumed that the type of the destination to be set is predetermined in the destination automatic setting unit 284 for each driver state in which the destination should be set. Then, the destination automatic setting unit 284 may specify the type of the destination according to the driver's state, and select the destination closest to the vehicle from the destinations of the specified type in the map information.
  • INS Inertial Navigation System
  • GPS Global Positioning System
  • the route search determination unit 285 searches for a route on which the vehicle should travel to the destination indicated by the destination information given by the destination automatic setting unit 284, and generates route information indicating the route. Then, the route search determination unit 285 gives the route information to the travel control unit 286.
  • the travel control unit 286 drives the vehicle to the destination according to the route information given by the route search determination unit 285.
  • a part or all of the vehicle control device 200 described above is also, for example, as shown in FIG. 7A, a memory 10 and a processor such as a CPU that executes a program stored in the memory 10. It can be configured by 11. Further, a part or all of the vehicle control device 200 may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC, an FPGA, or the like, as shown in FIG. 7 (B). It can also be configured by the processing circuit 12 of. As described above, the vehicle control device 200 can be realized by the processing network.
  • FIG. 14 is a state chart diagram showing an example of a process of controlling a vehicle from the specific results of the driver consciousness level specifying unit 160 and the passenger behavior specifying unit 110 in the second embodiment.
  • the steps that perform the same processing as the steps shown in FIG. 8 are designated by the same reference numerals as those in FIG. 8, and detailed description thereof will be omitted.
  • steps S10 to S13 in FIG. 14 is the same as the process of steps S10 to S13 of FIG. However, after steps S11 and S13 in FIG. 14, the process proceeds to step S44.
  • step S44 the driver state specifying unit 270 determines whether or not the amount of data stored in the driver state storage unit 290 is sufficient. If the amount of data is sufficient, the process proceeds to step S45, and if the amount of data is not sufficient, the process proceeds to step S48.
  • step S45 the driver state specifying unit 270 determines whether or not there is a passenger.
  • the driver state specifying unit 270 associates the driver consciousness level given by the driver consciousness level specifying unit 160 with the combination of the passenger behavior given by the passenger behavior specifying unit 110.
  • the driver state is specified from the driver state-related information stored in the driver state storage unit 290 (S46). Then, the driver state specifying unit 270 provides the vehicle control unit 280 with the driver state information indicating the specified driver state.
  • the driver state specifying unit 270 stores the driver state associated with the driver consciousness level given by the driver consciousness level specifying unit 160 in the driver state storage unit 290. It is specified from the driver status related information (S47). Then, the driver state specifying unit 270 provides the vehicle control unit 280 with the driver state information indicating the specified driver state.
  • step S48 the driver state specifying unit 270 determines whether or not there is a passenger.
  • the driver state specifying unit 270 is a driver according to the driver consciousness level given by the driver consciousness level specifying unit 160 and the passenger behavior given by the passenger behavior specifying unit 110.
  • the state is specified (S49).
  • the driver state specifying unit 270 identifies the driver state shown in the driver state result table 152 shown in FIG. Then, the driver state specifying unit 270 provides the vehicle control unit 280 with the driver state information indicating the specified driver state.
  • the driver state specifying unit 270 identifies the driver consciousness level given by the driver consciousness level specifying unit 160 and the passenger behavior given by the passenger behavior specifying unit 110 in step S48.
  • the driver state-related information associated with the state is generated, and the driver state-related information is stored in the driver state storage unit 290 (S50).
  • the driver state specifying unit 170 specifies the driver state from the driver consciousness level given by the driver consciousness level specifying unit 160 (S51).
  • the driver state specifying unit 270 identifies the driver state shown in the driver state result table 152 shown in FIG. Then, the driver state specifying unit 270 provides the vehicle control unit 280 with the driver state information indicating the specified driver state.
  • the driver state specifying unit 270 generates driver state-related information in which the driver consciousness level given by the driver consciousness level specifying unit 160 is associated with the driver state specified in step S51.
  • the driver state-related information is stored in the driver state storage unit 290 (S52).
  • step S46 the vehicle control unit 280 controls the vehicle according to the driver state (S53).
  • FIG. 15 is a state chart diagram showing an example of processing of the vehicle control unit 280 in the second embodiment.
  • FIG. 15 is the process in step S53 of FIG.
  • the destination automatic setting unit 284 sets a nearby emergency hospital or the like as the destination according to the driver's condition (S60).
  • the route search determination unit 285 searches for a route to the set destination and determines the route (S61). Finally, the travel control unit 286 automatically drives the vehicle safely to the destination according to the determined route (S62).
  • the driver state can be specified by the passenger behavior and the driver consciousness level
  • the destination is automatically set according to the driver state
  • the destination is set.
  • the vehicle can be driven automatically.
  • the method of specifying the driver state when the stored data amount of the driver state-related information becomes equal to or more than a predetermined threshold value is specified from the driver state-related information.
  • the processing load can be reduced by switching to a method of identifying the driver state associated with the combination of the driver awareness level and the identified passenger behavior.
  • the embodiment is not limited to the above embodiment 1 or 2, and can be implemented in various embodiments within a range not deviating from the above gist.
  • 100, 200 vehicle control device 110 passenger behavior identification unit, 111 load identification unit, 112 line-of-sight identification unit, 113 heart rate identification unit, 114 sweating amount identification unit, 115 voice identification unit, 120 passenger monitoring unit, 121 load detection Unit, 122 load determination unit, 123 face image acquisition unit, 124 line-of-sight detection unit, 125 line-of-sight determination unit, 126 heart rate detection unit, 127 heart rate determination unit, 128 sweating amount detection unit, 129 sweating amount determination unit, 130 voice detection Unit, 131 voice judgment unit, 140 load analysis unit, 141 line-of-sight analysis unit, 142 voice analysis unit, 143 language analysis unit, 144 dictionary storage unit, 145 dictionary reference unit, 146 specific unit, 160 driver awareness level specific unit, 170 , 270 Driver status identification unit, 180, 280 vehicle control unit, 181 road condition acquisition unit, 182 stop position determination unit, 183 stop unit, 284 destination automatic setting unit, 285 route search determination unit, 286 travel control unit, 290 Driver state storage unit.

Abstract

The present invention comprises: a driver awareness level specifying unit (160) that specifies a driver awareness level, which is the level of awareness of the driver of a vehicle; a passenger behavior specifying unit (110) that specifies passenger behavior, which is the behavior of a passenger in the vehicle other than the driver; a driver state specifying unit (170) that specifies a driver state, which is the state of the driver, based on a combination of the driver awareness level and the passenger behavior; and a vehicle control unit (180) that controls the vehicle in accordance with the driver state.

Description

車両制御装置、プログラム及び車両制御方法Vehicle control devices, programs and vehicle control methods
 本開示は、車両制御装置、プログラム及び車両制御方法に関する。 This disclosure relates to a vehicle control device, a program, and a vehicle control method.
 特許文献1には、車両の同乗者の視線の向きが車両外部又は車両前方である場合に、同乗者が緊張しているか否かの判定を行う運転支援装置が記載されている。この運転支援装置は、同乗者の緊張度の程度に基づいて、運転者の運転操作に対する支援を行う。 Patent Document 1 describes a driving support device that determines whether or not the passenger is nervous when the direction of the line of sight of the passenger of the vehicle is outside the vehicle or in front of the vehicle. This driving support device supports the driver's driving operation based on the degree of tension of the passenger.
特開2014-75008号公報Japanese Unexamined Patent Publication No. 2014-75008
 特許文献1に記載された運転支援装置では、同乗者の視線の向きは、車両外部又は車両前方であり運転者を観察しておらず、運転者の状態判定において同乗者から見た運転者の状態情報を反映させることが出来ない。 In the driving support device described in Patent Document 1, the direction of the line of sight of the passenger is outside the vehicle or in front of the vehicle, and the driver is not observed. The status information cannot be reflected.
 そこで、本開示の位置又は複数の態様は、同乗者の行動も運転者の状態判定を行う際に使うことで、より正確に運転者の状態を判定できるようにすることを目的とする。 Therefore, the position or a plurality of aspects of the present disclosure is intended to enable more accurate determination of the driver's condition by using the behavior of the passenger when determining the driver's condition.
 本開示の一態様に係る車両制御装置は、車両の運転者の意識のレベルである運転者意識レベルを特定する運転者意識レベル特定部と、前記車両における前記運転者以外の同乗者の行動である同乗者行動を特定する同乗者行動特定部と、前記運転者意識レベル及び前記同乗者行動の組み合わせから、前記運転者の状態である運転者状態を特定する運転者状態特定部と、前記運転者状態に応じて、前記車両を制御する車両制御部と、を備えることを特徴とする。 The vehicle control device according to one aspect of the present disclosure includes a driver consciousness level specifying unit that specifies a driver consciousness level, which is a level of consciousness of the driver of a vehicle, and an action of a passenger other than the driver in the vehicle. A passenger behavior specifying unit that specifies a certain passenger behavior, a driver state specifying unit that specifies the driver state that is the driver's state from the combination of the driver awareness level and the passenger behavior, and the driving. It is characterized by including a vehicle control unit that controls the vehicle according to a person's state.
 本開示の一態様に係る車両制御装置は、車両の運転者の意識のレベルである運転者意識レベル及び前記車両における前記運転者以外の同乗者の行動である同乗者行動の組み合わせと、前記運転者の状態である運転者状態とを関連付ける運転者状態関連情報を記憶する運転者状態記憶部と、前記運転者から前記運転者意識レベルを特定する運転者意識レベル特定部と、前記同乗者から前記同乗者行動を特定する同乗者行動特定部と、前記運転者状態記憶部に記憶されているデータ量が予め定められた閾値よりも少ない場合に、前記運転者意識レベル特定部により特定された前記運転者意識レベル及び前記同乗者行動特定部により特定された前記同乗者行動の組み合わせから、前記運転者状態を特定するとともに、前記運転者意識レベル特定部により特定された前記運転者意識レベル及び前記同乗者行動特定部により特定された前記同乗者行動の前記組み合わせと、前記特定された同乗者行動と、を関連付けて、前記運転者状態関連情報として、前記運転者状態記憶部に記憶させ、前記運転者状態記憶部に記憶されているデータ量が予め定められた閾値以上である場合に、前記運転者状態関連情報から、前記運転者意識レベル特定部により特定された前記運転者意識レベル及び前記同乗者行動特定部により特定された前記同乗者行動の組み合わせに関連付けられている前記運転者状態を特定する運転者状態特定部と、前記運転者状態特定部により特定された前記運転者状態に応じて、前記車両を制御する車両制御部と、を備えることを特徴とする。 The vehicle control device according to one aspect of the present disclosure is a combination of a driver consciousness level, which is the level of consciousness of the driver of the vehicle, and a passenger behavior, which is the behavior of a passenger other than the driver in the vehicle, and the driving. From the driver state storage unit that stores driver state-related information associated with the driver state, which is the state of the driver, the driver consciousness level specifying unit that specifies the driver consciousness level from the driver, and the passenger. When the amount of data stored in the passenger behavior specifying unit for specifying the passenger behavior and the driver state storage unit is less than a predetermined threshold value, the driver consciousness level specifying unit is specified. The driver state is specified from the combination of the driver consciousness level and the passenger behavior specified by the passenger behavior specifying unit, and the driver consciousness level and the driver consciousness level specified by the driver consciousness level specifying unit are specified. The combination of the passenger behavior specified by the passenger behavior specifying unit and the specified passenger behavior are associated with each other and stored in the driver state storage unit as the driver state-related information. When the amount of data stored in the driver state storage unit is equal to or greater than a predetermined threshold value, the driver awareness level and the driver awareness level specified by the driver awareness level specifying unit from the driver state-related information The driver state specifying unit that specifies the driver state associated with the combination of the passenger behaviors specified by the passenger behavior specifying unit and the driver state specified by the driver state specifying unit. Accordingly, it is characterized by including a vehicle control unit that controls the vehicle.
 本開示の一態様に係るプログラムは、コンピュータを、車両の運転者の意識のレベルである運転者意識レベルを特定する運転者意識レベル特定部、前記車両における前記運転者以外の同乗者の行動である同乗者行動を特定する同乗者行動特定部、前記運転者意識レベル及び前記同乗者行動の組み合わせから、前記運転者の状態である運転者状態を特定する運転者状態特定部、及び、前記運転者状態に応じて、前記車両を制御する車両制御部、として機能させることを特徴とする。 The program according to one aspect of the present disclosure uses a computer as a driver consciousness level specifying unit that specifies a driver consciousness level, which is a level of consciousness of the driver of a vehicle, and an action of a passenger other than the driver in the vehicle. A passenger behavior specifying unit that specifies a certain passenger behavior, a driver state specifying unit that specifies the driver state that is the driver's state from the combination of the driver consciousness level and the passenger behavior, and the driving. It is characterized in that it functions as a vehicle control unit that controls the vehicle according to a person's state.
 本開示の一態様に係るプログラムは、コンピュータを、車両の運転者の意識のレベルである運転者意識レベル及び前記車両における前記運転者以外の同乗者の行動である同乗者行動の組み合わせと、前記運転者の状態である運転者状態とを関連付ける運転者状態関連情報を記憶する運転者状態記憶部、前記運転者から前記運転者意識レベルを特定する運転者意識レベル特定部、前記同乗者から前記同乗者行動を特定する同乗者行動特定部、前記運転者状態記憶部に記憶されているデータ量が予め定められた閾値よりも少ない場合に、前記運転者意識レベル特定部により特定された前記運転者意識レベル及び前記同乗者行動特定部により特定された前記同乗者行動の組み合わせから、前記運転者状態を特定するとともに、前記運転者意識レベル特定部により特定された前記運転者意識レベル及び前記同乗者行動特定部により特定された前記同乗者行動の前記組み合わせと、前記特定された同乗者行動と、を関連付けて、前記運転者状態関連情報として、前記運転者状態記憶部に記憶させ、前記運転者状態記憶部に記憶されているデータ量が予め定められた閾値以上である場合に、前記運転者状態関連情報から、前記運転者意識レベル特定部により特定された前記運転者意識レベル及び前記同乗者行動特定部により特定された前記同乗者行動の組み合わせに関連付けられている前記運転者状態を特定する運転者状態特定部、及び、前記運転者状態特定部により特定された前記運転者状態に応じて、前記車両を制御する車両制御部、として機能させることを特徴とする。 The program according to one aspect of the present disclosure uses a computer as a combination of a driver consciousness level, which is the level of consciousness of the driver of the vehicle, and a passenger behavior, which is the behavior of a passenger other than the driver in the vehicle. A driver state storage unit that stores driver state-related information associated with the driver state, which is the driver's state, a driver consciousness level specifying unit that specifies the driver consciousness level from the driver, and the passenger said The driving specified by the driver consciousness level specifying unit when the amount of data stored in the passenger behavior specifying unit for specifying the passenger behavior and the driver state storage unit is less than a predetermined threshold value. The driver state is specified from the combination of the person consciousness level and the passenger behavior specified by the passenger behavior specifying unit, and the driver consciousness level and the riding passenger specified by the driver consciousness level specifying unit. The combination of the passenger behavior specified by the person behavior specifying unit and the specified passenger behavior are associated with each other and stored in the driver state storage unit as the driver state-related information, and the driving is performed. When the amount of data stored in the person state storage unit is equal to or greater than a predetermined threshold value, the driver consciousness level and the rider specified by the driver consciousness level specifying unit from the driver state related information. According to the driver state specifying unit that specifies the driver state associated with the combination of passenger behaviors specified by the person behavior specifying unit, and the driver state specified by the driver state specifying unit. It is characterized in that it functions as a vehicle control unit that controls the vehicle.
 本開示の一態様に係る車両制御方法は、車両の運転者の意識のレベルである運転者意識レベルを特定し、前記車両における前記運転者以外の同乗者の行動である同乗者行動を特定し、前記運転者意識レベル及び前記同乗者行動の組み合わせから、前記運転者の状態である運転者状態を特定し、前記運転者状態に応じて、前記車両を制御することを特徴とする。 The vehicle control method according to one aspect of the present disclosure specifies the driver consciousness level, which is the level of consciousness of the driver of the vehicle, and specifies the passenger behavior, which is the behavior of passengers other than the driver in the vehicle. It is characterized in that the driver state, which is the state of the driver, is specified from the combination of the driver consciousness level and the passenger behavior, and the vehicle is controlled according to the driver state.
 本開示の一態様に係る車両制御方法は、車両の運転者の意識のレベルである運転者意識レベルを特定し、前記車両における前記運転者以外の同乗者の行動である同乗者行動を特定し、前記特定された運転者意識レベル及び前記特定された同乗者行動の組み合わせから、前記運転者の状態である運転者状態を特定し、前記特定された運転者意識レベル及び前記特定された同乗者行動の組み合わせと、前記特定された運転者状態とを関連付ける運転者状態関連情報を記憶する車両制御方法であって、前記記憶された運転者状態関連情報のデータ量が予め定められた閾値以上となった場合に、前記運転者状態を特定する方法を、前記運転者状態関連情報から、前記特定された運転者意識レベル及び前記特定された同乗者行動の組み合わせに関連付けられている前記運転者状態を特定する方法に切り替えることを特徴とする。 The vehicle control method according to one aspect of the present disclosure specifies the driver consciousness level, which is the level of consciousness of the driver of the vehicle, and specifies the passenger behavior, which is the behavior of passengers other than the driver in the vehicle. , The specified driver consciousness level and the specified passenger behavior are combined to identify the driver state, which is the driver's state, and the specified driver consciousness level and the specified passenger. It is a vehicle control method that stores driver state-related information that associates a combination of actions with the specified driver state, and the amount of data of the stored driver state-related information is equal to or higher than a predetermined threshold value. When this happens, the method of identifying the driver state is associated with the combination of the specified driver awareness level and the specified passenger behavior from the driver state related information. It is characterized by switching to a method of specifying.
 本開示の一又は複数の態様によれば、同乗者の行動も運転者の状態判定を行う際に使うことで、より正確に運転者の状態を判定することができる。 According to one or more aspects of the present disclosure, the behavior of the passenger can also be used when determining the state of the driver, so that the state of the driver can be determined more accurately.
実施の形態1に係る車両制御装置の構成を概略的に示すブロック図である。It is a block diagram schematically showing the structure of the vehicle control device which concerns on Embodiment 1. FIG. 実施の形態1における同乗者行動特定部の構成を概略的に示すブロック図である。It is a block diagram which shows schematic structure of the passenger behavior identification part in Embodiment 1. FIG. 同乗者行動を特定するための同乗者行動結果表の一例を示す概略図である。It is a schematic diagram which shows an example of a passenger behavior result table for specifying a passenger behavior. 運転者意識レベルを特定するための運転者意識レベル結果表の一例を示す概略図である。It is a schematic diagram which shows an example of the driver consciousness level result table for specifying the driver consciousness level. 運転者状態を特定するための運転者状態結果表の一例を示す概略図である。It is a schematic diagram which shows an example of the driver state result table for specifying a driver state. 実施の形態1における車両制御部の構成を概略的に示すブロック図である。It is a block diagram which shows schematic structure of the vehicle control part in Embodiment 1. FIG. (A)及び(B)は、ハードウェア構成例を示すブロック図である。(A) and (B) are block diagrams showing a hardware configuration example. 実施の形態1における運転者意識レベル特定部及び同乗者行動特定部での特定結果から車両を制御する処理の一例を示すステートチャート図である。It is a state chart diagram which shows an example of the process which controls a vehicle from the specific result in the driver consciousness level specifying part and the passenger behavior specifying part in Embodiment 1. FIG. 同乗者行動特定部での処理の一例を示すステートチャート図である。It is a state chart diagram which shows an example of the processing in a passenger behavior identification part. 実施の形態1における車両制御部の処理の一例を示すステートチャート図である。It is a state chart diagram which shows an example of the processing of the vehicle control part in Embodiment 1. FIG. 車両制御装置が搭載された車両のコックピットの内部の一例を示す概略図である。It is a schematic diagram which shows an example of the inside of the cockpit of a vehicle equipped with a vehicle control device. 実施の形態2に係る車両制御装置の構成を概略的に示すブロック図である。FIG. 3 is a block diagram schematically showing a configuration of a vehicle control device according to a second embodiment. 実施の形態2における車両制御部の構成を概略的に示すブロック図である。It is a block diagram which shows schematic structure of the vehicle control part in Embodiment 2. FIG. 実施の形態2における運転者意識レベル特定部及び同乗者行動特定部での特定結果から車両を制御する処理の一例を示すステートチャート図である。It is a state chart diagram which shows an example of the process which controls a vehicle from the specific result in the driver consciousness level specifying part and the passenger behavior specifying part in Embodiment 2. FIG. 実施の形態2における車両制御部の処理の一例を示すステートチャート図である。It is a state chart diagram which shows an example of the processing of the vehicle control part in Embodiment 2.
実施の形態1.
 図1は、実施の形態1に係る車両制御装置100の構成を概略的に示すブロック図である。
 車両制御装置100は、同乗者行動特定部110と、運転者意識レベル特定部160と、運転者状態特定部170と、車両制御部180とを備える。
Embodiment 1.
FIG. 1 is a block diagram schematically showing the configuration of the vehicle control device 100 according to the first embodiment.
The vehicle control device 100 includes a passenger behavior specifying unit 110, a driver consciousness level specifying unit 160, a driver state specifying unit 170, and a vehicle control unit 180.
 同乗者行動特定部110は、車両における運転者以外の同乗者の行動である同乗者行動を特定する。
 例えば、同乗者行動特定部110は、同乗者の視線情報、音声情報又は生体情報等の同乗者に関する情報から同乗者行動を特定し、特定された同乗者行動を運転者状態特定部170に与える。
 ここで、同乗者行動は、同乗者が運転者を観察することで同乗者が行う行動であることが望ましい。
The passenger behavior specifying unit 110 identifies the passenger behavior, which is the behavior of a passenger other than the driver in the vehicle.
For example, the passenger behavior specifying unit 110 identifies the passenger behavior from information about the passenger such as line-of-sight information, voice information, or biological information of the passenger, and gives the specified passenger behavior to the driver state specifying unit 170. ..
Here, it is desirable that the passenger behavior is an behavior performed by the passenger by observing the driver.
 図2は、実施の形態1における同乗者行動特定部110の構成を概略的に示すブロック図である。
 同乗者行動特定部110は、同乗者監視部120と、荷重解析部140と、視線解析部141と、音声解析部142と、特定部146とを備える。
FIG. 2 is a block diagram schematically showing the configuration of the passenger behavior specifying unit 110 in the first embodiment.
The passenger behavior specifying unit 110 includes a passenger monitoring unit 120, a load analysis unit 140, a line-of-sight analysis unit 141, a voice analysis unit 142, and a specific unit 146.
 同乗者監視部120は、同乗者の状態を検出する。
 同乗者監視部120は、荷重検出部121と、荷重判定部122と、顔画像取得部123と、視線検出部124と、視線判定部125と、心拍数検出部126と、心拍数判定部127と、発汗量検出部128と、発汗量判定部129と、音声検出部130と、音声判定部131とを備える。
The passenger monitoring unit 120 detects the state of the passenger.
The passenger monitoring unit 120 includes a load detection unit 121, a load determination unit 122, a face image acquisition unit 123, a line-of-sight detection unit 124, a line-of-sight determination unit 125, a heart rate detection unit 126, and a heart rate determination unit 127. A sweating amount detection unit 128, a sweating amount determination unit 129, a voice detection unit 130, and a voice determination unit 131 are provided.
 荷重検出部121は、車両の同乗者席に設置されている荷重センサ(図示せず)で検出された信号から、同乗者席への荷重の値を検出する。そして、荷重検出部121は、その荷重の値を示す荷重検出データを生成し、その荷重検出データを荷重判定部122に与える。 The load detection unit 121 detects the value of the load on the passenger seat from the signal detected by the load sensor (not shown) installed in the passenger seat of the vehicle. Then, the load detection unit 121 generates load detection data indicating the value of the load, and gives the load detection data to the load determination unit 122.
 荷重判定部122は、荷重検出部121から与えられる荷重検出データを荷重解析部140に与える。
 また、荷重判定部122は、後述するように、荷重解析部140で生成される荷重情報を特定部146に与える。
The load determination unit 122 gives the load detection data given by the load detection unit 121 to the load analysis unit 140.
Further, the load determination unit 122 gives the load information generated by the load analysis unit 140 to the specific unit 146, as will be described later.
 荷重解析部140は、荷重検出部121から与えられる荷重検出データを解析することで、同乗者の有無を検出する。さらに、荷重解析部140は、同乗者がいる場合に、荷重検出部121から与えられる荷重検出データを解析することで、車両における同乗者の位置、同乗者により荷重がかかっている方向、及び、荷重の大きさを特定し、特定された位置、方向及び大きさを示す荷重情報を生成する。荷重情報は、荷重判定部122に与えられ、荷重判定部122から特定部146に与えられる。 The load analysis unit 140 detects the presence or absence of a passenger by analyzing the load detection data given by the load detection unit 121. Further, the load analysis unit 140 analyzes the load detection data given by the load detection unit 121 when there is a passenger, so that the position of the passenger in the vehicle, the direction in which the load is applied by the passenger, and the load are applied by the passenger. It identifies the magnitude of the load and generates load information indicating the identified position, direction and magnitude. The load information is given to the load determination unit 122, and is given to the specific unit 146 from the load determination unit 122.
 例えば、同乗者が乗車すると、同乗者席に設置されている荷重センサが反応するため、荷重解析部140は、同乗者が乗車したことを検出することができる。また、同乗者が運転席側又は窓側に重心をずらした場合も、荷重解析部140は、荷重センサで検出される荷重の状況から、同乗者が何れの方向に重心をずらしたかを検出することができる。 For example, when a passenger gets on board, the load sensor installed in the passenger's seat reacts, so that the load analysis unit 140 can detect that the passenger has boarded. Further, even when the passenger shifts the center of gravity to the driver's seat side or the window side, the load analysis unit 140 detects in which direction the passenger shifts the center of gravity from the load status detected by the load sensor. Can be done.
 顔画像取得部123は、同乗者が乗車した場合に、車両の内部を撮像した画像を示す画像データから同乗者の顔を特定し、特定された顔を示す顔画像のデータである顔画像データを取得する。顔画像データは、視線検出部124に与えられる。 When the passenger gets on the vehicle, the face image acquisition unit 123 identifies the passenger's face from the image data showing the image of the inside of the vehicle, and the face image data which is the data of the face image showing the specified face. To get. The face image data is given to the line-of-sight detection unit 124.
 視線検出部124は、顔画像取得部123から与えられる顔画像データで示される顔画像から、同乗者の視線を検出する。視線検出部124は、検出された視線を示す視線データを生成し、その視線データを視線判定部125に与える。 The line-of-sight detection unit 124 detects the line-of-sight of the passenger from the face image indicated by the face image data given by the face image acquisition unit 123. The line-of-sight detection unit 124 generates line-of-sight data indicating the detected line-of-sight, and gives the line-of-sight data to the line-of-sight determination unit 125.
 視線判定部125は、視線検出部124から与えられる視線データを視線解析部141に与える。
 そして、視線判定部125は、後述するように、視線解析部141で生成される視線情報を特定部146に与える。
The line-of-sight determination unit 125 gives the line-of-sight data given by the line-of-sight detection unit 124 to the line-of-sight analysis unit 141.
Then, the line-of-sight determination unit 125 gives the line-of-sight information generated by the line-of-sight analysis unit 141 to the specific unit 146, as will be described later.
 視線解析部141は、視線検出部124から与えられる視線データを解析することで、同乗者が見ている方向を特定する。そして、視線解析部141は、同乗者が見ている方向を示す視線情報を生成し、その視線情報を視線判定部125に与える。 The line-of-sight analysis unit 141 identifies the direction the passenger is looking at by analyzing the line-of-sight data given by the line-of-sight detection unit 124. Then, the line-of-sight analysis unit 141 generates line-of-sight information indicating the direction the passenger is looking at, and gives the line-of-sight information to the line-of-sight determination unit 125.
 心拍数検出部126は、同乗者の心拍数を検出し、検出された心拍数を示す心拍数情報を生成する。心拍数情報は、心拍数判定部127に与えられる。
 心拍数判定部127は、心拍数情報を特定部146に与える。
The heart rate detection unit 126 detects the heart rate of the passenger and generates heart rate information indicating the detected heart rate. The heart rate information is given to the heart rate determination unit 127.
The heart rate determination unit 127 gives heart rate information to the specific unit 146.
 発汗量検出部128は、同乗者の発汗量を検出し、検出された発汗量を示す発汗量情報を生成する。発汗量情報は、発汗量判定部129に与えられる。
 発汗量判定部129は、発汗量情報を特定部146に与える。
The sweating amount detection unit 128 detects the sweating amount of the passenger and generates sweating amount information indicating the detected sweating amount. The sweating amount information is given to the sweating amount determination unit 129.
The sweating amount determination unit 129 gives the sweating amount information to the specific unit 146.
 音声検出部130は、同乗者が発する音声を検出することで、検出された音声を示す音声データを生成する。音声データは、音声判定部131に与えられる。
 音声判定部131は、音声データを音声解析部142に与える。
 また、音声判定部131は、後述するように、音声解析部142で生成された音声情報を特定部146に与える。
The voice detection unit 130 detects the voice emitted by the passenger to generate voice data indicating the detected voice. The voice data is given to the voice determination unit 131.
The voice determination unit 131 gives voice data to the voice analysis unit 142.
Further, the voice determination unit 131 gives the voice information generated by the voice analysis unit 142 to the specific unit 146, as will be described later.
 音声解析部142は、音声データを解析することで、同乗者が発した音声の内容を特定する。
 音声解析部142は、言語解析部143と、辞書記憶部144と、辞書参照部145とを備える。
The voice analysis unit 142 identifies the content of the voice emitted by the passenger by analyzing the voice data.
The voice analysis unit 142 includes a language analysis unit 143, a dictionary storage unit 144, and a dictionary reference unit 145.
 言語解析部143は、音声データで示される音声に対して形態素解析を行うことで、形態素に分割する。そして、言語解析部143は、分割された形態素を辞書参照部145に与える。 The language analysis unit 143 divides the voice indicated by the voice data into morphemes by performing morphological analysis. Then, the language analysis unit 143 gives the divided morphemes to the dictionary reference unit 145.
 辞書記憶部144は、形態素毎の意味を示す辞書データを記憶する。
 辞書参照部145は、辞書記憶部144に記憶されている辞書データを参照することで、言語解析部143から与えられる形態素毎の意味を特定し、その意味を言語解析部143に与える。
The dictionary storage unit 144 stores dictionary data indicating the meaning of each morpheme.
The dictionary reference unit 145 identifies the meaning of each morpheme given by the language analysis unit 143 by referring to the dictionary data stored in the dictionary storage unit 144, and gives the meaning to the language analysis unit 143.
 言語解析部143は、形態素毎の意味から、同乗者の発した音声の内容を特定し、特定した内容を示す音声情報を生成する。音声情報は、音声判定部131に与えられ、音声判定部131は、その音声情報を特定部146に与える。ここで、同乗者が発する音声には、悲鳴又は嗚咽等言葉にならない音声も含まれる。 The language analysis unit 143 identifies the content of the voice emitted by the passenger from the meaning of each morpheme, and generates voice information indicating the specified content. The voice information is given to the voice determination unit 131, and the voice determination unit 131 gives the voice information to the specific unit 146. Here, the voice emitted by the passenger includes non-verbal voice such as screaming or sobbing.
 以上のように、荷重検出部121、荷重判定部122及び荷重解析部140により、同乗者の座席における荷重の方向及び大きさを特定する荷重特定部111が構成される。
 また、顔画像取得部123、視線検出部124、視線判定部125及び視線解析部141により、同乗者の視線の方向を特定する視線特定部112が構成される。
As described above, the load detection unit 121, the load determination unit 122, and the load analysis unit 140 constitute a load identification unit 111 that specifies the direction and magnitude of the load in the passenger's seat.
Further, the face image acquisition unit 123, the line-of-sight detection unit 124, the line-of-sight determination unit 125, and the line-of-sight analysis unit 141 constitute a line-of-sight identification unit 112 that specifies the direction of the passenger's line of sight.
 心拍数検出部126及び心拍数判定部127により、同乗者の心拍数を特定する心拍数特定部113が構成される。
 発汗量検出部128及び発汗量判定部129により、同乗者の発汗量を特定する発汗量特定部114が構成される。
 なお、心拍数特定部113及び発汗量特定部114により、同乗者の生体に関する物理量を特定する物理量特定部が構成されるものとする。実施の形態1では、同乗者の生体に関する物理量として、心拍数と発汗量が特定されているが、実施の形態1は、このような例に限定されない。例えば、血圧又は呼吸数等が特定されてもよい。物理量特定部は、同乗者の生体に関する物理量を示す生体情報を特定部146に与える。
The heart rate detection unit 126 and the heart rate determination unit 127 configure a heart rate identification unit 113 for specifying the heart rate of the passenger.
The sweating amount detecting unit 128 and the sweating amount determining unit 129 constitute a sweating amount specifying unit 114 that specifies the sweating amount of the passenger.
It is assumed that the heart rate specifying unit 113 and the sweating amount specifying unit 114 constitute a physical quantity specifying unit that specifies the physical quantity of the passenger's living body. In the first embodiment, the heart rate and the sweating amount are specified as physical quantities related to the living body of the passenger, but the first embodiment is not limited to such an example. For example, blood pressure, respiratory rate, etc. may be specified. The physical quantity specifying unit provides the specific unit 146 with biological information indicating the physical quantity of the passenger's living body.
 さらに、音声検出部130、音声判定部131及び音声解析部142により、同乗者が発した音声の内容を特定する音声特定部115が構成される。
 なお、同乗者行動特定部110は、荷重特定部111、視線特定部112、心拍数特定部113、発汗量特定部114及び音声特定部115の全てを備えている必要はなく、これらの少なくとも何れか一つを備えていればよい。
Further, the voice detection unit 130, the voice determination unit 131, and the voice analysis unit 142 constitute a voice identification unit 115 for specifying the content of the voice emitted by the passenger.
The passenger behavior specifying unit 110 does not have to include all of the load specifying unit 111, the line-of-sight specifying unit 112, the heart rate specifying unit 113, the sweating amount specifying unit 114, and the voice specifying unit 115, and at least any of these. You only need to have one.
 特定部146は、同乗者監視部120から与えられる情報に基づいて、同乗者の行動である同乗者行動を特定する。例えば、特定部146は、同乗者監視部120から与えられる情報に基づいて、図3に示されているような同乗者行動を特定する。 The identification unit 146 identifies the passenger behavior, which is the behavior of the passenger, based on the information given from the passenger monitoring unit 120. For example, the identification unit 146 identifies the passenger behavior as shown in FIG. 3 based on the information given from the passenger monitoring unit 120.
 図3は、同乗者行動を特定するための同乗者行動結果表の一例を示す概略図である。
 図3に示された同乗者行動結果表150は、該当列150aと、PID(Passenger IDentigication)列150bと、同乗者行動列150cとを備える。
 同乗者行動列150cは、同乗者行動を格納する。
 PID列150bは、同乗者行動を識別するための識別情報である同乗者行動識別情報としてのPIDを格納する。
 該当列150aは、同乗者監視部120から与えられる情報に従って、同乗者行動に該当するか否かを格納する。
FIG. 3 is a schematic diagram showing an example of a passenger behavior result table for specifying passenger behavior.
The passenger action result table 150 shown in FIG. 3 includes a corresponding column 150a, a PID (Passenger Identity) column 150b, and a passenger action column 150c.
The passenger behavior row 150c stores the passenger behavior.
The PID column 150b stores the PID as the passenger behavior identification information which is the identification information for identifying the passenger behavior.
The corresponding column 150a stores whether or not it corresponds to the passenger behavior according to the information given from the passenger monitoring unit 120.
 例えば、特定部146は、荷重情報に基づいて、「同乗者の重心が運転手側に傾いている」、「同乗者の重心が窓側に傾いている」、「同乗者がシートから立ち上がっている」又は「同乗者が運転者を揺さぶっている」ことを特定する。具体的には、特定部146は、「同乗者の重心が運転手側に傾いている」、「同乗者がシートから立ち上がっている」又は「同乗者が運転者を揺さぶっている」に該当しない場合には、「同乗者の重心が窓側に傾いている」と判断すればよい。 For example, the specific unit 146 has "the center of gravity of the passenger tilted toward the driver", "the center of gravity of the passenger tilts toward the window", and "the passenger stands up from the seat" based on the load information. "Or" the passenger is shaking the driver ". Specifically, the specific unit 146 does not correspond to "the center of gravity of the passenger is tilted toward the driver", "the passenger is standing up from the seat", or "the passenger is shaking the driver". In that case, it may be determined that the center of gravity of the passenger is tilted toward the window.
 特定部146は、視線情報に基づいて、「同乗者の視線が前方を向いている」又は「同乗者の視線が運転者を向いている」ことを特定する。具体的には、特定部146は、「同乗者の視線が運転者を向いている」に該当しない場合には、「同乗者の視線が前方を向いている」と判断すればよい。 The identification unit 146 specifies that "the line of sight of the passenger is facing forward" or "the line of sight of the passenger is facing the driver" based on the line-of-sight information. Specifically, the specific unit 146 may determine that "the passenger's line of sight is facing forward" when it does not correspond to "the passenger's line of sight is facing the driver".
 特定部146は、心拍数情報に基づいて、「同乗者の心拍数が多い」、「同乗者の心拍数が普通である」又は「同乗者の心拍数が少ない」ことを特定する。例えば、特定部146は、予め定められた閾値を用いて、この判断を行えばよい。 The specifying unit 146 specifies that "the passenger's heart rate is high", "the passenger's heart rate is normal", or "the passenger's heart rate is low" based on the heart rate information. For example, the specific unit 146 may make this determination using a predetermined threshold value.
 特定部146は、発汗量情報に基づいて、「同乗者の発汗量が多い」、「同乗者の発汗量が普通である」又は「同乗者の発汗量が少ない」ことを特定する。例えば、特定部146は、予め定められた閾値を用いて、この判断を行えばよい。 The specific unit 146 specifies that "the passenger's sweating amount is large", "the passenger's sweating amount is normal", or "the passenger's sweating amount is small" based on the sweating amount information. For example, the specific unit 146 may make this determination using a predetermined threshold value.
 特定部146は、音声情報に基づいて、「同乗者が任意の言葉を発している」、「同乗者が特定の言葉を発している」、「同乗者が悲鳴を上げている」又は「同乗者が言葉を発していない」ことを特定する。具体的には、特定部146は、「同乗者が特定の言葉を発している」、「同乗者が悲鳴を上げている」又は「同乗者が言葉を発していない」に該当しない場合には、「同乗者が任意の言葉を発している」と判断すればよい。 Based on the voice information, the specific unit 146 "passenger is uttering an arbitrary word", "passenger is uttering a specific word", "passenger is screaming", or "passenger is screaming". The person is not speaking. " Specifically, the specific unit 146 does not correspond to "the passenger is uttering a specific word", "the passenger is screaming", or "the passenger is not speaking". , "The passenger is uttering any word."
 特定部146は、以上のようにして特定された同乗者行動を示す同乗者行動情報を運転者状態特定部170に与える。なお、ここでは同乗者の行動を判断する際、荷重、視線、心拍数、発汗量及び音声の各種データを採用したが、同乗者に関わる他のデータ、例えば、瞳径又は瞬目回数等が用いられてもよい。 The specific unit 146 provides the driver state specific unit 170 with passenger behavior information indicating the passenger behavior specified as described above. Here, when determining the behavior of the passenger, various data such as load, line of sight, heart rate, sweating amount, and voice are adopted, but other data related to the passenger, such as pupil diameter or number of blinks, are used. It may be used.
 ここでは、特定部146は、荷重特定部111、視線特定部112、心拍数特定部113、発汗量特定部114及び音声特定部115の全てから情報を取得して、同乗者行動を特定しているが、実施の形態1は、このような例に限定されない。特定部146は、荷重特定部111、視線特定部112、心拍数特定部113、発汗量特定部114及び音声特定部115の少なくとも何れか一つで特定された結果から、同乗者行動を特定できればよい。 Here, the specific unit 146 acquires information from all of the load specific unit 111, the line of sight specific unit 112, the heart rate specific unit 113, the sweating amount specific unit 114, and the voice specific unit 115, and specifies the passenger behavior. However, Embodiment 1 is not limited to such an example. If the passenger behavior can be specified from the result specified by at least one of the load specifying unit 111, the line-of-sight specifying unit 112, the heart rate specifying unit 113, the sweating amount specifying unit 114, and the voice specifying unit 115. good.
 図1に戻り、運転者意識レベル特定部160は、運転者の意識のレベルである運転者意識レベルを特定する。例えば、運転者意識レベル特定部160は、同乗者行動特定部110と同様に、運転者の画像、音声、心拍数及び発汗量の少なくとも何れか一つに基づいて、図4に示されている表に示されている運転者意識レベルを特定する。特定された運転者意識レベルは、運転者状態特定部170に与えられる。 Returning to FIG. 1, the driver consciousness level specifying unit 160 specifies the driver consciousness level, which is the level of the driver's consciousness. For example, the driver consciousness level specifying unit 160 is shown in FIG. 4 based on at least one of the driver's image, voice, heart rate, and sweating amount, similarly to the passenger behavior specifying unit 110. Identify the driver awareness level shown in the table. The identified driver awareness level is given to the driver state identification unit 170.
 図4は、運転者意識レベルを特定するための運転者意識レベル結果表の一例を示す概略図である。
 図4に示されている運転者意識レベル結果表151は、該当列151aと、DID(Driver IDentigication)列151bと、運転者意識レベル列151cとを備える。
FIG. 4 is a schematic diagram showing an example of a driver consciousness level result table for specifying a driver consciousness level.
The driver awareness level result table 151 shown in FIG. 4 includes a corresponding column 151a, a DID (Driver IDentification) column 151b, and a driver awareness level column 151c.
 運転者意識レベル列151cは、運転者意識レベルを格納する。
 DID列151bは、運転者意識レベルを識別するための運転者意識レベル識別情報としてのDIDを格納する。
 該当列151aは、運転者意識レベル特定部160により判断された、運転者意識レベルに該当するか否かを格納する。
The driver awareness level column 151c stores the driver awareness level.
The DID column 151b stores the DID as the driver consciousness level identification information for identifying the driver consciousness level.
The corresponding column 151a stores whether or not it corresponds to the driver consciousness level determined by the driver consciousness level specifying unit 160.
 運転者状態特定部170は、同乗者行動特定部110から与えられる同乗者行動と、運転者意識レベル特定部160から与えられる運転者意識レベルとの組み合わせから、運転者の状態である運転者状態を特定する。
 なお、運転者状態特定部170は、車両に同乗者が乗っていない場合には、運転者意識レベルから運転者状態を特定すればよい。
 例えば、運転者状態特定部170は、図5に示されている表に示されている運転者状態を特定する。特定された運転者状態は、車両制御部180に与えられる。
The driver state specifying unit 170 is a driver's state based on a combination of the passenger behavior given by the passenger behavior specifying unit 110 and the driver consciousness level given by the driver consciousness level specifying unit 160. To identify.
When the passenger is not in the vehicle, the driver state specifying unit 170 may specify the driver state from the driver consciousness level.
For example, the driver state specifying unit 170 identifies the driver state shown in the table shown in FIG. The identified driver state is given to the vehicle control unit 180.
 図5は、運転者状態を特定するための運転者状態結果表の一例を示す概略図である。
 図5に示されている運転者状態結果表152は、該当列152aと、JCS(Japan Coma Scale)列152bと、運転者状態列152cとを備える。
 JCS列152bは、運転者状態を示すJCSで指定された意識レベル番号を格納する。
 運転者状態列152cは、意識レベル番号に対応する運転者状態結果を格納する。
 該当列152aは、運転者状態特定部170により判断された運転者状態に該当するか否かを格納する。
FIG. 5 is a schematic diagram showing an example of a driver state result table for specifying a driver state.
The driver status result table 152 shown in FIG. 5 includes a corresponding row 152a, a JCS (Japan COMA Scale) row 152b, and a driver status row 152c.
The JCS column 152b stores the consciousness level number specified by JCS indicating the driver status.
The driver status column 152c stores the driver status result corresponding to the awareness level number.
The corresponding column 152a stores whether or not the driver state corresponds to the driver state determined by the driver state specifying unit 170.
 ここでは、図3に示されている同乗者行動結果表で特定される一又は複数の同乗者行動と、図4に示されている運転者意識レベル結果表で特定される一又は複数の運転者意識レベルとの組み合わせに、図5で示されている運転者状態表で特定される何れかの運転者状態が対応付けられているものとする。
 例えば、図3に示されている同乗者行動結果表において、同乗者行動が、P01、P03、P04、P05、P07、P09、P11、P12及びP15の内、二つ以上の項目に該当し、かつ、図4に示されている運転者意識レベル結果表において、運転者意識レベルが、D02、D03、D05、D06、D08、D09、D10、D11、D13、D14、D15及びD16の内、二つ以上の項目に該当する場合には、図5に示されているJCSの「300」に該当する。この場合、運転者状態特定部170は、運転者は意識がないと判断する。
Here, one or more passenger behaviors specified in the passenger behavior result table shown in FIG. 3 and one or more driving specified in the driver awareness level result table shown in FIG. It is assumed that any of the driver states specified in the driver state table shown in FIG. 5 is associated with the combination with the person consciousness level.
For example, in the passenger behavior result table shown in FIG. 3, the passenger behavior corresponds to two or more items among P01, P03, P04, P05, P07, P09, P11, P12 and P15. Moreover, in the driver consciousness level result table shown in FIG. 4, the driver consciousness level is two of D02, D03, D05, D06, D08, D09, D10, D11, D13, D14, D15 and D16. When it corresponds to one or more items, it corresponds to "300" of JCS shown in FIG. In this case, the driver state specifying unit 170 determines that the driver is unconscious.
 本実施の形態では、運転者状態としてJCSを採用しているが、本実施の形態はこのような例に限定されない。例えば、GCS(Glasgow Coma Scale)、その他の意識レベル指標が採用されてもよい。
 即ち、運転者状態特定部170は、同乗者行動と、運転者意識レベルとの組み合わせから、図5に示されているような何れかの運転者状態を出力する。
In the present embodiment, JCS is adopted as the driver state, but the present embodiment is not limited to such an example. For example, GCS (Glasgow Coma Scale) and other consciousness level indicators may be adopted.
That is, the driver state specifying unit 170 outputs any driver state as shown in FIG. 5 from the combination of the passenger behavior and the driver consciousness level.
 車両制御部180は、運転者状態特定部170から与えられる運転者状態に応じて、車両を制御する。ここでは、車両制御部180は、安全に車両を制御するものとする。
 具体的には、車両制御部180は、運転者状態特定部170により特定された運転者状態が、運転者が車両を正常に運転できないことを示している場合に、車両を制御する。ここでは、車両制御部180は、車両を安全な場所に停車させる。
The vehicle control unit 180 controls the vehicle according to the driver state given by the driver state specifying unit 170. Here, the vehicle control unit 180 shall control the vehicle safely.
Specifically, the vehicle control unit 180 controls the vehicle when the driver state specified by the driver state specifying unit 170 indicates that the driver cannot normally drive the vehicle. Here, the vehicle control unit 180 stops the vehicle in a safe place.
 図6は、実施の形態1における車両制御部180の構成を概略的に示すブロック図である。
 車両制御部180は、道路状況取得部181と、停止位置決定部182と、停止部183とを備える。
FIG. 6 is a block diagram schematically showing the configuration of the vehicle control unit 180 according to the first embodiment.
The vehicle control unit 180 includes a road condition acquisition unit 181, a stop position determination unit 182, and a stop unit 183.
 道路状況取得部181は、道路の状況を示す道路状況を取得する。例えば、道路状況取得部181は、図示してはいないが、車両に備え付けられている前方カメラ、側方カメラ、ソナー、ミリ波レーダ又はLIDAR(Light Detection and Ranging)等から車両の周囲の道路状況を取得し、その道路状況を停止位置決定部182に与える。 The road condition acquisition unit 181 acquires the road condition indicating the road condition. For example, although not shown, the road condition acquisition unit 181 has a road condition around the vehicle from a front camera, a side camera, a sonar, a millimeter wave radar, a LIDAR (Light Detection and Ranking), etc., which are installed in the vehicle. And give the road condition to the stop position determination unit 182.
 停止位置決定部182は、道路状況取得部181から与えられる道路状況に応じて、車両を安全に停止させることができる停止位置を決定する。そして、停止位置決定部182は、決定された停止位置を示す停止位置情報を停止部183に与える。 The stop position determination unit 182 determines a stop position at which the vehicle can be safely stopped according to the road condition given by the road condition acquisition unit 181. Then, the stop position determination unit 182 gives the stop position information indicating the determined stop position to the stop unit 183.
 停止部183は、停止位置決定部182から与えられる停止位置情報で示される停止位置に車両を停止させる。 The stop unit 183 stops the vehicle at the stop position indicated by the stop position information given by the stop position determination unit 182.
 以上に記載された車両制御装置100の一部又は全部は、例えば、図7(A)に示されているように、メモリ10と、メモリ10に格納されているプログラムを実行するCPU(Central Processing Unit)等のプロセッサ11とにより構成することができる。このようなプログラムは、ネットワークを通じて提供されてもよく、また、記録媒体に記録されて提供されてもよい。即ち、このようなプログラムは、例えば、プログラムプロダクトとして提供されてもよい。このような場合、車両制御装置100は、例えば、コンピュータにより実現することができる。 A part or all of the vehicle control device 100 described above includes, for example, a memory 10 and a CPU (Central Processing) that executes a program stored in the memory 10, as shown in FIG. 7A. It can be configured by a processor 11 such as Unit). Such a program may be provided through a network, or may be recorded and provided on a recording medium. That is, such a program may be provided, for example, as a program product. In such a case, the vehicle control device 100 can be realized by, for example, a computer.
 また、車両制御装置100の一部又は全部は、例えば、図7(B)に示されているように、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuit)又はFPGA(Field Programmable Gate Array)等の処理回路12で構成することもできる。
 以上のように、車両制御装置100は、処理回路網により実現することができる。
Further, a part or all of the vehicle control device 100 may be, for example, as shown in FIG. 7B, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, or an ASIC (Application Specific ASIC). It can also be configured by a processing circuit 12 such as an Integrated Circuit) or an FPGA (Field Programmable Gate Array).
As described above, the vehicle control device 100 can be realized by the processing network.
 図8は、実施の形態1における運転者意識レベル特定部160及び同乗者行動特定部110での特定結果から車両を制御する処理の一例を示すステートチャート図である。
 まず、運転者が乗車した場合、同乗者行動特定部110の荷重解析部140は、荷重検出部121で検出された荷重の値に基づいて、同乗者が乗車したか判定する(S10)。
FIG. 8 is a state chart diagram showing an example of a process of controlling a vehicle from the specific results of the driver consciousness level specifying unit 160 and the passenger behavior specifying unit 110 in the first embodiment.
First, when the driver gets on board, the load analysis unit 140 of the passenger behavior specifying unit 110 determines whether or not the passenger has boarded based on the value of the load detected by the load detection unit 121 (S10).
 また、運転者意識レベル特定部160は、運転者の行動を把握し、運転者意識レベルを特定して、特定された運転者意識レベルを運転者状態特定部170に与える(S11)。 Further, the driver consciousness level specifying unit 160 grasps the driver's behavior, specifies the driver consciousness level, and gives the specified driver consciousness level to the driver state specifying unit 170 (S11).
 ステップS12において、同乗者がいる場合には、同乗者行動特定部110は、同乗者の行動を把握して、同乗者行動を特定し、特定された同乗者行動を示す同乗者行動情報を運転者状態特定部170に与える(S13)。 In step S12, when there is a passenger, the passenger behavior specifying unit 110 grasps the behavior of the passenger, identifies the passenger behavior, and drives the passenger behavior information indicating the specified passenger behavior. It is given to the person state specifying unit 170 (S13).
 なお、運転者が乗車したが、同乗者は乗車して無い場合、ステップS12にて同乗者がいない場合となり、同乗者行動特定部110は、何の処理も行わず、処理は、S11の後段のステップへと続く。ここで、同乗者行動特定部110での処理と、運転者意識レベル特定部160での処理とは同時に行われるものとする。 If the driver is on board but the passenger is not on board, it means that there is no passenger in step S12, and the passenger behavior specifying unit 110 does not perform any processing, and the processing is the latter stage of S11. Continue to the step of. Here, it is assumed that the processing in the passenger behavior specifying unit 110 and the processing in the driver consciousness level specifying unit 160 are performed at the same time.
 ステップS14において同乗者がいる場合、運転者状態特定部170は、運転者意識レベル特定部160から与えられる運転者意識レベルと、同乗者行動特定部110から与えられる同乗者行動とに従って、運転者状態を特定する(S15)。そして、運転者状態特定部170は、特定された運転者状態を示す運転者状態情報を車両制御部180に与える。 When there is a passenger in step S14, the driver state specifying unit 170 is a driver according to the driver consciousness level given by the driver consciousness level specifying unit 160 and the passenger behavior given by the passenger behavior specifying unit 110. The state is specified (S15). Then, the driver state specifying unit 170 provides the vehicle control unit 180 with the driver state information indicating the specified driver state.
 ステップS14において同乗者がいない場合、運転者状態特定部170は、運転者意識レベル特定部160から与えられる運転者意識レベルから運転者状態を特定する(S16)。そして、運転者状態特定部170は、特定された運転者状態を示す運転者状態情報を車両制御部180に与える。 When there is no passenger in step S14, the driver state specifying unit 170 specifies the driver state from the driver consciousness level given by the driver consciousness level specifying unit 160 (S16). Then, the driver state specifying unit 170 provides the vehicle control unit 180 with the driver state information indicating the specified driver state.
 そして、ステップS15又はステップS16の処理の後に、車両制御部180は運転者状態に応じて、車両を制御する(S17)。 Then, after the process of step S15 or step S16, the vehicle control unit 180 controls the vehicle according to the driver state (S17).
 図9は、同乗者行動特定部110での処理の一例を示すステートチャート図である。
 図9は、図8のステップS13での処理の詳細を示すものである。
FIG. 9 is a state chart diagram showing an example of processing in the passenger behavior specifying unit 110.
FIG. 9 shows the details of the process in step S13 of FIG.
 まず、顔画像取得部123は、図示しない撮像部であるカメラで撮像された画像から、同乗者の顔画像を取得する(S20)。
 次に、視線検出部124は、取得された顔画像から、同乗者の視線を検出し、検出された視線を示す視線データを生成して、その視線データを視線判定部125に与える(S21)。視線判定部125は、視線解析部141に視線データを与える。
First, the face image acquisition unit 123 acquires a passenger's face image from an image captured by a camera, which is an image pickup unit (not shown) (S20).
Next, the line-of-sight detection unit 124 detects the line-of-sight of the passenger from the acquired face image, generates line-of-sight data indicating the detected line-of-sight, and gives the line-of-sight data to the line-of-sight determination unit 125 (S21). .. The line-of-sight determination unit 125 gives line-of-sight data to the line-of-sight analysis unit 141.
 視線解析部141は、視線データを解析することで、同乗者が見ている方向を特定する(S22)。そして、視線解析部141は、特定された方向を示す視線情報を生成し、その視線情報を視線判定部125に与える。視線判定部125は、その視線情報を特定部146に与える。 The line-of-sight analysis unit 141 identifies the direction the passenger is looking at by analyzing the line-of-sight data (S22). Then, the line-of-sight analysis unit 141 generates line-of-sight information indicating the specified direction, and gives the line-of-sight information to the line-of-sight determination unit 125. The line-of-sight determination unit 125 gives the line-of-sight information to the specific unit 146.
 また、心拍数データに関する処理では、心拍数検出部126は、同乗者の心拍数を検出し、検出された心拍数を示す心拍数情報を心拍数判定部127に与える(S23)。心拍数判定部127は、その心拍数情報を特定部146に与える。 Further, in the processing related to the heart rate data, the heart rate detection unit 126 detects the heart rate of the passenger and gives the heart rate information indicating the detected heart rate to the heart rate determination unit 127 (S23). The heart rate determination unit 127 gives the heart rate information to the specific unit 146.
 また、発汗量データに関する処理では、発汗量検出部128は、同乗者の発汗量を検出し、検出された発汗量を示す発汗量情報を発汗量判定部129に与える(S24)。発汗量判定部129は、その発汗量情報を特定部146に与える。 Further, in the process related to the sweating amount data, the sweating amount detection unit 128 detects the sweating amount of the passenger and gives the sweating amount information indicating the detected sweating amount to the sweating amount determination unit 129 (S24). The sweating amount determination unit 129 gives the sweating amount information to the specific unit 146.
 音声データに関する処理では、音声検出部130は、同乗者の音声を検出することで、検出された音声を示す音声データを生成する(S25)。音声検出部130は、生成された音声データを音声判定部131に与える。音声判定部131は、その音声データを言語解析部143に与える。 In the processing related to voice data, the voice detection unit 130 detects the voice of the passenger and generates voice data indicating the detected voice (S25). The voice detection unit 130 gives the generated voice data to the voice determination unit 131. The voice determination unit 131 gives the voice data to the language analysis unit 143.
 言語解析部143は、音声データを解析することで、同乗者が発した音声の内容を特定する(S26)。具体的には、言語解析部143は、音声データで示される音声に対して形態素解析を行うことで、形態素に分割する。そして、言語解析部143は、分割された形態素を辞書参照部145に与える。
 辞書参照部145は、辞書記憶部144に記憶されている辞書データを参照することで、言語解析部143から与えられる形態素毎の意味を特定し、その意味を言語解析部143に与える。
 言語解析部143は、形態素毎の意味から、同乗者の発した音声の内容を特定し、特定した内容を示す音声情報を生成する。
 そして、言語解析部143は、音声情報を音声判定部131に与え、音声判定部131は、その音声情報を特定部146に与える。
The language analysis unit 143 identifies the content of the voice emitted by the passenger by analyzing the voice data (S26). Specifically, the language analysis unit 143 divides the voice indicated by the voice data into morphemes by performing morphological analysis. Then, the language analysis unit 143 gives the divided morphemes to the dictionary reference unit 145.
The dictionary reference unit 145 identifies the meaning of each morpheme given by the language analysis unit 143 by referring to the dictionary data stored in the dictionary storage unit 144, and gives the meaning to the language analysis unit 143.
The language analysis unit 143 identifies the content of the voice emitted by the passenger from the meaning of each morpheme, and generates voice information indicating the specified content.
Then, the language analysis unit 143 gives the voice information to the voice determination unit 131, and the voice determination unit 131 gives the voice information to the specific unit 146.
 特定部146は、同乗者監視部120から与えられる情報に基づいて、同乗者の行動である同乗者行動を特定する(S27)。 The identification unit 146 identifies the passenger behavior, which is the behavior of the passenger, based on the information given from the passenger monitoring unit 120 (S27).
 図10は、実施の形態1における車両制御部180の処理の一例を示すステートチャート図である。
 図10は、図8のステップS17での処理である。
FIG. 10 is a state chart diagram showing an example of processing of the vehicle control unit 180 in the first embodiment.
FIG. 10 is the process in step S17 of FIG.
 まず、道路状況取得部181は、運転者状態特定部170から与えられる運転者状態情報で示される運転者状態が意識清明か否かを判断する(S30)。
 そして、ステップS31において、運転者状態が意識清明である場合には、処理を終了し、運転者状態が意識清明ではない場合には、処理はステップS32に進む。
First, the road condition acquisition unit 181 determines whether or not the driver state indicated by the driver state information given by the driver state specifying unit 170 is clear consciousness (S30).
Then, in step S31, if the driver state is clear consciousness, the process ends, and if the driver state is not clear consciousness, the process proceeds to step S32.
 次に、道路状況取得部181は、周囲の道路の状況を示す道路状況を取得する(S31)。道路状況取得部181は、取得された道路状況を停止位置決定部182に与える。 Next, the road condition acquisition unit 181 acquires the road condition indicating the condition of the surrounding road (S31). The road condition acquisition unit 181 gives the acquired road condition to the stop position determination unit 182.
 そして、停止位置決定部182は、与えられた道路状況から、安全に停止出来る位置を決定する(S33)。
 最後に、停止部183は、安全に停止出来る位置に車両を停止させる(S34)。
Then, the stop position determination unit 182 determines a position where the vehicle can be safely stopped based on the given road condition (S33).
Finally, the stop unit 183 stops the vehicle at a position where it can be safely stopped (S34).
 図11は、車両制御装置100が搭載された車両のコックピット101の内部の一例を示す概略図である。
 コックピット101には、一つの集音器102と、一つの広角撮像器103とが設けられている。
FIG. 11 is a schematic view showing an example of the inside of the cockpit 101 of a vehicle in which the vehicle control device 100 is mounted.
The cockpit 101 is provided with one sound collector 102 and one wide-angle imager 103.
 集音器102は、右側の運転席、左側の同乗者席としての助手席及び後ろの同乗者席からの音声を全て集音する集音装置である。集音器102で集音された音声の信号である音声信号は、音声検出部130に与えられる。音声検出部130は、音声信号の受信方向から誰が発声した音声であるかを特定できるものとする。これにより、音声検出部130は、車両の内部の音声から,同乗者の音声を特定することができる。 The sound collector 102 is a sound collecting device that collects all the sounds from the driver's seat on the right side, the passenger seat as the passenger's seat on the left side, and the passenger's seat behind. The voice signal, which is a voice signal collected by the sound collector 102, is given to the voice detection unit 130. The voice detection unit 130 can identify who utters the voice from the reception direction of the voice signal. As a result, the voice detection unit 130 can identify the voice of the passenger from the voice inside the vehicle.
 広角撮像器103は、右側の運転席、左側の助手席及び後ろの同乗者席の映像を全て撮像することのできる撮像部としての撮像装置である。広角撮像器103で撮像された画像の画像データは、顔画像取得部123に与えられる。顔画像取得部123は、その画像データで示される画像から同乗者の顔を特定することができるものとする。 The wide-angle imager 103 is an image pickup device as an image pickup unit capable of capturing all images of the driver's seat on the right side, the passenger seat on the left side, and the passenger seat behind. The image data of the image captured by the wide-angle imager 103 is given to the face image acquisition unit 123. The face image acquisition unit 123 can identify the face of the passenger from the image shown by the image data.
 本実施の形態では一つの集音器102及び一つの広角撮像器103からのデータに基づいて、誰かを判別できる構成にしているが、これは、集音器102及び広角撮像器103の数をそれぞれ一つとすることで、コスト削減を図るためである。コストを気にしないのであれば集音器及び撮像器の数をそれぞれ複数にしてもよい。 In the present embodiment, the configuration is such that someone can be identified based on the data from one sound collector 102 and one wide-angle imager 103, but this is the number of sound collector 102 and wide-angle imager 103. This is to reduce costs by making each one one. If the cost is not a concern, the number of sound collectors and imagers may be multiple.
 以上のように、実施の形態1によれば、同乗者行動及び運転者意識レベルにより運転者状態を特定することができ、運転者が車両を正常に運転できない場合に、停止位置に車両を安全に停止させることができる。 As described above, according to the first embodiment, the driver's state can be specified by the passenger behavior and the driver's consciousness level, and when the driver cannot drive the vehicle normally, the vehicle is safely placed in the stop position. Can be stopped at.
実施の形態2.
 図12は、実施の形態2に係る車両制御装置200の構成を概略的に示すブロック図である。
 車両制御装置200は、同乗者行動特定部110と、運転者意識レベル特定部160と、運転者状態特定部270と、車両制御部280と、運転者状態記憶部290とを備える。
Embodiment 2.
FIG. 12 is a block diagram schematically showing the configuration of the vehicle control device 200 according to the second embodiment.
The vehicle control device 200 includes a passenger behavior specifying unit 110, a driver consciousness level specifying unit 160, a driver state specifying unit 270, a vehicle control unit 280, and a driver state storage unit 290.
 実施の形態2に係る車両制御装置200の同乗者行動特定部110及び運転者意識レベル特定部160は、実施の形態1に係る車両制御装置100の同乗者行動特定部110及び運転者意識レベル特定部160と同様である。 The passenger behavior specifying unit 110 and the driver consciousness level specifying unit 160 of the vehicle control device 200 according to the second embodiment are the passenger behavior specifying unit 110 and the driver consciousness level specifying unit of the vehicle control device 100 according to the first embodiment. It is the same as the part 160.
 運転者状態特定部270は、同乗者行動特定部110から与えられる同乗者行動と、運転者意識レベル特定部160から与えられる運転者意識レベルとの組み合わせから、運転者の状態である運転者状態を特定する。 The driver state specifying unit 270 is a driver state, which is a driver's state, based on a combination of the passenger behavior given by the passenger behavior specifying unit 110 and the driver consciousness level given by the driver consciousness level specifying unit 160. To identify.
 実施の形態2では、運転者状態記憶部290に記憶されているデータ量が予め定められた閾値よりも少なく、かつ、同乗者がいる場合には、運転者状態特定部270は、図5に示されている運転者状態結果表152で示される運転者状態を特定できるように、同乗者行動と、運転者意識レベルとの組み合わせに基づいて、運転者状態を特定する。そして、運転者状態特定部270は、特定された運転者状態を車両制御部280に与えるとともに、特定された運転者状態を、同乗者行動及び運転者意識レベルの組み合わせに関連付けて、運転者状態関連情報として運転者状態記憶部290に記憶させる。 In the second embodiment, when the amount of data stored in the driver state storage unit 290 is less than a predetermined threshold value and there is a passenger, the driver state identification unit 270 is shown in FIG. Driver Status Results Shown Driver status is identified based on a combination of passenger behavior and driver awareness level so that the driver status shown in Table 152 can be identified. Then, the driver state specifying unit 270 gives the specified driver state to the vehicle control unit 280, and associates the specified driver state with the combination of the passenger behavior and the driver awareness level, and the driver state. It is stored in the driver state storage unit 290 as related information.
 また、運転者状態記憶部290に記憶されているデータ量が予め定められた閾値よりも少なく、かつ、同乗者がいない場合には、運転者状態特定部270は、図5に示されている運転者状態結果表152で示される運転者状態を特定できるように、運転者意識レベルに基づいて、運転者状態を特定する。そして、運転者状態特定部270は、特定された運転者状態を車両制御部280に与えるとともに、特定された運転者状態を、運転者意識レベルに関連付けて、運転者状態関連情報として運転者状態記憶部290に記憶させる。 Further, when the amount of data stored in the driver state storage unit 290 is less than a predetermined threshold value and there is no passenger, the driver state identification unit 270 is shown in FIG. The driver state is specified based on the driver awareness level so that the driver state shown in the driver state result table 152 can be specified. Then, the driver state specifying unit 270 gives the specified driver state to the vehicle control unit 280, and associates the specified driver state with the driver consciousness level as the driver state-related information. It is stored in the storage unit 290.
 一方、運転者状態記憶部290に記憶されているデータ量が予め定められた閾値以上であり、かつ、同乗者がいる場合には、運転者状態特定部270は、同乗者行動及び運転者意識レベルの組み合わせに関連付けられている運転者状態を、運転者状態記憶部290に記憶されている運転者状態関連情報から特定する。そして、運転者状態特定部270は、特定された運転者状態を車両制御部280に与える。 On the other hand, when the amount of data stored in the driver state storage unit 290 is equal to or greater than a predetermined threshold value and there is a passenger, the driver state identification unit 270 may perform passenger behavior and driver awareness. The driver state associated with the combination of levels is specified from the driver state related information stored in the driver state storage unit 290. Then, the driver state specifying unit 270 gives the specified driver state to the vehicle control unit 280.
 また、運転者状態記憶部290に記憶されているデータ量が予め定められた閾値以上であり、かつ、同乗者がいない場合には、運転者状態特定部270は、運転者意識レベルに関連付けられている運転者状態を、運転者状態記憶部290に記憶されている運転者状態関連情報から特定する。そして、運転者状態特定部270は、特定された運転者状態を車両制御部280に与える。 Further, when the amount of data stored in the driver state storage unit 290 is equal to or higher than a predetermined threshold value and there is no passenger, the driver state identification unit 270 is associated with the driver awareness level. The driver state is specified from the driver state-related information stored in the driver state storage unit 290. Then, the driver state specifying unit 270 gives the specified driver state to the vehicle control unit 280.
 運転者状態記憶部290は、運転者状態特定部270から与えられる運転者状態関連情報を記憶する。 The driver state storage unit 290 stores the driver state-related information given by the driver state identification unit 270.
 車両制御部280は、運転者状態特定部270から与えられる運転者状態に応じて、車両を制御する。
 実施の形態2における車両制御部280は、運転者状態に応じて、救急病院等の近隣の目的地を自動的に設定し、設定された目的地まで、車両を自動走行させる。
 なお、実施の形態2における車両制御部280は、実施の形態1における車両制御部180と同様に、運転者が車両を正常に運転できない場合に、車両を安全な場所に停車させてもよい。
The vehicle control unit 280 controls the vehicle according to the driver state given by the driver state specifying unit 270.
The vehicle control unit 280 in the second embodiment automatically sets a nearby destination such as an emergency hospital according to the driver's condition, and automatically drives the vehicle to the set destination.
The vehicle control unit 280 in the second embodiment may stop the vehicle in a safe place when the driver cannot normally drive the vehicle, as in the vehicle control unit 180 in the first embodiment.
 図13は、実施の形態2における車両制御部280の構成を概略的に示すブロック図である。
 車両制御部280は、目的地自動設定部284と、ルート探索決定部285と、走行制御部286とを備える。
FIG. 13 is a block diagram schematically showing the configuration of the vehicle control unit 280 according to the second embodiment.
The vehicle control unit 280 includes a destination automatic setting unit 284, a route search determination unit 285, and a travel control unit 286.
 目的地自動設定部284は、車両に備え付けられているINS(Inertial Navigation System:慣性航法装置)、又は、GPS(Global Positioning System:全地球測位システム)受信機等から車両の位置を特定し、地図情報を参照することで、運転者状態に応じた目的地を設定する。例えば、目的地自動設定部284は、運転者状態が、図5に示されている「刺激しても覚醒しない状態」である場合に、車両の近隣の救急病院を探して、その救急病院を目的地として設定する。そして、目的地自動設定部284は、設定された目的地を示す目的地情報をルート探索決定部285に与える。なお、目的地自動設定部284には、目的地を設定すべき運転者状態毎に、設定すべき目的地の種別が予め定められているものとする。そして、目的地自動設定部284は、運転者状態に従って、目的地の種別を特定し、地図情報において、特定された種別の目的地の内、車両に最も近い目的地を選択すればよい。 The destination automatic setting unit 284 identifies the position of the vehicle from the INS (Inertial Navigation System) or GPS (Global Positioning System) receiver installed in the vehicle, and maps the position. By referring to the information, the destination is set according to the driver's condition. For example, when the driver's state is the "state in which the driver does not awaken even if stimulated" shown in FIG. 5, the destination automatic setting unit 284 searches for an emergency hospital near the vehicle and searches for the emergency hospital. Set as a destination. Then, the destination automatic setting unit 284 gives the destination information indicating the set destination to the route search determination unit 285. In addition, it is assumed that the type of the destination to be set is predetermined in the destination automatic setting unit 284 for each driver state in which the destination should be set. Then, the destination automatic setting unit 284 may specify the type of the destination according to the driver's state, and select the destination closest to the vehicle from the destinations of the specified type in the map information.
 ルート探索決定部285は、目的地自動設定部284から与えられた目的地情報で示される目的地へ車両が走行すべきルートを検索し、そのルートを示すルート情報を生成する。そして、ルート探索決定部285は、そのルート情報を走行制御部286に与える。 The route search determination unit 285 searches for a route on which the vehicle should travel to the destination indicated by the destination information given by the destination automatic setting unit 284, and generates route information indicating the route. Then, the route search determination unit 285 gives the route information to the travel control unit 286.
 走行制御部286は、ルート探索決定部285から与えられたルート情報に従って、車両を目的地まで走行させる。 The travel control unit 286 drives the vehicle to the destination according to the route information given by the route search determination unit 285.
 以上に記載された車両制御装置200の一部又は全部も、例えば、図7(A)に示されているように、メモリ10と、メモリ10に格納されているプログラムを実行するCPU等のプロセッサ11とにより構成することができる。
 また、車両制御装置200の一部又は全部は、例えば、図7(B)に示されているように、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC又はFPGA等の処理回路12で構成することもできる。
 以上のように、車両制御装置200は、処理回路網により実現することができる。
A part or all of the vehicle control device 200 described above is also, for example, as shown in FIG. 7A, a memory 10 and a processor such as a CPU that executes a program stored in the memory 10. It can be configured by 11.
Further, a part or all of the vehicle control device 200 may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC, an FPGA, or the like, as shown in FIG. 7 (B). It can also be configured by the processing circuit 12 of.
As described above, the vehicle control device 200 can be realized by the processing network.
 図14は、実施の形態2における運転者意識レベル特定部160及び同乗者行動特定部110での特定結果から車両を制御する処理の一例を示すステートチャート図である。
 図14に示されているステップの内、図8に示されているステップと同様の処理を行うステップについては、図8と同一の符号を付し、その詳細な説明を省略する。
FIG. 14 is a state chart diagram showing an example of a process of controlling a vehicle from the specific results of the driver consciousness level specifying unit 160 and the passenger behavior specifying unit 110 in the second embodiment.
Among the steps shown in FIG. 14, the steps that perform the same processing as the steps shown in FIG. 8 are designated by the same reference numerals as those in FIG. 8, and detailed description thereof will be omitted.
 図14のステップS10~S13までの処理は、図8のステップS10~S13までの処理と同様である。但し、図14のステップS11及びS13の後は、処理はステップS44に進む。 The process of steps S10 to S13 in FIG. 14 is the same as the process of steps S10 to S13 of FIG. However, after steps S11 and S13 in FIG. 14, the process proceeds to step S44.
 ステップS44では、運転者状態特定部270は、運転者状態記憶部290に記憶されているデータ量が十分であるか否かを判断する。そのデータ量が十分である場合には、処理はステップS45に進み、そのデータ量が十分ではない場合には、処理はステップS48に進む。 In step S44, the driver state specifying unit 270 determines whether or not the amount of data stored in the driver state storage unit 290 is sufficient. If the amount of data is sufficient, the process proceeds to step S45, and if the amount of data is not sufficient, the process proceeds to step S48.
 ステップS45では、運転者状態特定部270は、同乗者がいるか否かの判定を行う。ステップS45において同乗者がいる場合、運転者状態特定部270は、運転者意識レベル特定部160から与えられる運転者意識レベルと、同乗者行動特定部110から与えられる同乗者行動との組み合わせに関連付けられている運転者状態を、運転者状態記憶部290に記憶されている運転者状態関連情報から特定する(S46)。そして、運転者状態特定部270は、特定された運転者状態を示す運転者状態情報を車両制御部280に与える。 In step S45, the driver state specifying unit 270 determines whether or not there is a passenger. When there is a passenger in step S45, the driver state specifying unit 270 associates the driver consciousness level given by the driver consciousness level specifying unit 160 with the combination of the passenger behavior given by the passenger behavior specifying unit 110. The driver state is specified from the driver state-related information stored in the driver state storage unit 290 (S46). Then, the driver state specifying unit 270 provides the vehicle control unit 280 with the driver state information indicating the specified driver state.
 ステップS45において同乗者がいない場合、運転者状態特定部270は、運転者意識レベル特定部160から与えられる運転者意識レベルに関連付けられている運転者状態を、運転者状態記憶部290に記憶されている運転者状態関連情報から特定する(S47)。そして、運転者状態特定部270は、特定された運転者状態を示す運転者状態情報を車両制御部280に与える。 When there is no passenger in step S45, the driver state specifying unit 270 stores the driver state associated with the driver consciousness level given by the driver consciousness level specifying unit 160 in the driver state storage unit 290. It is specified from the driver status related information (S47). Then, the driver state specifying unit 270 provides the vehicle control unit 280 with the driver state information indicating the specified driver state.
 ステップS48では、運転者状態特定部270は、同乗者がいるか否かの判定を行う。ステップS48において同乗者がいる場合、運転者状態特定部270は、運転者意識レベル特定部160から与えられる運転者意識レベルと、同乗者行動特定部110から与えられる同乗者行動とに従って、運転者状態を特定する(S49)。ここでは、運転者状態特定部270は、図5に示されている運転者状態結果表152で示される運転者状態を特定する。そして、運転者状態特定部270は、特定された運転者状態を示す運転者状態情報を車両制御部280に与える。 In step S48, the driver state specifying unit 270 determines whether or not there is a passenger. When there is a passenger in step S48, the driver state specifying unit 270 is a driver according to the driver consciousness level given by the driver consciousness level specifying unit 160 and the passenger behavior given by the passenger behavior specifying unit 110. The state is specified (S49). Here, the driver state specifying unit 270 identifies the driver state shown in the driver state result table 152 shown in FIG. Then, the driver state specifying unit 270 provides the vehicle control unit 280 with the driver state information indicating the specified driver state.
 次に、運転者状態特定部270は、運転者意識レベル特定部160から与えられる運転者意識レベルと、同乗者行動特定部110から与えられる同乗者行動とを、ステップS48で特定された運転者状態と対応付けた運転者状態関連情報を生成し、その運転者状態関連情報を運転者状態記憶部290に記憶させる(S50)。 Next, the driver state specifying unit 270 identifies the driver consciousness level given by the driver consciousness level specifying unit 160 and the passenger behavior given by the passenger behavior specifying unit 110 in step S48. The driver state-related information associated with the state is generated, and the driver state-related information is stored in the driver state storage unit 290 (S50).
 ステップS48において同乗者がいない場合、運転者状態特定部170は、運転者意識レベル特定部160から与えられる運転者意識レベルから運転者状態を特定する(S51)。ここでは、運転者状態特定部270は、図5に示されている運転者状態結果表152で示される運転者状態を特定する。そして、運転者状態特定部270は、特定された運転者状態を示す運転者状態情報を車両制御部280に与える。 When there is no passenger in step S48, the driver state specifying unit 170 specifies the driver state from the driver consciousness level given by the driver consciousness level specifying unit 160 (S51). Here, the driver state specifying unit 270 identifies the driver state shown in the driver state result table 152 shown in FIG. Then, the driver state specifying unit 270 provides the vehicle control unit 280 with the driver state information indicating the specified driver state.
 次に、運転者状態特定部270は、運転者意識レベル特定部160から与えられる運転者意識レベルを、ステップS51で特定された運転者状態と対応付けた運転者状態関連情報を生成し、その運転者状態関連情報を運転者状態記憶部290に記憶させる(S52)。 Next, the driver state specifying unit 270 generates driver state-related information in which the driver consciousness level given by the driver consciousness level specifying unit 160 is associated with the driver state specified in step S51. The driver state-related information is stored in the driver state storage unit 290 (S52).
 そして、ステップS46、ステップS47、ステップS50又はステップS52の処理の後に、車両制御部280は、運転者状態に応じて、車両を制御する(S53)。 Then, after the processing of step S46, step S47, step S50 or step S52, the vehicle control unit 280 controls the vehicle according to the driver state (S53).
 図15は、実施の形態2における車両制御部280の処理の一例を示すステートチャート図である。
 図15は、図14のステップS53での処理である。
FIG. 15 is a state chart diagram showing an example of processing of the vehicle control unit 280 in the second embodiment.
FIG. 15 is the process in step S53 of FIG.
 まず、目的地自動設定部284は、運転者状態に応じて、近くの救急病院等を目的地に設定する(S60)。 First, the destination automatic setting unit 284 sets a nearby emergency hospital or the like as the destination according to the driver's condition (S60).
 そして、ルート探索決定部285は、設定された目的地へのルートを探索して、そのルートを決定する(S61)。
 最後に走行制御部286は、決定されたルートに従い、車両を目的地まで安全に自動走行させる(S62)。
Then, the route search determination unit 285 searches for a route to the set destination and determines the route (S61).
Finally, the travel control unit 286 automatically drives the vehicle safely to the destination according to the determined route (S62).
 以上のように、実施の形態2によれば、同乗者行動及び運転者意識レベルにより運転者状態を特定することができ、運転者状態に応じて自動的に目的地を設定し、その目的地にまで車両を自動走行させることができる。
 なお、実施の形態2は、記憶された運転者状態関連情報のデータ量が予め定められた閾値以上となった場合に、運転者状態を特定する方法を、運転者状態関連情報から、特定された運転者意識レベル及び特定された同乗者行動の組み合わせに関連付けられている運転者状態を特定する方法に切り替えることで、処理の負荷を軽減することができる。
As described above, according to the second embodiment, the driver state can be specified by the passenger behavior and the driver consciousness level, the destination is automatically set according to the driver state, and the destination is set. The vehicle can be driven automatically.
In the second embodiment, the method of specifying the driver state when the stored data amount of the driver state-related information becomes equal to or more than a predetermined threshold value is specified from the driver state-related information. The processing load can be reduced by switching to a method of identifying the driver state associated with the combination of the driver awareness level and the identified passenger behavior.
 なお、実施の形態は、上記の実施の形態1又は2に限定されるものではなく、上記の要旨を逸脱しない範囲内において種々の態様で実施することができる。 It should be noted that the embodiment is not limited to the above embodiment 1 or 2, and can be implemented in various embodiments within a range not deviating from the above gist.
 100,200 車両制御装置、 110 同乗者行動特定部、 111 荷重特定部、 112 視線特定部、 113 心拍数特定部、 114 発汗量特定部、 115 音声特定部、 120 同乗者監視部、 121 荷重検出部、 122 荷重判定部、 123 顔画像取得部、 124 視線検出部、 125 視線判定部、 126 心拍数検出部、 127 心拍数判定部、 128 発汗量検出部、 129 発汗量判定部、 130 音声検出部、 131 音声判定部、 140 荷重解析部、 141 視線解析部、 142 音声解析部、 143 言語解析部、 144 辞書記憶部、 145 辞書参照部、 146 特定部、 160 運転者意識レベル特定部、 170,270 運転者状態特定部、 180,280 車両制御部、 181 道路状況取得部、 182 停止位置決定部、 183 停止部、 284 目的地自動設定部、 285 ルート探索決定部、 286 走行制御部、 290 運転者状態記憶部。 100, 200 vehicle control device, 110 passenger behavior identification unit, 111 load identification unit, 112 line-of-sight identification unit, 113 heart rate identification unit, 114 sweating amount identification unit, 115 voice identification unit, 120 passenger monitoring unit, 121 load detection Unit, 122 load determination unit, 123 face image acquisition unit, 124 line-of-sight detection unit, 125 line-of-sight determination unit, 126 heart rate detection unit, 127 heart rate determination unit, 128 sweating amount detection unit, 129 sweating amount determination unit, 130 voice detection Unit, 131 voice judgment unit, 140 load analysis unit, 141 line-of-sight analysis unit, 142 voice analysis unit, 143 language analysis unit, 144 dictionary storage unit, 145 dictionary reference unit, 146 specific unit, 160 driver awareness level specific unit, 170 , 270 Driver status identification unit, 180, 280 vehicle control unit, 181 road condition acquisition unit, 182 stop position determination unit, 183 stop unit, 284 destination automatic setting unit, 285 route search determination unit, 286 travel control unit, 290 Driver state storage unit.

Claims (13)

  1.  車両の運転者の意識のレベルである運転者意識レベルを特定する運転者意識レベル特定部と、
     前記車両における前記運転者以外の同乗者の行動である同乗者行動を特定する同乗者行動特定部と、
     前記運転者意識レベル及び前記同乗者行動の組み合わせから、前記運転者の状態である運転者状態を特定する運転者状態特定部と、
     前記運転者状態に応じて、前記車両を制御する車両制御部と、を備えること
     を特徴とする車両制御装置。
    The driver consciousness level identification unit that specifies the driver consciousness level, which is the level of consciousness of the driver of the vehicle,
    A passenger behavior specifying unit that specifies a passenger behavior that is a behavior of a passenger other than the driver in the vehicle, and
    From the combination of the driver consciousness level and the passenger behavior, the driver state specifying unit that specifies the driver state, which is the driver's state, and the driver state specifying unit.
    A vehicle control device including a vehicle control unit that controls the vehicle according to the driver's state.
  2.  車両の運転者の意識のレベルである運転者意識レベル及び前記車両における前記運転者以外の同乗者の行動である同乗者行動の組み合わせと、前記運転者の状態である運転者状態とを関連付ける運転者状態関連情報を記憶する運転者状態記憶部と、
     前記運転者から前記運転者意識レベルを特定する運転者意識レベル特定部と、
     前記同乗者から前記同乗者行動を特定する同乗者行動特定部と、
     前記運転者状態記憶部に記憶されているデータ量が予め定められた閾値よりも少ない場合に、前記運転者意識レベル特定部により特定された前記運転者意識レベル及び前記同乗者行動特定部により特定された前記同乗者行動の組み合わせから、前記運転者状態を特定するとともに、前記運転者意識レベル特定部により特定された前記運転者意識レベル及び前記同乗者行動特定部により特定された前記同乗者行動の前記組み合わせと、前記特定された同乗者行動と、を関連付けて、前記運転者状態関連情報として、前記運転者状態記憶部に記憶させ、前記運転者状態記憶部に記憶されているデータ量が予め定められた閾値以上である場合に、前記運転者状態関連情報から、前記運転者意識レベル特定部により特定された前記運転者意識レベル及び前記同乗者行動特定部により特定された前記同乗者行動の組み合わせに関連付けられている前記運転者状態を特定する運転者状態特定部と、
     前記運転者状態特定部により特定された前記運転者状態に応じて、前記車両を制御する車両制御部と、を備えること
     を特徴とする車両制御装置。
    Driving that associates the combination of the driver consciousness level, which is the level of consciousness of the driver of the vehicle, and the passenger behavior, which is the behavior of passengers other than the driver in the vehicle, with the driver state, which is the state of the driver. A driver state storage unit that stores information related to the person state,
    A driver consciousness level specifying unit that specifies the driver consciousness level from the driver,
    A passenger behavior identification unit that identifies the passenger behavior from the passenger,
    When the amount of data stored in the driver state storage unit is less than a predetermined threshold value, it is specified by the driver consciousness level specified by the driver consciousness level specifying unit and the passenger behavior specifying unit. The driver state is specified from the combination of the passenger behaviors, and the driver consciousness level specified by the driver consciousness level specifying unit and the passenger behavior specified by the passenger behavior specifying unit. The combination of the above and the specified passenger behavior are associated with each other, and the amount of data stored in the driver state storage unit as the driver state-related information is stored in the driver state storage unit. When the value is equal to or higher than a predetermined threshold value, the driver consciousness level specified by the driver consciousness level specifying unit and the passenger behavior specified by the passenger behavior specifying unit are used from the driver state-related information. The driver state specifying unit that specifies the driver state associated with the combination of
    A vehicle control device including a vehicle control unit that controls the vehicle according to the driver state specified by the driver state specifying unit.
  3.  前記同乗者行動は、前記同乗者が前記運転者を観察することで前記同乗者が行う行動であること
     を特徴とする請求項1又は2記載の車両制御装置。
    The vehicle control device according to claim 1 or 2, wherein the passenger behavior is an behavior performed by the passenger by observing the driver.
  4.  前記同乗者行動特定部は、
     前記同乗者の座席における荷重の方向及び大きさを特定する荷重特定部、前記同乗者の視線の方向を特定する視線特定部、前記同乗者の生体に関する物理量を特定する物理量特定部、及び、前記同乗者が発した音声の内容を特定する音声特定部、の少なくとも何れか一つと、
     前記荷重特定部、前記視線特定部、前記物理量特定部及び前記音声特定部の前記少なくとも何れか一つで特定された結果から、前記同乗者行動を特定する特定部と、を備えること
     を特徴とする請求項1から3の何れか一項に記載の車両制御装置。
    The passenger behavior identification unit
    A load specifying unit that specifies the direction and magnitude of the load in the passenger's seat, a line-of-sight specifying unit that specifies the direction of the passenger's line of sight, a physical quantity specifying unit that specifies the physical quantity of the passenger's living body, and the above. At least one of the voice identification parts that specify the content of the voice emitted by the passenger, and
    It is characterized by including a specific unit that specifies the passenger behavior from the result specified by at least one of the load specifying unit, the line-of-sight specifying unit, the physical quantity specifying unit, and the voice specifying unit. The vehicle control device according to any one of claims 1 to 3.
  5.  前記視線特定部は、一つの撮像装置により撮像された、前記車両の内部の画像から、前記同乗者の顔を特定し、前記特定された顔から前記同乗者の視線の方向を特定すること
     を特徴とする請求項4に記載の車両制御装置。
    The line-of-sight identification unit identifies the face of the passenger from the image of the inside of the vehicle captured by one imaging device, and specifies the direction of the line of sight of the passenger from the specified face. The vehicle control device according to claim 4.
  6.  前記音声特定部は、一つの集音装置により取得された、前記車両の内部の音声から、前記同乗者の音声を特定し、前記特定された音声から前記内容を特定すること
     を特徴とする請求項4に記載の車両制御装置。
    The voice specifying unit is characterized in that the voice of the passenger is specified from the voice inside the vehicle acquired by one sound collecting device, and the content is specified from the specified voice. Item 4. The vehicle control device according to item 4.
  7.  前記車両制御部は、前記運転者状態特定部により特定された前記運転者状態が、前記運転者が前記車両を正常に運転できないことを示している場合に、前記車両を制御すること
     を特徴とする請求項1から6の何れか一項に記載の車両制御装置。
    The vehicle control unit is characterized in that the vehicle control unit controls the vehicle when the driver state specified by the driver state identification unit indicates that the driver cannot normally drive the vehicle. The vehicle control device according to any one of claims 1 to 6.
  8.  前記車両制御部は、
     前記車両が走行する道路の状況である道路状況を取得する道路状況取得部と、
     前記道路状況に応じて、前記車両を安全に停止させることができる停止位置を決定する停止位置決定部と、
     前記車両を前記停止位置に停止させる停止部と、を備えること
     を特徴とする請求項1から7の何れか一項に記載の車両制御装置。
    The vehicle control unit
    The road condition acquisition unit that acquires the road condition, which is the condition of the road on which the vehicle travels,
    A stop position determining unit that determines a stop position at which the vehicle can be safely stopped according to the road conditions, and a stop position determining unit.
    The vehicle control device according to any one of claims 1 to 7, further comprising a stop portion for stopping the vehicle at the stop position.
  9.  前記車両制御部は、
     前記運転者状態特定部により特定された前記運転者状態に応じて、目的地を設定する目的地自動設定部と、
     前記目的地へのルートを決定するルート探索決定部と、
     前記ルートで、前記車両を走行させる走行制御部と、を備えること
     を特徴とする請求項1から6の何れか一項に記載の車両制御装置。
    The vehicle control unit
    A destination automatic setting unit that sets a destination according to the driver state specified by the driver state specifying unit, and a destination automatic setting unit.
    The route search determination unit that determines the route to the destination,
    The vehicle control device according to any one of claims 1 to 6, further comprising a travel control unit for traveling the vehicle on the route.
  10.  コンピュータを、
     車両の運転者の意識のレベルである運転者意識レベルを特定する運転者意識レベル特定部、
     前記車両における前記運転者以外の同乗者の行動である同乗者行動を特定する同乗者行動特定部、
     前記運転者意識レベル及び前記同乗者行動の組み合わせから、前記運転者の状態である運転者状態を特定する運転者状態特定部、及び、
     前記運転者状態に応じて、前記車両を制御する車両制御部、として機能させること
     を特徴とするプログラム。
    Computer,
    Driver consciousness level identification unit, which specifies the driver consciousness level, which is the level of consciousness of the driver of the vehicle.
    Passenger behavior identification unit that specifies passenger behavior that is the behavior of passengers other than the driver in the vehicle,
    From the combination of the driver consciousness level and the passenger behavior, the driver state specifying unit that specifies the driver state, which is the driver's state, and the driver state specifying unit, and
    A program characterized by functioning as a vehicle control unit that controls the vehicle according to the driver's state.
  11.  コンピュータを、
     車両の運転者の意識のレベルである運転者意識レベル及び前記車両における前記運転者以外の同乗者の行動である同乗者行動の組み合わせと、前記運転者の状態である運転者状態とを関連付ける運転者状態関連情報を記憶する運転者状態記憶部、
     前記運転者から前記運転者意識レベルを特定する運転者意識レベル特定部、
     前記同乗者から前記同乗者行動を特定する同乗者行動特定部、
     前記運転者状態記憶部に記憶されているデータ量が予め定められた閾値よりも少ない場合に、前記運転者意識レベル特定部により特定された前記運転者意識レベル及び前記同乗者行動特定部により特定された前記同乗者行動の組み合わせから、前記運転者状態を特定するとともに、前記運転者意識レベル特定部により特定された前記運転者意識レベル及び前記同乗者行動特定部により特定された前記同乗者行動の前記組み合わせと、前記特定された同乗者行動と、を関連付けて、前記運転者状態関連情報として、前記運転者状態記憶部に記憶させ、前記運転者状態記憶部に記憶されているデータ量が予め定められた閾値以上である場合に、前記運転者状態関連情報から、前記運転者意識レベル特定部により特定された前記運転者意識レベル及び前記同乗者行動特定部により特定された前記同乗者行動の組み合わせに関連付けられている前記運転者状態を特定する運転者状態特定部、及び、
     前記運転者状態特定部により特定された前記運転者状態に応じて、前記車両を制御する車両制御部、として機能させること
     を特徴とするプログラム。
    Computer,
    Driving that associates the combination of the driver consciousness level, which is the level of consciousness of the driver of the vehicle, and the passenger behavior, which is the behavior of passengers other than the driver in the vehicle, with the driver state, which is the state of the driver. Driver state storage unit that stores person state related information,
    Driver consciousness level specifying unit that specifies the driver consciousness level from the driver,
    Passenger behavior identification unit that identifies the passenger behavior from the passenger,
    When the amount of data stored in the driver state storage unit is less than a predetermined threshold value, it is specified by the driver consciousness level specified by the driver consciousness level specifying unit and the passenger behavior specifying unit. The driver state is specified from the combination of the passenger behaviors, and the driver consciousness level specified by the driver consciousness level specifying unit and the passenger behavior specified by the passenger behavior specifying unit. The combination of the above and the specified passenger behavior are associated with each other, and the amount of data stored in the driver state storage unit as the driver state-related information is stored in the driver state storage unit. When the value is equal to or higher than a predetermined threshold value, the driver consciousness level specified by the driver consciousness level specifying unit and the passenger behavior specified by the passenger behavior specifying unit are used from the driver state-related information. The driver state specifying unit that identifies the driver state associated with the combination of, and
    A program characterized by functioning as a vehicle control unit that controls the vehicle according to the driver state specified by the driver state specifying unit.
  12.  車両の運転者の意識のレベルである運転者意識レベルを特定し、
     前記車両における前記運転者以外の同乗者の行動である同乗者行動を特定し、
     前記運転者意識レベル及び前記同乗者行動の組み合わせから、前記運転者の状態である運転者状態を特定し、
     前記運転者状態に応じて、前記車両を制御すること
     を特徴とする車両制御方法。
    Identify the driver consciousness level, which is the level of consciousness of the driver of the vehicle,
    Identifying passenger behavior, which is the behavior of passengers other than the driver in the vehicle,
    From the combination of the driver consciousness level and the passenger behavior, the driver state, which is the state of the driver, is specified.
    A vehicle control method comprising controlling the vehicle according to the driver state.
  13.  車両の運転者の意識のレベルである運転者意識レベルを特定し、
     前記車両における前記運転者以外の同乗者の行動である同乗者行動を特定し、
     前記特定された運転者意識レベル及び前記特定された同乗者行動の組み合わせから、前記運転者の状態である運転者状態を特定し、
     前記特定された運転者意識レベル及び前記特定された同乗者行動の組み合わせと、前記特定された運転者状態とを関連付ける運転者状態関連情報を記憶する車両制御方法であって、
     前記記憶された運転者状態関連情報のデータ量が予め定められた閾値以上となった場合に、前記運転者状態を特定する方法を、前記特定された運転者意識レベル及び前記特定された同乗者行動の組み合わせに関連付けられている前記運転者状態を特定する方法に切り替えること
     を特徴とする車両制御方法。
    Identify the driver consciousness level, which is the level of consciousness of the driver of the vehicle,
    Identifying passenger behavior, which is the behavior of passengers other than the driver in the vehicle,
    From the combination of the specified driver awareness level and the specified passenger behavior, the driver state, which is the state of the driver, is specified.
    A vehicle control method for storing driver state-related information relating a combination of the specified driver awareness level and the specified passenger behavior to the specified driver state.
    When the amount of data of the stored driver state-related information exceeds a predetermined threshold value, the method for specifying the driver state is described by the specified driver awareness level and the specified passenger. A vehicle control method comprising switching to a method of identifying the driver state associated with a combination of actions.
PCT/JP2020/027990 2020-07-20 2020-07-20 Vehicle control device, program, and vehicle control method WO2022018781A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2022538492A JP7361929B2 (en) 2020-07-20 2020-07-20 Vehicle control device, program and vehicle control method
PCT/JP2020/027990 WO2022018781A1 (en) 2020-07-20 2020-07-20 Vehicle control device, program, and vehicle control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/027990 WO2022018781A1 (en) 2020-07-20 2020-07-20 Vehicle control device, program, and vehicle control method

Publications (1)

Publication Number Publication Date
WO2022018781A1 true WO2022018781A1 (en) 2022-01-27

Family

ID=79728564

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/027990 WO2022018781A1 (en) 2020-07-20 2020-07-20 Vehicle control device, program, and vehicle control method

Country Status (2)

Country Link
JP (1) JP7361929B2 (en)
WO (1) WO2022018781A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005323691A (en) * 2004-05-12 2005-11-24 Toyota Motor Corp Condition display device and condition display method
JP2014019301A (en) * 2012-07-18 2014-02-03 Toyota Motor Corp Emergency evacuation device
JP2017136922A (en) * 2016-02-02 2017-08-10 富士通テン株式会社 Vehicle control device, on-vehicle device controller, map information generation device, vehicle control method, and on-vehicle device control method
JP2017138762A (en) * 2016-02-03 2017-08-10 トヨタ自動車株式会社 Driver's emotion estimation device
JP2018151693A (en) * 2017-03-09 2018-09-27 株式会社デンソーテン Drive supporting device and drive supporting method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005323691A (en) * 2004-05-12 2005-11-24 Toyota Motor Corp Condition display device and condition display method
JP2014019301A (en) * 2012-07-18 2014-02-03 Toyota Motor Corp Emergency evacuation device
JP2017136922A (en) * 2016-02-02 2017-08-10 富士通テン株式会社 Vehicle control device, on-vehicle device controller, map information generation device, vehicle control method, and on-vehicle device control method
JP2017138762A (en) * 2016-02-03 2017-08-10 トヨタ自動車株式会社 Driver's emotion estimation device
JP2018151693A (en) * 2017-03-09 2018-09-27 株式会社デンソーテン Drive supporting device and drive supporting method

Also Published As

Publication number Publication date
JP7361929B2 (en) 2023-10-16
JPWO2022018781A1 (en) 2022-01-27

Similar Documents

Publication Publication Date Title
DE112008002030B4 (en) Information providing device in vehicle
US20140342790A1 (en) Apparatus and method for safe drive inducing game
JP2008082822A (en) Watching object detector and watching object detection method
US20190147274A1 (en) Driver state determination apparatus, method, and recording medium
CN111762184A (en) Vehicle control system and vehicle
US20240000354A1 (en) Driving characteristic determination device, driving characteristic determination method, and recording medium
JP2012118011A (en) Information processor, on-vehicle device, and information processing method
US11030863B2 (en) Systems and methods for providing audio information in a vehicle
JP5513353B2 (en) Alarm device
KR101563639B1 (en) Alarming device for vehicle and method for warning driver of vehicles
JP2007286959A (en) Operation support device
US11631328B2 (en) Determination device, determination method, and non-transitory computer readable storage medium storing determining program
JP6468306B2 (en) Visual assistance device, method and program
WO2022018781A1 (en) Vehicle control device, program, and vehicle control method
JP7376996B2 (en) Vehicle dangerous situation determination device, vehicle dangerous situation determination method, and program
CN112272635B (en) Method and vehicle system for optimizing parking advice
JP2004064409A (en) Information recording device, information recording method, and information recording program
JP2009223187A (en) Display content controller, display content control method and display content control method program
JP2007121796A (en) Display control device
CN113492864A (en) Driving support device and data collection system
JP6833322B2 (en) Search device, search method, search program and recording medium, and navigation device
US20200103235A1 (en) Visual confirmation system for driver assist system
US20200023863A1 (en) Concentration degree determination device, concentration degree determination method, and program for determining concentration degree
US11881065B2 (en) Information recording device, information recording method, and program for recording information
CN116279485B (en) Automatic lane changing method based on unmanned vehicle, control chip and electronic equipment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20946200

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022538492

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20946200

Country of ref document: EP

Kind code of ref document: A1