JPWO2020079755A1 - Information providing device and information providing method - Google Patents

Information providing device and information providing method Download PDF

Info

Publication number
JPWO2020079755A1
JPWO2020079755A1 JP2020551635A JP2020551635A JPWO2020079755A1 JP WO2020079755 A1 JPWO2020079755 A1 JP WO2020079755A1 JP 2020551635 A JP2020551635 A JP 2020551635A JP 2020551635 A JP2020551635 A JP 2020551635A JP WO2020079755 A1 JPWO2020079755 A1 JP WO2020079755A1
Authority
JP
Japan
Prior art keywords
information
occupant
degree
confusion
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2020551635A
Other languages
Japanese (ja)
Inventor
匠 武井
匠 武井
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Publication of JPWO2020079755A1 publication Critical patent/JPWO2020079755A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W40/09Driving style or behaviour
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W50/16Tactile feedback to the driver, e.g. vibration or force feedback to the driver on the steering wheel or the accelerator pedal
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0013Planning or execution of driving tasks specially adapted for occupant comfort
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0872Driver physiology
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/089Driver voice
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/143Alarm means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/21Voice
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/22Psychological state; Stress level or workload
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/221Physiology, e.g. weight, heartbeat, health or special needs
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/225Direction of gaze
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/229Attention level, e.g. attentive to driving, reading or sleeping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/10Historical data

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Traffic Control Systems (AREA)

Abstract

混乱度判定部(4)は、乗員状態取得部(3)により取得された乗員状態情報を用いて乗員の混乱度を判定する。認識度判定部(5)は、自車状況取得部(2)により取得された周辺状況情報及び制御情報並びに乗員状態取得部(3)により取得された乗員状態情報を用いて車両の周辺状況及び自動的な制御に対する乗員の認識度を判定する。情報生成部(6)は、自車状況取得部(2)により取得された周辺状況情報及び制御情報、混乱度判定部(4)により判定された混乱度、並びに認識度判定部(5)により判定された認識度を用いて乗員に提供する情報を生成する。The confusion level determination unit (4) determines the confusion level of the occupant using the occupant state information acquired by the occupant state acquisition unit (3). The recognition degree determination unit (5) uses the surrounding condition information and control information acquired by the own vehicle status acquisition unit (2) and the occupant status information acquired by the occupant status acquisition unit (3) to obtain the peripheral condition of the vehicle and the surrounding condition of the vehicle. Determine the occupant's awareness of automatic control. The information generation unit (6) is based on the surrounding situation information and control information acquired by the vehicle status acquisition unit (2), the confusion level determined by the confusion level determination unit (4), and the recognition level determination unit (5). Information to be provided to the occupants is generated using the determined recognition level.

Description

この発明は、車両の制御に関する情報を乗員に提供する情報提供装置及び情報提供方法に関するものである。 The present invention relates to an information providing device and an information providing method for providing information on vehicle control to an occupant.

従来、車両の自動的な制御により自車両に急動作が発生した場合に、急動作の発生原因を乗員に容易に把握させるようにした情報提供装置が知られている。
例えば、特許文献1に記載された車両用表示装置は、自車両の周囲状況に応じて、自車両の自動的な制御が実行されるか否かを判定し、自動的な制御が実行されると判定された場合、周囲状況に基づいて自動的な制御の発生原因を含む周囲状況を示す画像を生成し、生成した画像を自車両内に設けられた画像表示手段に表示する。
Conventionally, there is known an information providing device that allows an occupant to easily grasp the cause of a sudden movement when the own vehicle suddenly moves due to automatic control of the vehicle.
For example, the vehicle display device described in Patent Document 1 determines whether or not automatic control of the own vehicle is executed according to the surrounding conditions of the own vehicle, and the automatic control is executed. If it is determined, an image showing the surrounding conditions including the cause of the automatic control is generated based on the surrounding conditions, and the generated image is displayed on the image display means provided in the own vehicle.

特開2017−187839号公報Japanese Unexamined Patent Publication No. 2017-187839

特許文献1記載の車両用表示装置のような従来の情報提供装置は、自車両に急動作が発生した場合のみに基づいて画像を表示していた。そのため、乗員がその急動作を予め予想できていた場合及びその急動作の発生原因を認識できていた場合等の不要な場面及び不要な内容についても画像が表示され、乗員に煩わしさを与えていた。また、急動作以外の動作が発生した場合には画像が表示されないため、その動作の発生原因を認識できていない乗員に不安感を与えていた。このように、従来の情報提供装置は、過不足のない情報提供ができていないという課題があった。 Conventional information providing devices such as the vehicle display device described in Patent Document 1 display an image only when a sudden operation occurs in the own vehicle. Therefore, images are displayed for unnecessary scenes and unnecessary contents such as when the occupant can predict the sudden movement in advance and when the cause of the sudden movement can be recognized, which causes annoyance to the occupant. It was. In addition, since the image is not displayed when an operation other than the sudden operation occurs, the occupant who cannot recognize the cause of the operation is anxious. As described above, the conventional information providing device has a problem that it cannot provide information in just proportion.

この発明は、上記のような課題を解決するためになされたもので、過不足のない情報提供を行うことを目的とする。 The present invention has been made to solve the above problems, and an object of the present invention is to provide just enough information.

この発明に係る情報提供装置は、車両の周辺状況を示す情報及び車両の自動的な制御に関する情報を取得する自車状況取得部と、車両の乗員の状態を示す情報を取得する乗員状態取得部と、乗員状態取得部により取得された情報を用いて乗員の混乱度を判定する混乱度判定部と、自車状況取得部及び乗員状態取得部により取得された情報を用いて車両の周辺状況及び自動的な制御に対する乗員の認識度を判定する認識度判定部と、自車状況取得部により取得された情報、混乱度判定部により判定された混乱度、及び認識度判定部により判定された認識度を用いて乗員に提供する情報を生成する情報生成部とを備えるものである。 The information providing device according to the present invention includes a vehicle status acquisition unit that acquires information indicating the surrounding conditions of the vehicle and information related to automatic control of the vehicle, and an occupant status acquisition unit that acquires information indicating the status of the occupants of the vehicle. The confusion level determination unit that determines the degree of confusion of the occupants using the information acquired by the occupant status acquisition unit, and the surrounding conditions of the vehicle and the surrounding conditions of the vehicle using the information acquired by the own vehicle status acquisition unit and the occupant status acquisition unit. The recognition level determination unit that determines the recognition level of the occupant for automatic control, the information acquired by the vehicle status acquisition unit, the confusion level determined by the confusion level determination unit, and the recognition determined by the recognition level determination unit. It is provided with an information generation unit that generates information to be provided to the occupants using the degree.

この発明によれば、車両の周辺状況を示す情報及び車両の自動的な制御に関する情報に加え、車両の乗員の状態を示す情報に基づいて、乗員に提供する情報を生成するようにしたので、過不足のない情報提供を行うことができる。 According to the present invention, in addition to the information indicating the surrounding condition of the vehicle and the information regarding the automatic control of the vehicle, the information to be provided to the occupants is generated based on the information indicating the condition of the occupants of the vehicle. It is possible to provide just enough information.

実施の形態1に係る情報提供装置の構成例を示すブロック図である。It is a block diagram which shows the structural example of the information providing apparatus which concerns on Embodiment 1. FIG. 実施の形態1に係る情報提供装置が有する情報生成テーブルの一例を示す図である。It is a figure which shows an example of the information generation table which the information providing apparatus which concerns on Embodiment 1 has. 実施の形態1に係る情報提供装置の動作例を示すフローチャートである。It is a flowchart which shows the operation example of the information providing apparatus which concerns on Embodiment 1. FIG. 図4A、図4B、及び図4Cは、実施の形態1に係る情報提供装置1の情報提供例を示す図である。4A, 4B, and 4C are diagrams showing an example of providing information of the information providing device 1 according to the first embodiment. 実施の形態1に係る情報提供装置のハードウェア構成例を示す図である。It is a figure which shows the hardware configuration example of the information providing apparatus which concerns on Embodiment 1. FIG.

以下、この発明をより詳細に説明するために、この発明を実施するための形態について、添付の図面に従って説明する。
実施の形態1.
図1は、実施の形態1に係る情報提供装置1の構成例を示すブロック図である。情報提供装置1は、車両に搭載される。また、情報提供装置1は、同じ車両に搭載された車両制御装置10、入力装置11、及び出力装置12と接続される。
Hereinafter, in order to explain the present invention in more detail, a mode for carrying out the present invention will be described with reference to the accompanying drawings.
Embodiment 1.
FIG. 1 is a block diagram showing a configuration example of the information providing device 1 according to the first embodiment. The information providing device 1 is mounted on the vehicle. Further, the information providing device 1 is connected to a vehicle control device 10, an input device 11, and an output device 12 mounted on the same vehicle.

車両制御装置10は、ミリ波レーダ、LIDAR(Light Detection And Ranging)又はコーナセンサ等の各種車外センサ、及び、V2X(Vehicle to Everything)通信機又はGNSS(Global Navigation Satellite System)受信機等の各種通信機と接続され、周辺状況を監視しながら車両の自動的な制御(運転支援を含む)を実現する。また、車両制御装置10は、光ビーコンを備えた路側機、又は他車両等に搭載された外部装置等との間で情報を送受信しながら車両の自動的な制御(運転支援を含む)を実現してもよい。 The vehicle control device 10 includes various out-of-vehicle sensors such as a millimeter-wave radar, LIDAR (Light Detection And Ringing) or corner sensor, and various V2X (Vehicle to Everything) communication devices or GNSS (Global Navigation Satellite System) receivers and the like. It is connected to the aircraft and realizes automatic control of the vehicle (including driving assistance) while monitoring the surrounding conditions. Further, the vehicle control device 10 realizes automatic control (including driving support) of the vehicle while transmitting and receiving information with a roadside unit equipped with an optical beacon or an external device mounted on another vehicle or the like. You may.

この車両制御装置10は、加速、制動、及び操舵等の車両の自動的な制御の内容を示す情報(以下、「制御情報」と称する)を出力する。なお、制御情報は、現在作動中の制御に関わる情報だけでなく、これから作動する予定の制御に関わる情報を含んでもよい。また、車両制御装置10は、車両の自動的な制御が作動する要因となる車両の周辺状況を示す情報(以下、「周辺状況情報」と称する)を出力する。 The vehicle control device 10 outputs information (hereinafter, referred to as "control information") indicating the contents of automatic control of the vehicle such as acceleration, braking, and steering. The control information may include not only information related to the control currently being operated but also information related to the control scheduled to be operated in the future. Further, the vehicle control device 10 outputs information indicating the surrounding condition of the vehicle (hereinafter, referred to as "peripheral condition information") that causes the automatic control of the vehicle to operate.

入力装置11は、車両に乗車している乗員による入力を受け付けるためのマイク、リモコン、又はタッチセンサ等、及び、乗員の状態を監視するためのカメラ、赤外線センサ、又は生体センサ等である。入力装置11は、マイク、リモコン、タッチセンサ、カメラ、赤外線センサ、又は生体センサ等を用いて検知した乗員の状態を示す情報(以下、乗員状態情報)を出力する。乗員状態情報は、乗員の表情、視線、挙動、音声、心拍数、脳波、及び発汗量のうちの少なくとも1つを含む。また、入力装置11は、乗員の顔画像又は音声等を用いて個人を認識し、乗員毎の乗車回数又は乗車時間等、車両の自動的な制御に対する乗員毎の経験値を示す情報を生成し、乗員状態情報に含めてもよい。なお、乗員状態情報は、上記例に限定されず、乗員の状態を示す情報であれば何でもよい。 The input device 11 is a microphone, a remote controller, a touch sensor or the like for receiving an input by an occupant in a vehicle, and a camera, an infrared sensor, a biosensor or the like for monitoring the state of the occupant. The input device 11 outputs information indicating the state of the occupant detected by using a microphone, a remote controller, a touch sensor, a camera, an infrared sensor, a biological sensor, or the like (hereinafter, occupant state information). The occupant state information includes at least one of the occupant's facial expression, line of sight, behavior, voice, heart rate, electroencephalogram, and sweating amount. In addition, the input device 11 recognizes the individual by using the face image or voice of the occupant, and generates information indicating the experience value of each occupant with respect to the automatic control of the vehicle, such as the number of occupants or the occupancy time. , May be included in the occupant status information. The occupant status information is not limited to the above example, and may be any information indicating the occupant status.

出力装置12は、スピーカ等の音声出力装置、液晶若しくは有機EL(Electro Luminescence)を用いた表示装置、又は、アクチュエータを内蔵して振動できるようになっているハンドル若しくはシート等である。 The output device 12 is an audio output device such as a speaker, a display device using liquid crystal or organic EL (Electroluminescence), or a handle or seat having an actuator built-in and capable of vibrating.

情報提供装置1は、自車状況取得部2、乗員状態取得部3、混乱度判定部4、認識度判定部5、情報生成部6、及び情報生成テーブル7を備える。情報提供装置1が情報を提供する対象は運転者のみでなく、複数の乗員すべてを対象とできるものであるが、ここでは説明を単純化するため、乗員が一人であるものとして説明する。 The information providing device 1 includes a vehicle status acquisition unit 2, an occupant status acquisition unit 3, a confusion degree determination unit 4, a recognition degree determination unit 5, an information generation unit 6, and an information generation table 7. The target to which the information providing device 1 provides information can be not only the driver but also all a plurality of occupants, but here, for the sake of simplicity, it is assumed that there is only one occupant.

自車状況取得部2は、自車両の制御内容を示す制御情報と制御要因となった周辺状況情報とを、車両制御装置10から取得し、乗員状態取得部3及び情報生成部6へ出力する。 The own vehicle status acquisition unit 2 acquires control information indicating the control content of the own vehicle and peripheral situation information that has become a control factor from the vehicle control device 10 and outputs the control information to the occupant state acquisition unit 3 and the information generation unit 6. ..

乗員状態取得部3は、乗員状態情報を入力装置11から取得すると共に制御情報及び周辺状況情報を自車状況取得部2から取得し、取得した情報を認識度判定部5又は混乱度判定部4へ出力する。具体的には、乗員状態取得部3は、制御情報を自車状況取得部2から取得し、この制御情報に基づき車両の自動的な制御が作動したか否かを検知する。そして、乗員状態取得部3は、車両の自動的な制御が作動したことを検知した場合に、作動時を含む前後一定時間(例えば、1分間)における乗員の状態変化が分かるように、当該一定時間内の乗員状態情報の時系列データを混乱度判定部4及び認識度判定部5へ出力する。また、乗員状態取得部3は、車両の自動的な制御が作動したことを検知した場合に、作動時を含む前後一定時間(例えば、1分間)における自車両とその周辺状況との状態変化が分かるように、当該一定時間内の制御情報及び周辺状況情報の時系列データを認識度判定部5へ出力する。 The occupant state acquisition unit 3 acquires the occupant state information from the input device 11 and the control information and the surrounding situation information from the own vehicle status acquisition unit 2, and obtains the acquired information from the recognition degree determination unit 5 or the confusion degree determination unit 4. Output to. Specifically, the occupant state acquisition unit 3 acquires control information from the own vehicle status acquisition unit 2 and detects whether or not automatic control of the vehicle is activated based on the control information. Then, when the occupant state acquisition unit 3 detects that the automatic control of the vehicle has been activated, the constant occupant state acquisition unit 3 can know the change in the occupant state during a certain period of time (for example, 1 minute) before and after the operation. The time-series data of the occupant state information within the time is output to the confusion degree determination unit 4 and the recognition degree determination unit 5. Further, when the occupant state acquisition unit 3 detects that the automatic control of the vehicle has been activated, the state change between the own vehicle and its surroundings during a certain period of time (for example, 1 minute) before and after the operation including the operation is changed. As can be seen, the time-series data of the control information and the surrounding situation information within the fixed time is output to the recognition degree determination unit 5.

混乱度判定部4は、乗員状態情報を乗員状態取得部3から取得し、乗員の状態に基づき乗員の混乱度を判定する。例えば、混乱度判定部4は、乗員が咄嗟に発した音声の音量に基づき、音圧60dB未満であれば混乱度「低」、音圧60dB以上70dB未満であれば混乱度「中」、音圧70dB以上であれば混乱度「高」のように混乱度を判定する。なお、混乱度判定部4は、音声の音量だけでなく、韻律情報又は言語情報を利用して判定するようにしてもよい。また、混乱度判定部4は、DNN(Deep Neural Network)法等の一般的な方法を用いて、音声、カメラ画像、及び心拍等の複数の情報を統合して判定するようにしてもよい。また、混乱度判定部4は、DNN法等の一般的な方法を用いて、音声以外の情報である乗員の表情、視線、挙動、心拍数、脳波、及び発汗量のうちの少なくとも1つから個別に判定するようにしてもよい。さらに、混乱度判定部4は、車両の自動的な制御に対する乗員の経験値が高い場合、車両の自動的な制御に対する乗員の理解度が高いことを考慮して混乱度を低下させるようにしてもよい。また、混乱度判定部4は、車両の自動的な制御に対する乗員の経験度が低い場合、車両の自動的な制御に対する乗員の理解度が低いことを考慮して混乱度を上昇させるようにしてもよい。その後、混乱度判定部4は、判定した混乱度を情報生成部6へ出力する。 The confusion level determination unit 4 acquires the occupant state information from the occupant state acquisition unit 3 and determines the confusion level of the occupant based on the occupant state. For example, the confusion level determination unit 4 has a confusion level of "low" if the sound pressure is less than 60 dB, and a confusion level of "medium" if the sound pressure is 60 dB or more and less than 70 dB, based on the volume of the voice uttered by the occupant. If the pressure is 70 dB or more, the degree of confusion is determined as if the degree of confusion is “high”. The confusion level determination unit 4 may determine by using not only the volume of the voice but also the prosodic information or the linguistic information. In addition, the confusion level determination unit 4 may integrate and determine a plurality of information such as voice, camera image, and heartbeat by using a general method such as a DNN (Deep Neural Network) method. In addition, the confusion degree determination unit 4 uses a general method such as the DNN method to obtain information other than voice from at least one of the occupant's facial expression, line of sight, behavior, heart rate, brain wave, and sweating amount. It may be judged individually. Further, when the occupant's experience value for the automatic control of the vehicle is high, the confusion degree determination unit 4 reduces the confusion degree in consideration of the high degree of understanding of the occupant for the automatic control of the vehicle. May be good. Further, the confusion level determination unit 4 increases the confusion level in consideration of the fact that the occupant's understanding of the automatic control of the vehicle is low when the occupant's experience with the automatic control of the vehicle is low. May be good. After that, the confusion degree determination unit 4 outputs the determined confusion degree to the information generation unit 6.

認識度判定部5は、周辺状況情報、制御情報、及び乗員状態情報を乗員状態取得部3から取得する。そして、認識度判定部5は、取得したこれらの情報を用いて、乗員の状態に基づき、車両の周辺状況及び車両の自動的な制御に対する乗員の認識度を判定する。例えば、認識度判定部5は、カメラ画像等を用いて乗員の瞼の開き具合を検知し、瞼の開き具合に基づき覚醒状態か否かを判定し、乗員の覚醒状態、顔向き及び視線方向等に基づき乗員の車両周辺の確認状況等を判定する。そして、認識度判定部5は、乗員が睡眠中等の非覚醒状態であれば認識度「低」、覚醒状態であるもののスマートフォン等を操作中のように車外の状況を視認できていない状態であれば認識度「中」、車外の状況を視認できている状態であれば認識度「高」のように認識度を判定する。なお、認識度判定部5は、混乱度判定部4と同様に、DNN法等の一般的な方法を用いて、複数の情報を統合して判定するようにしてもよい。その後、認識度判定部5は、判定した認識度を情報生成部6へ出力する。 The recognition degree determination unit 5 acquires peripheral situation information, control information, and occupant state information from the occupant state acquisition unit 3. Then, the recognition degree determination unit 5 determines the recognition degree of the occupant with respect to the surrounding condition of the vehicle and the automatic control of the vehicle based on the state of the occupant by using the acquired information. For example, the recognition degree determination unit 5 detects the opening degree of the occupant's eyelids using a camera image or the like, determines whether or not the occupant is in the awake state based on the opening degree of the eyelids, and determines the occupant's awakening state, face orientation, and line-of-sight direction. Judge the confirmation status around the vehicle of the occupants based on the above. Then, the recognition degree determination unit 5 has a recognition degree of "low" if the occupant is in a non-awakening state such as during sleep, and is in a state of being awake but unable to visually recognize the situation outside the vehicle as if operating a smartphone or the like. For example, the recognition level is determined as "medium", and if the situation outside the vehicle can be visually recognized, the recognition level is determined as "high". Note that the recognition degree determination unit 5 may integrate and determine a plurality of pieces of information by using a general method such as the DNN method, similarly to the confusion degree determination unit 4. After that, the recognition degree determination unit 5 outputs the determined recognition degree to the information generation unit 6.

情報生成部6は、周辺状況情報及び制御情報を自車状況取得部2から取得し、混乱度を混乱度判定部4から取得し、認識度を認識度判定部5から取得する。そして、情報生成部6は、情報生成テーブル7を参照し、周辺状況情報及び制御情報、混乱度、並びに認識度に基づき、乗員に提供する情報を生成する。情報生成部6による情報生成方法は後述する。 The information generation unit 6 acquires peripheral situation information and control information from the own vehicle situation acquisition unit 2, acquires the degree of confusion from the confusion degree determination unit 4, and acquires the recognition degree from the recognition degree determination unit 5. Then, the information generation unit 6 refers to the information generation table 7 and generates information to be provided to the occupant based on the surrounding situation information and control information, the degree of confusion, and the degree of recognition. The information generation method by the information generation unit 6 will be described later.

情報生成テーブル7は、例えば、制御内容に対し、混乱度と認識度に応じて乗員に提供する情報の情報量を規定したテーブルである。図2は、実施の形態1に係る情報提供装置1が有する情報生成テーブル7の一例を示す図である。図2の例では、制御内容「自動操舵」に対し、乗員の混乱度が高いほど、提供する情報量が増加する。また、制御内容「自動操舵」に対し、乗員の認識度が低いほど、提供する情報量が増加する。なお、混乱度の高低と認識度の高低との組み合わせに対する情報量の種類は、図2の例に限定されない。例えば、混乱度「低」に対して認識度「低」、「中」及び「高」の3種類の情報量、混乱度「中」に対して認識度「低」、「中」及び「高」の3種類の情報量、並びに、混乱度「高」に対して認識度「低」、「中」及び「高」の3種類の情報量の、合計9種類の情報量が、情報生成テーブル7によって規定されていてもよい。また、図2の例では、混乱度及び認識度が「低」、「中」及び「高」の3段階で表現されているが、この3段階に限定されるものではない。混乱度及び認識度は、例えば「1」から「100」までの数値で表現されてもよく、この場合には乗員に提供する情報のより細かな制御が可能である。 The information generation table 7 is, for example, a table that defines the amount of information to be provided to the occupants according to the degree of confusion and the degree of recognition for the control content. FIG. 2 is a diagram showing an example of an information generation table 7 included in the information providing device 1 according to the first embodiment. In the example of FIG. 2, the amount of information provided increases as the degree of confusion of the occupant increases with respect to the control content "automatic steering". In addition, the lower the occupant's awareness of the control content "automatic steering", the greater the amount of information provided. The type of information amount for the combination of the high and low levels of confusion and the high and low levels of recognition is not limited to the example of FIG. For example, the recognition level is "low", "medium" and "high" for the confusion level "low", and the recognition levels are "low", "medium" and "high" for the confusion level "medium". The information generation table contains a total of nine types of information, including three types of information, and three types of information, "low", "medium", and "high", with respect to "high" confusion. It may be specified by 7. Further, in the example of FIG. 2, the degree of confusion and the degree of recognition are expressed in three stages of "low", "medium", and "high", but the degree is not limited to these three stages. The degree of confusion and the degree of recognition may be expressed by numerical values from "1" to "100", for example, and in this case, finer control of the information provided to the occupant is possible.

また、図2では、乗員に提供する情報の情報量として、警告のみ、警告と制御内容、及び、警告と制御内容と制御要因の3種類を例示するが、情報量はこれに限定されない。警告、制御内容、及び制御要因の情報提供は、音若しくは音声、又は表示等によって行われる。図2では、制御内容が「自動操舵」の場合を例示するが、制御内容はこれに限定されず、「自動ブレーキ」及び「交差点右左折」等であってもよい。また、図2では、混乱度及び認識度の値によらず乗員に情報を提供するようになっているが、例えば混乱度「低」及び認識度「高」の場合には情報を提供しないようになっていてもよい。このように、情報生成部6は、混乱度が予め定められた値以上であり、かつ認識度が予め定められた値未満である場合に情報を提供し、混乱度が上記予め定められた値未満であり、かつ認識度が上記予め定められた値以上である場合に情報を提供しない構成であってもよい。 Further, in FIG. 2, three types of information, warning only, warning and control content, and warning and control content and control factor, are illustrated as the amount of information to be provided to the occupant, but the amount of information is not limited to this. Information on warnings, control contents, and control factors is provided by sound, voice, display, or the like. In FIG. 2, the case where the control content is "automatic steering" is illustrated, but the control content is not limited to this, and may be "automatic braking", "turn right or left at an intersection", or the like. Further, in FIG. 2, information is provided to the occupants regardless of the values of the degree of confusion and the degree of recognition. However, for example, when the degree of confusion is "low" and the degree of recognition is "high", the information is not provided. It may be. As described above, the information generation unit 6 provides information when the degree of confusion is equal to or higher than the predetermined value and the degree of recognition is less than the predetermined value, and the degree of confusion is the predetermined value. Information may not be provided when the degree of recognition is less than or equal to or greater than the above-mentioned predetermined value.

次に、実施の形態1に係る情報提供装置1の動作を説明する。
図3は、実施の形態1に係る情報提供装置1の動作例を示すフローチャートである。情報提供装置1は、例えば車両のエンジンが作動している期間、図3のフローチャートに示される動作を繰り返す。
Next, the operation of the information providing device 1 according to the first embodiment will be described.
FIG. 3 is a flowchart showing an operation example of the information providing device 1 according to the first embodiment. The information providing device 1 repeats the operation shown in the flowchart of FIG. 3, for example, while the engine of the vehicle is operating.

図4A、図4B、及び図4Cは、実施の形態1に係る情報提供装置1の情報提供例を示す図である。ここでは、図4Aに示されるように、出力装置12の一種である表示装置が車両のインスツルメントパネルに設置されている場合を例に挙げる。また、このインスツルメントパネルには、出力装置12の一種であるスピーカ(図示せず)も設置されているものとする。
また、以下では、車両制御装置10が、車両前方に障害物を検出し、当該障害物を回避するために自動操舵を作動させる場合を想定する。この場合、車両制御装置10は、制御内容を「自動操舵」とした制御情報と、制御要因を「前方の障害物を検知」とした周辺状況情報とを自車状況取得部2へ出力する。
4A, 4B, and 4C are diagrams showing an example of providing information of the information providing device 1 according to the first embodiment. Here, as shown in FIG. 4A, a case where a display device, which is a kind of output device 12, is installed on an instrument panel of a vehicle will be taken as an example. Further, it is assumed that a speaker (not shown), which is a kind of output device 12, is also installed on the instrument panel.
Further, in the following, it is assumed that the vehicle control device 10 detects an obstacle in front of the vehicle and activates automatic steering in order to avoid the obstacle. In this case, the vehicle control device 10 outputs control information whose control content is "automatic steering" and peripheral situation information whose control factor is "detecting an obstacle in front" to the own vehicle status acquisition unit 2.

ステップST1において、自車状況取得部2は、上述の周辺状況情報及び制御情報を車両制御装置10から取得し、乗員状態取得部3は、乗員状態情報を入力装置11から取得する。 In step ST1, the own vehicle status acquisition unit 2 acquires the above-mentioned peripheral situation information and control information from the vehicle control device 10, and the occupant status acquisition unit 3 acquires the occupant status information from the input device 11.

ステップST2において、乗員状態取得部3は、自車状況取得部2が取得した制御情報に基づき車両の自動的な制御が作動したか否かを検知する。乗員状態取得部3は、車両の自動的な制御が作動したことを検知した場合(ステップST2“YES”)、作動時を含む前後一定時間内の乗員状態情報を、混乱度判定部4及び認識度判定部5へ出力し、それ以外の場合(ステップST2“NO”)、処理はステップST1へ戻る。 In step ST2, the occupant state acquisition unit 3 detects whether or not the automatic control of the vehicle has been activated based on the control information acquired by the own vehicle status acquisition unit 2. When the occupant status acquisition unit 3 detects that the automatic control of the vehicle has been activated (step ST2 “YES”), the occupant status acquisition unit 3 recognizes the occupant status information within a certain time before and after the operation including the operation time by the confusion degree determination unit 4 and the recognition unit. The output is output to the degree determination unit 5, and in other cases (step ST2 “NO”), the process returns to step ST1.

ステップST3において、混乱度判定部4は、乗員状態取得部3から取得した乗員状態情報に基づき、作動時を含む前後一定時間内の乗員の混乱度を判定する。例えば、図4Aのように、乗員の「どうしたの!?」というような驚いた音声の音量が70dBである場合、混乱度判定部4は、混乱度「高」と判定する。 In step ST3, the confusion level determination unit 4 determines the confusion level of the occupant within a certain time before and after the operation including the operation time, based on the occupant state information acquired from the occupant state acquisition unit 3. For example, as shown in FIG. 4A, when the volume of the surprised voice such as "What happened !?" of the occupant is 70 dB, the confusion level determination unit 4 determines that the confusion level is "high".

ステップST4において、認識度判定部5は、乗員状態取得部3から取得した周辺状況情報、制御情報、及び乗員状態情報に基づき、作動時を含む前後一定時間内の、車両の周辺状況及び車両の自動的な制御に対する乗員の認識度を判定する。例えば、乗員が覚醒状態であり自車両の制御状態をある程度把握できたものの、窓の外に視線を向けておらず車外の状況を完全には認識できていなかった場合、認識度判定部5は、認識度「中」と判定する。 In step ST4, the recognition level determination unit 5 determines the peripheral condition of the vehicle and the vehicle's peripheral condition within a certain time before and after the operation, based on the peripheral condition information, the control information, and the occupant state information acquired from the occupant state acquisition unit 3. Determine the occupant's awareness of automatic control. For example, if the occupant is in an awake state and can grasp the control state of the own vehicle to some extent, but does not look out of the window and cannot completely recognize the situation outside the vehicle, the recognition degree determination unit 5 may perform the recognition level determination unit 5. , Judged as "medium" recognition.

ステップST5において、情報生成部6は、情報生成テーブル7を参照し、車両の自動的な制御の内容とステップST3で判定された混乱度とステップST4で判定された認識度とに応じた情報を生成する。例えば、制御内容が「自動操舵」であり、混乱度が「高」であり、認識度が「中」であった場合、情報生成部6は、図2のような情報生成テーブル7に基づき、乗員に提供する情報の情報量を警告と制御内容と制御要因と決定する。そして、情報生成部6は、自動的な制御が作動したことを乗員に知らせるための「ピピピ」等の警告音、又は警告画面を生成する。また、情報生成部6は、制御内容である「自動操舵」を乗員に知らせるための音声又は表示画面の少なくとも一方を生成する。さらに、情報生成部6は、制御要因である「前方の障害物を検知」を乗員に知らせるための音声又は表示画面の少なくとも一方を生成する。 In step ST5, the information generation unit 6 refers to the information generation table 7 and provides information according to the content of automatic control of the vehicle, the degree of confusion determined in step ST3, and the degree of recognition determined in step ST4. Generate. For example, when the control content is "automatic steering", the degree of confusion is "high", and the degree of recognition is "medium", the information generation unit 6 is based on the information generation table 7 as shown in FIG. The amount of information to be provided to the occupants is determined as a warning, control content, and control factor. Then, the information generation unit 6 generates a warning sound such as "pip-pip" or a warning screen for notifying the occupant that the automatic control has been activated. In addition, the information generation unit 6 generates at least one of a voice or a display screen for notifying the occupant of the control content "automatic steering". Further, the information generation unit 6 generates at least one of a voice or a display screen for notifying the occupant of the control factor "detection of an obstacle in front".

ステップST6において、情報生成部6は、ステップST5で生成した情報を、出力装置12へ出力する。混乱度「高」かつ認識度「低」の場合、出力装置12は、情報生成部6が生成した警告、制御内容及び制御要因の情報を、乗員に対して提供する。例えば、まず情報生成部6は、図4Bのように、出力装置12の一種であるスピーカに「ピピピ」という警告音を出力させる。同時に、情報生成部6は、出力装置12の一種である表示装置に、警告アイコンと「自動操舵」というテキストとを含む警告画面を表示させる。続いて情報生成部6は、図4Cのように、出力装置12の一種であるスピーカに、制御内容である「自動操舵」と制御要因である「前方の障害物を検知」とを表す「前方に障害物を検知したため、自動操舵により回避します。」という音声を出力させる。同時に、情報生成部6は、出力装置12の一種である表示装置に、制御内容である「自動操舵」と制御要因である「前方の障害物を検知」とを表す画面を表示させる。 In step ST6, the information generation unit 6 outputs the information generated in step ST5 to the output device 12. When the degree of confusion is "high" and the degree of recognition is "low", the output device 12 provides the occupant with information on the warning, the control content, and the control factor generated by the information generation unit 6. For example, first, as shown in FIG. 4B, the information generation unit 6 causes a speaker, which is a kind of output device 12, to output a beeping sound. At the same time, the information generation unit 6 causes a display device, which is a kind of output device 12, to display a warning screen including a warning icon and the text "automatic steering". Subsequently, as shown in FIG. 4C, the information generation unit 6 attaches the speaker, which is a kind of the output device 12, to the "forward" which represents the control content "automatic steering" and the control factor "detects an obstacle in front". Has detected an obstacle, so it will be avoided by automatic steering. " At the same time, the information generation unit 6 causes the display device, which is a kind of output device 12, to display a screen showing the control content "automatic steering" and the control factor "detecting an obstacle in front".

なお、例えば、混乱度「低」かつ認識度「高」の場合、どうして自動操舵が作動したかを乗員は理解していると考えられる。そのため、情報生成部6は、自動操舵が作動したことを乗員に知らせるために、図4Bのような警告音又は警告表示の情報のみ生成すればよい。 For example, when the degree of confusion is "low" and the degree of recognition is "high", it is considered that the occupant understands why the automatic steering is activated. Therefore, the information generation unit 6 need only generate the warning sound or warning display information as shown in FIG. 4B in order to notify the occupant that the automatic steering has been activated.

また、図4A、図4B及び図4Cでは、情報提供装置1がスピーカ及び表示装置を利用して乗員に警告したが、ハンドル又はシート等に内蔵されたアクチュエータを振動させることにより乗員に警告してもよい。 Further, in FIGS. 4A, 4B and 4C, the information providing device 1 warns the occupant by using the speaker and the display device, but warns the occupant by vibrating the actuator built in the handle or the seat. May be good.

最後に、情報提供装置1のハードウェア構成を説明する。
図5は、実施の形態1に係る情報提供装置1のハードウェア構成例を示す図である。バス100には、CPU(Central Processing Unit)101、ROM(Read Only Memory)102、RAM(Random Access Memory)103、HDD(Hard Disk Drive)104、車両制御装置10、入力装置11、及び出力装置12が接続されている。情報提供装置1における自車状況取得部2、乗員状態取得部3、混乱度判定部4、認識度判定部5、及び情報生成部6の機能は、ROM102又はHDD104に格納されるプログラムを実行するCPU101により実現される。CPU101は、マルチコア等により複数処理を並行して実行することが可能であってもよい。RAM103は、CPU101がプログラム実行時に使用するメモリである。HDD104は、外部記憶装置の一例であり、情報生成テーブル7を記憶している。なお、外部記憶装置は、HDD104以外であってもよく、CD(Compact Disc)若しくはDVD(Digital Versatile Disc)等のディスク、又は、USB(Universal Serial Bus)メモリ若しくはSDカード等のフラッシュメモリを採用したストレージ等であってもよい。
Finally, the hardware configuration of the information providing device 1 will be described.
FIG. 5 is a diagram showing a hardware configuration example of the information providing device 1 according to the first embodiment. The bus 100 includes a CPU (Central Processing Unit) 101, a ROM (Read Only Memory) 102, a RAM (Random Access Memory) 103, an HDD (Hard Disk Drive) 104, a vehicle control device 10, an input device 11, and an output device 12. Is connected. The functions of the own vehicle status acquisition unit 2, the occupant status acquisition unit 3, the confusion degree determination unit 4, the recognition degree determination unit 5, and the information generation unit 6 in the information providing device 1 execute a program stored in the ROM 102 or the HDD 104. It is realized by the CPU 101. The CPU 101 may be capable of executing a plurality of processes in parallel by using a multi-core processor or the like. The RAM 103 is a memory used by the CPU 101 when executing a program. The HDD 104 is an example of an external storage device and stores the information generation table 7. The external storage device may be other than the HDD 104, and a disk such as a CD (Compact Disc) or a DVD (Digital Versaille Disc), or a flash memory such as a USB (Universal Serial Bus) memory or an SD card is adopted. It may be storage or the like.

自車状況取得部2、乗員状態取得部3、混乱度判定部4、認識度判定部5、及び情報生成部6の機能は、ソフトウェア、ファームウェア、又はソフトウェアとファームウェアとの組み合わせにより実現される。ソフトウェア又はファームウェアはプログラムとして記述され、ROM102又はHDD104に格納される。CPU101は、ROM102又はHDD104に格納されたプログラムを読みだして実行することにより、各部の機能を実現する。即ち、情報提供装置1は、CPU101により実行されるときに、図3のフローチャートで示されるステップが結果的に実行されることになるプログラムを格納するためのROM102又はHDD104を備える。また、このプログラムは、自車状況取得部2、乗員状態取得部3、混乱度判定部4、認識度判定部5、及び情報生成部6の手順又は方法をコンピュータに実行させるものであるとも言える。 The functions of the vehicle status acquisition unit 2, the occupant status acquisition unit 3, the confusion degree determination unit 4, the recognition degree determination unit 5, and the information generation unit 6 are realized by software, firmware, or a combination of software and firmware. The software or firmware is described as a program and stored in the ROM 102 or HDD 104. The CPU 101 realizes the functions of each part by reading and executing the program stored in the ROM 102 or the HDD 104. That is, the information providing device 1 includes a ROM 102 or an HDD 104 for storing a program in which the step shown in the flowchart of FIG. 3 is eventually executed when executed by the CPU 101. Further, it can be said that this program causes the computer to execute the procedures or methods of the own vehicle status acquisition unit 2, the occupant status acquisition unit 3, the confusion degree determination unit 4, the recognition degree determination unit 5, and the information generation unit 6. ..

以上のように、実施の形態1に係る情報提供装置1は、自車状況取得部2、乗員状態取得部3、混乱度判定部4、認識度判定部5、及び情報生成部6を備える。自車状況取得部2は、車両の周辺状況を示す情報及び車両の自動的な制御に関する情報を取得する。乗員状態取得部3は、車両の乗員の状態を示す情報を取得する。混乱度判定部4は、乗員状態取得部3により取得された情報を用いて乗員の混乱度を判定する。認識度判定部5は、自車状況取得部2及び乗員状態取得部3により取得された情報を用いて車両の周辺状況及び自動的な制御に対する乗員の認識度を判定する。情報生成部6は、自車状況取得部2により取得された情報、混乱度判定部4により判定された混乱度、及び認識度判定部5により判定された認識度を用いて乗員に提供する情報を生成する。この構成により、情報提供装置1は、車両の自動的な制御に伴う乗員の混乱度並びにその前後の車両の周辺状況及び自動的な制御に対する乗員の認識度に基づき、提供する情報を変更できる。そのため、車両制御装置10による自動運転又は運転支援の制御内容をよく知らず不安を感じている乗員に対しては、十分な情報を提供でき、安心感を与えることができる。また、車両制御装置10による自動運転又は運転支援の制御内容を理解した乗員に対しては、不要な情報提供を低減でき、煩わしさを抑制することができる。したがって、情報提供装置1は、過不足のない情報提供を行うことができる。 As described above, the information providing device 1 according to the first embodiment includes the own vehicle status acquisition unit 2, the occupant state acquisition unit 3, the confusion degree determination unit 4, the recognition degree determination unit 5, and the information generation unit 6. The own vehicle status acquisition unit 2 acquires information indicating the surrounding condition of the vehicle and information regarding automatic control of the vehicle. The occupant state acquisition unit 3 acquires information indicating the state of the occupants of the vehicle. The confusion level determination unit 4 determines the confusion level of the occupant using the information acquired by the occupant state acquisition unit 3. The recognition degree determination unit 5 determines the recognition degree of the occupant for the surrounding condition of the vehicle and the automatic control by using the information acquired by the own vehicle condition acquisition unit 2 and the occupant state acquisition unit 3. The information generation unit 6 provides information to the occupant using the information acquired by the vehicle status acquisition unit 2, the confusion degree determined by the confusion degree determination unit 4, and the recognition degree determined by the recognition degree determination unit 5. To generate. With this configuration, the information providing device 1 can change the information to be provided based on the degree of confusion of the occupant due to the automatic control of the vehicle, the surrounding situation of the vehicle before and after the occupant, and the degree of recognition of the occupant for the automatic control. Therefore, sufficient information can be provided to the occupants who are unfamiliar with the control contents of automatic driving or driving support by the vehicle control device 10 and feel uneasy, and can give a sense of security. Further, it is possible to reduce unnecessary information provision to the occupants who understand the control contents of automatic driving or driving support by the vehicle control device 10, and it is possible to suppress annoyance. Therefore, the information providing device 1 can provide just enough information.

また、実施の形態1の情報生成部6は、混乱度判定部4により判定された混乱度が高いほど、又は認識度判定部5により判定された認識度が低いほど、乗員に提供する情報量を増加させる。これにより、情報提供装置1は、車両制御装置10による自動運転又は運転支援の制御内容に対する乗員の理解度に応じて提供する情報量を増減できるため、過不足のない情報提供を行うことができる。 Further, the information generation unit 6 of the first embodiment provides an amount of information to the occupant as the degree of confusion determined by the degree of confusion determination unit 4 is higher or the degree of recognition determined by the recognition degree determination unit 5 is lower. To increase. As a result, the information providing device 1 can increase or decrease the amount of information to be provided according to the degree of understanding of the occupant regarding the control content of the automatic driving or the driving support by the vehicle control device 10, so that the information can be provided in just proportion. ..

なお、上記では、混乱度判定部4は、車両の自動的な制御の作動時を含む前後一定時間内の乗員状態情報に基づき乗員の混乱度を判定したが、乗員に対する過去の混乱度の判定履歴を用いて、現在の自動的な制御の作動時の混乱度を推定するようにしてもよい。例えば、今回の自動操舵が乗員にとって3度目の作動である場合に、過去2回の自動操舵の作動時の混乱度がともに「高」であった場合、混乱度判定部4は、今回の乗員状態に関わらず作動時の混乱度を「高」と推定する。このようにすれば、情報提供装置1は、実際に乗員が混乱するよりも早く即座に適切な情報提供ができるため、乗員の混乱が発生すること自体を防ぐことができる。 In the above, the confusion degree determination unit 4 determines the confusion degree of the occupant based on the occupant state information within a certain time before and after including the operation of the automatic control of the vehicle, but determines the past confusion degree with respect to the occupant. The history may be used to estimate the degree of confusion at the time of activation of the current automatic control. For example, if the current automatic steering is the third operation for the occupant, and the degree of confusion during the operation of the past two automatic steerings is both "high", the confusion degree determination unit 4 will perform the current occupant. The degree of confusion during operation is estimated to be "high" regardless of the state. In this way, the information providing device 1 can immediately and immediately provide appropriate information before the occupant is actually confused, so that it is possible to prevent the occupant from being confused.

また、上記では、情報生成部6は、単純に認識度の高低に応じて提供する情報を生成したが、乗員が具体的にどこまで周囲状況及び自動的な制御を認識していたかに基づき、提供する情報を生成するようにしてもよい。例えば、車両制御装置10が、車両前方に障害物を検出し、当該障害物を回避するために自動操舵を作動させる場合を想定する。この場合、認識度判定部5は、乗員の視線情報等を基に、乗員が車外に視線を向けて車両前方の障害物を視認しているか否かを判定する。認識度判定部5により乗員が車両前方の障害物を視認していると判定された場合、情報生成部6は、乗員が制御内容である「自動操舵」と制御要因である「前方の障害物を検知」とを認識できていると判断し、自動操舵についての情報を生成しない。一方、認識度判定部5により乗員が車両前方の障害物に気付いていないと判定された場合、情報生成部6は、乗員が制御内容である「自動操舵」と制御要因である「前方の障害物を検知」とを認識できていないと判断し、自動操舵についての情報を生成する。このように、認識度判定部5が、車両の周辺状況及び自動的な制御のうちの乗員が認識していた事柄を判定し、情報生成部6が、認識度判定部5により乗員が認識していたと判定された事柄以外の車両の周辺状況及び自動的な制御に関する情報を生成することにより、より過不足のない適切な情報提供が可能となる。 Further, in the above, the information generation unit 6 simply generates information to be provided according to the level of recognition, but provides it based on how far the occupant specifically recognizes the surrounding situation and automatic control. You may try to generate the information to be used. For example, it is assumed that the vehicle control device 10 detects an obstacle in front of the vehicle and activates automatic steering in order to avoid the obstacle. In this case, the recognition degree determination unit 5 determines whether or not the occupant directs his / her line of sight to the outside of the vehicle and visually recognizes an obstacle in front of the vehicle based on the line-of-sight information of the occupant. When the recognition level determination unit 5 determines that the occupant is visually recognizing an obstacle in front of the vehicle, the information generation unit 6 determines that the occupant controls "automatic steering" and the control factor "obstacle in front". It is judged that "detection" is recognized, and information about automatic steering is not generated. On the other hand, when the recognition level determination unit 5 determines that the occupant is not aware of the obstacle in front of the vehicle, the information generation unit 6 determines that the occupant controls "automatic steering" and the control factor "front obstacle". It is determined that "object is detected" and information about automatic steering is generated. In this way, the recognition level determination unit 5 determines the surrounding conditions of the vehicle and the matters recognized by the occupant in the automatic control, and the information generation unit 6 recognizes the occupant by the recognition level determination unit 5. By generating information on the surrounding conditions of the vehicle and automatic control other than the matters determined to have been, it is possible to provide more appropriate information without excess or deficiency.

なお、情報生成部6は、自車状況取得部2により取得された周辺状況情報及び制御情報、混乱度判定部4により判定された混乱度、並びに認識度判定部5により取得された認識度を用いて乗員に提供する情報を生成し、かつ、乗員に提供した後、混乱度判定部4により新たに判定された混乱度が予め定められた値以上低下しない場合、乗員に提供する情報量を増加させるようにしてもよい。ここで、予め定められた値は、例えば、混乱度1段階分に相当するものとする。情報生成部6は、混乱度「中」及び認識度「高」に応じて提供する情報量を警告と制御内容とに決定し、出力装置12を用いて情報を提供した後、混乱度が「中」から「低」に低下しなければ、混乱度が「中」であっても、警告と制御内容とに制御要因を追加して再び情報提供を行う。このようにすることで、情報提供装置1は、提供した情報量が不足している場面で情報量を増加させることができ、より過不足のない適切な情報提供が可能となる。 The information generation unit 6 obtains the surrounding situation information and control information acquired by the own vehicle status acquisition unit 2, the confusion degree determined by the confusion degree determination unit 4, and the recognition degree acquired by the recognition degree determination unit 5. If the information to be provided to the occupant is generated by using the information and is provided to the occupant, and then the degree of confusion newly determined by the confusion degree determination unit 4 does not decrease by more than a predetermined value, the amount of information to be provided to the occupant is determined. You may try to increase it. Here, it is assumed that the predetermined value corresponds to, for example, one level of confusion. The information generation unit 6 determines the amount of information to be provided according to the degree of confusion "medium" and the degree of recognition "high" as warnings and control contents, and after providing the information using the output device 12, the degree of confusion is "". If it does not decrease from "medium" to "low", even if the degree of confusion is "medium", control factors are added to the warning and the control content, and information is provided again. By doing so, the information providing device 1 can increase the amount of information in a situation where the amount of provided information is insufficient, and it is possible to provide more appropriate information without excess or deficiency.

また、情報生成部6は、乗員の混乱度が他の乗員との会話に起因して高くなった場合、当該乗員に提供する情報を、混乱度が高くなる前の情報のまま変更しないようにしてもよい。この場合、乗員状態取得部3は、入力装置11からの音声に基づき、複数の乗員が交互に発話している場合に、当該複数の乗員が会話をしていると判定する。または、乗員状態取得部3は、入力装置11からのカメラ画像及び音声に基づき、複数の乗員が互いを見ながら発話している場合に、当該複数の乗員が会話をしていると判定してもよい。さらに、乗員状態取得部3は、乗員の音声に基づき、乗員が車両の周辺状況及び自動的な制御とは関係ない内容の発話をしているか否かを判定する。そして、情報生成部6は、情報提供対象の乗員の混乱度が高くなった場合、かつ、乗員状態取得部3により複数の乗員が車両の周辺状況及び自動的な制御とは関係ない内容の会話をしていると判定された場合、情報提供対象の乗員の混乱度が他の乗員との会話に起因して高くなったと判断して、当該乗員に提供する情報を変更しない。一方、情報生成部6は、情報提供対象の乗員の混乱度が高くなった場合、かつ、当該乗員が発話していないか乗員状態取得部3により複数の乗員が車両の周辺状況及び自動的な制御に関係する内容の会話をしていると判定された場合、情報提供対象の乗員の混乱度が車両の自動的な制御に起因して高くなったと判断して、当該乗員に提供する情報を混乱度及び認識度に応じて変更する。このようにすることで、情報提供装置1は、自動的な制御とは無関係な混乱に対して不要に情報を提供してしまうことを防ぐことができ、乗員に煩わしさを与えてしまうことを防ぐことができる。 Further, when the occupant's degree of confusion becomes high due to conversation with another occupant, the information generation unit 6 does not change the information provided to the occupant as it was before the degree of confusion became high. You may. In this case, the occupant state acquisition unit 3 determines that the plurality of occupants are having a conversation when the plurality of occupants are alternately speaking based on the voice from the input device 11. Alternatively, the occupant state acquisition unit 3 determines that the plurality of occupants are having a conversation when a plurality of occupants are speaking while looking at each other based on the camera image and the voice from the input device 11. May be good. Further, the occupant state acquisition unit 3 determines, based on the voice of the occupant, whether or not the occupant is speaking in a content unrelated to the surrounding conditions of the vehicle and automatic control. Then, in the information generation unit 6, when the degree of confusion of the occupants to be provided with information becomes high, and the occupant state acquisition unit 3 allows a plurality of occupants to have a conversation having no relation to the surrounding situation of the vehicle and automatic control. If it is determined that the information is provided, it is determined that the degree of confusion of the occupant to whom the information is provided has increased due to conversation with other occupants, and the information provided to the occupant is not changed. On the other hand, in the information generation unit 6, when the degree of confusion of the occupant to be provided with information becomes high, and whether the occupant is speaking or the occupant status acquisition unit 3 causes a plurality of occupants to automatically change the surrounding conditions of the vehicle and the vehicle. If it is determined that the conversation is related to control, it is determined that the degree of confusion of the occupant to whom the information is provided has increased due to the automatic control of the vehicle, and the information to be provided to the occupant is provided. Change according to the degree of confusion and recognition. By doing so, the information providing device 1 can prevent unnecessary information from being provided for confusion unrelated to automatic control, which causes annoyance to the occupants. Can be prevented.

また、上記では、情報を提供する対象が乗員一人であるものとして情報提供装置1を説明したが、車両に搭乗している複数の乗員すべてを対象とすることが可能である。その場合、例えば、情報提供装置1は、各乗員の混乱度と認識度とに基づき、提供する情報量が最も多い乗員を選択し、その乗員の混乱度と認識度とに応じて情報を生成し、車両に1つ設置されている出力装置12を用いて当該情報を全乗員に提供する。または、情報提供装置1は、各乗員の混乱度と認識度とに応じて個別に情報を生成し、各座席に設置されている出力装置12を用いて個別に情報を提供するようにしてもよい。 Further, in the above description, the information providing device 1 has been described assuming that the target for providing information is one occupant, but it is possible to target all a plurality of occupants in the vehicle. In that case, for example, the information providing device 1 selects the occupant with the largest amount of information to be provided based on the degree of confusion and recognition of each occupant, and generates information according to the degree of confusion and recognition of the occupant. Then, the information is provided to all the occupants by using one output device 12 installed in the vehicle. Alternatively, the information providing device 1 may generate information individually according to the degree of confusion and recognition of each occupant, and may provide the information individually using the output device 12 installed in each seat. Good.

また、上記では、情報提供装置1、車両制御装置10、入力装置11、及び出力装置12が車両に搭載された例を説明したが、この構成に限定されない。例えば、情報提供装置1が、車外のサーバ装置として構成され、このサーバ装置が車両に搭載された車両制御装置10、入力装置11、及び出力装置12と無線通信を行うことにより情報提供を行ってもよい。また、情報提供装置1の自車状況取得部2、乗員状態取得部3、混乱度判定部4、認識度判定部5、情報生成部6、及び情報生成テーブル7が、サーバ装置とスマートフォン等の携帯端末と車載器とに分散されていてもよい。 Further, in the above description, an example in which the information providing device 1, the vehicle control device 10, the input device 11, and the output device 12 are mounted on the vehicle has been described, but the present invention is not limited to this configuration. For example, the information providing device 1 is configured as a server device outside the vehicle, and the server device provides information by performing wireless communication with the vehicle control device 10, the input device 11, and the output device 12 mounted on the vehicle. May be good. In addition, the vehicle status acquisition unit 2, the occupant status acquisition unit 3, the confusion degree determination unit 4, the recognition degree determination unit 5, the information generation unit 6, and the information generation table 7 of the information providing device 1 are the server device, the smartphone, and the like. It may be distributed between the mobile terminal and the in-vehicle device.

また、上記では、情報提供装置1を四輪自動車に適用した場合を例に挙げて説明したが、二輪車、船舶、航空機、及びパーソナルモビリティ等の乗員が搭乗する移動体に適用してもよい。 Further, in the above description, the case where the information providing device 1 is applied to a four-wheeled vehicle has been described as an example, but it may be applied to a moving body such as a two-wheeled vehicle, a ship, an aircraft, and a personal mobility on which an occupant is on board.

なお、本発明はその発明の範囲内において、実施の形態の任意の構成要素の変形、又は実施の形態の任意の構成要素の省略が可能である。 It should be noted that, within the scope of the present invention, it is possible to modify any component of the embodiment or omit any component of the embodiment.

この発明に係る情報提供装置は、乗員の状態を考慮して情報を提供するようにしたので、自動運転又は運転支援を行う車両等の情報提供装置に用いるのに適している。 Since the information providing device according to the present invention provides information in consideration of the state of the occupant, it is suitable for use in an information providing device such as a vehicle that performs automatic driving or driving support.

1 情報提供装置、2 自車状況取得部、3 乗員状態取得部、4 混乱度判定部、5 認識度判定部、6 情報生成部、7 情報生成テーブル、10 車両制御装置、11 入力装置、12 出力装置、100 バス、101 CPU、102 ROM、103 RAM、104 HDD。 1 Information providing device, 2 Own vehicle status acquisition unit, 3 Crew status acquisition unit, 4 Confusion level determination unit, 5 Recognition level determination unit, 6 Information generation unit, 7 Information generation table, 10 Vehicle control device, 11 Input device, 12 Output device, 100 bus, 101 CPU, 102 ROM, 103 RAM, 104 HDD.

Claims (9)

車両の周辺状況を示す情報及び前記車両の自動的な制御に関する情報を取得する自車状況取得部と、
前記車両の乗員の状態を示す情報を取得する乗員状態取得部と、
前記乗員状態取得部により取得された情報を用いて前記乗員の混乱度を判定する混乱度判定部と、
前記自車状況取得部及び前記乗員状態取得部により取得された情報を用いて前記車両の周辺状況及び自動的な制御に対する前記乗員の認識度を判定する認識度判定部と、
前記自車状況取得部により取得された情報、前記混乱度判定部により判定された混乱度、及び前記認識度判定部により判定された認識度を用いて前記乗員に提供する情報を生成する情報生成部とを備える情報提供装置。
The own vehicle status acquisition unit that acquires information indicating the surrounding status of the vehicle and information related to the automatic control of the vehicle, and
An occupant status acquisition unit that acquires information indicating the occupant status of the vehicle, and
A confusion level determination unit that determines the confusion level of the occupant using the information acquired by the occupant state acquisition unit,
A recognition degree determination unit that determines the recognition degree of the occupant with respect to the surrounding condition of the vehicle and automatic control by using the information acquired by the own vehicle status acquisition unit and the occupant state acquisition unit.
Information generation that generates information to be provided to the occupant using the information acquired by the vehicle status acquisition unit, the confusion degree determined by the confusion degree determination unit, and the recognition degree determined by the recognition degree determination unit. An information providing device equipped with a unit.
前記情報生成部は、前記混乱度判定部により判定された混乱度又は前記認識度判定部により判定された認識度に応じて、前記乗員に提供する情報量を増減させることを特徴とする請求項1記載の情報提供装置。 The claim is characterized in that the information generation unit increases or decreases the amount of information provided to the occupant according to the degree of confusion determined by the degree of confusion determination unit or the degree of recognition determined by the degree of recognition determination unit. 1. The information providing device according to 1. 前記情報生成部は、前記混乱度判定部により判定された混乱度が高いほど、又は前記認識度判定部により判定された認識度が低いほど、前記乗員に提供する情報量を増加させることを特徴とする請求項2記載の情報提供装置。 The information generation unit is characterized in that the amount of information provided to the occupant increases as the degree of confusion determined by the confusion degree determination unit increases or the degree of recognition determined by the recognition degree determination unit decreases. 2. The information providing device according to claim 2. 前記混乱度判定部は、前記乗員状態取得部により取得された前記乗員の表情、視線、挙動、音声、心拍数、発汗量、及び乗車回数のうちの少なくとも1つの情報を用いて前記乗員の混乱度を判定することを特徴とする請求項1記載の情報提供装置。 The confusion degree determination unit uses at least one of the facial expression, line of sight, behavior, voice, heart rate, sweating amount, and number of rides of the occupant acquired by the occupant state acquisition unit to confuse the occupant. The information providing device according to claim 1, wherein the degree is determined. 前記混乱度判定部は、前記乗員に対する過去の混乱度の判定履歴を用いて現在の混乱度を推定することを特徴とする請求項1記載の情報提供装置。 The information providing device according to claim 1, wherein the confusion degree determination unit estimates the current confusion degree by using the past confusion degree determination history for the occupant. 前記認識度判定部は、前記車両の周辺状況及び自動的な制御のうちの前記乗員が認識していた事柄を判定し、
前記情報生成部は、前記認識度判定部により前記乗員が認識していたと判定された事柄以外の前記車両の周辺状況及び自動的な制御に関する情報を生成することを特徴とする請求項1記載の情報提供装置。
The recognition degree determination unit determines the surrounding conditions of the vehicle and the automatic control that the occupant has recognized.
The first aspect of claim 1, wherein the information generation unit generates information on the surrounding conditions of the vehicle and automatic control other than the matters determined to have been recognized by the occupant by the recognition degree determination unit. Information providing device.
前記情報生成部は、前記自車状況取得部により取得された情報、前記混乱度判定部により判定された混乱度、及び前記認識度判定部により判定された認識度を用いて前記乗員に提供する情報を生成し、かつ、前記乗員に提供した後、前記混乱度判定部により新たに判定された混乱度が予め定められた値以上低下しない場合、前記乗員に提供する情報量を増加させることを特徴とする請求項1記載の情報提供装置。 The information generation unit provides the occupant with the information acquired by the vehicle status acquisition unit, the confusion degree determined by the confusion degree determination unit, and the recognition degree determined by the recognition degree determination unit. After the information is generated and provided to the occupant, if the degree of confusion newly determined by the confusion degree determination unit does not decrease by a predetermined value or more, the amount of information provided to the occupant is increased. The information providing device according to claim 1, which is characterized. 前記情報生成部は、前記乗員の混乱度が他の乗員との会話に起因して高くなった場合、前記乗員に提供する情報を変更しないことを特徴とする請求項1記載の情報提供装置。 The information providing device according to claim 1, wherein the information generating unit does not change the information provided to the occupant when the degree of confusion of the occupant becomes high due to a conversation with another occupant. 自車状況取得部が、車両の周辺状況を示す情報及び前記車両の自動的な制御に関する情報を取得し、
乗員状態取得部が、前記車両の乗員の状態を示す情報を取得し、
混乱度判定部が、前記乗員状態取得部により取得された情報を用いて前記乗員の混乱度を判定し、
認識度判定部が、前記自車状況取得部及び前記乗員状態取得部により取得された情報を用いて前記車両の周辺状況及び自動的な制御に対する前記乗員の認識度を判定し、
情報生成部が、前記自車状況取得部により取得された情報、前記混乱度判定部により判定された混乱度、及び前記認識度判定部により判定された認識度を用いて前記乗員に提供する情報を生成する情報提供方法。
The own vehicle status acquisition unit acquires information indicating the surrounding condition of the vehicle and information on the automatic control of the vehicle.
The occupant status acquisition unit acquires information indicating the occupant status of the vehicle, and
The confusion level determination unit determines the confusion level of the occupant using the information acquired by the occupant state acquisition unit.
The recognition degree determination unit determines the recognition degree of the occupant with respect to the surrounding situation of the vehicle and automatic control by using the information acquired by the own vehicle status acquisition unit and the occupant state acquisition unit.
Information provided by the information generation unit to the occupant using the information acquired by the vehicle status acquisition unit, the confusion degree determined by the confusion degree determination unit, and the recognition degree determined by the recognition degree determination unit. Information provision method to generate.
JP2020551635A 2018-10-16 2018-10-16 Information providing device and information providing method Pending JPWO2020079755A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/038502 WO2020079755A1 (en) 2018-10-16 2018-10-16 Information providing device and information providing method

Publications (1)

Publication Number Publication Date
JPWO2020079755A1 true JPWO2020079755A1 (en) 2021-02-15

Family

ID=70283827

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2020551635A Pending JPWO2020079755A1 (en) 2018-10-16 2018-10-16 Information providing device and information providing method

Country Status (5)

Country Link
US (1) US20220032942A1 (en)
JP (1) JPWO2020079755A1 (en)
CN (1) CN112823383A (en)
DE (1) DE112018008075T5 (en)
WO (1) WO2020079755A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113741485A (en) * 2021-06-23 2021-12-03 阿波罗智联(北京)科技有限公司 Control method and device for cooperative automatic driving of vehicle and road, electronic equipment and vehicle
KR20230000626A (en) * 2021-06-25 2023-01-03 현대자동차주식회사 Apparatus and method for generating warning vibration of steering wheel

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006042903A (en) * 2004-07-30 2006-02-16 Mazda Motor Corp Driving support device for vehicle
JP2008056059A (en) * 2006-08-30 2008-03-13 Equos Research Co Ltd Driver state estimation device and driving support device
JP2018151904A (en) * 2017-03-14 2018-09-27 オムロン株式会社 Driver monitoring device, driver monitoring method and program for driver monitoring

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9751534B2 (en) * 2013-03-15 2017-09-05 Honda Motor Co., Ltd. System and method for responding to driver state
JP6747019B2 (en) 2016-04-01 2020-08-26 日産自動車株式会社 Vehicle display method and vehicle display device
JP6822325B2 (en) * 2017-06-21 2021-01-27 日本電気株式会社 Maneuvering support device, maneuvering support method, program
US10807604B2 (en) * 2018-09-11 2020-10-20 Hyundai Motor Company Vehicle and method for controlling thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006042903A (en) * 2004-07-30 2006-02-16 Mazda Motor Corp Driving support device for vehicle
JP2008056059A (en) * 2006-08-30 2008-03-13 Equos Research Co Ltd Driver state estimation device and driving support device
JP2018151904A (en) * 2017-03-14 2018-09-27 オムロン株式会社 Driver monitoring device, driver monitoring method and program for driver monitoring

Also Published As

Publication number Publication date
CN112823383A (en) 2021-05-18
WO2020079755A1 (en) 2020-04-23
US20220032942A1 (en) 2022-02-03
DE112018008075T5 (en) 2021-07-08

Similar Documents

Publication Publication Date Title
US10377303B2 (en) Management of driver and vehicle modes for semi-autonomous driving systems
US10745032B2 (en) ADAS systems using haptic stimuli produced by wearable devices
US10300929B2 (en) Adaptive user interface for an autonomous vehicle
US10640123B2 (en) Driver monitoring system
US9613459B2 (en) System and method for in-vehicle interaction
CN107628033B (en) Navigation based on occupant alertness
JP2016124542A (en) Automatic operation interface
US10474145B2 (en) System and method of depth sensor activation
US10462281B2 (en) Technologies for user notification suppression
CN110182222B (en) Autonomous driving control apparatus and method for notifying departure of preceding vehicle
US10286781B2 (en) Method for the automatic execution of at least one driving function of a motor vehicle
US11532053B1 (en) Method and system for detecting use of vehicle safety systems
US10369943B2 (en) In-vehicle infotainment control systems and methods
US11092458B2 (en) Navigation system with operation obstacle alert mechanism and method of operation thereof
US9886034B2 (en) Vehicle control based on connectivity of a portable device
US10386853B2 (en) Method for accessing a vehicle-specific electronic device
JPWO2020079755A1 (en) Information providing device and information providing method
US11912267B2 (en) Collision avoidance system for vehicle interactions
CN116867697A (en) Physical feedback validation of traffic events by a driving assistance system
US20230356713A1 (en) Driver Assessment System to Determine a Driver's Ability to Perform Driving Tasks
JP2020125089A (en) Vehicle control device
WO2023035260A1 (en) Method and apparatus for prompting state information of vehicle
US20220262246A1 (en) Distraction-sensitive traffic drive-off alerts
JP2023006970A (en) Driving support control device and driving support control method
WO2019167561A1 (en) Driving assistance system and driving assistance method

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20200828

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20210706

A601 Written request for extension of time

Free format text: JAPANESE INTERMEDIATE CODE: A601

Effective date: 20210709

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20220105