WO2018078889A1 - Information processing device, information processing method, and information processing program - Google Patents

Information processing device, information processing method, and information processing program Download PDF

Info

Publication number
WO2018078889A1
WO2018078889A1 PCT/JP2016/082876 JP2016082876W WO2018078889A1 WO 2018078889 A1 WO2018078889 A1 WO 2018078889A1 JP 2016082876 W JP2016082876 W JP 2016082876W WO 2018078889 A1 WO2018078889 A1 WO 2018078889A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
vehicle
driver
information processing
data
Prior art date
Application number
PCT/JP2016/082876
Other languages
French (fr)
Japanese (ja)
Inventor
中村 成志
建太 手銭
Original Assignee
損害保険ジャパン日本興亜株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 損害保険ジャパン日本興亜株式会社 filed Critical 損害保険ジャパン日本興亜株式会社
Publication of WO2018078889A1 publication Critical patent/WO2018078889A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to an information processing apparatus, an information processing method, and an information processing program.
  • Patent Document 1 discloses a technology for performing driving support based on map information generated by associating statistical information obtained by statistically processing a driver's operation amount or the like as a time series pattern and vehicle position information. Has been.
  • An object of the present invention is to provide a technique for solving the above-described problems.
  • an information processing apparatus provides: Biometric information acquisition means for acquiring biometric information of the driver of the vehicle; Vehicle information acquisition means for acquiring vehicle information of the vehicle; A surrounding image acquisition means for acquiring a surrounding image of the vehicle including the vehicle from an imaging means attached to the vehicle; Storage means for storing a plurality of sets of the biological information, the vehicle information, and the surrounding image; Analyzing means for analyzing the plurality of sets, and generating information related to vehicle information and surrounding images in which the driver's biological information has changed; and Using the related information, action data output means for outputting action data in which at least one of vehicle information and surrounding images and actions to be taken by the driver are associated; Equipped with.
  • an information processing method includes: A biological information acquisition step for acquiring biological information of the driver of the vehicle; Vehicle information acquisition step of acquiring vehicle information of the vehicle; A surrounding image acquisition step of acquiring a surrounding image of the vehicle including the vehicle from an imaging means attached to the vehicle; A storage step of storing a plurality of sets of the biological information, the vehicle information, and the surrounding image; An analysis step of analyzing the plurality of sets and generating information related to vehicle information and surrounding images in which the driver's biometric information has changed; Using the related information, an action data output step of outputting action data in which at least one of vehicle information and surrounding images is associated with an action to be taken by the driver; including.
  • an information processing program provides: A biological information acquisition step for acquiring biological information of the driver of the vehicle; Vehicle information acquisition step of acquiring vehicle information of the vehicle; A surrounding image acquisition step of acquiring a surrounding image of the vehicle including the vehicle from an imaging means attached to the vehicle; A storage step of storing a plurality of sets of the biological information, the vehicle information, and the surrounding image; An analysis step of analyzing the plurality of sets and generating information related to vehicle information and surrounding images in which the driver's biometric information has changed; Using the related information, an action data output step of outputting action data in which at least one of vehicle information and surrounding images is associated with an action to be taken by the driver; Is executed on the computer.
  • the information processing apparatus 100 is an apparatus that outputs action data regarding actions to be taken by the driver.
  • the information processing apparatus 100 includes a biological information acquisition unit 101, a vehicle information acquisition unit 102, a surrounding image acquisition unit 103, a storage unit 104, an analysis unit 105, and a behavior data output unit 106. ,including.
  • the biological information acquisition unit 101 acquires biological information of the driver of the vehicle.
  • the vehicle information acquisition unit 102 acquires vehicle information of the vehicle.
  • the surrounding image acquisition unit 103 acquires a surrounding image of the vehicle including the vehicle from an imaging unit attached to the vehicle.
  • the storage unit 104 stores a plurality of sets of biological information, vehicle information, and surrounding images.
  • the analysis unit 105 analyzes a plurality of sets and generates related information between vehicle information and surrounding images in which the driver's biometric information has changed.
  • the behavior data output unit 106 outputs behavior data in which at least one of the vehicle information and the surrounding image and the behavior to be taken by the driver are associated with each other using the related information.
  • FIG. 2A is a diagram illustrating an example of an outline of processing by the information processing apparatus 200 according to the present embodiment.
  • FIG. 2B is a figure explaining the other example of the outline
  • FIG. 2A shows a situation where two vehicles are approaching the intersection from different directions.
  • FIG. 2B shows a situation where the vehicle is approaching the intersection and a pedestrian is walking on the pedestrian crossing.
  • images and videos are taken by a drive recorder attached to the vehicle.
  • the captured image or video is stored in association with biological information or vehicle information. Note that image and video imaging by the drive recorder may be always performed, or may be started when a predetermined trigger for imaging is detected.
  • the information processing apparatus 200 when the driver recognizes the existence of the partner vehicle and senses the danger, the information processing apparatus 200 responds to changes in biological information such as an increase in heart rate, blood pressure, and sweating. Analyze the stored image. And the relevant information which shows the relationship between the vehicle information when such a biometric information change arises and the imaged image is produced
  • biological information such as an increase in heart rate, blood pressure, and sweating.
  • the relevant information which shows the relationship between the vehicle information when such a biometric information change arises and the imaged image is produced
  • the information processing apparatus 200 associates either the vehicle information or the surrounding image with the action to be taken by the driver using the generated related information, and outputs it as action data.
  • the information processing apparatus 200 may provide the generated driving data to a vehicle capable of automatic driving.
  • the information processing apparatus 200 also generates instruction data for instructing the driver using the behavior data.
  • This instruction data is generated based on either vehicle information of the vehicle or a surrounding image captured by the imaging unit of the vehicle.
  • the information processing apparatus 200 may provide the generated instruction data to the vehicle.
  • the timing at which the information processing apparatus 200 provides the driving data and the instruction data is not particularly limited.
  • the information processing apparatus 200 may provide the driving data and the instruction data at a timing capable of dealing with an approaching event based on road conditions. Further, for example, it may be provided at the timing when the vehicle engine is started.
  • the actions to be taken by the driver include, but are not limited to, when the opponent vehicle turns the steering wheel to the right, turns the steering wheel to the left, and steps on the brake when there is a pedestrian.
  • the action to be taken by the driver varies depending on, for example, the positional relationship with the opponent vehicle, the speed of the host vehicle and the opponent vehicle, the vehicle weight, the vehicle body shape, the vehicle body color, and the like.
  • the information processing apparatus 200 may perform the above-described processing in real time.
  • FIG. 3 is a block diagram showing a configuration of the information processing apparatus 200 according to the present embodiment.
  • the information processing apparatus 200 includes a biological information acquisition unit 301, a vehicle information acquisition unit 302, a surrounding image acquisition unit 303, a storage unit 304, an analysis unit 305, a behavior data output unit 306, and a driving data generation unit 307. And an instruction data generation unit 308.
  • the biometric information acquisition unit 301 acquires the biometric information of the driver of the vehicle from the wearable device.
  • the biometric information to be acquired is, for example, heart rate, blood pressure, sweating amount, respiratory rate, brain wave, etc., but is not limited thereto.
  • the biological information is acquired from, for example, a taxi or a bus driver, but the acquisition target of the biological information is not limited to these.
  • Vehicle information acquisition unit 302 acquires vehicle information of a vehicle driven by the driver.
  • the vehicle information includes, for example, vehicle position information, speed, acceleration, steering angle, remaining fuel, brake pedal force, and the like, but is not limited thereto.
  • the position information of the vehicle is acquired from, for example, a GPS (Global Positioning System) device attached to the vehicle, but the acquisition destination of the position information is not limited to this.
  • the vehicle speed, acceleration, steering angle, etc. are acquired from, for example, OBD (On-board Diagnostics), CAN (Controller Area Network), etc., but are not limited thereto.
  • the vehicle information is acquired from a business vehicle such as a taxi or a bus or a general vehicle.
  • the surrounding image acquisition unit 303 acquires an image (ambient image) around the vehicle including the vehicle (own vehicle) driven by the driver. That is, the surrounding image acquisition unit 303 acquires an image captured by a camera such as a drive recorder attached to the vehicle. Moreover, the surrounding image acquisition part 303 may acquire the image imaged with cameras, such as mobile terminals, such as a smart phone which a driver
  • the storage unit 304 stores the biometric information acquired by the biometric information acquisition unit 301, the vehicle information acquired by the vehicle information acquisition unit 302, and the surrounding image acquired by the surrounding image acquisition unit 303 in association with each other.
  • the storage unit 304 further includes weather information such as weather in the biological information acquired by the biological information acquisition unit 301, the vehicle information acquired by the vehicle information acquisition unit 302, and the surrounding image acquired by the surrounding image acquisition unit 303.
  • External information may be stored in association with each other.
  • the external information is, for example, probe data such as traffic jam information, congestion prediction information, and faulty vehicle information, but is not limited thereto.
  • the storage unit 304 may store the shooting date and time of the surrounding image, the shooting time zone (for example, morning, day, night, etc.), the region, the vehicle shape and the vehicle height of the host vehicle and the partner vehicle.
  • the analysis unit 305 analyzes the biological information, vehicle information, and surrounding images stored in the storage unit 304 according to changes in the biological information. And the analysis part 305 produces
  • the analysis of the surrounding image is performed, for example, by dividing the image into predetermined time intervals.
  • successive actions are stochastically patterned by finely dividing images and videos.
  • the next action can be patterned stochastically, and by shortening the time interval that separates the images, the image can be analyzed in detail.
  • the accuracy can be increased.
  • the behavior data output unit 306 uses the related information generated by the analysis unit 305 to output, for example, behavior data about the behavior to be taken by the driver determined based on the surrounding image, the vehicle information, the surrounding image, and the vehicle information. To do.
  • the actions to be taken by the driver are, for example, operations and actions for avoiding danger such as turning a steering wheel, stepping on a brake, changing lanes, and the like.
  • the driving data generation unit 307 uses the behavior data output by the behavior data output unit 306 to generate driving data used for a vehicle capable of automatic driving.
  • the driving data includes switching data for switching from automatic driving to manual driving by the driver when it is determined that automatic driving is dangerous based on vehicle information and surrounding images. For example, when a plurality of events occur simultaneously and it is difficult to determine which event should be dealt with, automatic driving may be switched to manual driving in order to leave the determination to the driver.
  • the driving data includes switching data for switching from manual driving to automatic driving when the biological information acquired from a wearable device attached to the driver of the vehicle represents an abnormality of the driver, for example, seizure or sleepiness. .
  • an abnormality of the driver for example, seizure or sleepiness.
  • manual driving may be switched to automatic driving to avoid danger.
  • the instruction data generation unit 308 generates instruction data for instructing the driver using the behavior data.
  • the instruction data generation unit 308 generates instruction data based on at least one of the vehicle information of the vehicle and the surrounding image captured by the imaging unit of the vehicle.
  • the instruction data may be, for example, an alert such as a sound for instructing a driving operation to the driver of the vehicle, but is not limited thereto.
  • FIG. 4 is a diagram illustrating an example of the configuration of the image table 401 included in the information processing apparatus 200 according to the present embodiment.
  • the image table 401 stores an image ID 412, biological information 413, vehicle information 414, and weather information 415 in association with a driving ID (identifier) 411.
  • the driving ID 411 is information for identifying driving, and includes, for example, information related to the driver such as driver attributes, information related to the vehicle, information related to the driving date and time, and the like.
  • the image ID 412 is information for identifying the image, and is a number uniquely assigned to the image.
  • the biometric information 413 is biometric information data acquired from a wearable terminal worn by the driver.
  • the vehicle information 414 is data indicating the state of the vehicle acquired from the vehicle, and is data such as vehicle speed, acceleration, and steering angle.
  • the weather information 415 is information related to the weather when an image is captured. Then, the analysis unit 305 refers to the image table 401, analyzes the surrounding image, and generates related information.
  • FIG. 5 is a block diagram showing a hardware configuration of the information processing apparatus 200 according to the present embodiment.
  • a CPU (Central Processing Unit) 510 is a processor for arithmetic control, and implements a functional component of the information processing apparatus 200 in FIG. 3 by executing a program.
  • a ROM (Read Only Memory) 520 stores fixed data such as initial data and programs and other programs.
  • the network interface 530 communicates with other devices via the network. Note that the number of CPUs 510 is not limited to one, and a plurality of CPUs may be included, or a GPU (Graphics Processing Unit) for image processing may be included.
  • GPU Graphics Processing Unit
  • the network interface 530 preferably includes a CPU independent of the CPU 510 and writes or reads transmission / reception data in a RAM (Random Access Memory) 540 area. Also, it is desirable to provide a DMAC (Direct Memory Access Controller) that transfers data between the RAM 540 and the storage 550 (not shown). Furthermore, the input / output interface 560 preferably has a CPU independent of the CPU 510 and writes or reads input / output data in the RAM 540 area. Therefore, the CPU 510 recognizes that the data has been received or transferred to the RAM 540 and processes the data. Further, the CPU 510 prepares the processing result in the RAM 540 and leaves the subsequent transmission or transfer to the network interface 530, the DMAC, or the input / output interface 560.
  • a CPU independent of the CPU 510 and writes or reads transmission / reception data in a RAM (Random Access Memory) 540 area.
  • DMAC Direct Memory Access Controller
  • the RAM 540 is a random access memory used by the CPU 510 as a temporary storage work area. In the RAM 540, an area for storing data necessary for realizing the present embodiment is secured.
  • the biometric information data 541 is data related to the driver's biometric information.
  • the vehicle information data 542 is data relating to the speed and acceleration of the vehicle.
  • the surrounding image data 543 is data relating to an image captured by a drive recorder or the like.
  • the related information data 544 is information related to the relationship between the vehicle information that has led to the change of the biological information and the surrounding image. These data are data developed from the image table 401.
  • the action data 545 is data about actions to be taken by the driver.
  • the driving data / instruction data 546 is data for automatic driving used for a vehicle capable of automatic driving and data for instructing the driver.
  • the input / output data 547 is data input / output via the input / output interface 560.
  • the transmission / reception data 548 is data transmitted / received via the network interface 530.
  • the RAM 540 has an application execution area 549 for executing various application modules.
  • the storage 550 stores a database, various parameters, or the following data or programs necessary for realizing the present embodiment.
  • the storage 550 stores the image table 401.
  • the image table 401 is a table for managing the relationship between the driving ID 411 and the image ID 412 shown in FIG.
  • the storage 550 further includes a biological information acquisition module 551, a vehicle information acquisition module 552, an imaging module 553, an analysis module 554, a behavior data output module 555, a driving data generation module 556, and an instruction data generation module. 557 are stored. These modules are executed by the CPU 510.
  • the biometric information acquisition module 551 is a module that acquires biometric information of a vehicle driver.
  • the vehicle information acquisition module 552 is a module that acquires vehicle information of the vehicle.
  • the imaging module 553 is a module that captures an image around the vehicle including the vehicle.
  • the analysis module 554 is a module that analyzes biological information, vehicle information, and surrounding images in accordance with changes in biological information, and generates related information between the vehicle information that has led to changes in biological information and surrounding images.
  • the behavior data output module 555 is a module that outputs behavior data on the behavior to be taken by the driver determined based on at least one of the vehicle information and the surrounding image.
  • the driving data generation module 556 is a module that generates driving data used for a vehicle capable of automatic driving using behavior data.
  • the instruction data generation module 557 is a module that generates instruction data for instructing the driver using behavior data. These modules 551 to 557 are read by the CPU 510 into the application execution area 549 of the RAM 540 and executed.
  • the control program 557 is a program for controlling the entire information processing apparatus 200.
  • the input / output interface 560 interfaces input / output data with input / output devices.
  • a display unit 561 and an operation unit 562 are connected to the input / output interface 560.
  • a storage medium 564 may be further connected to the input / output interface 560.
  • a speaker 563 that is an audio output unit, a microphone that is an audio input unit, or a GPS position determination unit may be connected.
  • the RAM 540 and the storage 550 shown in FIG. 5 do not show programs and data related to general-purpose functions and other realizable functions that the information processing apparatus 200 has.
  • FIG. 6 is a flowchart for explaining the processing procedure of the information processing apparatus 200 according to this embodiment. This flowchart is executed by the CPU 510 using the RAM 540, and implements the functional components of the information processing apparatus 200 of FIG.
  • step S601 the information processing apparatus 200 acquires the biological information of the driver of the vehicle from, for example, a wearable terminal worn by the driver.
  • step S603 the information processing apparatus 200 acquires vehicle information of the vehicle.
  • step S605 the information processing apparatus 200 acquires a surrounding image of the host vehicle including the host vehicle. Note that the order of steps S601 to S605 is not limited to the order shown in FIG. 6, and the order may be changed as appropriate, or these steps may be executed simultaneously.
  • step S607 the information processing apparatus 200 stores biometric information, vehicle information, and surrounding images in association with each other.
  • step S609 the information processing apparatus 200 determines whether there is a change in the acquired biological information. If there is no change in the biological information (NO in step S609), the information processing apparatus 200 returns to step S601. When there is a change in the biological information (YES in step S609), the information processing apparatus 200 proceeds to step S611.
  • step S611 the information processing apparatus 200 analyzes the stored biological information, vehicle information, and surrounding images according to changes in the biological information, and obtains related information between the vehicle information and the surrounding images that have led to the change in the biological information. Generate.
  • step S ⁇ b> 613 the information processing apparatus 200 uses the generated related information to generate action data regarding actions to be taken by the driver determined based on at least one of vehicle information and surrounding images.
  • step S615 the information processing apparatus 200 generates driving data used for a vehicle capable of automatic driving using the output behavior data.
  • the information processing apparatus 200 generates instruction data for instructing the driver using the output behavior data.
  • the action data on the action to be taken by the driver is output, safe driving can be sufficiently supported. Further, since the driving data is generated using the output action data, the automatic driving technique can be advanced. Furthermore, since the instruction data is generated using the output action data, it is possible to sufficiently support safe driving.
  • FIG. 7 is a block diagram illustrating a configuration of the information processing apparatus 700 according to the present embodiment.
  • the information processing apparatus 700 according to the present embodiment is different from the second embodiment in that it includes a timing estimation unit. Since other configurations and operations are the same as those of the second embodiment, the same configurations and operations are denoted by the same reference numerals, and detailed description thereof is omitted.
  • the information processing apparatus 700 further includes a timing estimation unit 701.
  • the timing estimation unit 701 estimates the timing at which the driver senses the occurrence of an accident according to changes in the biological information.
  • the timing estimation unit 701 estimates, for example, a predetermined time before the change in biological information occurs as the detection timing, but the timing estimation method is not limited to this.
  • the analysis unit 305 generates related information based on the estimated timing.
  • FIG. 8 is a diagram illustrating an example of the image table 801 included in the information processing apparatus 700 according to the present embodiment.
  • the image table 801 stores a predetermined time 811 in association with the driving ID 411.
  • the timing estimation unit 701 refers to the image table 801 and estimates the timing at which the driver has detected an accident.
  • FIG. 9 is a block diagram illustrating a hardware configuration of the information processing apparatus 700 according to the present embodiment.
  • the RAM 940 is a random access memory that the CPU 910 uses as a work area for temporary storage. In the RAM 940, an area for storing data necessary for realizing the present embodiment is secured.
  • the predetermined time data 941 is time data used for estimation of the detection timing, and is data developed from the image table 801.
  • the storage 950 stores an image table 801.
  • the image table 801 is a table for managing the relationship between the operation ID 411 and the predetermined time 811 shown in FIG.
  • the storage 550 further stores a timing estimation module 951.
  • the timing estimation module 951 is a module that estimates the timing at which the driver senses the occurrence of an accident.
  • the timing estimation module 951 is read by the CPU 510 into the application execution area 549 of the RAM 940 and executed.
  • FIG. 10 is a flowchart for explaining the processing procedure of the information processing apparatus 700 according to this embodiment. This flowchart is executed by the CPU 510 using the RAM 940, and implements the functional components of the information processing apparatus 700 in FIG. Note that the same steps as those in FIG. 6 are denoted by the same step numbers, and redundant description is omitted.
  • step S1001 the information processing apparatus 700 estimates the timing when the driver senses the occurrence of an accident.
  • step S ⁇ b> 1003 the information processing apparatus 700 generates related information between the vehicle information that has caused the change in the biological information and the surrounding image based on the estimated timing.
  • the accuracy of the action data regarding the action to be taken by the driver is increased, and the automatic driving technique can be further advanced.
  • the present invention may be applied to a system composed of a plurality of devices, or may be applied to a single device. Furthermore, the present invention can also be applied to a case where an information processing program that implements the functions of the embodiments is supplied directly or remotely to a system or apparatus. Therefore, in order to realize the functions of the present invention on a computer, a program installed on the computer, a medium storing the program, and a WWW (World Wide Web) server that downloads the program are also included in the scope of the present invention. . In particular, at least a non-transitory computer readable medium storing a program for causing a computer to execute the processing steps included in the above-described embodiments is included in the scope of the present invention.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The purpose of the present invention is to output action data relating to an action that should be taken by a driver to sufficiently assist safe driving. An information processing device is provided with: a living body information acquisition unit for acquiring living body information relating to a driver of a vehicle; a vehicle information acquisition unit for acquiring vehicle information relating to the vehicle; a surrounding image acquisition unit for acquiring a vehicle surrounding image including the vehicle from an image capture unit attached to the vehicle; a storage unit for storing a plurality of sets of the living body information, the vehicle information, and the surrounding images; an analysis unit for analyzing the plurality of sets to generate relevance information between the vehicle information and the surrounding image in which the living body information of the driver is changed; and an action data output unit for outputting action data in which an action that should be taken by the driver and at least one of the vehicle information and the surrounding image are associated by using the relevance information.

Description

情報処理装置、情報処理方法および情報処理プログラムInformation processing apparatus, information processing method, and information processing program
 本発明は、情報処理装置、情報処理方法および情報処理プログラムに関する。 The present invention relates to an information processing apparatus, an information processing method, and an information processing program.
 上記技術分野において、特許文献1には、運転者の操作量などを時系列パターンとして統計処理した統計情報と車両の位置情報とを関連付けて生成した地図情報に基づいて運転支援を行う技術が開示されている。 In the above technical field, Patent Document 1 discloses a technology for performing driving support based on map information generated by associating statistical information obtained by statistically processing a driver's operation amount or the like as a time series pattern and vehicle position information. Has been.
国際公開第2014/013985号International Publication No. 2014/013985
 しかしながら、上記文献に記載の技術では、運転者の取るべき行動についての行動データを出力しないので、安全運転を十分に支援することができなかった。 However, the technique described in the above document does not output action data on actions to be taken by the driver, and thus cannot fully support safe driving.
 本発明の目的は、上述の課題を解決する技術を提供することにある。 An object of the present invention is to provide a technique for solving the above-described problems.
 上記目的を達成するため、本発明に係る情報処理装置は、
 車両の運転者の生体情報を取得する生体情報取得手段と、
 前記車両の車両情報を取得する車両情報取得手段と、
 前記車両に取り付けられた撮像手段から、前記車両を含む前記車両の周囲画像を取得する周囲画像取得手段と、
 前記生体情報、前記車両情報および前記周囲画像の複数の組を記憶する記憶手段と、
 前記複数の組を解析し、運転者の生体情報が変化に至った車両情報と周囲画像との関連情報を生成する解析手段と、
 前記関連情報を用いて、車両情報および周囲画像の少なくともいずれかと運転者の取るべき行動とを対応付けた行動データを出力する行動データ出力手段と、
 を備えた。
In order to achieve the above object, an information processing apparatus according to the present invention provides:
Biometric information acquisition means for acquiring biometric information of the driver of the vehicle;
Vehicle information acquisition means for acquiring vehicle information of the vehicle;
A surrounding image acquisition means for acquiring a surrounding image of the vehicle including the vehicle from an imaging means attached to the vehicle;
Storage means for storing a plurality of sets of the biological information, the vehicle information, and the surrounding image;
Analyzing means for analyzing the plurality of sets, and generating information related to vehicle information and surrounding images in which the driver's biological information has changed; and
Using the related information, action data output means for outputting action data in which at least one of vehicle information and surrounding images and actions to be taken by the driver are associated;
Equipped with.
 上記目的を達成するため、本発明に係る情報処理方法は、
 車両の運転者の生体情報を取得する生体情報取得ステップと、
 前記車両の車両情報を取得する車両情報取得ステップと、
 前記車両に取り付けられた撮像手段から、前記車両を含む前記車両の周囲画像を取得する周囲画像取得ステップと、
 前記生体情報、前記車両情報および前記周囲画像の複数の組を記憶する記憶ステップと、
 前記複数の組を解析し、運転者の生体情報が変化に至った車両情報と周囲画像との関連情報を生成する解析ステップと、
 前記関連情報を用いて、車両情報および周囲画像の少なくともいずれかと運転者の取るべき行動とを対応付けた行動データを出力する行動データ出力ステップと、
 を含む。
In order to achieve the above object, an information processing method according to the present invention includes:
A biological information acquisition step for acquiring biological information of the driver of the vehicle;
Vehicle information acquisition step of acquiring vehicle information of the vehicle;
A surrounding image acquisition step of acquiring a surrounding image of the vehicle including the vehicle from an imaging means attached to the vehicle;
A storage step of storing a plurality of sets of the biological information, the vehicle information, and the surrounding image;
An analysis step of analyzing the plurality of sets and generating information related to vehicle information and surrounding images in which the driver's biometric information has changed;
Using the related information, an action data output step of outputting action data in which at least one of vehicle information and surrounding images is associated with an action to be taken by the driver;
including.
 上記目的を達成するため、本発明に係る情報処理プログラムは、
 車両の運転者の生体情報を取得する生体情報取得ステップと、
 前記車両の車両情報を取得する車両情報取得ステップと、
 前記車両に取り付けられた撮像手段から、前記車両を含む前記車両の周囲画像を取得する周囲画像取得ステップと、
 前記生体情報、前記車両情報および前記周囲画像の複数の組を記憶する記憶ステップと、
 前記複数の組を解析し、運転者の生体情報が変化に至った車両情報と周囲画像との関連情報を生成する解析ステップと、
 前記関連情報を用いて、車両情報および周囲画像の少なくともいずれかと運転者の取るべき行動とを対応付けた行動データを出力する行動データ出力ステップと、
 をコンピュータに実行させる。
In order to achieve the above object, an information processing program according to the present invention provides:
A biological information acquisition step for acquiring biological information of the driver of the vehicle;
Vehicle information acquisition step of acquiring vehicle information of the vehicle;
A surrounding image acquisition step of acquiring a surrounding image of the vehicle including the vehicle from an imaging means attached to the vehicle;
A storage step of storing a plurality of sets of the biological information, the vehicle information, and the surrounding image;
An analysis step of analyzing the plurality of sets and generating information related to vehicle information and surrounding images in which the driver's biometric information has changed;
Using the related information, an action data output step of outputting action data in which at least one of vehicle information and surrounding images is associated with an action to be taken by the driver;
Is executed on the computer.
 本発明によれば、運転者の取るべき行動についての行動データを出力するので、安全運転を十分に支援することができる。 According to the present invention, since the action data on the action to be taken by the driver is output, safe driving can be sufficiently supported.
本発明の第1実施形態に係る情報処理装置の構成を示すブロック図である。It is a block diagram which shows the structure of the information processing apparatus which concerns on 1st Embodiment of this invention. 本発明の第2実施形態に係る情報処理装置による処理の概要の一例を説明する図である。It is a figure explaining an example of the outline | summary of the process by the information processing apparatus which concerns on 2nd Embodiment of this invention. 本発明の第2実施形態に係る情報処理装置による処理の概要の他の例を説明する図である。It is a figure explaining the other example of the outline | summary of the process by the information processing apparatus which concerns on 2nd Embodiment of this invention. 本発明の第2実施形態に係る情報処理装置の構成を示すブロック図である。It is a block diagram which shows the structure of the information processing apparatus which concerns on 2nd Embodiment of this invention. 本発明の第2実施形態に係る情報処理装置の備える画像テーブルの構成の一例を示す図である。It is a figure which shows an example of a structure of the image table with which the information processing apparatus which concerns on 2nd Embodiment of this invention is provided. 本発明の第2実施形態に係る情報処理装置のハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware constitutions of the information processing apparatus which concerns on 2nd Embodiment of this invention. 本発明の第2実施形態に係る情報処理装置の処理手順を説明するフローチャートである。It is a flowchart explaining the process sequence of the information processing apparatus which concerns on 2nd Embodiment of this invention. 本発明の第3実施形態に係る情報処理装置の構成を示すブロック図である。It is a block diagram which shows the structure of the information processing apparatus which concerns on 3rd Embodiment of this invention. 本発明の第3実施形態に係る情報処理装置の備える画像テーブルの一例を示す図である。It is a figure which shows an example of the image table with which the information processing apparatus which concerns on 3rd Embodiment of this invention is provided. 本発明の第3実施形態に係る情報処理装置のハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware constitutions of the information processing apparatus which concerns on 3rd Embodiment of this invention. 本発明の第3実施形態に係る情報処理装置の処理手順を説明するフローチャートである。It is a flowchart explaining the process sequence of the information processing apparatus which concerns on 3rd Embodiment of this invention.
 以下に、本発明を実施するための形態について、図面を参照して、例示的に詳しく説明記載する。ただし、以下の実施の形態に記載されている、構成、数値、処理の流れ、機能要素などは一例に過ぎず、その変形や変更は自由であって、本発明の技術範囲を以下の記載に限定する趣旨のものではない。 Hereinafter, modes for carrying out the present invention will be described in detail by way of example with reference to the drawings. However, the configuration, numerical values, process flow, functional elements, and the like described in the following embodiments are merely examples, and modifications and changes are free, and the technical scope of the present invention is described in the following description. It is not intended to be limited.
 [第1実施形態]
 本発明の第1実施形態としての情報処理装置100について、図1を用いて説明する。情報処理装置100は、運転者の取るべき行動についての行動データを出力する装置である。
[First Embodiment]
An information processing apparatus 100 as a first embodiment of the present invention will be described with reference to FIG. The information processing apparatus 100 is an apparatus that outputs action data regarding actions to be taken by the driver.
 図1に示すように、情報処理装置100は、生体情報取得部101と、車両情報取得部102と、周囲画像取得部103と、記憶部104と、解析部105と、行動データ出力部106と、を含む。 As illustrated in FIG. 1, the information processing apparatus 100 includes a biological information acquisition unit 101, a vehicle information acquisition unit 102, a surrounding image acquisition unit 103, a storage unit 104, an analysis unit 105, and a behavior data output unit 106. ,including.
 生体情報取得部101は、車両の運転者の生体情報を取得する。車両情報取得部102は、車両の車両情報を取得する。周囲画像取得部103は、車両に取付けられた撮像部から、車両を含む車両の周囲画像を取得する。記憶部104は、生体情報、車両情報および周囲画像の複数の組を記憶する。解析部105は、複数の組を解析し、運転者の生体情報が変化に至った車両情報と周囲画像との関連情報を生成する。行動データ出力部106は、関連情報を用いて、車両情報および周囲画像の少なくともいずれかと運転者の取るべき行動とを対応付けた行動データを出力する。 The biological information acquisition unit 101 acquires biological information of the driver of the vehicle. The vehicle information acquisition unit 102 acquires vehicle information of the vehicle. The surrounding image acquisition unit 103 acquires a surrounding image of the vehicle including the vehicle from an imaging unit attached to the vehicle. The storage unit 104 stores a plurality of sets of biological information, vehicle information, and surrounding images. The analysis unit 105 analyzes a plurality of sets and generates related information between vehicle information and surrounding images in which the driver's biometric information has changed. The behavior data output unit 106 outputs behavior data in which at least one of the vehicle information and the surrounding image and the behavior to be taken by the driver are associated with each other using the related information.
 本実施形態によれば、運転者の取るべき行動についての行動データを出力するので、安全運転を十分に支援することができる。 According to the present embodiment, since the action data on the action to be taken by the driver is output, safe driving can be sufficiently supported.
 [第2実施形態]
 次に本発明の第2実施形態に係る情報処理装置について、図2A乃至図6を用いて説明する。図2Aは、本実施形態に係る情報処理装置200による処理の概要の一例を説明する図である。また、図2Bは、本実施形態に係る情報処理装置200による処理の概要の他の例を説明する図である。
[Second Embodiment]
Next, an information processing apparatus according to the second embodiment of the present invention will be described with reference to FIGS. 2A to 6. FIG. 2A is a diagram illustrating an example of an outline of processing by the information processing apparatus 200 according to the present embodiment. Moreover, FIG. 2B is a figure explaining the other example of the outline | summary of the process by the information processing apparatus 200 which concerns on this embodiment.
 図2Aには、2台の車両が、互いに異なる方角から交差点に接近している状況が示されている。また、図2Bには、車両が交差点に接近しており、歩行者が横断歩道上を歩行している状況が示されている。例えば、これらの状況は、車両に取り付けられたドライブレコーダにより画像や映像などが撮像される。撮像された画像や映像は、生体情報や車両情報と対応付けて記憶される。なお、ドライブレコーダによる画像や映像の撮像は、常時撮像をしていてもよいし、撮像のための所定のトリガーを検知した場合に撮像を開始するようにしてもよい。 FIG. 2A shows a situation where two vehicles are approaching the intersection from different directions. FIG. 2B shows a situation where the vehicle is approaching the intersection and a pedestrian is walking on the pedestrian crossing. For example, in these situations, images and videos are taken by a drive recorder attached to the vehicle. The captured image or video is stored in association with biological information or vehicle information. Note that image and video imaging by the drive recorder may be always performed, or may be started when a predetermined trigger for imaging is detected.
 情報処理装置200は、例えば、運転者が、相手車両の存在を認識して、危険を察知することにより、心拍数や血圧が上がったり、発汗量が増えたりするなどの生体情報の変化に応じて、記憶された画像を解析する。そして、このような生体情報の変化が生じたときの車両情報と、撮像された画像との関連を示す関連情報を生成する。情報処理装置200は、生成された関連情報を用いて、車両情報および周囲画像のいずれかと運転者が取るべき行動とを対応付けて、行動データとして出力する。 In the information processing apparatus 200, for example, when the driver recognizes the existence of the partner vehicle and senses the danger, the information processing apparatus 200 responds to changes in biological information such as an increase in heart rate, blood pressure, and sweating. Analyze the stored image. And the relevant information which shows the relationship between the vehicle information when such a biometric information change arises and the imaged image is produced | generated. The information processing apparatus 200 associates either the vehicle information or the surrounding image with the action to be taken by the driver using the generated related information, and outputs it as action data.
 そして、行動データを用いて、自動運転可能な車両に用いられる運転用データを生成する。情報処理装置200は、生成した運転用データを自動運転可能な車両に提供してもよい。 Then, using the behavior data, driving data used for a vehicle capable of automatic driving is generated. The information processing apparatus 200 may provide the generated driving data to a vehicle capable of automatic driving.
 また、情報処理装置200は、行動データを用いて、運転者に対して指示するための指示用データも生成する。この指示用データは、車両の車両情報および車両の撮像部が撮像した周囲画像のいずれかに基づいて生成される。情報処理装置200は、生成した指示用データを車両に対して提供してもよい。なお、情報処理装置200が、運転用データや指示用データを提供するタイミングは特に限定されないが、例えば、道路状況などを踏まえて接近する事象への対処が可能なタイミングで提供してもよい。また、例えば、車両のエンジンを掛けたタイミングで提供してもよい。 The information processing apparatus 200 also generates instruction data for instructing the driver using the behavior data. This instruction data is generated based on either vehicle information of the vehicle or a surrounding image captured by the imaging unit of the vehicle. The information processing apparatus 200 may provide the generated instruction data to the vehicle. Note that the timing at which the information processing apparatus 200 provides the driving data and the instruction data is not particularly limited. For example, the information processing apparatus 200 may provide the driving data and the instruction data at a timing capable of dealing with an approaching event based on road conditions. Further, for example, it may be provided at the timing when the vehicle engine is started.
 運転者が取るべき行動としては、相手車両が右にハンドルを切ったら、左にハンドルを切る、歩行者がいる場合にはブレーキを踏む、などであるが、これらには限定されない。運転者が取るべき行動は、例えば、相手車両との位置関係や、自車両および相手車両のスピードや車重、車体形状、車体カラーなどにより様々に変化する。なお、ここでは、情報処理装置200が記憶された周囲画像を解析する例で説明をしたが、情報処理装置200は、上述の処理をリアルタイムで行ってもよい。 The actions to be taken by the driver include, but are not limited to, when the opponent vehicle turns the steering wheel to the right, turns the steering wheel to the left, and steps on the brake when there is a pedestrian. The action to be taken by the driver varies depending on, for example, the positional relationship with the opponent vehicle, the speed of the host vehicle and the opponent vehicle, the vehicle weight, the vehicle body shape, the vehicle body color, and the like. Here, the example in which the information processing apparatus 200 analyzes the stored surrounding image has been described, but the information processing apparatus 200 may perform the above-described processing in real time.
 図3は、本実施形態に係る情報処理装置200の構成を示すブロック図である。情報処理装置200は、生体情報取得部301と、車両情報取得部302と、周囲画像取得部303と、記憶部304と、解析部305と、行動データ出力部306と、運転用データ生成部307と、指示用データ生成部308と、を備える。 FIG. 3 is a block diagram showing a configuration of the information processing apparatus 200 according to the present embodiment. The information processing apparatus 200 includes a biological information acquisition unit 301, a vehicle information acquisition unit 302, a surrounding image acquisition unit 303, a storage unit 304, an analysis unit 305, a behavior data output unit 306, and a driving data generation unit 307. And an instruction data generation unit 308.
 生体情報取得部301は、ウェアラブルデバイスから車両の運転者の生体情報を取得する。取得する生体情報は、例えば、心拍や血圧、発汗量、呼吸数、脳波などであるがこれらには限定されない。なお、生体情報は、例えば、タクシーやバスの運転手などから取得されるが、生体情報の取得対象はこれらには限定されない。 The biometric information acquisition unit 301 acquires the biometric information of the driver of the vehicle from the wearable device. The biometric information to be acquired is, for example, heart rate, blood pressure, sweating amount, respiratory rate, brain wave, etc., but is not limited thereto. The biological information is acquired from, for example, a taxi or a bus driver, but the acquisition target of the biological information is not limited to these.
 車両情報取得部302は、運転者が運転する車両の車両情報を取得する。車両情報は、例えば、車両の位置情報や、速度、加速度、ハンドル舵角、残燃料、ブレーキ踏力などを含むが、これらには限定されない。車両の位置情報は、例えば、車両に取り付けられたGPS(Global Positioning System)機器などから取得されるが、位置情報の取得先はこれには限定されない。また、車両の速度や、加速度、ハンドル舵角などは、例えば、OBD(On-board Diagnostics)やCAN(Controller Area Network)などから取得されるが、これらには限定されない。なお、車両情報は、タクシーやバスなどの営業車両や一般車両などから取得される。 Vehicle information acquisition unit 302 acquires vehicle information of a vehicle driven by the driver. The vehicle information includes, for example, vehicle position information, speed, acceleration, steering angle, remaining fuel, brake pedal force, and the like, but is not limited thereto. The position information of the vehicle is acquired from, for example, a GPS (Global Positioning System) device attached to the vehicle, but the acquisition destination of the position information is not limited to this. The vehicle speed, acceleration, steering angle, etc. are acquired from, for example, OBD (On-board Diagnostics), CAN (Controller Area Network), etc., but are not limited thereto. The vehicle information is acquired from a business vehicle such as a taxi or a bus or a general vehicle.
 周囲画像取得部303は、運転者の運転する車両(自車両)を含む車両の周囲の画像(周囲画像)を取得する。すなわち、周囲画像取得部303は、車両に取り付けられたドライブレコーダなどのカメラで撮像した画像を取得する。また、周囲画像取得部303は、車両の運転者や同乗者の所有するスマートフォンなどの携帯端末などのカメラで撮像した画像を取得してもよく、スマートフォンなどは所定の取付器具で車両に取り付けられる。なお、周囲画像取得部303は、周囲画像を常時取得するものであっても、所定のトリガーを検知したら取得を開始するものであってもよい。また、周囲画像には、自動車に関連する事故の画像などが含まれてもよい。なお、周囲画像は、タクシーやバスなどの営業車両や一般車両などから取得される。 The surrounding image acquisition unit 303 acquires an image (ambient image) around the vehicle including the vehicle (own vehicle) driven by the driver. That is, the surrounding image acquisition unit 303 acquires an image captured by a camera such as a drive recorder attached to the vehicle. Moreover, the surrounding image acquisition part 303 may acquire the image imaged with cameras, such as mobile terminals, such as a smart phone which a driver | operator and passenger of a vehicle own, and a smart phone etc. are attached to a vehicle with a predetermined attachment. . The surrounding image acquisition unit 303 may acquire the surrounding image all the time, or may start acquiring when a predetermined trigger is detected. The surrounding image may include an image of an accident related to a car. The surrounding image is acquired from a business vehicle such as a taxi or a bus or a general vehicle.
 記憶部304は、生体情報取得部301で取得した生体情報、車両情報取得部302で取得した車両情報および周囲画像取得部303で取得した周囲画像を対応付けて記憶する。また、記憶部304は、生体情報取得部301で取得した生体情報、車両情報取得部302で取得した車両情報および周囲画像取得部303で取得した周囲画像に、さらに、天候などの気象情報を含む外部情報を対応付けて記憶してもよい。外部情報は、例えば、渋滞情報、混雑予測情報、故障車両情報などのプローブデータであるが、これらには限定されない。なお、記憶部304は、周囲画像の撮影日時や撮影時間帯(例えば、朝昼夜など)、地域、自車および相手車の車体形状や車高などを合わせて記憶してもよい。 The storage unit 304 stores the biometric information acquired by the biometric information acquisition unit 301, the vehicle information acquired by the vehicle information acquisition unit 302, and the surrounding image acquired by the surrounding image acquisition unit 303 in association with each other. The storage unit 304 further includes weather information such as weather in the biological information acquired by the biological information acquisition unit 301, the vehicle information acquired by the vehicle information acquisition unit 302, and the surrounding image acquired by the surrounding image acquisition unit 303. External information may be stored in association with each other. The external information is, for example, probe data such as traffic jam information, congestion prediction information, and faulty vehicle information, but is not limited thereto. Note that the storage unit 304 may store the shooting date and time of the surrounding image, the shooting time zone (for example, morning, day, night, etc.), the region, the vehicle shape and the vehicle height of the host vehicle and the partner vehicle.
 解析部305は、生体情報の変化に応じて、記憶部304に記憶された生体情報、車両情報および周囲画像を解析する。そして、解析部305は、生体情報の変化に至った車両情報と周囲画像との関連情報を生成する。すなわち、生体情報の変化が生じるまでの過程やその後の過程について、周囲画像に映っている自車両の挙動や相手車両の挙動、相手車両の大きさ、相手車両の形状、歩行者の挙動などと車両情報との関連を示す関連情報を生成する。 The analysis unit 305 analyzes the biological information, vehicle information, and surrounding images stored in the storage unit 304 according to changes in the biological information. And the analysis part 305 produces | generates the relevant information of the vehicle information and surrounding image which resulted in the change of biometric information. In other words, regarding the process until the change of biometric information and the subsequent process, the behavior of the host vehicle and the behavior of the partner vehicle, the size of the partner vehicle, the shape of the partner vehicle, the behavior of the pedestrian, etc. The related information indicating the relationship with the vehicle information is generated.
 周囲画像などの解析は、例えば、画像などを所定の時間間隔に区切って行う。つまり、画像や映像を細かく区切ることにより、連続する行動を確率的にパターン化する。例えば、運転者が、ある行動を取った場合の次の行動は、確率的にパターン化することができ、画像を区切る時間間隔を短くすることにより、画像を細かく解析することができるので、さらに精度を上げることができる。 The analysis of the surrounding image is performed, for example, by dividing the image into predetermined time intervals. In other words, successive actions are stochastically patterned by finely dividing images and videos. For example, when the driver takes a certain action, the next action can be patterned stochastically, and by shortening the time interval that separates the images, the image can be analyzed in detail. The accuracy can be increased.
 行動データ出力部306は、解析部305が生成した関連情報を用いて、例えば、周囲画像や車両情報、周囲画像および車両情報に基づいて決定された運転者の取るべき行動についての行動データを出力する。運転者の取るべき行動は、例えば、ハンドルを切る、ブレーキを踏む、車線を変更する、などの危険回避のための操作や行動などである。 The behavior data output unit 306 uses the related information generated by the analysis unit 305 to output, for example, behavior data about the behavior to be taken by the driver determined based on the surrounding image, the vehicle information, the surrounding image, and the vehicle information. To do. The actions to be taken by the driver are, for example, operations and actions for avoiding danger such as turning a steering wheel, stepping on a brake, changing lanes, and the like.
 運転用データ生成部307は、行動データ出力部306が出力した行動データを用いて、自動運転可能な車両に用いられる運転用データを生成する。 The driving data generation unit 307 uses the behavior data output by the behavior data output unit 306 to generate driving data used for a vehicle capable of automatic driving.
 また、運転用データは、車両情報および周囲画像に基づいて、自動運転が危険であると判断した場合に、自動運転から、運転手による手動運転に切り替える切替データを含む。例えば、複数の事象が同時に発生し、どの事象から対処していくべきかの判断が難しい場合に、運転者に判断を委ねるために、自動運転から手動運転に切り替えてもよい。 The driving data includes switching data for switching from automatic driving to manual driving by the driver when it is determined that automatic driving is dangerous based on vehicle information and surrounding images. For example, when a plurality of events occur simultaneously and it is difficult to determine which event should be dealt with, automatic driving may be switched to manual driving in order to leave the determination to the driver.
 さらに、運転用データは、車両の運転者に装着されたウェアラブルデバイスなどから取得した生体情報が運転者の異常、例えば、発作や眠気などを表す場合、手動運転から自動運転に切り替える切替データを含む。例えば、手動運転を継続していると、事故が発生する確率が高くなる場合に、危険回避のために、手動運転から自動運転に切り替えてもよい。 Further, the driving data includes switching data for switching from manual driving to automatic driving when the biological information acquired from a wearable device attached to the driver of the vehicle represents an abnormality of the driver, for example, seizure or sleepiness. . For example, when manual driving is continued, if the probability of occurrence of an accident increases, manual driving may be switched to automatic driving to avoid danger.
 指示用データ生成部308は、行動データを用いて、運転者に対して指示するための指示用データ生成する。そして、指示用データ生成部308は、車両の車両情報および車両の撮像部が撮像した周囲画像の少なくともいずれかに基づいて、指示用データを生成する。この指示用データは、例えば、車両の運転者に対して、運転操作を指示する音声などのアラートであってもよいが、これには限定されない。 The instruction data generation unit 308 generates instruction data for instructing the driver using the behavior data. The instruction data generation unit 308 generates instruction data based on at least one of the vehicle information of the vehicle and the surrounding image captured by the imaging unit of the vehicle. The instruction data may be, for example, an alert such as a sound for instructing a driving operation to the driver of the vehicle, but is not limited thereto.
 図4は、本実施形態に係る情報処理装置200の備える画像テーブル401の構成の一例を示す図である。画像テーブル401は、運転ID(identifier)411に対応付けて、画像ID412、生体情報413、車両情報414および気象情報415を記憶する。運転ID411は、運転を識別するための情報であり、これには、例えば、運転者の属性などの運転者に関する情報や、車両に関する情報、運転日時に関する情報などが含まれる。画像ID412は、画像を識別するための情報であり、画像に対して一意に割り当てられた番号などである。生体情報413は、運転者が装着しているウェアラブル端末などから取得した生体情報のデータである。車両情報414は、車両から取得された、車両の状態を示すデータであり、車速や加速度、ハンドル舵角などのデータである。気象情報415は、画像が撮像されたときの天候に関する情報である。そして、解析部305は、画像テーブル401を参照して、周囲画像を解析し、関連情報を生成する。 FIG. 4 is a diagram illustrating an example of the configuration of the image table 401 included in the information processing apparatus 200 according to the present embodiment. The image table 401 stores an image ID 412, biological information 413, vehicle information 414, and weather information 415 in association with a driving ID (identifier) 411. The driving ID 411 is information for identifying driving, and includes, for example, information related to the driver such as driver attributes, information related to the vehicle, information related to the driving date and time, and the like. The image ID 412 is information for identifying the image, and is a number uniquely assigned to the image. The biometric information 413 is biometric information data acquired from a wearable terminal worn by the driver. The vehicle information 414 is data indicating the state of the vehicle acquired from the vehicle, and is data such as vehicle speed, acceleration, and steering angle. The weather information 415 is information related to the weather when an image is captured. Then, the analysis unit 305 refers to the image table 401, analyzes the surrounding image, and generates related information.
 図5は、本実施形態に係る情報処理装置200のハードウェア構成を示すブロック図である。CPU(Central Processing Unit)510は演算制御用のプロセッサであり、プログラムを実行することで図3の情報処理装置200の機能構成部を実現する。ROM(Read Only Memory)520は、初期データおよびプログラムなどの固定データおよびその他のプログラムを記憶する。また、ネットワークインタフェース530は、ネットワークを介して他の装置などと通信する。なお、CPU510は1つに限定されず、複数のCPUであっても、あるいは画像処理用のGPU(Graphics Processing Unit)を含んでもよい。また、ネットワークインタフェース530は、CPU510とは独立したCPUを有して、RAM(Random Access Memory)540の領域に送受信データを書き込みあるいは読み出しするのが望ましい。また、RAM540とストレージ550との間でデータを転送するDMAC(Direct Memory Access Controller)を設けるのが望ましい(図示なし)。さらに、入出力インタフェース560は、CPU510とは独立したCPUを有して、RAM540の領域に入出力データを書き込みあるいは読み出しするのが望ましい。したがって、CPU510は、RAM540にデータが受信あるいは転送されたことを認識してデータを処理する。また、CPU510は、処理結果をRAM540に準備し、後の送信あるいは転送はネットワークインタフェース530やDMAC、あるいは入出力インタフェース560に任せる。 FIG. 5 is a block diagram showing a hardware configuration of the information processing apparatus 200 according to the present embodiment. A CPU (Central Processing Unit) 510 is a processor for arithmetic control, and implements a functional component of the information processing apparatus 200 in FIG. 3 by executing a program. A ROM (Read Only Memory) 520 stores fixed data such as initial data and programs and other programs. The network interface 530 communicates with other devices via the network. Note that the number of CPUs 510 is not limited to one, and a plurality of CPUs may be included, or a GPU (Graphics Processing Unit) for image processing may be included. The network interface 530 preferably includes a CPU independent of the CPU 510 and writes or reads transmission / reception data in a RAM (Random Access Memory) 540 area. Also, it is desirable to provide a DMAC (Direct Memory Access Controller) that transfers data between the RAM 540 and the storage 550 (not shown). Furthermore, the input / output interface 560 preferably has a CPU independent of the CPU 510 and writes or reads input / output data in the RAM 540 area. Therefore, the CPU 510 recognizes that the data has been received or transferred to the RAM 540 and processes the data. Further, the CPU 510 prepares the processing result in the RAM 540 and leaves the subsequent transmission or transfer to the network interface 530, the DMAC, or the input / output interface 560.
 RAM540は、CPU510が一時記憶のワークエリアとして使用するランダムアクセスメモリである。RAM540には、本実施形態の実現に必要なデータを記憶する領域が確保されている。生体情報データ541は、運転者の生体情報に関するデータである。車両情報データ542は、車両の速度や加速度などに関するデータである。周囲画像データ543は、ドライブレコーダなどで撮像された画像に関するデータである。関連情報データ544は、生体情報の変化に至った車両情報と周囲画像との関連性に関する情報である。これらのデータは、画像テーブル401から展開されたデータである。行動データ545は、運転者が取るべき行動についてのデータである。運転用データ/指示用データ546は、自動運転可能な車両に用いられる自動運転のためのデータおよび運転者に対して指示するためのデータである。 The RAM 540 is a random access memory used by the CPU 510 as a temporary storage work area. In the RAM 540, an area for storing data necessary for realizing the present embodiment is secured. The biometric information data 541 is data related to the driver's biometric information. The vehicle information data 542 is data relating to the speed and acceleration of the vehicle. The surrounding image data 543 is data relating to an image captured by a drive recorder or the like. The related information data 544 is information related to the relationship between the vehicle information that has led to the change of the biological information and the surrounding image. These data are data developed from the image table 401. The action data 545 is data about actions to be taken by the driver. The driving data / instruction data 546 is data for automatic driving used for a vehicle capable of automatic driving and data for instructing the driver.
 入出力データ547は、入出力インタフェース560を介して入出力されるデータである。送受信データ548は、ネットワークインタフェース530を介して送受信されるデータである。また、RAM540は、各種アプリケーションモジュールを実行するためのアプリケーション実行領域549を有する。 The input / output data 547 is data input / output via the input / output interface 560. The transmission / reception data 548 is data transmitted / received via the network interface 530. The RAM 540 has an application execution area 549 for executing various application modules.
 ストレージ550には、データベースや各種のパラメータ、あるいは本実施形態の実現に必要な以下のデータまたはプログラムが記憶されている。ストレージ550は、画像テーブル401を格納する。画像テーブル401は、図4に示した、運転ID411と画像ID412などとの関係を管理するテーブルである。ストレージ550は、さらに、生体情報取得モジュール551と、車両情報取得モジュール552と、撮像モジュール553と、解析モジュール554と、行動データ出力モジュール555と、運転用データ生成モジュール556と、指示用データ生成モジュール557と、を格納する。これらのモジュールは、CPU510により実行される。 The storage 550 stores a database, various parameters, or the following data or programs necessary for realizing the present embodiment. The storage 550 stores the image table 401. The image table 401 is a table for managing the relationship between the driving ID 411 and the image ID 412 shown in FIG. The storage 550 further includes a biological information acquisition module 551, a vehicle information acquisition module 552, an imaging module 553, an analysis module 554, a behavior data output module 555, a driving data generation module 556, and an instruction data generation module. 557 are stored. These modules are executed by the CPU 510.
 生体情報取得モジュール551は、車両の運転者の生体情報を取得するモジュールである。車両情報取得モジュール552は、車両の車両情報を取得するモジュールである。撮像モジュール553は、車両を含む車両の周囲画像を撮像するモジュールである。解析モジュール554は、生体情報の変化に応じて、生体情報、車両情報および周囲画像を解析し、生体情報の変化に至った車両情報と周囲画像との関連情報を生成するモジュールである。行動データ出力モジュール555は、車両情報および周囲画像の少なくともいずれかに基づいて決定された運転者の取るべき行動についての行動データを出力するモジュールである。運転用データ生成モジュール556は、行動データを用いて、自動運転可能な車両に用いられる運転用データを生成するモジュールである。指示用データ生成モジュール557は、行動データを用いて、運転者に対して指示するための指示用データを生成するモジュールである。これらのモジュール551~557は、CPU510によりRAM540のアプリケーション実行領域549に読み出され、実行される。制御プログラム557は、情報処理装置200の全体を制御するためのプログラムである。 The biometric information acquisition module 551 is a module that acquires biometric information of a vehicle driver. The vehicle information acquisition module 552 is a module that acquires vehicle information of the vehicle. The imaging module 553 is a module that captures an image around the vehicle including the vehicle. The analysis module 554 is a module that analyzes biological information, vehicle information, and surrounding images in accordance with changes in biological information, and generates related information between the vehicle information that has led to changes in biological information and surrounding images. The behavior data output module 555 is a module that outputs behavior data on the behavior to be taken by the driver determined based on at least one of the vehicle information and the surrounding image. The driving data generation module 556 is a module that generates driving data used for a vehicle capable of automatic driving using behavior data. The instruction data generation module 557 is a module that generates instruction data for instructing the driver using behavior data. These modules 551 to 557 are read by the CPU 510 into the application execution area 549 of the RAM 540 and executed. The control program 557 is a program for controlling the entire information processing apparatus 200.
 入出力インタフェース560は、入出力機器との入出力データをインタフェースする。入出力インタフェース560には、表示部561、操作部562、が接続される。また、入出力インタフェース560には、さらに、記憶媒体564が接続されてもよい。さらに、音声出力部であるスピーカ563や、音声入力部であるマイク、あるいは、GPS位置判定部が接続されてもよい。なお、図5に示したRAM540やストレージ550には、情報処理装置200が有する汎用の機能や他の実現可能な機能に関するプログラムやデータは図示されていない。 The input / output interface 560 interfaces input / output data with input / output devices. A display unit 561 and an operation unit 562 are connected to the input / output interface 560. Further, a storage medium 564 may be further connected to the input / output interface 560. Furthermore, a speaker 563 that is an audio output unit, a microphone that is an audio input unit, or a GPS position determination unit may be connected. Note that the RAM 540 and the storage 550 shown in FIG. 5 do not show programs and data related to general-purpose functions and other realizable functions that the information processing apparatus 200 has.
 図6は、本実施形態に係る情報処理装置200の処理手順を説明するフローチャートである。このフローチャートは、CPU510がRAM540を使用して実行し、図3の情報処理装置200の機能構成部を実現する。 FIG. 6 is a flowchart for explaining the processing procedure of the information processing apparatus 200 according to this embodiment. This flowchart is executed by the CPU 510 using the RAM 540, and implements the functional components of the information processing apparatus 200 of FIG.
 ステップS601において、情報処理装置200は、車両の運転者の生体情報を、例えば、運転者の装着しているウェアラブル端末などから取得する。ステップS603において、情報処理装置200は、車両の車両情報を取得する。ステップS605において、情報処理装置200は、自車両を含む自車両の周囲画像を取得する。なお、ステップS601~S605の順序は図6に示した順序には限定されず、順序を適宜入れ替えて実行してもよいし、これらのステップを同時に実行してもよい。 In step S601, the information processing apparatus 200 acquires the biological information of the driver of the vehicle from, for example, a wearable terminal worn by the driver. In step S603, the information processing apparatus 200 acquires vehicle information of the vehicle. In step S605, the information processing apparatus 200 acquires a surrounding image of the host vehicle including the host vehicle. Note that the order of steps S601 to S605 is not limited to the order shown in FIG. 6, and the order may be changed as appropriate, or these steps may be executed simultaneously.
 ステップS607において、情報処理装置200は、生体情報、車両情報および周囲画像を対応付けて記憶する。ステップS609において、情報処理装置200は、取得した生体情報に変化があるか否かを判定する。生体情報に変化がない場合(ステップS609のNO)、情報処理装置200は、ステップS601に戻る。生体情報に変化がある場合(ステップS609のYES)、情報処理装置200は、ステップS611へ進む。 In step S607, the information processing apparatus 200 stores biometric information, vehicle information, and surrounding images in association with each other. In step S609, the information processing apparatus 200 determines whether there is a change in the acquired biological information. If there is no change in the biological information (NO in step S609), the information processing apparatus 200 returns to step S601. When there is a change in the biological information (YES in step S609), the information processing apparatus 200 proceeds to step S611.
 ステップS611において、情報処理装置200は、生体情報の変化に応じて、記憶された生体情報、車両情報および周囲画像を解析し、生体情報の変化に至った車両情報と周囲画像との関連情報を生成する。ステップS613において、情報処理装置200は、生成した関連情報を用いて、車両情報および周囲画像の少なくともいずれかに基づいて決定された運転者の取るべき行動についての行動データを生成する。ステップS615において、情報処理装置200は、出力された行動データを用いて、自動運転可能な車両に用いられる運転用データを生成する。また、情報処理装置200は、出力された行動データを用いて、運転者に指示するための指示用データを生成する。 In step S611, the information processing apparatus 200 analyzes the stored biological information, vehicle information, and surrounding images according to changes in the biological information, and obtains related information between the vehicle information and the surrounding images that have led to the change in the biological information. Generate. In step S <b> 613, the information processing apparatus 200 uses the generated related information to generate action data regarding actions to be taken by the driver determined based on at least one of vehicle information and surrounding images. In step S615, the information processing apparatus 200 generates driving data used for a vehicle capable of automatic driving using the output behavior data. In addition, the information processing apparatus 200 generates instruction data for instructing the driver using the output behavior data.
 本実施形態によれば、運転者の取るべき行動についての行動データを出力するので、安全運転を十分に支援することができる。また、出力した行動データを用いて、運転用データを生成するので、自動運転技術を高度化することができる。さらに、出力した行動データを用いて、指示用データを生成するので、安全運転を十分に支援することができる。 According to the present embodiment, since the action data on the action to be taken by the driver is output, safe driving can be sufficiently supported. Further, since the driving data is generated using the output action data, the automatic driving technique can be advanced. Furthermore, since the instruction data is generated using the output action data, it is possible to sufficiently support safe driving.
 [第3実施形態]
 次に本発明の第3実施形態に係る情報処理装置について、図7乃至図10を用いて説明する。図7は、本実施形態に係る情報処理装置700の構成を示すブロック図である。本実施形態に係る情報処理装置700は、上記第2実施形態と比べると、タイミング推定部を有する点で異なる。その他の構成および動作は、第2実施形態と同様であるため、同じ構成および動作については同じ符号を付してその詳しい説明を省略する。
[Third Embodiment]
Next, an information processing apparatus according to the third embodiment of the present invention will be described with reference to FIGS. FIG. 7 is a block diagram illustrating a configuration of the information processing apparatus 700 according to the present embodiment. The information processing apparatus 700 according to the present embodiment is different from the second embodiment in that it includes a timing estimation unit. Since other configurations and operations are the same as those of the second embodiment, the same configurations and operations are denoted by the same reference numerals, and detailed description thereof is omitted.
 情報処理装置700は、さらに、タイミング推定部701を備える。タイミング推定部701は、生体情報の変化に応じて、運転者が事故の発生を察知したタイミングを推定する。タイミング推定部701は、例えば、生体情報の変化が発生してから所定時間前を察知のタイミングと推定するが、タイミングの推定方法は、これには限定されない。そして、解析部305は、推定したタイミングに基づいて、関連情報を生成する。 The information processing apparatus 700 further includes a timing estimation unit 701. The timing estimation unit 701 estimates the timing at which the driver senses the occurrence of an accident according to changes in the biological information. The timing estimation unit 701 estimates, for example, a predetermined time before the change in biological information occurs as the detection timing, but the timing estimation method is not limited to this. Then, the analysis unit 305 generates related information based on the estimated timing.
 図8は、本実施形態に係る情報処理装置700の備える画像テーブル801の一例を示す図である。画像テーブル801は、運転ID411に対応付けて所定時間811を記憶する。タイミング推定部701は、画像テーブル801を参照して、運転者が事故発生を察知したタイミングを推定する。 FIG. 8 is a diagram illustrating an example of the image table 801 included in the information processing apparatus 700 according to the present embodiment. The image table 801 stores a predetermined time 811 in association with the driving ID 411. The timing estimation unit 701 refers to the image table 801 and estimates the timing at which the driver has detected an accident.
 図9は、本実施形態に係る情報処理装置700のハードウェア構成を示すブロック図である。RAM940は、CPU910が一時記憶のワークエリアとして使用するランダムアクセスメモリである。RAM940には、本実施形態の実現に必要なデータを記憶する領域が確保されている。所定時間データ941は、察知タイミングの推定に用いる時間データであり、画像テーブル801から展開されたデータである。ストレージ950は、画像テーブル801を格納する。画像テーブル801は、図8に示した、運転ID411と所定時間811などとの関係を管理するテーブルである。ストレージ550は、さらに、タイミング推定モジュール951を格納する。タイミング推定モジュール951は、運転者が事故の発生を察知したタイミングを推定するモジュールである。タイミング推定モジュール951は、CPU510によりRAM940のアプリケーション実行領域549に読み出され、実行される。 FIG. 9 is a block diagram illustrating a hardware configuration of the information processing apparatus 700 according to the present embodiment. The RAM 940 is a random access memory that the CPU 910 uses as a work area for temporary storage. In the RAM 940, an area for storing data necessary for realizing the present embodiment is secured. The predetermined time data 941 is time data used for estimation of the detection timing, and is data developed from the image table 801. The storage 950 stores an image table 801. The image table 801 is a table for managing the relationship between the operation ID 411 and the predetermined time 811 shown in FIG. The storage 550 further stores a timing estimation module 951. The timing estimation module 951 is a module that estimates the timing at which the driver senses the occurrence of an accident. The timing estimation module 951 is read by the CPU 510 into the application execution area 549 of the RAM 940 and executed.
 図10は、本実施形態に係る情報処理装置700の処理手順を説明するフローチャートである。このフローチャートは、CPU510がRAM940を使用して実行し、図7の情報処理装置700の機能構成部を実現する。なお、図6と同様のステップには同じステップ番号を付して重複する説明を省略する。 FIG. 10 is a flowchart for explaining the processing procedure of the information processing apparatus 700 according to this embodiment. This flowchart is executed by the CPU 510 using the RAM 940, and implements the functional components of the information processing apparatus 700 in FIG. Note that the same steps as those in FIG. 6 are denoted by the same step numbers, and redundant description is omitted.
 ステップS1001において、情報処理装置700は、運転者が事故の発生を察知したタイミングを推定する。ステップS1003において、情報処理装置700は、推定したタイミングに基づいて、生体情報の変化に至った車両情報と周囲画像との関連情報を生成する。 In step S1001, the information processing apparatus 700 estimates the timing when the driver senses the occurrence of an accident. In step S <b> 1003, the information processing apparatus 700 generates related information between the vehicle information that has caused the change in the biological information and the surrounding image based on the estimated timing.
 本実施形態によれば、事故察知のタイミングを推定するので、運転者の取るべき行動についての行動データの精度が上がり、自動運転技術をさらに高度化することができる。 According to the present embodiment, since the timing of accident detection is estimated, the accuracy of the action data regarding the action to be taken by the driver is increased, and the automatic driving technique can be further advanced.
 [他の実施形態]
 以上、実施形態を参照して本願発明を説明したが、本願発明は上記実施形態に限定されるものではない。本願発明の構成や詳細には、本願発明のスコープ内で当業者が理解し得る様々な変更をすることができる。また、それぞれの実施形態に含まれる別々の特徴を如何様に組み合わせたシステムまたは装置も、本発明の範疇に含まれる。
[Other Embodiments]
While the present invention has been described with reference to the embodiments, the present invention is not limited to the above embodiments. Various changes that can be understood by those skilled in the art can be made to the configuration and details of the present invention within the scope of the present invention. In addition, a system or an apparatus in which different features included in each embodiment are combined in any way is also included in the scope of the present invention.
 また、本発明は、複数の機器から構成されるシステムに適用されてもよいし、単体の装置に適用されてもよい。さらに、本発明は、実施形態の機能を実現する情報処理プログラムが、システムあるいは装置に直接あるいは遠隔から供給される場合にも適用可能である。したがって、本発明の機能をコンピュータで実現するために、コンピュータにインストールされるプログラム、あるいはそのプログラムを格納した媒体、そのプログラムをダウンロードさせるWWW(World Wide Web)サーバも、本発明の範疇に含まれる。特に、少なくとも、上述した実施形態に含まれる処理ステップをコンピュータに実行させるプログラムを格納した非一時的コンピュータ可読媒体(non-transitory computer readable medium)は本発明の範疇に含まれる。 Further, the present invention may be applied to a system composed of a plurality of devices, or may be applied to a single device. Furthermore, the present invention can also be applied to a case where an information processing program that implements the functions of the embodiments is supplied directly or remotely to a system or apparatus. Therefore, in order to realize the functions of the present invention on a computer, a program installed on the computer, a medium storing the program, and a WWW (World Wide Web) server that downloads the program are also included in the scope of the present invention. . In particular, at least a non-transitory computer readable medium storing a program for causing a computer to execute the processing steps included in the above-described embodiments is included in the scope of the present invention.
この出願は、2016年10月25日に出願された日本出願特願2016-208590を基礎とする優先権を主張し、その開示の全てをここに取り込む。 This application claims the priority on the basis of Japanese application Japanese Patent Application No. 2016-208590 for which it applied on October 25, 2016, and takes in those the indications of all here.

Claims (12)

  1.  車両の運転者の生体情報を取得する生体情報取得手段と、
     前記車両の車両情報を取得する車両情報取得手段と、
     前記車両に取り付けられた撮像手段から、前記車両を含む前記車両の周囲画像を取得する周囲画像取得手段と、
     前記生体情報、前記車両情報および前記周囲画像の複数の組を記憶する記憶手段と、
     前記複数の組を解析し、運転者の生体情報が変化に至った車両情報と周囲画像との関連情報を生成する解析手段と、
     前記関連情報を用いて、車両情報および周囲画像の少なくともいずれかと運転者の取るべき行動とを対応付けた行動データを出力する行動データ出力手段と、
     を備えた情報処理装置。
    Biometric information acquisition means for acquiring biometric information of the driver of the vehicle;
    Vehicle information acquisition means for acquiring vehicle information of the vehicle;
    A surrounding image acquisition means for acquiring a surrounding image of the vehicle including the vehicle from an imaging means attached to the vehicle;
    Storage means for storing a plurality of sets of the biological information, the vehicle information, and the surrounding image;
    Analyzing means for analyzing the plurality of sets, and generating information related to vehicle information and surrounding images in which the driver's biological information has changed; and
    Using the related information, action data output means for outputting action data in which at least one of vehicle information and surrounding images and actions to be taken by the driver are associated;
    An information processing apparatus comprising:
  2.  前記行動データを用いて、自動運転可能な車両に用いられる運転用データを生成する運転用データ生成手段をさらに備えた請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, further comprising driving data generation means for generating driving data used for a vehicle capable of automatic driving using the behavior data.
  3.  前記運転用データ生成手段は、自動運転可能な車両の車両情報および前記自動運転可能な車両の撮像手段が撮像した周囲画像の少なくともいずれかに基づいて、前記運転用データを生成する請求項2に記載の情報処理装置。 The driving data generation unit generates the driving data based on at least one of vehicle information of a vehicle capable of automatic driving and a surrounding image captured by the imaging unit of the vehicle capable of automatic driving. The information processing apparatus described.
  4.  前記運転用データは、車両情報および周囲画像に基づいて、自動運転が危険であると判断した場合、手動運転に切り替える第1切替データを含む請求項3に記載の情報処理装置。 The information processing apparatus according to claim 3, wherein the driving data includes first switching data for switching to manual driving when it is determined that automatic driving is dangerous based on vehicle information and surrounding images.
  5.  前記運転用データは、生体情報が運転者の異常を表す場合、自動運転に切り替える第2切替データを含む請求項3または4に記載の情報処理装置。 The information processing apparatus according to claim 3 or 4, wherein the driving data includes second switching data for switching to automatic driving when the biological information indicates an abnormality of the driver.
  6.  前記行動データを用いて、運転者に対して指示するための指示用データを生成する指示用データ生成手段をさらに備えた請求項1または2に記載の情報処理装置。 3. The information processing apparatus according to claim 1, further comprising instruction data generation means for generating instruction data for instructing a driver using the behavior data.
  7.  前記指示用データ生成手段は、車両の車両情報および前記車両の撮像手段が撮像した周囲画像の少なくともいずれかに基づいて、前記指示用データを生成する請求項6に記載の情報処理装置。 The information processing apparatus according to claim 6, wherein the instruction data generation unit generates the instruction data based on at least one of vehicle information of a vehicle and a surrounding image captured by the vehicle imaging unit.
  8.  前記記憶手段は、前記生体情報、前記車両情報および前記周囲画像に、さらに、気象情報を含む外部情報を対応付けて記憶し、
     前記解析手段は、前記記憶手段に記憶された前記生体情報、前記車両情報、前記周囲画像および前記気象情報を解析し、前記生体情報の変化に至った前記車両情報と前記周囲画像と前記気象情報の関連情報を生成する、請求項1乃至7のいずれか1項に記載の情報処理装置。
    The storage means further stores external information including weather information in association with the biological information, the vehicle information, and the surrounding image,
    The analysis unit analyzes the biological information, the vehicle information, the surrounding image, and the weather information stored in the storage unit, and the vehicle information, the surrounding image, and the weather information that have led to a change in the biological information. The information processing apparatus according to claim 1, wherein the related information is generated.
  9.  前記周囲画像は、事故に関連する画像を含み、
     前記生体情報の変化に応じて、前記運転者が前記事故の発生を察知したタイミングを推定するタイミング推定手段をさらに備え、
     前記解析手段は、前記タイミングに基づいて、前記生体情報の変化に至った前記車両情報と前記周囲画像との関連情報を生成する、請求項1乃至8のいずれか1項に記載の情報処理装置。
    The surrounding image includes an image related to an accident,
    In accordance with the change in the biometric information, further comprising timing estimation means for estimating the timing at which the driver sensed the occurrence of the accident,
    9. The information processing apparatus according to claim 1, wherein the analysis unit generates related information between the vehicle information that has led to a change in the biological information and the surrounding image based on the timing. .
  10.  前記タイミング推定手段は、前記生体情報の変化から所定時間前を前記タイミングと推定する請求項9に記載の情報処理装置。 10. The information processing apparatus according to claim 9, wherein the timing estimation means estimates a predetermined time before the change from the change in the biological information.
  11.  車両の運転者の生体情報を取得する生体情報取得ステップと、
     前記車両の車両情報を取得する車両情報取得ステップと、
     前記車両に取り付けられた撮像手段から、前記車両を含む前記車両の周囲画像を取得する周囲画像取得ステップと、
     前記生体情報、前記車両情報および前記周囲画像の複数の組を記憶する記憶ステップと、
     前記複数の組を解析し、運転者の生体情報が変化に至った車両情報と周囲画像との関連情報を生成する解析ステップと、
     前記関連情報を用いて、車両情報および周囲画像の少なくともいずれかと運転者の取るべき行動とを対応付けた行動データを出力する行動データ出力ステップと、
     を含む情報処理方法。
    A biological information acquisition step for acquiring biological information of the driver of the vehicle;
    Vehicle information acquisition step of acquiring vehicle information of the vehicle;
    A surrounding image acquisition step of acquiring a surrounding image of the vehicle including the vehicle from an imaging means attached to the vehicle;
    A storage step of storing a plurality of sets of the biological information, the vehicle information, and the surrounding image;
    An analysis step of analyzing the plurality of sets and generating information related to vehicle information and surrounding images in which the driver's biometric information has changed;
    Using the related information, an action data output step of outputting action data in which at least one of vehicle information and surrounding images is associated with an action to be taken by the driver;
    An information processing method including:
  12.  車両の運転者の生体情報を取得する生体情報取得ステップと、
     前記車両の車両情報を取得する車両情報取得ステップと、
     前記車両に取り付けられた撮像手段から、前記車両を含む前記車両の周囲画像を取得する周囲画像取得ステップと、
     前記生体情報、前記車両情報および前記周囲画像の複数の組を記憶する記憶ステップと、
     前記複数の組を解析し、運転者の生体情報が変化に至った車両情報と周囲画像との関連情報を生成する解析ステップと、
     前記関連情報を用いて、車両情報および周囲画像の少なくともいずれかと運転者の取るべき行動とを対応付けた行動データを出力する行動データ出力ステップと、
     をコンピュータに実行させる情報処理プログラム。
    A biological information acquisition step for acquiring biological information of the driver of the vehicle;
    Vehicle information acquisition step of acquiring vehicle information of the vehicle;
    A surrounding image acquisition step of acquiring a surrounding image of the vehicle including the vehicle from an imaging means attached to the vehicle;
    A storage step of storing a plurality of sets of the biological information, the vehicle information, and the surrounding image;
    An analysis step of analyzing the plurality of sets and generating information related to vehicle information and surrounding images in which the driver's biometric information has changed;
    Using the related information, an action data output step of outputting action data in which at least one of vehicle information and surrounding images is associated with an action to be taken by the driver;
    An information processing program that causes a computer to execute.
PCT/JP2016/082876 2016-10-25 2016-11-04 Information processing device, information processing method, and information processing program WO2018078889A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-208590 2016-10-25
JP2016208590A JP6749210B2 (en) 2016-10-25 2016-10-25 Information processing apparatus, information processing method, and information processing program

Publications (1)

Publication Number Publication Date
WO2018078889A1 true WO2018078889A1 (en) 2018-05-03

Family

ID=62024642

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/082876 WO2018078889A1 (en) 2016-10-25 2016-11-04 Information processing device, information processing method, and information processing program

Country Status (2)

Country Link
JP (1) JP6749210B2 (en)
WO (1) WO2018078889A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111611330A (en) * 2019-02-26 2020-09-01 丰田自动车株式会社 Information processing system, program, and control method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012164237A (en) * 2011-02-08 2012-08-30 Honda Motor Co Ltd Driving support device for vehicle
JP2014081947A (en) * 2013-12-04 2014-05-08 Denso Corp Information distribution device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012164237A (en) * 2011-02-08 2012-08-30 Honda Motor Co Ltd Driving support device for vehicle
JP2014081947A (en) * 2013-12-04 2014-05-08 Denso Corp Information distribution device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111611330A (en) * 2019-02-26 2020-09-01 丰田自动车株式会社 Information processing system, program, and control method
CN111611330B (en) * 2019-02-26 2023-09-26 丰田自动车株式会社 Information processing system, program, and control method

Also Published As

Publication number Publication date
JP6749210B2 (en) 2020-09-02
JP2018072922A (en) 2018-05-10

Similar Documents

Publication Publication Date Title
CN109421630A (en) For monitoring the controller architecture of the health of autonomous vehicle
JP6432490B2 (en) In-vehicle control device and in-vehicle recording system
US10640123B2 (en) Driver monitoring system
DE102021121558A1 (en) NEURAL NETWORK BASED GAZE DIRECTION DETERMINATION USING SPATIAL MODELS
US11400944B2 (en) Detecting and diagnosing anomalous driving behavior using driving behavior models
DE102021100065A1 (en) USE OF NEURONAL NETWORKS FOR ERROR DETECTION IN APPLICATIONS FOR AUTONOMOUS DRIVING
EP2781979B1 (en) Real-time monitoring of vehicle
JP7191752B2 (en) Vehicle control system and vehicle
US11308357B2 (en) Training data generation apparatus
JP6458579B2 (en) Image processing device
JP2016119547A (en) Remote collection system for vehicle data
US20200294329A1 (en) Data collecting device, data collecting system, data collecting method, and on-vehicle device
KR101802858B1 (en) Integrated data processing system and method for vehicle
WO2018078889A1 (en) Information processing device, information processing method, and information processing program
JP2018139070A (en) Vehicle display control device
US20220222936A1 (en) Outside environment recognition device
CN116071949A (en) Augmented reality method and device for driving assistance
WO2019131388A1 (en) Drive assistance device, drive assistance system, drive assistance method, and recording medium in which drive assistance program is stored
JP6915982B2 (en) Information processing equipment, information processing methods and information processing programs
CN115951599A (en) Unmanned aerial vehicle-based driving capability test system, method and device and storage medium
CN109144070A (en) Mobile device assists automatic Pilot method, automobile and storage medium
CN115599460A (en) State suspension for optimizing a start-up process of an autonomous vehicle
JP6960220B2 (en) Information processing equipment, information processing methods and information processing programs
CN115344117A (en) Adaptive eye tracking machine learning model engine
CN210781107U (en) Vehicle-mounted data processing terminal and system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16919961

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16919961

Country of ref document: EP

Kind code of ref document: A1