WO2021024497A1 - Système, procédé et programme d'affichage des images d'incident d'exploitation - Google Patents

Système, procédé et programme d'affichage des images d'incident d'exploitation Download PDF

Info

Publication number
WO2021024497A1
WO2021024497A1 PCT/JP2019/031529 JP2019031529W WO2021024497A1 WO 2021024497 A1 WO2021024497 A1 WO 2021024497A1 JP 2019031529 W JP2019031529 W JP 2019031529W WO 2021024497 A1 WO2021024497 A1 WO 2021024497A1
Authority
WO
WIPO (PCT)
Prior art keywords
incident
image
driver
information
acquired
Prior art date
Application number
PCT/JP2019/031529
Other languages
English (en)
Japanese (ja)
Inventor
佐藤 公則
秀明 南雲
圭 石山
憲之 根本
将仁 谷口
Original Assignee
株式会社日立物流
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立物流 filed Critical 株式会社日立物流
Priority to JP2021537554A priority Critical patent/JP7369776B2/ja
Priority to PCT/JP2019/031529 priority patent/WO2021024497A1/fr
Publication of WO2021024497A1 publication Critical patent/WO2021024497A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/40Business processes related to the transportation industry
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/123Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams
    • G08G1/127Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams to a central station ; Indicators in a central station
    • G08G1/13Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams to a central station ; Indicators in a central station the indicator being in the form of a map
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to an operation incident image display system, method and program for the driver himself to look back on the driving video in chronological order along the operation route on the map.
  • the present invention is related to IoT (Internet of Things), and the technical field corresponds to G06Q or the like in the IPC classification.
  • an object of the present invention is to provide an operation incident image display system, method and program that allows a driver to look back on his / her own driving video in chronological order along an operation route on a map.
  • the first acquisition means for acquiring the biological information of the driver during driving, the first detection means for analyzing the acquired biological information to detect the physical condition of the driver, and the detected physical condition are A first determination means for determining a first incident when a predetermined condition is satisfied, a first extraction means for cutting out an image captured by a drive recorder as a first image in association with the first incident, and the driver.
  • a second determination means for determining a second incident a second cutting means for cutting out an image captured by the drive recorder as a second image in association with the second incident, and a vehicle driven by the driver.
  • a fourth detecting means for analyzing the acquired behavior information of the vehicle to detect a dangerous driving degree, and a fourth determining means for determining a fourth incident when the dangerous driving degree satisfies a predetermined condition.
  • a fourth cutting means for cutting out an image captured by the drive recorder as a fourth image in association with the fourth incident, and the first incident, the second incident, the third incident, and the fourth incident. It is provided with a display means for displaying the occurrence location together with the operation route on a map and displaying the cut-out first image, second image, third image, and fourth image in a time series of incident occurrence.
  • Provide an operation incident image display system (the system may be a single computer).
  • the step of detecting, the step of collating the detected behavior with the road sign information and / or the signal information, and determining a third incident when there is a violation, and the image captured by the drive recorder are described as described above.
  • the step of cutting out as a third image in association with the third incident the step of analyzing the acquired behavior information of the vehicle to detect the dangerous driving degree, and the dangerous driving degree satisfying a predetermined condition, A step of determining a fourth incident, a step of cutting out an image captured by the drive recorder as a fourth image in association with the fourth incident, the first incident, the second incident, the third incident, and the first 4
  • a step of displaying the location of the incident occurrence on the map together with the operation route, and displaying the cut-out first image, second image, third image, and fourth image in the time series of the incident occurrence Provide a method of displaying an operation incident image.
  • a step of acquiring the biological information of the driver while driving, a step of analyzing the acquired biological information to detect the physical condition of the driver, and the detected physical condition are predetermined.
  • the step of cutting out the captured image as a second image in association with the second incident the step of acquiring the behavior information of the vehicle being driven by the driver, and the step of analyzing the acquired behavior information to analyze the vehicle.
  • the step of detecting the behavior of the above the step of collating the detected behavior with the road sign information and / or the signal information, and determining a third incident when there is a violation, and an image captured by the drive recorder.
  • a step of determining a fourth incident a step of cutting out an image captured by the drive recorder as a fourth image in association with the fourth incident, a first incident, a second incident, and a third incident. And the place where the fourth incident occurred is displayed on the map together with the operation route, and the cut out first image, second image, third image, and fourth image are displayed in the time series of the incident occurrence. It provides a step and an operation incident image display program for executing.
  • the biological information of the driver during driving is acquired and analyzed, the physical condition of the driver is detected, and when the physical condition satisfies a predetermined condition, it is determined as the first incident and the drive is performed.
  • the driver's motion information is acquired and analyzed, the driver's motion is detected, and the motion satisfies a predetermined condition.
  • it is determined that it is a second incident and the image captured by the drive recorder is cut out as a second image in association with the second incident, and the behavior information of the vehicle being driven by the driver is acquired, analyzed, and then analyzed.
  • the behavior of the vehicle is detected, the behavior is collated with the road sign information and / or the signal information, and if there is a violation, it is determined as a third incident, and the image captured by the drive recorder is used as the third incident. It is associated and cut out as a third image, the acquired behavior information of the vehicle is analyzed, the dangerous driving degree is detected, and when the dangerous driving degree satisfies a predetermined condition, it is determined as a fourth incident.
  • the image captured by the drive recorder is cut out as the fourth image in association with the fourth incident, and the cut out first image, second image, third image, and fourth image are displayed on the map along the operation route. I decided to display it in chronological order. Therefore, the driver can look back on his / her own driving video in chronological order along the operation route on the map, which can be useful for preventing accidents.
  • the biological information of the driver during driving is acquired and analyzed, the physical condition of the driver is detected, and when the physical condition satisfies a predetermined condition, it is determined as the first incident and the drive recorder determines.
  • the captured image is cut out as the first image in association with the first incident, the operation information of the driver is acquired, analyzed, the operation of the driver is detected, and the operation satisfies a predetermined condition.
  • the image captured by the drive recorder which is determined to be the second incident, is cut out as the second image in association with the second incident, and the behavior information of the vehicle being driven by the driver is acquired, analyzed, and the vehicle The behavior is detected, the behavior is collated with the road sign information and / or the signal information, and if there is a violation, it is determined as a third incident, and the image captured by the drive recorder is associated with the third incident. It is cut out as a third image, the behavior information of the acquired behavior is analyzed, the dangerous driving degree is detected, and when the dangerous driving degree satisfies a predetermined condition, it is determined as a fourth incident and the drive recorder.
  • the image captured by is cut out as the fourth image in association with the fourth incident, and the cut out first image, second image, third image, and fourth image are displayed on the map along the operation route. It is displayed in a series.
  • Fig. 1 is a conceptual diagram showing an outline of the operation incident image display system according to the present embodiment.
  • the server 50 the terminal 20 mounted on the vehicle 10, and the terminal 110 for the administrator or the driver 12 to look back on the operation contents after returning are connected to the Internet.
  • Data communication is possible with each other via the including network.
  • the vehicle 10 is equipped with a driver biometric information sensing unit 40, a vehicle information sensing unit 42, and a driver motion sensing unit 44, and the information detected by these is transmitted to the terminal 20 by short-range wireless communication, and the terminal 20 Sends the received information to the server 50 via the communication unit and the network.
  • the driver biometric information sensing unit 40 includes, for example, a handle cover type electrocardiograph 22 and a seat cover type electrocardiograph 24, whereby an electrocardiographic waveform which is biometric information of the driver 12 can be obtained. Further, the driver biological information sensing unit 40 may acquire biological information such as pulse wave, electroencephalogram, blood pressure, and body temperature in addition to the electrocardiographic waveform.
  • the acquired biometric information of the driver 12 during driving is transmitted to the server 50 via the terminal 20, for example (step S1).
  • the server 50 analyzes the acquired biological information to detect the physical condition of the driver 12 (for example, fatigue, drowsiness, dementia, depression, concentration, possibility of sudden change, allergic symptom, etc.), and the detected physical condition is determined.
  • the server 50 cuts out an image (moving image or still image) captured by the drive recorder as a first image in association with the first incident.
  • the driver motion sensing unit 44 includes, for example, IoT (Internet of Things) drive recorders 30A and 30B including a camera for the outside of the vehicle and a camera for the inside of the vehicle, and an image of the camera for the vehicle interior, which is operation information of the driver 12, can be obtained. Further, the driver motion sensing unit 44 may acquire motion information of the driver 12 from a handle sensor, a motion sensor, or the like (not shown) in addition to the image for the inside of the vehicle. The acquired operation information of the driver 12 may be transmitted to the server 50 via the communication units of the IoT drive recorders 30A and 30B, or may be transmitted to the server 50 via the terminal 20 (step S2). ..
  • IoT Internet of Things
  • the server 50 analyzes the acquired operation information, detects the operation of the driver 12, and when the detected operation satisfies a predetermined condition (for example, inattentive driving or driving while operating a mobile terminal). Is deviated for 2 seconds or more, sleepy movement, etc.), it is judged as a second incident.
  • a predetermined condition for example, inattentive driving or driving while operating a mobile terminal. Is deviated for 2 seconds or more, sleepy movement, etc.), it is judged as a second incident.
  • the IoT drive recorders 30A and 30B are also used for vehicle behavior detection as will be described later, and also perform constant recording and video cutting. Further, the server 50 cuts out an image (moving image or still image) captured by the drive recorder as a second image in association with the second incident.
  • the vehicle behavior sensing unit 42 includes, for example, a danger behavior detector 26 equipped with an outward-facing camera, a danger notification button 28, and the IoT drive recorders 30A and 30B, whereby the distance between vehicles, lane departure, date and time of pressing, and constant recording are recorded. , Moving image is cut out, and behavior information (for example, speed, acceleration, brake, position information, etc.) of the vehicle 10 is acquired.
  • the acquired behavior information of the vehicle 10 may be transmitted to the server 50 via the terminal 20, or may be transmitted directly from the sensors to the server 50 when the sensors have a communication function. Good (step S3).
  • the server 50 analyzes the acquired behavior information, detects the behavior of the vehicle, and collates the detected behavior with the road sign information / signal information (for example, the road sign identification by image analysis of the camera for the outside of the vehicle). If the signal display content is specified, the road sign is specified from the map information, etc.), and if these are not observed (in the case of violation), it is determined as a third incident. Further, the server 50 cuts out an image (moving image or still image) captured by the drive recorder as a third image in association with the third incident.
  • the road sign information / signal information for example, the road sign identification by image analysis of the camera for the outside of the vehicle.
  • the server 50 analyzes the vehicle behavior information acquired in step S3, detects a dangerous driving degree (for example, sudden start, sudden braking, sudden steering, etc.), and determines the detected dangerous driving degree. When the condition of the above is satisfied, it may be determined as the fourth incident.
  • the server 50 cuts out an image (moving image or still image) captured by the drive recorder as a fourth image in association with the fourth incident.
  • the server 50 displays the occurrence locations of the first incident, the second incident, the third incident, and the fourth incident on the map together with the operation route, and cuts out the first image, the second image, the third image, and the server 50.
  • the fourth image is displayed in the time series of incident occurrence (step S4). Specifically, after the driver 12 returns, the map 118 is displayed on the display unit 112 of the terminal 110 that reviews the operation (roll call) together with the administrator. On the map 118, the locations where the first to fourth incidents occur are displayed as icons together with the operation route 116. Further, on the display unit 112, the incident information "13:01 temporary non-stop" and the cut-out incident image 130 are displayed together with the map 118.
  • the incident image 130 may be a moving image or a still image, and the moving image may be reproduced by selecting the incident image.
  • server 50 may also display the transition of the detected physical condition of the driver 12 on the map 118 in chronological order along the operation route.
  • the server 50 may learn the incident for each driver 12 and give advice to the driver 12.
  • the advice may be displayed on the terminal 110 after being managed by an ID or password for each driver 12, may be notified by pop-up to the terminal of the driver 12, or may be notified by e-mail or the like. It does not prevent you from giving advice in other ways.
  • the server 50 may stat all incidents, visualize dangerous places and times, and give advice to the driver 12.
  • the advice is given by managing each driver 12 with an ID and password and then displaying the advice on the terminal 110, or by sending a pop-up notification or an e-mail to the terminal of the driver 12. It does not prevent you from giving advice in other ways.
  • the server 50 may evaluate the growth of the driver 12 by comparing the past incident with the current incident for each driver 12. The evaluation result may be notified to the driver 12 or the administrator.
  • the server 50 may score which driver 12 has a small number of incidents, or which of the plurality of bases to which the vehicle 10 belongs has a small number of incidents.
  • the scoring result may be displayed on the terminal 110, or may be notified to the driver 12 or the administrator.
  • the players may be sorted in order of score to display the top rankers and bases.
  • the server 50 described above may be a single computer, for example, a terminal. Further, the functional configuration described later may be a computer system (cloud) executed by different computers. Further, in the present embodiment, although many processes are executed on the server 50, information acquisition, analysis / detection, and incident determination may be performed on the edge (sensors, etc.) side. By performing a lot of processing on the edge side, the amount of data transmitted to the server 50 can be reduced.
  • the vehicle 10 obtains the driver biological information sensing unit 40 for obtaining the biological information of the driver 12, the vehicle information sensing unit 42 for obtaining the behavior information of the vehicle 10 driven by the driver 12, and the operation information of the driver 12.
  • the driver motion sensing unit 44 of the above is provided.
  • the driver biometric information sensing unit 40 includes a steering wheel type electrocardiograph 22 and a seat cover type electrocardiograph 24.
  • the vehicle behavior sensing unit 42 includes a danger behavior detector 26 equipped with a vehicle outward-facing camera, a danger notification button 28, IoT drive recorders 30A, 30B, and the like.
  • the driver motion sensing unit 44 includes IoT drive recorders 30A and 30B provided with a camera for the outside of the vehicle and a camera for the inside of the vehicle.
  • the danger notification button 28 is a button that can be pressed when an incident occurs that the driver 12 notices inside or outside the vehicle regardless of various sensing devices, and is for obtaining incident occurrence information at the discretion of the driver 12.
  • the communication unit 46 transmits information obtained by various sensors and an image (moving image or still image) captured by the drive recorder to the server 50 and provides the information via the network. Of course, other information may be provided as needed.
  • the server 50 includes a processor 52, a memory 54, a storage 56, and a communication unit 69, which are connected by a bus (not shown).
  • the processor 52 is configured by, for example, a CPU (Central Processing Unit), and performs various processes by reading and executing various programs stored in the memory 54.
  • the memory 54 stores a program to be executed by the processor 52, and is composed of, for example, a ROM (Read Only Memory) or a RAM (Random Access Memory). For example, various means shown in FIG. 4 are stored.
  • the storage 56 includes acquired information 58, first incident determination condition 50, second incident determination condition 62, road sign information / signal information 63, fourth incident determination condition 64, incident information 66, map information 67, cutout image 68, and the like.
  • a control program (not shown) is stored.
  • the acquired information 58 includes, for example, biological information (electrocardiogram, pulse wave, brain wave, blood pressure, body temperature, etc.) of the driver 12 while driving, operation information of the driver 12 (vehicle inward camera image, handle sensor information, motion sensor information, etc.). ) And the behavior information (speed, acceleration, brake, position information, etc.) of the vehicle 10 driven by the driver 12.
  • biological information electronic information
  • pulse wave pulse wave
  • brain wave blood pressure
  • body temperature body temperature
  • operation information of the driver 12 vehicle inward camera image, handle sensor information, motion sensor information, etc.
  • behavior information speed, acceleration, brake, position information, etc.
  • the first incident determination condition 60 is a determination standard for analyzing the acquired biological information to detect the physical condition of the driver 12 and determining whether or not the detected physical condition corresponds to the first incident. For example, if the driver 12 has a high degree of fatigue, there is a standard that it is better not to drive.
  • the second incident determination condition 62 is a determination criterion for analyzing the acquired operation information of the driver 12 to detect the operation of the driver 12 and determining whether or not the detected operation corresponds to the second incident. is there. For example, there is a standard that a second incident occurs when the line of sight is off for 2 seconds or more during inattentive driving or driving while operating a mobile terminal, or when a sleepy operation is performed.
  • the road sign information / signal information 63 analyzes the acquired behavior information of the driver 12 to detect the behavior of the vehicle 10, and determines the third incident depending on whether or not the detected behavior complies with the road sign or the signal. It is a standard for judging whether or not it is applicable. For example, it is determined whether or not the behavior of the vehicle 10 complies with the law by specifying the road sign by image analysis of the vehicle outward camera, specifying the road sign from the map information, specifying the signal display content, and the like.
  • the fourth incident determination condition 64 is a determination standard for analyzing the acquired behavior information of the vehicle 10 to detect the degree of dangerous driving and determining whether or not the dangerous driving corresponds to the fourth incident.
  • the dangerous driving degree for example, as in the fourth incident determination condition 64 shown in FIG. 9, a driving state related to each condition quantified according to the dangerous degree can be used.
  • the judgment value is "mn" (m and n are both numbers), but the former "m” indicates the type of device (sensor) from which the data was acquired.
  • the latter "n” indicates the degree of risk by replacing it with a numerical value.
  • a standard is set that corresponds to the fourth incident.
  • the notation method and condition contents of the judgment value are examples, and are not limited to these methods and conditions.
  • Incident information 66 includes information related to the occurrence of the first incident, the second incident, the third incident, and the fourth incident (for example, the occurrence time, the occurrence location, the occurrence driver, the occurrence vehicle, the incident content, and the drive recorder image (for example). Video or still image) etc.).
  • the map information 67 is information used when displaying the occurrence locations of the first incident to the fourth incident together with the operation route on the terminal 110 used for reviewing the operation.
  • the map information 67 may be used to specify the road sign information / signal information 63.
  • the clipped image 68 includes a first image associated with a first incident, a second image associated with a second incident, a third image associated with a third incident, and a fourth image associated with the fourth incident.
  • a fourth image is included.
  • the communication unit 69 receives and acquires the information obtained by the various sensors of the vehicle 10 and the image (moving image or still image) captured by the drive recorder via the network. Of course, other information may be acquired as needed.
  • the server 50 includes the first acquisition means 70, the first detection means 71, the first determination means 72, the first cutting means 73, the second acquisition means 74, the second detection means 75, and the second determination.
  • Means 76, 2nd cutting means 77, 3rd acquiring means 78, 3rd detecting means 80, 3rd determining means 82, 3rd cutting means 84, 4th detecting means 86, and 4th The determination means 88, the fourth cutting means 90, the display means 92, the first advice means 94, the second advice means 95, the evaluation means 96, and the scoring means 97 are provided.
  • the first acquisition means 70 acquires biological information (electrocardiogram, pulse wave, brain wave, blood pressure, body temperature, etc.) of the driver 12 while driving.
  • biological information electronic information
  • the biometric information obtained by the driver biometric information sensing unit 40 of the vehicle 10 is transmitted to the server 50 via the terminal 20, and the server 50 acquires the biometric information by receiving the transmitted information.
  • the acquired biometric information is stored in the acquired information 58 of the storage 56.
  • the first detection means 71 analyzes the biological information acquired by the first acquisition means 70 and detects the physical condition of the driver 12 (fatigue, drowsiness, dementia, depression, concentration, possibility of sudden change, allergy determination, etc.). To do.
  • the first determination means 72 determines that the physical condition detected by the first detection means 71 is a first incident when a predetermined condition (high degree of fatigue and better not to drive, etc.) is satisfied. The determination of whether or not it is the first incident is performed depending on whether or not the first incident determination condition 60 is satisfied.
  • the first cutting means 73 cuts out an image (moving image or still image) captured by the drive recorder as a first image in association with the first incident.
  • the cut-out image is stored in the cut-out image 68 of the storage 56.
  • the IoT drive recorders 30A and 30B of the driver motion sensing unit 44 are used as the drive recorder.
  • the second acquisition means 74 acquires the operation information of the driver 12 (image of the inward camera, handle sensor information, motion sensor information, etc.). For example, the operation information of the driver 12 obtained by the driver operation sensing unit 44 of the vehicle 10 is transmitted to the server 50 directly by the IoT drive recorders 30A and 30B or via the terminal 20, and the server 50 transmits the transmitted information. The operation information of the driver 12 is acquired by receiving. The acquired operation information is stored in the acquired information 58 of the storage 56.
  • the second detection means 75 analyzes the operation information acquired by the second acquisition means 74 and detects the operation of the driver 12.
  • the second determining means 76 has set a predetermined condition for the operation detected by the second detecting means 75 (a sleepy operation when the line of sight is off for 2 seconds or more during inattentive driving or driving while operating the mobile terminal). When the above conditions are satisfied, it is determined as a second incident. Whether or not it is a second incident is determined based on whether or not the second incident determination condition 62 is satisfied.
  • the second cutting means 77 cuts out an image (moving image or still image) captured by the drive recorder as a second image in association with the second incident.
  • the cut-out image is stored in the cut-out image 68 of the storage 56.
  • IoT drive recorders 30A, 30B and the like are used as the drive recorder.
  • the third acquisition means 78 acquires behavior information (speed, acceleration, brake, position information, etc.) of the vehicle 10 being driven by the driver 12.
  • the behavior information of the vehicle 10 obtained by the vehicle behavior sensing unit 42 of the vehicle 10 is transmitted to the server 50 via the terminal 20.
  • the behavior information of the vehicle 10 may be transmitted directly from the edge side to the server 50.
  • the server 50 acquires the behavior information of the vehicle 10 by receiving the transmitted information.
  • the acquired behavior information is stored in the acquired information 58 of the storage 56.
  • the third detection means 80 analyzes the behavior information acquired by the third acquisition means 78 and detects the behavior of the vehicle 10.
  • the third determination means 82 determines the behavior detected by the third detection means 80 as road sign information / signal information 63 (identification of road sign / signal display content by image analysis of a vehicle outward camera, identification of road sign from map information, etc.). If these instructions are not observed, it is determined as a third incident.
  • the third cutting means 84 cuts out an image (moving image or still image) captured by the drive recorder as a third image in association with the third incident.
  • the cut-out image is stored in the cut-out image 68 of the storage 56.
  • IoT drive recorders 30A, 30B and the like are used as the drive recorder.
  • the fourth detection means 86 analyzes the behavior information of the vehicle 10 acquired by the third acquisition means 78 to detect the degree of dangerous driving (sudden start, sudden braking, sudden steering, etc.).
  • the fourth determination means 88 determines as a fourth incident when the dangerous driving degree detected by the fourth detection means 86 satisfies a predetermined condition. The determination as to whether or not it is the fourth incident is performed depending on whether or not the fourth incident determination condition 64 is satisfied.
  • the fourth cutting means 90 cuts out an image (moving image or still image) captured by the drive recorder as a fourth image in association with the fourth incident.
  • the cut-out image is stored in the cut-out image 68 of the storage 56.
  • IoT drive recorders 30A, 30B and the like are used as the drive recorder.
  • the display means 92 displays the occurrence locations of the first incident, the second incident, the third incident, and the fourth incident on the map together with the operation route, and cuts out the first image, the second image, the third image, and the third image.
  • the four images are displayed in chronological order of incident occurrence.
  • the map 118 is displayed on the display unit 112 of the terminal 110 of the administrator or the driver 12 (see FIG. 1).
  • the locations where the first to fourth incidents occur are displayed as icons together with the operation route 116.
  • the incident information "13:01 temporary non-stop" and the incident image 130 are displayed together with the map 118.
  • the incident image 130 may be a moving image or a still image, and the moving image may be reproduced by selecting the incident image.
  • the display means 92 may also display the transition of the physical condition of the driver 12 in chronological order along the operation route 116 on the map.
  • the first advice means 94 learns an incident for each driver 12 and gives advice to the driver.
  • the advice may be displayed on the terminal 110 after being managed by an ID or password for each driver 12, may be notified by pop-up to the terminal of the driver 12, or may be notified by e-mail or the like. Of course, it does not prevent you from providing advice by other means.
  • the second advice means 95 stats all incidents, visualizes dangerous places and times, and gives advice to the driver 12.
  • the advice is given by managing each driver 12 with an ID and password and then displaying the advice on the terminal 110, or by sending a pop-up notification or an e-mail to the terminal of the driver 12. Of course, it does not prevent you from providing advice by other means.
  • the evaluation means 96 compares the past incident with the current incident for each driver 12 and evaluates the growth of the driver. The evaluation result may be notified to the driver 12 or the administrator.
  • the scoring means 97 scores which driver 12 has the least number of incidents, or which of the plurality of bases to which the vehicle 10 belongs has the fewest incidents.
  • the scoring result may be displayed on the terminal 110 or notified to the driver 12 or the administrator.
  • the terminal 110 is mainly a PC or a tablet, but there may be a smartphone used in combination with these.
  • the terminal 110 includes a display unit 112, an input unit 114, and a communication unit 115.
  • the display unit 112 and the input unit 114 are, for example, a touch panel, but the present invention is not limited thereto.
  • the communication unit 115 receives the incident information 66 and the clipped image 68 from the server 50 via the network. Of course, other information may be received, or information may be provided (transmitted) to the server 50.
  • the first acquisition means 70 of the server 50 acquires the biological information (electrocardiogram, pulse wave, brain wave, blood pressure, body temperature, etc.) of the driver 12 during driving (step S10).
  • the biometric information obtained by the driver biometric information sensing unit 40 of the vehicle 10 is transmitted to the server 50 via the terminal 20, and the server 50 receives the transmitted information to acquire the biometric information.
  • the acquired biometric information is stored in the acquired information 58 of the storage 56.
  • the first detection means 71 of the server 50 analyzes the biological information acquired by the first acquisition means 70, and the physical condition of the driver 12 (fatigue, drowsiness, dementia, depression, concentration, possibility of sudden change, etc. (Allergy determination, etc.) is detected (step S12).
  • the first determination means 72 of the server 50 determines that the physical condition detected by the first detection means 71 is a first incident when a predetermined condition (high degree of fatigue and better not to drive) is satisfied. (Step S14). The determination of whether or not it is the first incident is performed depending on whether or not the first incident determination condition 60 is satisfied.
  • the first cutting means 73 of the server 50 cuts out the image (moving image or still image) captured by the drive recorder as the first image in association with the first incident (step S16).
  • the cut-out image is stored in the cut-out image 68 of the storage 56.
  • the IoT drive recorders 30A and 30B of the driver motion sensing unit 44 are used as the drive recorder.
  • the second acquisition means 74 acquires the operation information of the driver 12 (image of the inward camera, handle sensor information, motion sensor information, etc.) (step S20). ).
  • the operation information of the driver 12 obtained by the driver operation sensing unit 44 of the vehicle 10 is transmitted to the server 50 directly by the IoT drive recorders 30A and 30B or via the terminal 20, and the information transmitted by the server 50.
  • the operation information of the driver 12 is acquired by receiving.
  • the acquired operation information is stored in the acquired information 58 of the storage 56.
  • the second detection means 75 of the server 50 analyzes the operation information acquired by the second acquisition means 74 and detects the operation of the driver 12 (step S22).
  • the second determination means 76 is a sleepy operation when the operation detected by the second detection means 75 is under a predetermined condition (when the line of sight is off for 2 seconds or more during inattentive driving / driving while operating the mobile terminal). ) Is satisfied, it is determined as a second incident (step S24). Whether or not it is a second incident is determined based on whether or not the second incident determination condition 62 is satisfied.
  • the second cutting means 77 of the server 50 cuts out the image (moving image or still image) captured by the drive recorder as the second image in association with the second incident (step S26).
  • the cut-out image is stored in the cut-out image 68 of the storage 56.
  • IoT drive recorders 30A, 30B and the like are used as the drive recorder.
  • the third acquisition means 78 of the server 50 provides behavior information (speed, acceleration, brake, position information, etc.) of the vehicle 10 driven by the driver 12.
  • behavior information speed, acceleration, brake, position information, etc.
  • the behavior information of the vehicle 10 obtained by the vehicle behavior sensing unit 42 of the vehicle 10 is transmitted to the server 50 via the terminal 20.
  • the behavior information of the vehicle 10 may be transmitted directly from the edge side to the server 50.
  • the server 50 acquires the behavior information of the vehicle 10 by receiving the transmitted information.
  • the acquired behavior information is stored in the acquired information of the storage 56.
  • the third detection means 80 of the server 50 analyzes the behavior information acquired by the third acquisition means 78 and detects the behavior of the vehicle 10 (step S32).
  • the third determination means 82 of the server 50 determines the behavior detected by the third detection means 80 as road sign information / signal information 63 (road sign identification / signal display content identification by image analysis of a vehicle outward camera, map information). If the vehicle does not comply with the law (if there is a violation), it is determined to be a third incident (step S34).
  • the third cutting means 84 of the server 50 cuts out the image (moving image or still image) captured by the drive recorder as the third image in association with the third incident (step S36).
  • the cut-out image is stored in the cut-out image 68 of the storage 56.
  • IoT drive recorders 30A, 30B and the like are used as the drive recorder.
  • the fourth detection means 86 of the server 50 analyzes the behavior information of the vehicle 10 acquired by the third acquisition means 78, and the dangerous driving degree (sudden start, sudden braking). , Sudden steering wheel, etc.) (step S38).
  • the fourth determination means 82 of the server 50 determines as a fourth incident when the dangerous driving degree detected by the fourth detection means 86 satisfies a predetermined condition (step S40). The determination as to whether or not it is the fourth incident is performed depending on whether or not the fourth incident determination condition 64 is satisfied.
  • the fourth cutting means 90 of the server 50 cuts out the image (moving image or still image) captured by the drive recorder as the fourth image in association with the fourth incident (step S42).
  • the cut-out image is stored in the cut-out image 68 of the storage 56.
  • IoT drive recorders 30A, 30B and the like are used as the drive recorder.
  • the display means 92 of the server 50 displays the occurrence locations of the first incident, the second incident, the third incident, and the fourth incident on the map together with the operation route, and cuts out the first image and the second image.
  • the third image and the fourth image are displayed in a time series of incident occurrence (step S50).
  • the map 118 is displayed on the display unit 112 of the terminal 110 of the administrator or the driver 12.
  • FIG. 7 shows an example of displaying an operation incident image displayed on the terminal 110 of the administrator or the driver 12.
  • the map 118 and the incident image 130 are displayed on the display unit 112 of the terminal 110.
  • the locations where the first to fourth incidents occur are displayed together with the operation route 116 by icons 120A to 120C.
  • the incident image 130 includes incident information and cutout images 132A to 132C.
  • the incident information is "09:17 short distance between vehicles", "12:49 sudden deceleration", and "15:32 drowsiness detection", respectively, from the icon 120A shown on the map 118.
  • the location of the incident is indicated by 120C. Further, it is shown that the types of icons 120A to 120C differ depending on the type of incident.
  • Incident images 132A to 132C may be moving images or still images, and moving images may be played back by selecting an incident image.
  • the display means 92 may also display the transition of the physical condition of the driver 12 on the map 118 in chronological order along the operation route 116. The transition of the physical condition may be performed by the type of the icon, or the character information may be displayed on the map 118.
  • FIG. 8 shows another display example of the operation incident image displayed on the terminal 110.
  • FIG. 8 shows an image in the case of non-compliance with laws and regulations as another example of the incident image 140 displayed on the display unit 112 (map 118 is omitted).
  • the image corresponding to the incident of "09:17 temporary non-stop” is the cutout image 142A
  • the image corresponding to the incident of "11:25 overspeed” is the cutout image 142B.
  • the incident images 142A and 142B may be moving images or still images, and moving images may be reproduced by selecting the incident images.
  • the server 50 may learn the incident for each driver 12 and give advice to the driver.
  • the advice may be displayed on the terminal 110 after being managed by an ID or password for each driver 12, may be notified by pop-up to the terminal of the driver 12, or may be notified by e-mail or the like. It does not prevent you from giving advice by other means.
  • the server 50 may stat all incidents, visualize dangerous places and times, and give advice to the driver 12.
  • the advice is given by managing each driver 12 with an ID and password and then displaying the advice on the terminal 110, or by sending a pop-up notification or an e-mail to the terminal of the driver 12. It does not prevent you from giving advice by other means.
  • the server 50 may compare the past incident with the current incident for each driver 12 and evaluate the growth of the driver.
  • the evaluation result may be notified to the driver 12 or the administrator.
  • the server 50 may score which driver 12 has a small number of incidents, or which of the plurality of bases to which the vehicle 10 belongs has a small number of incidents. Good.
  • the scoring result may be displayed on the terminal 110 or notified to the driver 12 or the administrator.
  • the drive recorder determines.
  • the captured image is cut out as a third image in association with the third incident
  • the behavior information of the acquired behavior is analyzed, the dangerous driving degree is detected, and the dangerous driving degree satisfies a predetermined condition.
  • the image captured by the drive recorder, determined to be the 4th incident was cut out as the 4th image in association with the 4th incident, and the cut out 1st image, 2nd image, 3rd image, and 4th image Is displayed in chronological order along the operation route on the map. Therefore, the driver can look back on his / her own driving video in chronological order along the operation route on the map.
  • the server 50 may be a single computer, for example, a terminal.
  • the above-mentioned functional configuration may be a computer system (cloud) executed by different computers.
  • information acquisition, analysis / detection, and incident determination may be performed on the edge (sensors, etc.) side. By performing a lot of processing on the edge side, the amount of data transmitted to the server 50 can be reduced.
  • the present invention may be provided as a program executed by the server 50.
  • the program may be provided as recorded on a computer-readable recording medium or may be downloaded over a network.
  • the present invention may also be provided as an invention of the method.
  • the behavior of the vehicle is detected, the behavior is collated with the road sign information and / or the signal information, and if there is a violation, it is determined as a third incident, and the image captured by the drive recorder is used as the third incident. It is associated and cut out as a third image, the behavior information of the acquired behavior is analyzed, the dangerous driving degree is detected, and when the dangerous driving degree satisfies a predetermined condition, it is determined as a fourth incident.
  • the image captured by the drive recorder is cut out as the fourth image in association with the fourth incident, and the cut out first image, second image, third image, and fourth image are displayed on the map along the operation route. Display in chronological order.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Emergency Management (AREA)
  • Traffic Control Systems (AREA)

Abstract

Le problème décrit par la présente invention concerne un conducteur passant en revue une vidéo de conduite du conducteur de façon chronologique le long d'un itinéraire de fonctionnement sur une carte. À cet effet, des informations des signes vitaux d'un conducteur pendant la conduite sont acquises (étape S1) et analysées, la condition physique du conducteur est détectée pour déterminer un premier incident, une première image est coupée, des informations d'action du conducteur sont acquises (étape S2) et analysée, l'action du conducteur est détectée pour déterminer un second incident, une seconde image est coupée, des informations de comportement du véhicule que le conducteur conduit sont acquises (étape S3) et analysées, le comportement du véhicule est détecté, le comportement est comparé à des informations de panneaux de signalisation et des informations de signal pour déterminer un troisième incident, une troisième image est coupée, les informations de comportement acquises du véhicule sont analysées, un degré de conduite dangereuse est détecté pour déterminer un quatrième incident, une quatrième image est coupée, et la première image à travers la quatrième image sont affichées chronologiquement le long d'un itinéraire de fonctionnement sur une carte (étape S4).
PCT/JP2019/031529 2019-08-08 2019-08-08 Système, procédé et programme d'affichage des images d'incident d'exploitation WO2021024497A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2021537554A JP7369776B2 (ja) 2019-08-08 2019-08-08 運行インシデント画像表示システム、方法及びプログラム
PCT/JP2019/031529 WO2021024497A1 (fr) 2019-08-08 2019-08-08 Système, procédé et programme d'affichage des images d'incident d'exploitation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/031529 WO2021024497A1 (fr) 2019-08-08 2019-08-08 Système, procédé et programme d'affichage des images d'incident d'exploitation

Publications (1)

Publication Number Publication Date
WO2021024497A1 true WO2021024497A1 (fr) 2021-02-11

Family

ID=74503362

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/031529 WO2021024497A1 (fr) 2019-08-08 2019-08-08 Système, procédé et programme d'affichage des images d'incident d'exploitation

Country Status (2)

Country Link
JP (1) JP7369776B2 (fr)
WO (1) WO2021024497A1 (fr)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004234260A (ja) * 2003-01-29 2004-08-19 Hitachi Ltd 安全運転診断方法および省燃費運転診断方法、ならびに装置、安全運転診断プログラムおよび省燃費運転診断プログラム
JP2008210375A (ja) * 2007-02-01 2008-09-11 Denso Corp ドライバ管理装置および運行管理システム
JP2008234414A (ja) * 2007-03-22 2008-10-02 Equos Research Co Ltd データベース作成装置、及びデータベース作成プログラム
US20170061222A1 (en) * 2015-08-31 2017-03-02 Lytx, Inc. Detecting risky driving with machine vision
WO2017134897A1 (fr) * 2016-02-02 2017-08-10 ソニー株式会社 Appareil et procédé de traitement vidéo
JP2017191368A (ja) * 2016-04-11 2017-10-19 株式会社デンソー 運転支援システム、運転支援装置及び運転支援プログラム
JP2017204104A (ja) * 2016-05-10 2017-11-16 エヌ・ティ・ティ・コミュニケーションズ株式会社 制御装置、車載装置、映像配信方法、及びプログラム
JP2018055445A (ja) * 2016-09-29 2018-04-05 株式会社デンソー 車両運行管理システム
WO2019146488A1 (fr) * 2018-01-25 2019-08-01 日本電気株式会社 Dispositif, procédé et système de surveillance d'état de conduite

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019069732A1 (fr) 2017-10-06 2019-04-11 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004234260A (ja) * 2003-01-29 2004-08-19 Hitachi Ltd 安全運転診断方法および省燃費運転診断方法、ならびに装置、安全運転診断プログラムおよび省燃費運転診断プログラム
JP2008210375A (ja) * 2007-02-01 2008-09-11 Denso Corp ドライバ管理装置および運行管理システム
JP2008234414A (ja) * 2007-03-22 2008-10-02 Equos Research Co Ltd データベース作成装置、及びデータベース作成プログラム
US20170061222A1 (en) * 2015-08-31 2017-03-02 Lytx, Inc. Detecting risky driving with machine vision
WO2017134897A1 (fr) * 2016-02-02 2017-08-10 ソニー株式会社 Appareil et procédé de traitement vidéo
JP2017191368A (ja) * 2016-04-11 2017-10-19 株式会社デンソー 運転支援システム、運転支援装置及び運転支援プログラム
JP2017204104A (ja) * 2016-05-10 2017-11-16 エヌ・ティ・ティ・コミュニケーションズ株式会社 制御装置、車載装置、映像配信方法、及びプログラム
JP2018055445A (ja) * 2016-09-29 2018-04-05 株式会社デンソー 車両運行管理システム
WO2019146488A1 (fr) * 2018-01-25 2019-08-01 日本電気株式会社 Dispositif, procédé et système de surveillance d'état de conduite

Also Published As

Publication number Publication date
JPWO2021024497A1 (fr) 2021-02-11
JP7369776B2 (ja) 2023-10-26

Similar Documents

Publication Publication Date Title
Braunagel et al. Ready for take-over? A new driver assistance system for an automated classification of driver take-over readiness
US20220095975A1 (en) Detection of cognitive state of a driver
Horrey et al. On-board safety monitoring systems for driving: Review, knowledge gaps, and framework
Verwey et al. Detecting short periods of elevated workload: A comparison of nine workload assessment techniques.
EP2924624A1 (fr) Procédé de diagnostic de caractéristiques de fonctionnement
EP1774492A1 (fr) Système et procede de surveillance de la conduite
Gaspar et al. Evaluating driver drowsiness countermeasures
Heikoop et al. Effects of mental demands on situation awareness during platooning: A driving simulator study
Lenné et al. Predicting drowsiness-related driving events: a review of recent research methods and future opportunities
JP6593011B2 (ja) 安全運転促進装置及び安全運転促進方法
WO2012077234A1 (fr) Système de collecte d'informations à l'usage d'un véhicule
JP6458438B2 (ja) 情報処理装置、警告方法、およびプログラム
Jansen et al. Does agreement mean accuracy? Evaluating glance annotation in naturalistic driving data
DE102017216328B3 (de) Verfahren zum Überwachen eines Aufmerksamkeitszustandes einer Person, Verarbeitungseinrichtung, Speichermedium, und Kraftfahrzeug
US20210237744A1 (en) Management assistance system
WO2021024497A1 (fr) Système, procédé et programme d'affichage des images d'incident d'exploitation
JP2019061498A (ja) 点呼・点検支援装置、点呼・点検支援システム及び点呼・点検支援プログラム
JP7369775B2 (ja) 運転中インシデント通知システム、方法及びプログラム
EP4126620B1 (fr) Groupes à risque des conducteurs encadrés
WO2021024496A1 (fr) Système, procédé et programme d'affichage d'information sur le fonctionnement du véhicule
JP7496484B2 (ja) 車両運行情報表示システム、方法及びプログラム
Kim Effectiveness of Collision Avoidance Technology
WO2021095154A1 (fr) Système, procédé et programme de variation de points de conducteur
CN113744498B (zh) 驾驶员注意力监测的系统和方法
WO2024062769A1 (fr) Dispositif d'aide au conducteur, système d'aide au conducteur et procédé d'aide au conducteur

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19940331

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021537554

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19940331

Country of ref document: EP

Kind code of ref document: A1